Sample records for estimation program capable

  1. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  2. IMAGES: A digital computer program for interactive modal analysis and gain estimation for eigensystem synthesis

    NASA Technical Reports Server (NTRS)

    Jones, R. L.

    1984-01-01

    An interactive digital computer program for modal analysis and gain estimation for eigensystem synthesis was written. Both mathematical and operation considerations are described; however, the mathematical presentation is limited to those concepts essential to the operational capability of the program. The program is capable of both modal and spectral synthesis of multi-input control systems. It is user friendly, has scratchpad capability and dynamic memory, and can be used to design either state or output feedback systems.

  3. Biometrics Enabling Capability Increment 1 (BEC Inc 1)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Biometrics Enabling Capability Increment 1 (BEC Inc 1) Defense Acquisition Management...Phone: 227-3119 DSN Fax: Date Assigned: July 15, 2015 Program Information Program Name Biometrics Enabling Capability Increment 1 (BEC Inc 1) DoD...therefore, no Original Estimate has been established. BEC Inc 1 2016 MAR UNCLASSIFIED 4 Program Description The Biometrics Enabling Capability (BEC

  4. Parallel computers - Estimate errors caused by imprecise data

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Bernat, Andrew; Villa, Elsa; Mariscal, Yvonne

    1991-01-01

    A new approach to the problem of estimating errors caused by imprecise data is proposed in the context of software engineering. A software device is used to produce an ideal solution to the problem, when the computer is capable of computing errors of arbitrary programs. The software engineering aspect of this problem is to describe a device for computing the error estimates in software terms and then to provide precise numbers with error estimates to the user. The feasibility of the program capable of computing both some quantity and its error estimate in the range of possible measurement errors is demonstrated.

  5. Space Programs: Nasa’s Independent Cost Estimating Capability Needs Improvement

    DTIC Science & Technology

    1992-11-01

    AD--A2?t59 263 DTJC 93-01281 I I !:ig’ i ~I1 V:II oz ’~ -A e•, 2.JQ For United States NTISAO General Accounting Office Wto faB Washington, D.C...advisory committee’s recommendation to strengthen NASA’s independent cost estimating capability. Congress and the executive branch need accurate cost ...estimates in deciding whether to undertake or continue space programs which often cost millions or even billions of dollars. In December 1990, the

  6. Spacelab baseline ECS trace contaminant removal test program

    NASA Technical Reports Server (NTRS)

    Ray, C. D.; Stanley, J. B.

    1977-01-01

    An estimate of the Spacelab Baseline Environmental Control System's contaminated removal capability was required to allow determination of the need for a supplemental trace contaminant removal system. Results from a test program to determine this removal capability are presented.

  7. 76 FR 4348 - Labor-Management Cooperation Grant Program Information Collection Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ...: Application for Federal Assistance (SF-424), Accounting System and Financial Capability Questionnaire (LM-3... the estimated time per response is 60 minutes. The Accounting System and Financial Capability... the information were not collected, there could be no accounting for the activities of the program...

  8. Improving the Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.

  9. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  10. Dual diagnosis capability in mental health and addiction treatment services: An assessment of programs across multiple state systems

    PubMed Central

    McGovern, Mark P.; Lambert-Harris, Chantal; Gotham, Heather J.; Claus, Ronald E.; Xie, Haiyi

    2012-01-01

    Despite increased awareness of the benefits of integrated services for persons with co-occurring substance use and psychiatric disorders, estimates of the availability of integrated services vary widely. The present study utilized standardized measures of program capacity to address co-occurring disorders, the Dual Diagnosis Capability in Addiction Treatment (DDCAT) and Dual Diagnosis Capability in Mental Health Treatment (DDCMHT) indexes, and sampled 256 programs across the United States. Approximately 18% of addiction treatment and 9% of mental health programs met criteria for dual diagnosis capable services. This is the first report on public access to integrated services using objective measures. PMID:23183873

  11. The Counterfactual Self-Estimation of Program Participants: Impact Assessment without Control Groups or Pretests

    ERIC Educational Resources Information Center

    Mueller, Christoph Emanuel; Gaus, Hansjoerg; Rech, Joerg

    2014-01-01

    This article proposes an innovative approach to estimating the counterfactual without the necessity of generating information from either a control group or a before-measure. Building on the idea that program participants are capable of estimating the hypothetical state they would be in had they not participated, the basics of the Roy-Rubin model…

  12. A real-time digital program for estimating aircraft stability and control parameters from flight test data by using the maximum likelihood method

    NASA Technical Reports Server (NTRS)

    Grove, R. D.; Mayhew, S. C.

    1973-01-01

    A computer program (Langley program C1123) has been developed for estimating aircraft stability and control parameters from flight test data. These parameters are estimated by the maximum likelihood estimation procedure implemented on a real-time digital simulation system, which uses the Control Data 6600 computer. This system allows the investigator to interact with the program in order to obtain satisfactory results. Part of this system, the control and display capabilities, is described for this program. This report also describes the computer program by presenting the program variables, subroutines, flow charts, listings, and operational features. Program usage is demonstrated with a test case using pseudo or simulated flight data.

  13. GEODYN programmers guide, volume 2, part 1

    NASA Technical Reports Server (NTRS)

    Mullins, N. E.; Goad, C. C.; Dao, N. C.; Martin, T. V.; Boulware, N. L.; Chin, M. M.

    1972-01-01

    A guide to the GEODYN Program is presented. The program estimates orbit and geodetic parameters. It possesses the capability to estimate that set of orbital elements, station positions, measurement biases, and a set of force model parameters such that the orbital tracking data from multiple arcs of multiple satellites best fit the entire set of estimated parameters. GEODYN consists of 113 different program segments, including the main program, subroutines, functions, and block data routines. All are in G or H level FORTRAN and are currently operational on GSFC's IBM 360/95 and IBM 360/91.

  14. A STUDY OF SIMULATOR CAPABILITIES IN AN OPERATIONAL TRAINING PROGRAM.

    ERIC Educational Resources Information Center

    MEYER, DONALD E.; AND OTHERS

    THE EXPERIMENT WAS CONDUCTED TO DETERMINE THE EFFECTS OF SIMULATOR TRAINING TO CRITERION PROFICIENCY UPON TIME REQUIRED IN THE AIRCRAFT. DATA WERE ALSO COLLECTED ON PROFICIENCY LEVELS ATTAINED, SELF-CONFIDENCE LEVELS, INDIVIDUAL ESTIMATES OF CAPABILITY, AND SOURCES FROM WHICH THAT CAPABILITY WAS DERIVED. SUBJECTS FOR THE EXPERIMENT--48 AIRLINE…

  15. User's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1980-01-01

    A user's manual for the FORTRAN IV computer program MMLE3 is described. It is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The theory and use of the program is described. The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program.

  16. GEODYN operations description, volume 3. [computer program for estimation of orbit and geodetic parameters

    NASA Technical Reports Server (NTRS)

    Martin, T. V.; Mullins, N. E.

    1972-01-01

    The operating and set-up procedures for the multi-satellite, multi-arc GEODYN- Orbit Determination program are described. All system output is analyzed. The GEODYN Program is the nucleus of the entire GEODYN system. It is a definitive orbit and geodetic parameter estimation program capable of simultaneously processing observations from multiple arcs of multiple satellites. GEODYN has two modes of operation: (1) the data reduction mode and (2) the orbit generation mode.

  17. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  18. Improving The Discipline of Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Piland, William M.; Pine, David J.; Wilson, Delano M.

    2000-01-01

    The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning many program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility. This paper presents the plans for the newly established role. Described is how the Independent Program Assessment Office, working with all NASA Centers, NASA Headquarters, other Government agencies, and industry, is focused on creating cost estimation and analysis as a professional discipline that will be recognized equally with the technical disciplines needed to design new space and aeronautics activities. Investments in selected, new analysis tools, creating advanced training opportunities for analysts, and developing career paths for future analysts engaged in the discipline are all elements of the plan. Plans also include increasing the human resources available to conduct independent cost analysis of Agency programs during their formulation, to improve near-term capability to conduct economic cost-benefit assessments, to support NASA management's decision process, and to provide cost analysis results emphasizing "full-cost" and "full-life cycle" considerations. The Agency cost analysis improvement plan has been approved for implementation starting this calendar year. Adequate financial and human resources are being made available to accomplish the goals of this important effort, and all indications are that NASA's cost estimation and analysis core competencies will be substantially improved within the foreseeable future.

  19. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less

  20. Department of the Navy Acquisition and Capabilities Guidebook

    DTIC Science & Technology

    2012-05-01

    Cost Estimates/Service Cost Position..................................... 5-1 5.1.2 Cost Analysis Requirements Description ( CARD ) 5-2 5.1.3...Description ( CARD ). 7. Satisfactory review of program health. 8. Concurrence with draft TDS, TES, and SEP. 9. Approval of full funding...Description ( CARD ) SECNAV M-5000.2 May 2012 5-3 Enclosure (1) A sound cost estimate is based on a well-defined program. The CARD is used

  1. Network capability estimation. Vela network evaluation and automatic processing research. Technical report. [NETWORTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, N.S.

    1976-09-24

    NETWORTH is a computer program which calculates the detection and location capability of seismic networks. A modified version of NETWORTH has been developed. This program has been used to evaluate the effect of station 'downtime', the signal amplitude variance, and the station detection threshold upon network detection capability. In this version all parameters may be changed separately for individual stations. The capability of using signal amplitude corrections has been added. The function of amplitude corrections is to remove possible bias in the magnitude estimate due to inhomogeneous signal attenuation. These corrections may be applied to individual stations, individual epicenters, ormore » individual station/epicenter combinations. An option has been added to calculate the effect of station 'downtime' upon network capability. This study indicates that, if capability loss due to detection errors can be minimized, then station detection threshold and station reliability will be the fundamental limits to network performance. A baseline network of thirteen stations has been performed. These stations are as follows: Alaskan Long Period Array, (ALPA); Ankara, (ANK); Chiang Mai, (CHG); Korean Seismic Research Station, (KSRS); Large Aperture Seismic Array, (LASA); Mashhad, (MSH); Mundaring, (MUN); Norwegian Seismic Array, (NORSAR); New Delhi, (NWDEL); Red Knife, Ontario, (RK-ON); Shillong, (SHL); Taipei, (TAP); and White Horse, Yukon, (WH-YK).« less

  2. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 6: IPAD system development and operation

    NASA Technical Reports Server (NTRS)

    Redhed, D. D.; Tripp, L. L.; Kawaguchi, A. S.; Miller, R. E., Jr.

    1973-01-01

    The strategy of the IPAD implementation plan presented, proposes a three phase development of the IPAD system and technical modules, and the transfer of this capability from the development environment to the aerospace vehicle design environment. The system and technical module capabilities for each phase of development are described. The system and technical module programming languages are recommended as well as the initial host computer system hardware and operating system. The cost of developing the IPAD technology is estimated. A schedule displaying the flowtime required for each development task is given. A PERT chart gives the developmental relationships of each of the tasks and an estimate of the operational cost of the IPAD system is offered.

  3. Probablilistic evaluation of earthquake detection and location capability for Illinois, Indiana, Kentucky, Ohio, and West Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauk, F.J.; Christensen, D.H.

    1980-09-01

    Probabilistic estimations of earthquake detection and location capabilities for the states of Illinois, Indiana, Kentucky, Ohio and West Virginia are presented in this document. The algorithm used in these epicentrality and minimum-magnitude estimations is a version of the program NETWORTH by Wirth, Blandford, and Husted (DARPA Order No. 2551, 1978) which was modified for local array evaluation at the University of Michigan Seismological Observatory. Estimations of earthquake detection capability for the years 1970 and 1980 are presented in four regional minimum m/sub b/ magnitude contour maps. Regional 90% confidence error ellipsoids are included for m/sub b/ magnitude events from 2.0more » through 5.0 at 0.5 m/sub b/ unit increments. The close agreement between these predicted epicentral 90% confidence estimates and the calculated error ellipses associated with actual earthquakes within the studied region suggest that these error determinations can be used to estimate the reliability of epicenter location. 8 refs., 14 figs., 2 tabs.« less

  4. LACIE performance predictor final operational capability program description, volume 3

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.

  5. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    NASA Technical Reports Server (NTRS)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  6. Preventive Maintenance Process

    NASA Technical Reports Server (NTRS)

    Ciaruffoli, Veronica; Bramley, Craig; Matteson, Mike

    2001-01-01

    The Preventive Maintenance (PM) program at Stennis Space Center (SSC) evolved from an ineffective and poorly organized state to a highly organized state in which it became capable of tracking equipment, planning jobs with man hour estimates, and supporting outsourcing. This viewgraph presentation traces the steps the program took to improve itself.

  7. Evaluating practical vs. theoretical inspection system capability with a new programmed defect test mask designed for 3X and 4X technology nodes

    NASA Astrophysics Data System (ADS)

    Glasser, Joshua; Pratt, Tim

    2008-10-01

    Programmed defect test masks serve the useful purpose of evaluating inspection system sensitivity and capability. It is widely recognized that when evaluating inspection system capability, it is important to understand the actual sensitivity of the inspection system in production; yet unfortunately we have observed that many test masks are a more accurate judge of theoretical sensitivity rather than real-world usable capability. Use of ineffective test masks leave the purchaser of inspection equipment open to the risks of over-estimating the capability of their inspection solution and overspecifying defect sensitivity to their customers. This can result in catastrophic yield loss for device makers. In this paper we examine some of the lithography-related technology advances which place an increasing burden on mask inspection complexity, such as MEEF, defect printability estimation, aggressive OPC, double patterning, and OPC jogs. We evaluate the key inspection system component contributors to successful mask inspection, including what can "go wrong" with these components. We designed and fabricated a test mask which both (a) more faithfully represents actual production use cases; and (b) stresses the key components of the inspection system. This mask's patterns represent 32nm, 36nm, and 45nm logic and memory technology including metal and poly like background patterns with programmed defects. This test mask takes into consideration requirements of advanced lithography, such as MEEF, defect printability, assist features, nearly-repetitive patterns, and data preparation. This mask uses patterns representative of 32nm, 36nm, and 45nm logic, flash, and DRAM technology. It is specifically designed to have metal and poly like background patterns with programmed defects. The mask is complex tritone and was designed for annular immersion lithography.

  8. Programmer's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The MMLE3 is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program. The implementation of the program on specific computer systems is discussed. The structure of the program is diagrammed, and the function and operation of individual routines is described. Complete listings and reference maps of the routines are included on microfiche as a supplement. Four test cases are discussed; listings of the input cards and program output for the test cases are included on microfiche as a supplement.

  9. IRTPRO 2.1 for Windows (Item Response Theory for Patient-Reported Outcomes)

    ERIC Educational Resources Information Center

    Paek, Insu; Han, Kyung T.

    2013-01-01

    This article reviews a new item response theory (IRT) model estimation program, IRTPRO 2.1, for Windows that is capable of unidimensional and multidimensional IRT model estimation for existing and user-specified constrained IRT models for dichotomously and polytomously scored item response data. (Contains 1 figure and 2 notes.)

  10. EGADS: A microcomputer program for estimating the aerodynamic performance of general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Melton, John E.

    1994-01-01

    EGADS is a comprehensive preliminary design tool for estimating the performance of light, single-engine general aviation aircraft. The software runs on the Apple Macintosh series of personal computers and assists amateur designers and aeronautical engineering students in performing the many repetitive calculations required in the aircraft design process. The program makes full use of the mouse and standard Macintosh interface techniques to simplify the input of various design parameters. Extensive graphics, plotting, and text output capabilities are also included.

  11. GDF v2.0, an enhanced version of GDF

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos

    2007-12-01

    An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.

  12. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  13. The Opportunity in Commercial Approaches for Future NASA Deep Space Exploration Elements

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2017-01-01

    This work joins two events, showing the potential for commercial, public private partnerships, modeled on programs like COTS, to reduce the cost to NASA significantly for other required deep space exploration capabilities. These other capabilities include landers, stages and more. We mature the concept of costed baseball cards, adding cost estimates to NASAs space systems baseball cards.

  14. Demonstration of CBR Modeling and Simulation Tool (CBRSim) Capabilities. Installation Technology Transfer Program

    DTIC Science & Technology

    2009-04-01

    Capabilities Co ns tr uc tio n En gi ne er in g R es ea rc h La bo ra to ry Kathy L. Simunich, Timothy K. Perkins, David M. Bailey, David Brown, and...inversion height in convective condition is estimated with a one- dimensional model of the atmospheric boundary layer based on the Drie- donks slab model...tool and its capabilities. Installation geospatial data, in CAD format, were obtained for select buildings, roads, and topographic features in

  15. Parameter estimation supplement to the Mission Analysis Evaluation and Space Trajectory Operations program (MAESTRO)

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Uphoff, C. W.

    1973-01-01

    This Parameter Estimation Supplement describes the PEST computer program and gives instructions for its use in determination of lunar gravitation field coefficients. PEST was developed for use in the RAE-B lunar orbiting mission as a means of lunar field recovery. The observations processed by PEST are short-arc osculating orbital elements. These observations are the end product of an orbit determination process obtained with another program. PEST's end product it a set of harmonic coefficients to be used in long-term prediction of the lunar orbit. PEST employs some novel techniques in its estimation process, notably a square batch estimator and linear variational equations in the orbital elements (both osculating and mean) for measurement sensitivities. The program's capabilities are described, and operating instructions and input/output examples are given. PEST utilizes MAESTRO routines for its trajectory propagation. PEST's program structure and subroutines which are not common to MAESTRO are described. Some of the theoretical background information for the estimation process, and a derivation of linear variational equations for the Method 7 elements are included.

  16. Dynamic Positioning Capability Analysis for Marine Vessels Based on A DPCap Polar Plot Program

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Yang, Jian-min; Xu, Sheng-wen

    2018-03-01

    Dynamic positioning capability (DPCap) analysis is essential in the selection of thrusters, in their configuration, and during preliminary investigation of the positioning ability of a newly designed vessel dynamic positioning system. DPCap analysis can help determine the maximum environmental forces, in which the DP system can counteract in given headings. The accuracy of the DPCap analysis is determined by the precise estimation of the environmental forces as well as the effectiveness of the thrust allocation logic. This paper is dedicated to developing an effective and efficient software program for the DPCap analysis for marine vessels. Estimation of the environmental forces can be obtained by model tests, hydrodynamic computation and empirical formulas. A quadratic programming method is adopted to allocate the total thrust on every thruster of the vessel. A detailed description of the thrust allocation logic of the software program is given. The effectiveness of the new program DPCap Polar Plot (DPCPP) was validated by a DPCap analysis for a supply vessel. The present study indicates that the developed program can be used in the DPCap analysis for marine vessels. Moreover, DPCap analysis considering the thruster failure mode might give guidance to the designers of vessels whose thrusters need to be safer.

  17. Assembly-line Simulation Program

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Zendejas, Silvino; Malhotra, Shan

    1987-01-01

    Costs and profits estimated for models based on user inputs. Standard Assembly-line Manufacturing Industry Simulation (SAMIS) program generalized so useful for production-line manufacturing companies. Provides accurate and reliable means of comparing alternative manufacturing processes. Used to assess impact of changes in financial parameters as cost of resources and services, inflation rates, interest rates, tax policies, and required rate of return of equity. Most important capability is ability to estimate prices manufacturer would have to receive for its products to recover all of costs of production and make specified profit. Written in TURBO PASCAL.

  18. P-8A Poseidon Multi Mission Maritime Aircraft (P-8A)

    DTIC Science & Technology

    2015-12-01

    focus also includes procurement of depot and intermediate level maintenance capabilities, full scale fatigue testing, and continued integration and... Level Confidence Level of cost estimate for current APB: 50% The current APB cost estimate provided sufficient resources to execute the program under...normal conditions, encountering average levels of technical, schedule, and programmatic risk and external interference. It was consistent with

  19. Standardization in software conversion of (ROM) estimating

    NASA Technical Reports Server (NTRS)

    Roat, G. H.

    1984-01-01

    Technical problems and their solutions comprise by far the majority of work involved in space simulation engineering. Fixed price contracts with schedule award fees are becoming more and more prevalent. Accurate estimation of these jobs is critical to maintain costs within limits and to predict realistic contract schedule dates. Computerized estimating may hold the answer to these new problems, though up to now computerized estimating has been complex, expensive, and geared to the business world, not to technical people. The objective of this effort was to provide a simple program on a desk top computer capable of providing a Rough Order of Magnitude (ROM) estimate in a short time. This program is not intended to provide a highly detailed breakdown of costs to a customer, but to provide a number which can be used as a rough estimate on short notice. With more debugging and fine tuning, a more detailed estimate can be made.

  20. Development of a simple, self-contained flight test data acquisition system

    NASA Technical Reports Server (NTRS)

    Clarke, R.; Shane, D.; Roskam, J.; Rummer, D. I.

    1982-01-01

    The flight test system described combines state-of-the-art microprocessor technology and high accuracy instrumentation with parameter identification technology which minimize data and flight time requirements. The system was designed to avoid permanent modifications of the test airplane and allow quick installation. It is capable of longitudinal and lateral-directional stability and control derivative estimation. Details of this system, calibration and flight test procedures, and the results of the Cessna 172 flight test program are presented. The system proved easy to install, simple to operate, and capable of accurate estimation of stability and control parameters in the Cessna 172 flight tests.

  1. Road weather management performance measures : 2012 update.

    DOT National Transportation Integrated Search

    1997-01-01

    The goal of the cost analysis of the ITS National Architecture program is twofold. First, the evaluation is to produce a high-level estimate of the expenditures associated with implementing the physical elements and the functional capabilities of ITS...

  2. Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 2

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr. (Compiler)

    1989-01-01

    The Control/Structures Integration Program, a survey of available software for control of flexible structures, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software are discussed.

  3. Early Training Estimation System (ETES). Appendix F. User’s Guide

    DTIC Science & Technology

    1984-06-01

    Related to Early Training Estimation 2-17 2-5 Organizations Interviewed During Task 1 2-17 2-6 Potential Problem Solving Aids 2-24 2-7 Task Deletion...tasks are available, only the training program elements must be estimated. Thus, by adding comparability analysis procedures to SDT data base management...data base manage- ment capabilities of the SDT, and (3) conduct trade-off studies of proposed solutions to identified training problems . 1-17

  4. System IDentification Programs for AirCraft (SIDPAC)

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2002-01-01

    A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.

  5. Modifications to give HOPE/MDC 2.0 the capability to solve for or consider vent forces: Mission planning, mission analysis, and software formulation

    NASA Technical Reports Server (NTRS)

    Zyla, L. V.

    1979-01-01

    The modifications are described as necessary to give the Houston Operations Predictor/Estimator (HOPE) program the capability to solve for or consider vent forces for orbit determination. The model implemented in solving for vent forces is described along with the integrator problems encountered. A summary derivation of the mathematical principles applicable to solve/consider methodology is provided.

  6. Canadian Forces in Joint Fires Support - Human Factors Analysis: Coalition Operations

    DTIC Science & Technology

    2010-08-01

    mesure/l’estimation des dommages collatéraux (EDC). Outil pour comprendre l’EDC de certains pays comparativement à celle des membres de l’OTAN et...Program (TDP). The TDP is aimed at concept development, evaluation for force design , and the demonstration of technologies fostered by Defence Research...logistics and the designation of targets on the joint targeting list. • Tactical capability / response time / training. The tactical capability of a fire

  7. Program for computer aided reliability estimation

    NASA Technical Reports Server (NTRS)

    Mathur, F. P. (Inventor)

    1972-01-01

    A computer program for estimating the reliability of self-repair and fault-tolerant systems with respect to selected system and mission parameters is presented. The computer program is capable of operation in an interactive conversational mode as well as in a batch mode and is characterized by maintenance of several general equations representative of basic redundancy schemes in an equation repository. Selected reliability functions applicable to any mathematical model formulated with the general equations, used singly or in combination with each other, are separately stored. One or more system and/or mission parameters may be designated as a variable. Data in the form of values for selected reliability functions is generated in a tabular or graphic format for each formulated model.

  8. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  9. F-35 Joint Strike Fighter: Preliminary Observations on Program Progress

    DTIC Science & Technology

    2016-03-23

    partners are the United Kingdom, Italy, the Netherlands, Turkey , Canada, Australia, Denmark, and Norway. These nations contributed funds for system...Estimated delivery and production dates Initial operational capability 2010-2012 Undetermined 2015- 2018 undetermined 5-6 years Full-rate

  10. Adaption of a corrector module to the IMP dynamics program

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The corrector module of the RAEIOS program and the IMP dynamics computer program were combined to achieve a date-fitting capability with the more general spacecraft dynamics models of the IMP program. The IMP dynamics program presents models of spacecraft dynamics for satellites with long, flexible booms. The properties of the corrector are discussed and a description is presented of the performance criteria and search logic for parameter estimation. A description is also given of the modifications made to add the corrector to the IMP program. This includes subroutine descriptions, common definitions, definition of input, and a description of output.

  11. Developing tolled-route demand estimation capabilities for Texas : opportunities for enhancement of existing models.

    DOT National Transportation Integrated Search

    2014-08-01

    The travel demand models developed and applied by the Transportation Planning and Programming Division : (TPP) of the Texas Department of Transportation (TxDOT) are daily three-step models (i.e., trip generation, trip : distribution, and traffic assi...

  12. Development of a Portfolio Management Approach with Case Study of the NASA Airspace Systems Program

    NASA Technical Reports Server (NTRS)

    Neitzke, Kurt W.; Hartman, Christopher L.

    2012-01-01

    A portfolio management approach was developed for the National Aeronautics and Space Administration s (NASA s) Airspace Systems Program (ASP). The purpose was to help inform ASP leadership regarding future investment decisions related to its existing portfolio of advanced technology concepts and capabilities (C/Cs) currently under development and to potentially identify new opportunities. The portfolio management approach is general in form and is extensible to other advanced technology development programs. It focuses on individual C/Cs and consists of three parts: 1) concept of operations (con-ops) development, 2) safety impact assessment, and 3) benefit-cost-risk (B-C-R) assessment. The first two parts are recommendations to ASP leaders and will be discussed only briefly, while the B-C-R part relates to the development of an assessment capability and will be discussed in greater detail. The B-C-R assessment capability enables estimation of the relative value of each C/C as compared with all other C/Cs in the ASP portfolio. Value is expressed in terms of a composite weighted utility function (WUF) rating, based on estimated benefits, costs, and risks. Benefit utility is estimated relative to achieving key NAS performance objectives, which are outlined in the ASP Strategic Plan.1 Risk utility focuses on C/C development and implementation risk, while cost utility focuses on the development and implementation portions of overall C/C life-cycle costs. Initial composite ratings of the ASP C/Cs were successfully generated; however, the limited availability of B-C-R information, which is used as inputs to the WUF model, reduced the meaningfulness of these initial investment ratings. Development of this approach, however, defined specific information-generation requirements for ASP C/C developers that will increase the meaningfulness of future B-C-R ratings.

  13. The Opportunity in Commercial Approaches for Future NASA Deep Space Exploration Elements

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2017-01-01

    In 2011, NASA released a report assessing the market for commercial crew and cargo services to low Earth orbit (LEO). The report stated that NASA had spent a few hundred million dollars in the Commercial Orbital Transportation Services (COTS) program on the portion related to the development of the Falcon 9 launch vehicle. Yet a NASA cost model predicted the cost would have been significantly more with a non-commercial cost-plus contracting approach. By 2016 a NASA request for information stated it must "maximize the efficiency and sustainability of the Exploration Systems development programs", as "critical to free resources for reinvestment...such as other required deep space exploration capabilities." This work joins the previous two events, showing the potential for commercial, public private partnerships, modeled on programs like COTS, to reduce the cost to NASA significantly for "...other required deep space exploration capabilities." These other capabilities include landers, stages and more. We mature the concept of "costed baseball cards", adding cost estimates to NASA's space systems "baseball cards." We show some potential costs, including analysis, the basis of estimates, data sources and caveats to address a critical question - based on initial assessment, are significant agency resources justified for more detailed analysis and due diligence to understand and invest in public private partnerships for human deep space exploration systems? The cost analysis spans commercial to cost-plus contracting approaches, for smaller elements vs. larger, with some variation for lunar or Mars. By extension, we delve briefly into the potentially much broader significance of the individual cost estimates if taken together as a NASA investment portfolio where public private partnership are stitched together for deep space exploration. How might multiple improvements in individual systems add up to NASA human deep space exploration achievements, realistically, affordably, sustainably, in a relevant timeframe?

  14. Acquisition Program Lead Systems Integration/Lead Capabilities Integration Decision Support Methodology and Tool

    DTIC Science & Technology

    2015-04-30

    from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  15. Weapon System Costing Methodology for Aircraft Airframes and Basic Structures. Volume I. Technical Volume

    DTIC Science & Technology

    1975-06-01

    the Air Force Flight Dynamics Laboratory for use in conceptual and preliminary designs pauses of weapon system development. The methods are a...trade study method provides ai\\ iterative capability stemming from a direct interface with design synthesis programs. A detailed cost data base ;ind...system for data expmjsion is provided. The methods are designed for ease in changing cost estimating relationships and estimating coefficients

  16. Impossible Certainty: Cost Risk Analysis for Air Force Systems

    DTIC Science & Technology

    2006-01-01

    the estimated cost of weapon systems , which typically take many years to acquire and remain in operation for a long time . To make those esti- mates... times , uncertain, undefined, or unknown when estimates are prepared. New system development may involve further uncer- tainty due to unproven or...risk (a system requiring more money to complete than was forecasted ) and opera- tional risk (a vital capability becoming unaffordable as the program

  17. A user-oriented and computerized model for estimating vehicle ride quality

    NASA Technical Reports Server (NTRS)

    Leatherwood, J. D.; Barker, L. M.

    1984-01-01

    A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.

  18. Program For Evaluation Of Reliability Of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, N.; Janosik, L. A.; Gyekenyesi, J. P.; Powers, Lynn M.

    1996-01-01

    CARES/LIFE predicts probability of failure of monolithic ceramic component as function of service time. Assesses risk that component fractures prematurely as result of subcritical crack growth (SCG). Effect of proof testing of components prior to service also considered. Coupled to such commercially available finite-element programs as ANSYS, ABAQUS, MARC, MSC/NASTRAN, and COSMOS/M. Also retains all capabilities of previous CARES code, which includes estimation of fast-fracture component reliability and Weibull parameters from inert strength (without SCG contributing to failure) specimen data. Estimates parameters that characterize SCG from specimen data as well. Written in ANSI FORTRAN 77 to be machine-independent. Program runs on any computer in which sufficient addressable memory (at least 8MB) and FORTRAN 77 compiler available. For IBM-compatible personal computer with minimum 640K memory, limited program available (CARES/PC, COSMIC number LEW-15248).

  19. ARM-Led Improvements Aerosols in Climate and Climate Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghan, Steven J.; Penner, Joyce E.

    2016-07-25

    The DOE ARM program has played a foundational role in efforts to quantify aerosol effects on climate, beginning with the early back-of-the-envelope estimates of direct radiative forcing by anthropogenic sulfate and biomass burning aerosol (Penner et al., 1994). In this chapter we review the role that ARM has played in subsequent detailed estimates based on physically-based representations of aerosols in climate models. The focus is on quantifying the direct and indirect effects of anthropogenic aerosol on the planetary energy balance. Only recently have other DOE programs applied the aerosol modeling capability to simulate the climate response to the radiative forcing.

  20. Imperishable Networks: Complexity Theory and Communication Networking-Bridging the Gap Between Algorithmic Information Theory and Communication Networking

    DTIC Science & Technology

    2003-04-01

    gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other

  1. Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 1

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr. (Compiler)

    1989-01-01

    Control/Structures Integration program software needs, computer aided control engineering for flexible spacecraft, computer aided design, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software for flexible structures and robots are among the topics discussed.

  2. 75 FR 33629 - Agency Information Collection Activities: Submission for Review; Information Collection Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-14

    ... Infrastructure against Cyber Threats (PREDICT) Program AGENCY: Science and Technology Directorate, DHS. ACTION... Infrastructure Against Cyber Threats (PREDICT) initiative. PREDICT is an initiative to facilitate the... effective threat assessment and increase cyber security capabilities. (4) An estimate of the total number of...

  3. Characterization of the Temperature Capabilities of Advanced Disk Alloy ME3

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.; OConnor, Kenneth

    2002-01-01

    The successful development of an advanced powder metallurgy disk alloy, ME3, was initiated in the NASA High Speed Research/Enabling Propulsion Materials (HSR/EPM) Compressor/Turbine Disk program in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. This alloy was designed using statistical screening and optimization of composition and processing variables to have extended durability at 1200 F in large disks. Disks of this alloy were produced at the conclusion of the program using a realistic scaled-up disk shape and processing to enable demonstration of these properties. The objective of the Ultra-Efficient Engine Technologies disk program was to assess the mechanical properties of these ME3 disks as functions of temperature in order to estimate the maximum temperature capabilities of this advanced alloy. These disks were sectioned, machined into specimens, and extensively tested. Additional sub-scale disks and blanks were processed and selectively tested to explore the effects of several processing variations on mechanical properties. Results indicate the baseline ME3 alloy and process can produce 1300 to 1350 F temperature capabilities, dependent on detailed disk and engine design property requirements.

  4. The microcomputer scientific software series 2: general linear model--regression.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  5. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  6. Estimating Total Electron Content Using 1,000+ GPS Receivers

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Mannucci, Anthony

    2006-01-01

    A computer program uses data from more than 1,000 Global Positioning System (GPS) receivers in an Internet-accessible global network to generate daily estimates of the global distribution of vertical total electron content (VTEC) of the ionosphere. This program supersedes an older program capable of processing readings from only about 200 GPS receivers. This program downloads the data via the Internet, then processes the data in three stages. In the first stage, raw data from a global subnetwork of about 200 receivers are preprocessed, station by station, in a Kalman-filter-based least-squares estimation scheme that estimates satellite and receiver differential biases for these receivers and for satellites. In the second stage, an observation equation that incorporates the results from the first stage and the raw data from the remaining 800 receivers is solved to obtain the differential biases for these receivers. The only remaining error sources for which an account cannot be given are multipath and receiver noise contributions. The third stage is a postprocessing stage in which all the processed data are combined and used to generate new data products, including receiver differential biases and global and regional VTEC maps and animations.

  7. Applying the 15 Public Health Emergency Preparedness Capabilities to Support Large-Scale Tuberculosis Investigations in Complex Congregate Settings

    PubMed Central

    Toren, Katelynne Gardner; Elsenboss, Carina; Narita, Masahiro

    2017-01-01

    Public Health—Seattle and King County, a metropolitan health department in western Washington, experiences rates of tuberculosis (TB) that are 1.6 times higher than are state and national averages. The department’s TB Control Program uses public health emergency management tools and capabilities sustained with Centers for Disease Control and Prevention grant funding to manage large-scale complex case investigations. We have described 3 contact investigations in large congregate settings that the TB Control Program conducted in 2015 and 2016. The program managed the investigations using public health emergency management tools, with support from the Preparedness Program. The 3 investigations encompassed medical evaluation of more than 1600 people, used more than 100 workers, identified nearly 30 individuals with latent TB infection, and prevented an estimated 3 cases of active disease. These incidents exemplify how investments in public health emergency preparedness can enhance health outcomes in traditional areas of public health. PMID:28892445

  8. U.S. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Air Force Optometry Augmentation Team (FFDOT)

    DTIC Science & Technology

    2010-11-09

    Report No. 10-13M, supported by the U.S. Air Force Medical Logistics Agency, under Work Unit No. 60334. The views expressed in this article are those...recommended 917Q line list. The Unit Type Code (UTC) capabilities, operational requirements, and materiel solutions were identified, and issues of...by 22%, and cost by 4%, or $9,500. Modeling and simulating a medical system like the FFDOT, with a range of capabilities and functional areas

  9. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.

  10. SOLE: enhanced FIA data analysis capabilities

    Treesearch

    Michael Spinney; Paul Van Deusen

    2009-01-01

    The Southern On Line Estimator (SOLE), is an Internet-based annual forest inventory and analysis (FIA) data analysis tool developed cooperatively by the National Council for Air and Stream Improvement and the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis program at the Southern Research Station. Recent development of SOLE has...

  11. Modeling of Army Research Laboratory EMP simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miletta, J.R.; Chase, R.J.; Luu, B.B.

    1993-12-01

    Models are required that permit the estimation of emitted field signatures from EMP simulators to design the simulator antenna structure, to establish the usable test volumes, and to estimate human exposure risk. This paper presents the capabilities and limitations of a variety of EMP simulator models useful to the Army's EMP survivability programs. Comparisons among frequency and time-domain models are provided for two powerful US Army Research Laboratory EMP simulators: AESOP (Army EMP Simulator Operations) and VEMPS II (Vertical EMP Simulator II).

  12. 1DTempPro V2: new features for inferring groundwater/surface-water exchange

    USGS Publications Warehouse

    Koch, Franklin W.; Voytek, Emily B.; Day-Lewis, Frederick D.; Healy, Richard W.; Briggs, Martin A.; Lane, John W.; Werkema, Dale D.

    2016-01-01

    A new version of the computer program 1DTempPro extends the original code to include new capabilities for (1) automated parameter estimation, (2) layer heterogeneity, and (3) time-varying specific discharge. The code serves as an interface to the U.S. Geological Survey model VS2DH and supports analysis of vertical one-dimensional temperature profiles under saturated flow conditions to assess groundwater/surface-water exchange and estimate hydraulic conductivity for cases where hydraulic head is known.

  13. Tree Canopy Light Interception Estimates in Almond and a Walnut Orchards Using Ground, Low Flying Aircraft, and Satellite Based Methods to Improve Irrigation Scheduling Programs

    NASA Technical Reports Server (NTRS)

    Rosecrance, Richard C.; Johnson, Lee; Soderstrom, Dominic

    2016-01-01

    Canopy light interception is a main driver of water use and crop yield in almond and walnut production. Fractional green canopy cover (Fc) is a good indicator of light interception and can be estimated remotely from satellite using the normalized difference vegetation index (NDVI) data. Satellite-based Fc estimates could be used to inform crop evapotranspiration models, and hence support improvements in irrigation evaluation and management capabilities. Satellite estimates of Fc in almond and walnut orchards, however, need to be verified before incorporating them into irrigation scheduling or other crop water management programs. In this study, Landsat-based NDVI and Fc from NASA's Satellite Irrigation Management Support (SIMS) were compared with four estimates of canopy cover: 1. light bar measurement, 2. in-situ and image-based dimensional tree-crown analyses, 3. high-resolution NDVI data from low flying aircraft, and 4. orchard photos obtained via Google Earth and processed by an Image J thresholding routine. Correlations between the various estimates are discussed.

  14. Tree canopy light interception estimates in almond and a walnut orchards using ground, low flying aircraft, and satellite based methods to improve irrigation scheduling programs.

    NASA Astrophysics Data System (ADS)

    Rosecrance, R. C.; Johnson, L.; Soderstrom, D.

    2016-12-01

    Canopy light interception is a main driver of water use and crop yield in almond and walnut production. Fractional green canopy cover (Fc) is a good indicator of light interception and can be estimated remotely from satellite using the normalized difference vegetation index (NDVI) data. Satellite-based Fc estimates could be used to inform crop evapotranspiration models, and hence support improvements in irrigation evaluation and management capabilities. Satellite estimates of Fc in almond and walnut orchards, however, need to be verified before incorporating them into irrigation scheduling or other crop water management programs. In this study, Landsat-based NDVI and Fc from NASA's Satellite Irrigation Management Support (SIMS) were compared with four estimates of canopy cover: 1. light bar measurement, 2. in-situ and image-based dimensional tree-crown analyses, 3. high-resolution NDVI data from low flying aircraft, and 4. orchard photos obtained via Google Earth and processed by an Image J thresholding routine. Correlations between the various estimates are discussed.

  15. Department of the Navy Supporting Data for Fiscal Year 1984 Budget Estimates Descriptive Summaries Submitted to Congress January 1983. Research, Development, Test & Evaluation, Navy. Book 3. Tactical Programs, Intelligence, & Communications Management & Support

    DTIC Science & Technology

    1983-01-01

    the purpose of forecasting future hardware designs. The basis for this is a discernable time- application relationship between foreign technology and... relationship with the P-14, 1-15, F-16 and P/A-18 program offices is miotained. Other programs which are related to full employment capability included target...Frequency. Relationship is for intaroperability between U.S. Navy, Northeast Asia Treaty Organization, and US. Air orcea. C. (U) WORK PERFORMED gY: IN

  16. Interdisciplinary Distinguished Seminar Series

    DTIC Science & Technology

    2014-08-29

    official Department of the Army position, policy or decision, unless so designated by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND...Received Book TOTAL: Patents Submitted Patents Awarded Awards Graduate Students Names of Post Doctorates Names of Faculty Supported Names of Under...capabilities, estimation and optimization techniques, image and color standards, efficient programming methods and efficient ASIC designs . This seminar will

  17. Research requirements for development of improved helicopter rotor efficiency

    NASA Technical Reports Server (NTRS)

    Davis, S. J.

    1976-01-01

    The research requirements for developing an improved-efficiency rotor for a civil helicopter are documented. The various design parameters affecting the hover and cruise efficiency of a rotor are surveyed, and the parameters capable of producing the greatest potential improvement are identified. Research and development programs to achieve these improvements are defined, and estimated costs and schedules are presented. Interaction of the improved efficiency rotor with other technological goals for an advanced civil helicopter is noted, including its impact on engine noise, hover and cruise performance, one-engine-inoperative hover capability, and maintenance and reliability.

  18. Assessment of CTAS ETA prediction capabilities

    NASA Astrophysics Data System (ADS)

    Bolender, Michael A.

    1994-11-01

    This report summarizes the work done to date in assessing the trajectory fidelity and estimated time of arrival (ETA) prediction capability of the NASA Ames Center TRACON Automation System (CTAS) software. The CTAS software suite is a series of computer programs designed to aid air traffic controllers in their tasks of safely scheduling the landing sequence of approaching aircraft. in particular, this report concerns the accuracy of the available measurements (e.g., position, altitude, etc.) that are input to the software, as well as the accuracy of the final data that is made available to the air traffic controllers.

  19. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  20. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  1. The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eden, H.F.; Mooers, C.N.K.

    1990-06-01

    The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less

  2. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  3. National Aeronautics and Space Administration Budget Estimates, Fiscal Year 2011

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The Budget includes three new robust exploration programs: (1) Technology demonstration program, $7.8 five years. Funds the development and demonstration of technologies that reduce the cost and expand the capabilities of future exploration activities, including in-orbit refueling and storage. (2) Heavy-Lift and Propulsion R&D, $3.1 billion over five years. Funds R&D for new launch systems, propellants, materials, and combustion processes. (3) Robotic precursor missions, $3.0 billion over five years. Funds cost-effective means to scout exploration targets and identify hazards and resources for human visitation and habitation. In addition, the Budget enhances the current Human Research Program by 42%; and supports the Participatory Exploration Program at 5 million per year for activities across many NASA programs.

  4. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  5. Michigan's Physician Group Incentive Program offers a regional model for incremental 'fee for value' payment reform.

    PubMed

    Share, David A; Mason, Margaret H

    2012-09-01

    Blue Cross Blue Shield of Michigan partnered with providers across the state to create an innovative, "fee for value" physician incentive program that would deliver high-quality, efficient care. The Physician Group Incentive Program rewards physician organizations-formal groups of physicians and practices that can accept incentive payments on behalf of their members-based on the number of quality and utilization measures they adopt, such as generic drug dispensing rates, and on their performance on these measures across their patient populations. Physicians also receive payments for implementing a range of patient-centered medical home capabilities, such as patient registries, and they receive higher fees for office visits for incorporating these capabilities into routine practice while also improving performance. Taken together, the incentive dollars, fee increases, and care management payments amount to a potential increase in reimbursement of 40 percent or more from Blue Cross Blue Shield of Michigan for practices designated as high-performing patient-centered medical homes. At the same time, we estimate that implementing the patient-centered medical home capabilities was associated with $155 million in lower medical costs in program year 2011 for Blue Cross Blue Shield of Michigan members. We intend to devote a higher percentage of reimbursement over time to communities of caregivers that offer high-value, system-based care, and a lower percentage of reimbursement to individual physicians on a service-specific basis.

  6. The PAWS and STEM reliability analysis programs

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Stevenson, Philip H.

    1988-01-01

    The PAWS and STEM programs are new design/validation tools. These programs provide a flexible, user-friendly, language-based interface for the input of Markov models describing the behavior of fault-tolerant computer systems. These programs produce exact solutions of the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. PAWS uses a Pade approximation as a solution technique; STEM uses a Taylor series as a solution technique. Both programs have the capability to solve numerically stiff models. PAWS and STEM possess complementary properties with regard to their input space; and, an additional strength of these programs is that they accept input compatible with the SURE program. If used in conjunction with SURE, PAWS and STEM provide a powerful suite of programs to analyze the reliability of fault-tolerant computer systems.

  7. Supporting Data for Fiscal Year 1994. Budget Estimate Submission

    DTIC Science & Technology

    1993-04-01

    0603401F 405 36 Space Systems Environmental Interactions Technology 0603410F 416 38 Conventional Weapons Technology 0603601F 423 39 Advanced Radiation...Transfer Pilot Program (SBIR/STTR) 0603302F Space and Missile Rocket Propulsion 31 392 060341OF Space Systems Environmental Interactions Technology 36...Deliver Interactive Decode (Rapid Message Processing) capability in Communications Element. - (U) Conduct maintainability demonstration. - (U) Begin Initial

  8. Ballistic Missile Defense System (BMDS)

    DTIC Science & Technology

    2015-12-01

    Assessment and Program Evaluation CARD - Cost Analysis Requirements Description CDD - Capability Development Document CLIN - Contract Line Item Number CPD...Estimate RDT&E - Research, Development, Test, and Evaluation SAR - Selected Acquisition Report SCP - Service Cost Position TBD - To Be Determined TY - Then...BMDS December 2015 SAR March 23, 2016 16:29:09 UNCLASSIFIED 5 Mission and Description Mission and Description To develop, test, and field a layered

  9. Computer graphics for management: An abstract of capabilities and applications of the EIS system

    NASA Technical Reports Server (NTRS)

    Solem, B. J.

    1975-01-01

    The Executive Information Services (EIS) system, developed as a computer-based, time-sharing tool for making and implementing management decisions, and including computer graphics capabilities, was described. The following resources are available through the EIS languages: centralized corporate/gov't data base, customized and working data bases, report writing, general computational capability, specialized routines, modeling/programming capability, and graphics. Nearly all EIS graphs can be created by a single, on-line instruction. A large number of options are available, such as selection of graphic form, line control, shading, placement on the page, multiple images on a page, control of scaling and labeling, plotting of cum data sets, optical grid lines, and stack charts. The following are examples of areas in which the EIS system may be used: research, estimating services, planning, budgeting, and performance measurement, national computer hook-up negotiations.

  10. [Age factor in a complex evaluation of health of air staff].

    PubMed

    Ushakov, I B; Batishcheva, G A; Chernov, Iu N; Khomenko, M N; Soldatov, S K

    2010-03-01

    Was elaborated program of a complex of estimation of health condition of air staff with determination of capability of early diagnostic of functional tension of physiological systems. According to this system there were observed 73 airmen using a complex of tests (estimation of level of pectoral control, of personal and reactive anxiety, vegetal regulation etc.). Was detected, that length of service and sympato-adrenaline activeness with vicarious decrease of adrenoreactiveness are in direct proportion. Were marked the most informative indexes of estimation of functional tension of psycho-physiological functions, vegetative regulation and cardiovascular system. Was shown that the elaborated system of individual estimation of health of air staff permits diagnose prenosological conditions and determine indexes for rehabilitation treatment.

  11. Nasa Program Plan

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Major facts are given for NASA'S planned FY-1981 through FY-1985 programs in aeronautics, space science, space and terrestrial applications, energy technology, space technology, space transportation systems, space tracking and data systems, and construction of facilities. Competition and cooperation, reimbursable launchings, schedules and milestones, supporting research and technology, mission coverage, and required funding are considered. Tables and graphs summarize new initiatives, significant events, estimates of space shuttle flights, and major missions in astrophysics, planetary exploration, life sciences, environmental and resources observation, and solar terrestrial investigations. The growth in tracking and data systems capabilities is also depicted.

  12. Engine structures analysis software: Component Specific Modeling (COSMO)

    NASA Astrophysics Data System (ADS)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  13. Engine Structures Analysis Software: Component Specific Modeling (COSMO)

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-01-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  14. Method and program product for determining a radiance field in an optical environment

    NASA Technical Reports Server (NTRS)

    Reinersman, Phillip N. (Inventor); Carder, Kendall L. (Inventor)

    2007-01-01

    A hybrid method is presented by which Monte Carlo techniques are combined with iterative relaxation techniques to solve the Radiative Transfer Equation in arbitrary one-, two- or three-dimensional optical environments. The optical environments are first divided into contiguous regions, or elements, with Monte Carlo techniques then being employed to determine the optical response function of each type of element. The elements are combined, and the iterative relaxation techniques are used to determine simultaneously the radiance field on the boundary and throughout the interior of the modeled environment. This hybrid model is capable of providing estimates of the under-water light field needed to expedite inspection of ship hulls and port facilities. It is also capable of providing estimates of the subaerial light field for structured, absorbing or non-absorbing environments such as shadows of mountain ranges within and without absorption spectral bands such as water vapor or CO.sub.2 bands.

  15. Hybrid Neural-Network: Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics Developed and Demonstrated

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2002-01-01

    As part of the NASA Aviation Safety Program, a unique model-based diagnostics method that employs neural networks and genetic algorithms for aircraft engine performance diagnostics has been developed and demonstrated at the NASA Glenn Research Center against a nonlinear gas turbine engine model. Neural networks are applied to estimate the internal health condition of the engine, and genetic algorithms are used for sensor fault detection, isolation, and quantification. This hybrid architecture combines the excellent nonlinear estimation capabilities of neural networks with the capability to rank the likelihood of various faults given a specific sensor suite signature. The method requires a significantly smaller data training set than a neural network approach alone does, and it performs the combined engine health monitoring objectives of performance diagnostics and sensor fault detection and isolation in the presence of nominal and degraded engine health conditions.

  16. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  17. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  18. IPEG- IMPROVED PRICE ESTIMATION GUIDELINES (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Aster, R. W.

    1994-01-01

    The Improved Price Estimation Guidelines, IPEG, program provides a simple yet accurate estimate of the price of a manufactured product. IPEG facilitates sensitivity studies of price estimates at considerably less expense than would be incurred by using the Standard Assembly-line Manufacturing Industry Simulation, SAMIS, program (COSMIC program NPO-16032). A difference of less than one percent between the IPEG and SAMIS price estimates has been observed with realistic test cases. However, the IPEG simplification of SAMIS allows the analyst with limited time and computing resources to perform a greater number of sensitivity studies than with SAMIS. Although IPEG was developed for the photovoltaics industry, it is readily adaptable to any standard assembly line type of manufacturing industry. IPEG estimates the annual production price per unit. The input data includes cost of equipment, space, labor, materials, supplies, and utilities. Production on an industry wide basis or a process wide basis can be simulated. Once the IPEG input file is prepared, the original price is estimated and sensitivity studies may be performed. The IPEG user selects a sensitivity variable and a set of values. IPEG will compute a price estimate and a variety of other cost parameters for every specified value of the sensitivity variable. IPEG is designed as an interactive system and prompts the user for all required information and offers a variety of output options. The IPEG/PC program is written in TURBO PASCAL for interactive execution on an IBM PC computer under DOS 2.0 or above with at least 64K of memory. The IBM PC color display and color graphics adapter are needed to use the plotting capabilities in IPEG/PC. IPEG/PC was developed in 1984. The original IPEG program is written in SIMSCRIPT II.5 for interactive execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 300K of 8 bit bytes. The original IPEG was developed in 1980.

  19. IPEG- IMPROVED PRICE ESTIMATION GUIDELINES (IBM 370 VERSION)

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1994-01-01

    The Improved Price Estimation Guidelines, IPEG, program provides a simple yet accurate estimate of the price of a manufactured product. IPEG facilitates sensitivity studies of price estimates at considerably less expense than would be incurred by using the Standard Assembly-line Manufacturing Industry Simulation, SAMIS, program (COSMIC program NPO-16032). A difference of less than one percent between the IPEG and SAMIS price estimates has been observed with realistic test cases. However, the IPEG simplification of SAMIS allows the analyst with limited time and computing resources to perform a greater number of sensitivity studies than with SAMIS. Although IPEG was developed for the photovoltaics industry, it is readily adaptable to any standard assembly line type of manufacturing industry. IPEG estimates the annual production price per unit. The input data includes cost of equipment, space, labor, materials, supplies, and utilities. Production on an industry wide basis or a process wide basis can be simulated. Once the IPEG input file is prepared, the original price is estimated and sensitivity studies may be performed. The IPEG user selects a sensitivity variable and a set of values. IPEG will compute a price estimate and a variety of other cost parameters for every specified value of the sensitivity variable. IPEG is designed as an interactive system and prompts the user for all required information and offers a variety of output options. The IPEG/PC program is written in TURBO PASCAL for interactive execution on an IBM PC computer under DOS 2.0 or above with at least 64K of memory. The IBM PC color display and color graphics adapter are needed to use the plotting capabilities in IPEG/PC. IPEG/PC was developed in 1984. The original IPEG program is written in SIMSCRIPT II.5 for interactive execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 300K of 8 bit bytes. The original IPEG was developed in 1980.

  20. A Smart Irrigation Approach Aided by Monitoring Surface Soil Moisture using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Wienhold, K. J.; Li, D.; Fang, N. Z.

    2017-12-01

    Soil moisture is a critical component in the optimization of irrigation scheduling in water resources management. Unmanned Aerial Vehicles (UAV) equipped with multispectral sensors represent an emerging technology capable of detecting and estimating soil moisture for irrigation and crop management. This study demonstrates a method of using a UAV as an optical and thermal remote sensing platform combined with genetic programming to derive high-resolution, surface soil moisture (SSM) estimates. The objective is to evaluate the feasibility of spatially-variable irrigation management for a golf course (about 50 acres) in North Central Texas. Multispectral data is collected over the course of one month in the visible, near infrared and longwave infrared spectrums using a UAV capable of rapid and safe deployment for daily estimates. The accuracy of the model predictions is quantified using a time domain reflectometry (TDR) soil moisture sensor and a holdout validation test set. The model produces reasonable estimates for SSM with an average coefficient of correlation (r) = 0.87 and coefficient of determination of (R2) = 0.76. The study suggests that the derived SSM estimates be used to better inform irrigation scheduling decisions for lightly vegetated areas such as the turf or native roughs found on golf courses.

  1. Experiment module concepts study. Volume 5 book 1, appendix A: Shuttle only task

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Results of a preliminary investigation of the effect on the candidate experiment program implementation of experiment module operations in the absence of an orbiting space station and with the availability of the space shuttle orbiter vehicle only are presented. The fundamental hardware elements for shuttle-only operation of the program are: (1) integrated common experiment modules CM-1, CM-3, and CM-4, together with the propulsion slice; (2) support modules capable of supplying on-orbit crew life support, power, data management, and other services normally provided by a space station; (3) dormancy kits to enable normally attached modules to remain in orbit while shuttle returns to earth; and (4) shuttle orbiter. Preliminary cost estimates for 30 day on-orbit and 5 day on-orbit capabilities for a four year implementation period are $4.2 billion and $2.1 billion, respectively.

  2. Study of aerodynamic technology for VSTOL fighter attack aircraft

    NASA Technical Reports Server (NTRS)

    Burhans, W., Jr.; Crafta, V. J., Jr.; Dannenhoffer, N.; Dellamura, F. A.; Krepski, R. E.

    1978-01-01

    Vertical short takeoff aircraft capability, supersonic dash capability, and transonic agility were investigated for the development of Fighter/attack aircraft to be accommodated on ships smaller than present aircraft carriers. Topics covered include: (1) description of viable V/STOL fighter/attack configuration (a high wing, close-coupled canard, twin-engine, control configured aircraft) which meets or exceeds specified levels of vehicle performance; (2) estimates of vehicle aerodynamic characteristics and the methodology utilized to generate them; (3) description of propulsion system characteristics and vehicle mass properties; (4) identification of areas of aerodynamic uncertainty; and (5) a test program to investigate the areas of aerodynamic uncertainty in the conventional flight mode.

  3. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    NASA Astrophysics Data System (ADS)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  4. CARES/Life Ceramics Durability Evaluation Software Enhanced for Cyclic Fatigue

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.

    1999-01-01

    The CARES/Life computer program predicts the probability of a monolithic ceramic component's failure as a function of time in service. The program has many features and options for materials evaluation and component design. It couples commercial finite element programs--which resolve a component's temperature and stress distribution--to reliability evaluation and fracture mechanics routines for modeling strength-limiting defects. The capability, flexibility, and uniqueness of CARES/Life have attracted many users representing a broad range of interests and has resulted in numerous awards for technological achievements and technology transfer. Recent work with CARES/Life was directed at enhancing the program s capabilities with regards to cyclic fatigue. Only in the last few years have ceramics been recognized to be susceptible to enhanced degradation from cyclic loading. To account for cyclic loads, researchers at the NASA Lewis Research Center developed a crack growth model that combines the Power Law (time-dependent) and the Walker Law (cycle-dependent) crack growth models. This combined model has the characteristics of Power Law behavior (decreased damage) at high R ratios (minimum load/maximum load) and of Walker law behavior (increased damage) at low R ratios. In addition, a parameter estimation methodology for constant-amplitude, steady-state cyclic fatigue experiments was developed using nonlinear least squares and a modified Levenberg-Marquardt algorithm. This methodology is used to give best estimates of parameter values from cyclic fatigue specimen rupture data (usually tensile or flexure bar specimens) for a relatively small number of specimens. Methodology to account for runout data (unfailed specimens over the duration of the experiment) was also included.

  5. Space station assembly/servicing capabilities

    NASA Technical Reports Server (NTRS)

    Joyce, Joseph

    1986-01-01

    The aim is to place a permanently manned space station on-orbit around the Earth, which is international in scope. The program is nearing the close of the system definition and preliminary design phase. The first shuttle launch for space station assembly on-orbit is estimated for January 1993. Topics perceived to be important to on-orbit assembly and servicing are discussed. This presentation is represented by charts.

  6. Naval Sea Systems Command On Watch 2010

    DTIC Science & Technology

    2010-01-01

    surface targets, such as zodiacs and fast patrol boats found in the littoral environment. As for future capabilities and goals for the program, An...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing

  7. Nonlethal Munitions (NLM) Expand Warfighter Capabilities

    DTIC Science & Technology

    2008-03-01

    regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington...the Office of the Project Manager Close Combat Systems (PM CCS), part of Program Executive Office Ammuni- tion (PEO Ammo), deployed the Army’s first...provide com- manders the flexibility to influence the situation favorably with increased safety to U.S. Forces while reducing risk of both noncombatant

  8. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    NASA Astrophysics Data System (ADS)

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.

    2016-06-01

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.

  9. Back to BaySICS: a user-friendly program for Bayesian Statistical Inference from Coalescent Simulations.

    PubMed

    Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love

    2014-01-01

    Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.

  10. Disk Alloy Development

    NASA Technical Reports Server (NTRS)

    Gabb, Tim; Gayda, John; Telesman, Jack

    2001-01-01

    The advanced powder metallurgy disk alloy ME3 was designed using statistical screening and optimization of composition and processing variables in the NASA HSR/EPM disk program to have extended durability at 1150 to 1250 "Fin large disks. Scaled-up disks of this alloy were produced at the conclusion of this program to demonstrate these properties in realistic disk shapes. The objective of the UEET disk program was to assess the mechanical properties of these ME3 disks as functions of temperature, in order to estimate the maximum temperature capabilities of this advanced alloy. Scaled-up disks processed in the HSR/EPM Compressor / Turbine Disk program were sectioned, machined into specimens, and tested in tensile, creep, fatigue, and fatigue crack growth tests by NASA Glenn Research Center, in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. Additional sub-scale disks and blanks were processed and tested to explore the effects of several processing variations on mechanical properties. Scaled-up disks of an advanced regional disk alloy, Alloy 10, were used to evaluate dual microstructure heat treatments. This allowed demonstration of an improved balance of properties in disks with higher strength and fatigue resistance in the bores and higher creep and dwell fatigue crack growth resistance in the rims. Results indicate the baseline ME3 alloy and process has 1300 to 1350 O F temperature capabilities, dependent on detailed disk and engine design property requirements. Chemistry and process enhancements show promise for further increasing temperature capabilities.

  11. Legacy model integration for enhancing hydrologic interdisciplinary research

    NASA Astrophysics Data System (ADS)

    Dozier, A.; Arabi, M.; David, O.

    2013-12-01

    Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.

  12. Right Size Determining the Staff Necessary to Sustain Simulation and Computing Capabilities for Nuclear Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikkel, Daniel J.; Meisner, Robert

    The Advanced Simulation and Computing Campaign, herein referred to as the ASC Program, is a core element of the science-based Stockpile Stewardship Program (SSP), which enables assessment, certification, and maintenance of the safety, security, and reliability of the U.S. nuclear stockpile without the need to resume nuclear testing. The use of advanced parallel computing has transitioned from proof-of-principle to become a critical element for assessing and certifying the stockpile. As the initiative phase of the ASC Program came to an end in the mid-2000s, the National Nuclear Security Administration redirected resources to other urgent priorities, and resulting staff reductions inmore » ASC occurred without the benefit of analysis of the impact on modern stockpile stewardship that is dependent on these new simulation capabilities. Consequently, in mid-2008 the ASC Program management commissioned a study to estimate the essential size and balance needed to sustain advanced simulation as a core component of stockpile stewardship. The ASC Program requires a minimum base staff size of 930 (which includes the number of staff necessary to maintain critical technical disciplines as well as to execute required programmatic tasks) to sustain its essential ongoing role in stockpile stewardship.« less

  13. Portfolio Management

    NASA Technical Reports Server (NTRS)

    Duncan, Sharon L.

    2011-01-01

    Enterprise Business Information Services Division (EBIS) supports the Laboratory and its functions through the implementation and support of business information systems on behalf of its business community. EBIS Five Strategic Focus Areas: (1) Improve project estimating, planning and delivery capability (2) Improve maintainability and sustainability of EBIS Application Portfolio (3) Leap forward in IT Leadership (4) Comprehensive Talent Management (5) Continuous IT Security Program. Portfolio Management is a strategy in which software applications are managed as assets

  14. Mass-Rearing Hydrellia Pakistanae Deonier and H. balciunasi Bock for the Management of Hydrilla verticillata

    DTIC Science & Technology

    2009-06-01

    capability. Mass Rearing: The ability to mass-produce large numbers of high quality insect biocontrol agents can be a tremendous asset when...implementing a biocontrol program. Common sense would dictate that releasing a high number of agents allows for higher establishment success, more rapid...varying densities of biocontrol agent (costs estimated using a constant weight). Table 1. Hypothetical calculation of Hydrellia pakistanae production cost

  15. Department of the Army Justification of Estimates for Fiscal Years 1988/1989. Procurement/Appropriations-Construction Program Submitted to Congress.

    DTIC Science & Technology

    1987-01-01

    e4run stoag :ahbl’ -ss to -ervice the, IDS equipment a-nd1 tho liFr . F)- fencing ;haill bo prov.id-d iroun,! the IDS area with necessary gats nd roa1d I...will provide enhanced and expanded capabilities. The EMCS (with 2,000 points) will control package boilers at 24 locations, one furnace, chiller

  16. Fitting integrated enzyme rate equations to progress curves with the use of a weighting matrix.

    PubMed Central

    Franco, R; Aran, J M; Canela, E I

    1991-01-01

    A method is presented for fitting the pairs of values product formed-time taken from progress curves to the integrated rate equation. The procedure is applied to the estimation of the kinetic parameters of the adenosine deaminase system. Simulation studies demonstrate the capabilities of this strategy. A copy of the FORTRAN77 program used can be obtained from the authors by request. PMID:2006914

  17. NASA's Potential Contributions for Remediation of Retention Ponds Using Solar Ultraviolet Radiation and Photocatalysis

    NASA Technical Reports Server (NTRS)

    Underwood, Lauren W.; Ryan, Robert E.

    2007-01-01

    This Candidate Solution uses NASA Earth science research on atmospheric ozone and aerosols data (1) to help improve the prediction capabilities of water runoff models that are used to estimate runoff pollution from retention ponds, and (2) to understand the pollutant removal contribution and potential of photocatalytically coated materials that could be used in these ponds. Models (the EPA's SWMM and the USGS SLAMM) exist that estimate the release of pollutants into the environment from storm-water-related retention pond runoff. UV irradiance data acquired from the satellite mission Aura and from the OMI Surface UV algorithm will be incorporated into these models to enhance their capabilities, not only by increasing the general understanding of retention pond function (both the efficacy and efficiency) but additionally by adding photocatalytic materials to these retention ponds, augmenting their performance. State and local officials who run pollution protection programs could then develop and implement photocatalytic technologies for water pollution control in retention ponds and use them in conjunction with existing runoff models. More effective decisions about water pollution protection programs could be made, the persistence and toxicity of waste generated could be minimized, and subsequently our natural water resources would be improved. This Candidate Solution is in alignment with the Water Management and Public Health National Applications.

  18. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  19. General aviation crash safety program at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.

    1976-01-01

    The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.

  20. Maternal influenza immunization in Malawi: Piloting a maternal influenza immunization program costing tool by examining a prospective program

    PubMed Central

    Pecenka, Clint; Munthali, Spy; Chunga, Paul; Levin, Ann; Morgan, Win; Lambach, Philipp; Bhat, Niranjan; Neuzil, Kathleen M.; Ortiz, Justin R.

    2017-01-01

    Background This costing study in Malawi is a first evaluation of a Maternal Influenza Immunization Program Costing Tool (Costing Tool) for maternal immunization. The tool was designed to help low- and middle-income countries plan for maternal influenza immunization programs that differ from infant vaccination programs because of differences in the target population and potential differences in delivery strategy or venue. Methods This analysis examines the incremental costs of a prospective seasonal maternal influenza immunization program that is added to a successful routine childhood immunization and antenatal care program. The Costing Tool estimates financial and economic costs for different vaccine delivery scenarios for each of the major components of the expanded immunization program. Results In our base scenario, which specifies a donated single dose pre-filled vaccine formulation, the total financial cost of a program that would reach 2.3 million women is approximately $1.2 million over five years. The economic cost of the program, including the donated vaccine, is $10.4 million over the same period. The financial and economic costs per immunized pregnancy are $0.52 and $4.58, respectively. Other scenarios examine lower vaccine uptake, reaching 1.2 million women, and a vaccine purchased at $2.80 per dose with an alternative presentation. Conclusion This study estimates the financial and economic costs associated with a prospective maternal influenza immunization program in a low-income country. In some scenarios, the incremental delivery cost of a maternal influenza immunization program may be as low as some estimates of childhood vaccination programs, assuming the routine childhood immunization and antenatal care systems are capable of serving as the platform for an additional vaccination program. However, purchasing influenza vaccines at the prices assumed in this analysis, instead of having them donated, is likely to be challenging for lower-income countries. This result should be considered as a starting point to understanding the costs of maternal immunization programs in low- and middle-income countries. PMID:29281710

  1. Maternal influenza immunization in Malawi: Piloting a maternal influenza immunization program costing tool by examining a prospective program.

    PubMed

    Pecenka, Clint; Munthali, Spy; Chunga, Paul; Levin, Ann; Morgan, Win; Lambach, Philipp; Bhat, Niranjan; Neuzil, Kathleen M; Ortiz, Justin R; Hutubessy, Raymond

    2017-01-01

    This costing study in Malawi is a first evaluation of a Maternal Influenza Immunization Program Costing Tool (Costing Tool) for maternal immunization. The tool was designed to help low- and middle-income countries plan for maternal influenza immunization programs that differ from infant vaccination programs because of differences in the target population and potential differences in delivery strategy or venue. This analysis examines the incremental costs of a prospective seasonal maternal influenza immunization program that is added to a successful routine childhood immunization and antenatal care program. The Costing Tool estimates financial and economic costs for different vaccine delivery scenarios for each of the major components of the expanded immunization program. In our base scenario, which specifies a donated single dose pre-filled vaccine formulation, the total financial cost of a program that would reach 2.3 million women is approximately $1.2 million over five years. The economic cost of the program, including the donated vaccine, is $10.4 million over the same period. The financial and economic costs per immunized pregnancy are $0.52 and $4.58, respectively. Other scenarios examine lower vaccine uptake, reaching 1.2 million women, and a vaccine purchased at $2.80 per dose with an alternative presentation. This study estimates the financial and economic costs associated with a prospective maternal influenza immunization program in a low-income country. In some scenarios, the incremental delivery cost of a maternal influenza immunization program may be as low as some estimates of childhood vaccination programs, assuming the routine childhood immunization and antenatal care systems are capable of serving as the platform for an additional vaccination program. However, purchasing influenza vaccines at the prices assumed in this analysis, instead of having them donated, is likely to be challenging for lower-income countries. This result should be considered as a starting point to understanding the costs of maternal immunization programs in low- and middle-income countries.

  2. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  3. Smoothing-Based Relative Navigation and Coded Aperture Imaging

    NASA Technical Reports Server (NTRS)

    Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher

    2017-01-01

    This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.

  4. Annual and average estimates of water-budget components based on hydrograph separation and PRISM precipitation for gaged basins in the Appalachian Plateaus Region, 1900-2011

    USGS Publications Warehouse

    Nelms, David L.; Messinger, Terence; McCoy, Kurt J.

    2015-07-14

    As part of the U.S. Geological Survey’s Groundwater Resources Program study of the Appalachian Plateaus aquifers, annual and average estimates of water-budget components based on hydrograph separation and precipitation data from parameter-elevation regressions on independent slopes model (PRISM) were determined at 849 continuous-record streamflow-gaging stations from Mississippi to New York and covered the period of 1900 to 2011. Only complete calendar years (January to December) of streamflow record at each gage were used to determine estimates of base flow, which is that part of streamflow attributed to groundwater discharge; such estimates can serve as a proxy for annual recharge. For each year, estimates of annual base flow, runoff, and base-flow index were determined using computer programs—PART, HYSEP, and BFI—that have automated the separation procedures. These streamflow-hydrograph analysis methods are provided with version 1.0 of the U.S. Geological Survey Groundwater Toolbox, which is a new program that provides graphing, mapping, and analysis capabilities in a Windows environment. Annual values of precipitation were estimated by calculating the average of cell values intercepted by basin boundaries where previously defined in the GAGES–II dataset. Estimates of annual evapotranspiration were then calculated from the difference between precipitation and streamflow.

  5. Photometric redshift estimation based on data mining with PhotoRApToR

    NASA Astrophysics Data System (ADS)

    Cavuoti, S.; Brescia, M.; De Stefano, V.; Longo, G.

    2015-03-01

    Photometric redshifts (photo-z) are crucial to the scientific exploitation of modern panchromatic digital surveys. In this paper we present PhotoRApToR (Photometric Research Application To Redshift): a Java/C ++ based desktop application capable to solve non-linear regression and multi-variate classification problems, in particular specialized for photo-z estimation. It embeds a machine learning algorithm, namely a multi-layer neural network trained by the Quasi Newton learning rule, and special tools dedicated to pre- and post-processing data. PhotoRApToR has been successfully tested on several scientific cases. The application is available for free download from the DAME Program web site.

  6. Final report on the development of the geographic position locator (GPL). Volume 12. Data reduction A3FIX: subroutine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niven, W.A.

    The long-term position accuracy of an inertial navigation system depends primarily on the ability of the gyroscopes to maintain a near-perfect reference orientation. Small imperfections in the gyroscopes cause them to drift slowly away from their initial orientation, thereby producing errors in the system's calculations of position. The A3FIX is a computer program subroutine developed to estimate inertial navigation system gyro drift rates with the navigator stopped or moving slowly. It processes data of the navigation system's position error to arrive at estimates of the north- south and vertical gyro drift rates. It also computes changes in the east--west gyromore » drift rate if the navigator is stopped and if data on the system's azimuth error changes are also available. The report describes the subroutine, its capabilities, and gives examples of gyro drift rate estimates that were computed during the testing of a high quality inertial system under the PASSPORT program at the Lawrence Livermore Laboratory. The appendices provide mathematical derivations of the estimation equations that are used in the subroutine, a discussion of the estimation errors, and a program listing and flow diagram. The appendices also contain a derivation of closed form solutions to the navigation equations to clarify the effects that motion and time-varying drift rates induce in the phase-plane relationships between the Schulerfiltered errors in latitude and azimuth snd between the Schulerfiltered errors in latitude and longitude. (auth)« less

  7. DoD High Performance Computing Modernization Program FY16 Annual Report

    DTIC Science & Technology

    2018-05-02

    vortex shedding from rotor blade tips using adaptive mesh refinement gives Helios the unique capability to assess the interaction of these vortices...with the fuselage and nearby rotor blades . Helios provides all the benefits for rotary-winged aircraft that Kestrel does for fixed-wing aircraft...rotor blade upgrade of the CH-47F Chinook helicopter to achieve up to an estimated 2,000 pounds increase in hover thrust (~10%) with limited

  8. NASA SMD Airborne Science Capabilities for Development and Testing of New Instruments

    NASA Technical Reports Server (NTRS)

    Fladeland, Matthew

    2015-01-01

    The SMD NASA Airborne Science Program operates and maintains a fleet of highly modified aircraft to support instrument development, satellite instrument calibration, data product validation and earth science process studies. This poster will provide an overview of aircraft available to NASA researchers including performance specifications and modifications for instrument support, processes for requesting aircraft time and developing cost estimates for proposals, and policies and procedures required to ensure safety of flight.

  9. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  10. Consensus Prediction of Charged Single Alpha-Helices with CSAHserver.

    PubMed

    Dudola, Dániel; Tóth, Gábor; Nyitray, László; Gáspári, Zoltán

    2017-01-01

    Charged single alpha-helices (CSAHs) constitute a rare structural motif. CSAH is characterized by a high density of regularly alternating residues with positively and negatively charged side chains. Such segments exhibit unique structural properties; however, there are only a handful of proteins where its existence is experimentally verified. Therefore, establishing a pipeline that is capable of predicting the presence of CSAH segments with a low false positive rate is of considerable importance. Here we describe a consensus-based approach that relies on two conceptually different CSAH detection methods and a final filter based on the estimated helix-forming capabilities of the segments. This pipeline was shown to be capable of identifying previously uncharacterized CSAH segments that could be verified experimentally. The method is available as a web server at http://csahserver.itk.ppke.hu and also a downloadable standalone program suitable to scan larger sequence collections.

  11. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  12. Astrometric Telescope Facility preliminary systems definition study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Sobeck, Charlie

    1987-01-01

    The Astrometric Telescope Facility (ATF) is a spaceborne observatory proposed for use on the Space Station (SS) as an Initial Operating Capability (IOC) payload. The primary objective of the ATF will be the search for extrasolar planetary systems and a detailed investigation of any discovered systems. In addition, it will have the capability of conducting other astrophysics investigations; e.g., measuring precise distances and motions of stars within our galaxy. The purposes of the study were to: (1) define mission and system requirements; (2) define a strawman system concept for the facility at the Prephase A level; (3) define the need for additional trade studies or technology development; and (4) estimate program cost for the strawman concept. It has been assumed for the study that the ATF will be a SS payload, will use a SS-provided Coarse Pointing System (CPS), will meet SS constraints, and will make maximum use of existing flight qualified designs or designs to be qualified by the SS program for general SS use.

  13. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  14. Libya, Algeria and Egypt: crude oil potential from known deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dietzman, W.D.; Rafidi, N.R.; Ross, T.A.

    1982-04-01

    An analysis is presented of the discovered crude oil resources, reserves, and estimated annual production from known fields of the Republics of Libya, Algeria, and Egypt. Proved reserves are defined as the remaining producible oil as of a specified date under operating practice in effect at that time and include estimated recoverable oil in undrilled portions of a given structure or structures. Also included in the proved reserve category are the estimated indicated additional volumes of recoverable oil from the entire oil reservoir where fluid injection programs have been started in a portion, or portions, of the reservoir. The indicatedmore » additional reserves (probable reserves) reported herein are the volumes of crude oil that might be obtained with the installation of secondary recovery or pressure maintenance operations in reservoirs where none have been previously installed. The sum of cumulative production, proved reserves, and probable reserves is defined as the ultimate oil recovery from known deposits; and resources are defined as the original oil in place (OOIP). An assessment was made of the availability of crude oil under three assumed sustained production rates for each country; an assessment was also made of each country's capability of sustaining production at, or near, the 1980 rates assuming different limiting reserve to production ratios. Also included is an estimate of the potential maximum producing capability from known deposits that might be obtained from known accumulations under certain assumptions, using a simple time series approach. The theoretical maximum oil production capability from known fields at any time is the maximum deliverability rate assuming there are no equipment, investment, market, or political constraints.« less

  15. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    DOE PAGES

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; ...

    2016-02-24

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involvingmore » carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.« less

  16. Development of the Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, H. G.

    2000-01-01

    A specific knowledge base to evaluate the on-orbit performance of selected materials on spacecraft is being developed under contract to the NASA SEE program. An artificial intelligence software package, the Boeing Expert System Tool (BEST), contains an inference engine used to operate knowledge bases constructed to selectively recall and distribute information about materials performance in space applications. This same system is used to make estimates of the environmental exposures expected for a given space flight. The performance capabilities of the Spacecraft Materials Selector (SMS) knowledge base are described in this paper. A case history for a planned flight experiment on ISS is shown as an example of the use of the SMS, and capabilities and limitations of the knowledge base are discussed.

  17. MSFC crack growth analysis computer program, version 2 (users manual)

    NASA Technical Reports Server (NTRS)

    Creager, M.

    1976-01-01

    An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.

  18. Enhancing Web applications in radiology with Java: estimating MR imaging relaxation times.

    PubMed

    Dagher, A P; Fitzpatrick, M; Flanders, A E; Eng, J

    1998-01-01

    Java is a relatively new programming language that has been used to develop a World Wide Web-based tool for estimating magnetic resonance (MR) imaging relaxation times, thereby demonstrating how Java may be used for Web-based radiology applications beyond improving the user interface of teaching files. A standard processing algorithm coded with Java is downloaded along with the hypertext markup language (HTML) document. The user (client) selects the desired pulse sequence and inputs data obtained from a region of interest on the MR images. The algorithm is used to modify selected MR imaging parameters in an equation that models the phenomenon being evaluated. MR imaging relaxation times are estimated, and confidence intervals and a P value expressing the accuracy of the final results are calculated. Design features such as simplicity, object-oriented programming, and security restrictions allow Java to expand the capabilities of HTML by offering a more versatile user interface that includes dynamic annotations and graphics. Java also allows the client to perform more sophisticated information processing and computation than is usually associated with Web applications. Java is likely to become a standard programming option, and the development of stand-alone Java applications may become more common as Java is integrated into future versions of computer operating systems.

  19. Computer program for analysis of high speed, single row, angular contact, spherical roller bearing, SASHBEAN. Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Aggarwal, Arun K.

    1993-01-01

    The computer program SASHBEAN (Sikorsky Aircraft Spherical Roller High Speed Bearing Analysis) analyzes and predicts the operating characteristics of a Single Row, Angular Contact, Spherical Roller Bearing (SRACSRB). The program runs on an IBM or IBM compatible personal computer, and for a given set of input data analyzes the bearing design for it's ring deflections (axial and radial), roller deflections, contact areas and stresses, induced axial thrust, rolling element and cage rotation speeds, lubrication parameters, fatigue lives, and amount of heat generated in the bearing. The dynamic loading of rollers due to centrifugal forces and gyroscopic moments, which becomes quite significant at high speeds, is fully considered in this analysis. For a known application and it's parameters, the program is also capable of performing steady-state and time-transient thermal analyses of the bearing system. The steady-state analysis capability allows the user to estimate the expected steady-state temperature map in and around the bearing under normal operating conditions. On the other hand, the transient analysis feature provides the user a means to simulate the 'lost lubricant' condition and predict a time-temperature history of various critical points in the system. The bearing's 'time-to-failure' estimate may also be made from this (transient) analysis by considering the bearing as failed when a certain temperature limit is reached in the bearing components. The program is fully interactive and allows the user to get started and access most of its features with a minimal of training. For the most part, the program is menu driven, and adequate help messages were provided to guide a new user through various menu options and data input screens. All input data, both for mechanical and thermal analyses, are read through graphical input screens, thereby eliminating any need of a separate text editor/word processor to edit/create data files. Provision is also available to select and view the contents of output files on the monitor screen if no paper printouts are required. A separate volume (Volume-2) of this documentation describes, in detail, the underlying mathematical formulations, assumptions, and solution algorithms of this program.

  20. Joint Precision Approach and Landing System Increment 1A (JPALS Inc 1A)

    DTIC Science & Technology

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-238 Joint Precision Approach and Landing System Increment 1A (JPALS Inc 1A) As of FY 2017...President’s Budget Defense Acquisition Management Information Retrieval (DAMIR) March 10, 2016 11:30:56 UNCLASSIFIED JPALS Inc 1A December 2015 SAR...Fiscal Year FYDP - Future Years Defense Program ICE - Independent Cost Estimate IOC - Initial Operational Capability Inc - Increment JROC - Joint

  1. Department of the Army Justification of Estimates for Fiscal Year 1983 Submitted to Congress February 1982. Part 5 (Other), Part 6 (National Guard Equipment) and Part 7 (PEMA).

    DTIC Science & Technology

    1982-02-01

    JUST1ICTIOI OFI ZB URSM The major cat"ories of vhI"es Included im the request ms Tactical Veicles - $1039.7 miim for dolly so, trailers. smitrala e ead trucks...will provide a hybrid analog and Oigital capability to support the transition to the future systems. The program provides for automated operation

  2. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  3. Capabilities and applications of the Program to Optimize Simulated Trajectories (POST). Program summary document

    NASA Technical Reports Server (NTRS)

    Brauer, G. L.; Cornick, D. E.; Stevenson, R.

    1977-01-01

    The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.

  4. Preliminary performance estimates of a highly maneuverable remotely piloted vehicle. [computerized synthesis program to assess effects of vehicle and mission parameters

    NASA Technical Reports Server (NTRS)

    Nelms, W. P., Jr.; Axelson, J. A.

    1974-01-01

    A computerized synthesis program has been used to assess the effects of various vehicle and mission parameters on the performance of a highly maneuverable remotely piloted vehicle (RPV) for the air-to-air combat role. The configuration used in the study is a trapezoidal-wing and body concept, with forward-mounted stabilizing and control surfaces. The study mission consists of an outbound cruise, an acceleration phase, a series of subsonic and supersonic turns, and a return cruise. Performance is evaluated in terms of both the required vehicle weight to accomplish this mission and combat effectiveness as measured by turning and acceleration capability. The report describes the synthesis program, the mission, the vehicle, and the results of sensitivity and trade studies.

  5. Rotordynamics on the PC: Further Capabilities of ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Rotordynamics codes for personal computers are now becoming available. One of the most capable codes is Analysis of RotorDynamic Systems (ARDS) which uses the component mode synthesis method to analyze a system of up to 5 rotating shafts. ARDS was originally written for a mainframe computer but has been successfully ported to a PC; its basic capabilities for steady-state and transient analysis were reported in an earlier paper. Additional functions have now been added to the PC version of ARDS. These functions include: 1) Estimation of the peak response following blade loss without resorting to a full transient analysis; 2) Calculation of response sensitivity to input parameters; 3) Formulation of optimum rotor and damper designs to place critical speeds in desirable ranges or minimize bearing loads; 4) Production of Poincard plots so the presence of chaotic motion can be ascertained. ARDS produces printed and plotted output. The executable code uses the full array sizes of the mainframe version and fits on a high density floppy disc. Examples of all program capabilities are presented and discussed.

  6. Graphical workstation capability for reliability modeling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Koppen, Sandra V.; Haley, Pamela J.

    1992-01-01

    In addition to computational capabilities, software tools for estimating the reliability of fault-tolerant digital computer systems must also provide a means of interfacing with the user. Described here is the new graphical interface capability of the hybrid automated reliability predictor (HARP), a software package that implements advanced reliability modeling techniques. The graphics oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault-tree gates, including sequence-dependency gates, or by a Markov chain. By using this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain, which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the graphical kernal system (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing stages.

  7. Satellite services system program plan

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J.

    1985-01-01

    The purpose is to determine the potential for servicing from the Space Shuttle Orbiter and to assess NASA's role as the catalyst in bringing about routine on-orbit servicing. Specifically this study seeks to determine what requirements, in terms of both funds and time, are needed to make the Shuttle Orbiter not only a transporter of spacecraft but a servicing vehicle for those spacecraft as well. The scope of this effort is to focus on the near term development of a generic servicing capability. To make this capability truly generic and attractive requires that the customer's point of veiw be taken and transformed into a widely usable set of hardware. And to maintain a near term advent of this capability requires that a minimal reliance be made on advanced technology. With this background and scope, this study will proceed through three general phases to arrive at the desired program costs and schedule. The first step will be to determine the servicing requirements of the user community. This will provide the basis for the second phase which is to develop hardware concepts to meet these needs. Finally, a cost estimate will be made for each of the new hardware concepts and a phased hardware development plan will be established for the acquisition of these items based on the inputs obtained from the user community.

  8. Current and future technology in radial and axial gas turbines

    NASA Technical Reports Server (NTRS)

    Rohlik, H. E.

    1983-01-01

    Design approaches and flow analysis techniques currently employed by aircraft engine manufacturers are assessed. Studies were performed to define the characteristics of aircraft and engines for civil missions of the 1990's and beyond. These studies, coupled with experience in recent years, identified the critical technologies needed to meet long range goals in fuel economy and other operating costs. Study results, recent and current research and development programs, and an estimate of future design and analytic capabilities are discussed.

  9. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Operational Requirements of the Air Force Ophthalmology Augmentation Team (FFEYE)

    DTIC Science & Technology

    2010-10-14

    non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient conditions, selected by the...for a range of surgical and non- surgical injuries and illnesses, typically seen and treated by an ophthalmologist and one technician working 12-hour...receive them. The “Equipment/supplies” column identifies the items needed to complete the “Insert endo - trach tube” task at that level of capability. Not

  10. A Framework for Dimensioning VDL-2 Air-Ground Networks

    NASA Technical Reports Server (NTRS)

    Ribeiro, Leila Z.; Monticone, Leone C.; Snow, Richard E.; Box, Frank; Apaza, Rafel; Bretmersky, Steven

    2014-01-01

    This paper describes a framework developed at MITRE for dimensioning a Very High Frequency (VHF) Digital Link Mode 2 (VDL-2) Air-to-Ground network. This framework was developed to support the FAA's Data Communications (Data Comm) program by providing estimates of expected capacity required for the air-ground network services that will support Controller-Pilot-Data-Link Communications (CPDLC), as well as the spectrum needed to operate the system at required levels of performance. The Data Comm program is part of the FAA's NextGen initiative to implement advanced communication capabilities in the National Airspace System (NAS). The first component of the framework is the radio-frequency (RF) coverage design for the network ground stations. Then we proceed to describe the approach used to assess the aircraft geographical distribution and the data traffic demand expected in the network. The next step is the resource allocation utilizing optimization algorithms developed in MITRE's Spectrum ProspectorTM tool to propose frequency assignment solutions, and a NASA-developed VDL-2 tool to perform simulations and determine whether a proposed plan meets the desired performance requirements. The framework presented is capable of providing quantitative estimates of multiple variables related to the air-ground network, in order to satisfy established coverage, capacity and latency performance requirements. Outputs include: coverage provided at different altitudes; data capacity required in the network, aggregated or on a per ground station basis; spectrum (pool of frequencies) needed for the system to meet a target performance; optimized frequency plan for a given scenario; expected performance given spectrum available; and, estimates of throughput distributions for a given scenario. We conclude with a discussion aimed at providing insight into the tradeoffs and challenges identified with respect to radio resource management for VDL-2 air-ground networks.

  11. Economic benefit of fertility control in wild horse populations

    USGS Publications Warehouse

    Bartholow, J.

    2007-01-01

    I projected costs for several contraceptive treatments that could be used by the Bureau of Land Management (BLM) to manage 4 wild horse (Equus caballus) populations. Potential management alternatives included existing roundup and selective removal methods combined with contraceptives of different duration and effectiveness. I projected costs for a 20-year economic life using the WinEquus?? wild horse population model and state-by-state cost estimates reflecting BLM's operational expenses. Findings revealed that 1) currently available 2-year contraceptives in most situations are capable of reducing variable operating costs by 15%, 2) experimental 3-year contraceptives may be capable of reducing costs by 18%, and 3) combining contraceptives with modest changes to herd sex ratio (e.g., 55-60% M) could trim costs by 30%. Predicted savings can increase when contraception is applied in conjunction with a removal policy that targets horses aged 0-4 years instead of 0-5 years. However, reductions in herd size result in greater variation in annual operating expenses. Because the horse program's variable operating costs make up about half of the total program costs (which include other fixed costs), contraceptive application and management can only reduce total costs by 14%, saving about $6.1 million per year. None of the contraceptive options I examined eliminated the need for long-term holding facilities over the 20-year period simulated, but the number of horses held may be reduced by about 17% with contraceptive treatment. Cost estimates were most sensitive to the oldest age adoptable and per-day holding costs. The BLM will experience significant cost savings as carefully designed contraceptive programs become widespread in the wild horse herds it manages.

  12. A Web-Based System for Bayesian Benchmark Dose Estimation.

    PubMed

    Shao, Kan; Shapiro, Andrew J

    2018-01-11

    Benchmark dose (BMD) modeling is an important step in human health risk assessment and is used as the default approach to identify the point of departure for risk assessment. A probabilistic framework for dose-response assessment has been proposed and advocated by various institutions and organizations; therefore, a reliable tool is needed to provide distributional estimates for BMD and other important quantities in dose-response assessment. We developed an online system for Bayesian BMD (BBMD) estimation and compared results from this software with U.S. Environmental Protection Agency's (EPA's) Benchmark Dose Software (BMDS). The system is built on a Bayesian framework featuring the application of Markov chain Monte Carlo (MCMC) sampling for model parameter estimation and BMD calculation, which makes the BBMD system fundamentally different from the currently prevailing BMD software packages. In addition to estimating the traditional BMDs for dichotomous and continuous data, the developed system is also capable of computing model-averaged BMD estimates. A total of 518 dichotomous and 108 continuous data sets extracted from the U.S. EPA's Integrated Risk Information System (IRIS) database (and similar databases) were used as testing data to compare the estimates from the BBMD and BMDS programs. The results suggest that the BBMD system may outperform the BMDS program in a number of aspects, including fewer failed BMD and BMDL calculations and estimates. The BBMD system is a useful alternative tool for estimating BMD with additional functionalities for BMD analysis based on most recent research. Most importantly, the BBMD has the potential to incorporate prior information to make dose-response modeling more reliable and can provide distributional estimates for important quantities in dose-response assessment, which greatly facilitates the current trend for probabilistic risk assessment. https://doi.org/10.1289/EHP1289.

  13. Post-licensure rapid immunization safety monitoring program (PRISM) data characterization.

    PubMed

    Baker, Meghan A; Nguyen, Michael; Cole, David V; Lee, Grace M; Lieu, Tracy A

    2013-12-30

    The Post-Licensure Rapid Immunization Safety Monitoring (PRISM) program is the immunization safety monitoring component of FDA's Mini-Sentinel project, a program to actively monitor the safety of medical products using electronic health information. FDA sought to assess the surveillance capabilities of this large claims-based distributed database for vaccine safety surveillance by characterizing the underlying data. We characterized data available on vaccine exposures in PRISM, estimated how much additional data was gained by matching with select state and local immunization registries, and compared vaccination coverage estimates based on PRISM data with other available data sources. We generated rates of computerized codes representing potential health outcomes relevant to vaccine safety monitoring. Standardized algorithms including ICD-9 codes, number of codes required, exclusion criteria and location of the encounter were used to obtain the background rates. The majority of the vaccines routinely administered to infants, children, adolescents and adults were well captured by claims data. Immunization registry data in up to seven states comprised between 5% and 9% of data for all vaccine categories with the exception of 10% for hepatitis B and 3% and 4% for rotavirus and zoster respectively. Vaccination coverage estimates based on PRISM's computerized data were similar to but lower than coverage estimates from the National Immunization Survey and Healthcare Effectiveness Data and Information Set. For the 25 health outcomes of interest studied, the rates of potential outcomes based on ICD-9 codes were generally higher than rates described in the literature, which are typically clinically confirmed cases. PRISM program's data on vaccine exposures and health outcomes appear complete enough to support robust safety monitoring. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. SPHERES National Lab Facility

    NASA Technical Reports Server (NTRS)

    Benavides, Jose

    2014-01-01

    SPHERES is a facility of the ISS National Laboratory with three IVA nano-satellites designed and delivered by MIT to research estimation, control, and autonomy algorithms. Since Fall 2010, The SPHERES system is now operationally supported and managed by NASA Ames Research Center (ARC). A SPHERES Program Office was established and is located at NASA Ames Research Center. The SPHERES Program Office coordinates all SPHERES related research and STEM activities on-board the International Space Station (ISS), as well as, current and future payload development. By working aboard ISS under crew supervision, it provides a risk tolerant Test-bed Environment for Distributed Satellite Free-flying Control Algorithms. If anything goes wrong, reset and try again! NASA has made the capability available to other U.S. government agencies, schools, commercial companies and students to expand the pool of ideas for how to test and use these bowling ball-sized droids. For many of the researchers, SPHERES offers the only opportunity to do affordable on-orbit characterization of their technology in the microgravity environment. Future utilization of SPHERES as a facility will grow its capabilities as a platform for science, technology development, and education.

  15. NASA evolution of exploration architectures

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.

    1991-01-01

    A series of charts and diagrams is used to provide a detailed overview of the evolution of NASA space exploration architectures. The pre-Apollo programs including the Werner von Braun feasibility study are discussed and the evolution of the Apollo program itself is treated in detail. The post-Apollo era is reviewed and attention is given to the resurgence of strategic planning exemplified by both ad hoc and formal efforts at planning. Results of NASA's study of the main elements of the Space Exploration Initiative which examined technical scenarios, science opportunities, required technologies, international considerations, institutional strengths and needs, and resource estimates are presented. The 90-day study concludes that, among other things, major investments in challenging technologies are required, the scientific opportunities provided by the program are considerable, current launch capabilities are inadequate, and Space Station Freedom is essential.

  16. Assessment of environments for Mars Science Laboratory entry, descent, and surface operations

    USGS Publications Warehouse

    Vasavada, Ashwin R.; Chen, Allen; Barnes, Jeffrey R.; Burkhart, P. Daniel; Cantor, Bruce A.; Dwyer-Cianciolo, Alicia M.; Fergason, Robini L.; Hinson, David P.; Justh, Hilary L.; Kass, David M.; Lewis, Stephen R.; Mischna, Michael A.; Murphy, James R.; Rafkin, Scot C.R.; Tyler, Daniel; Withers, Paul G.

    2012-01-01

    The Mars Science Laboratory mission aims to land a car-sized rover on Mars' surface and operate it for at least one Mars year in order to assess whether its field area was ever capable of supporting microbial life. Here we describe the approach used to identify, characterize, and assess environmental risks to the landing and rover surface operations. Novel entry, descent, and landing approaches will be used to accurately deliver the 900-kg rover, including the ability to sense and "fly out" deviations from a best-estimate atmospheric state. A joint engineering and science team developed methods to estimate the range of potential atmospheric states at the time of arrival and to quantitatively assess the spacecraft's performance and risk given its particular sensitivities to atmospheric conditions. Numerical models are used to calculate the atmospheric parameters, with observations used to define model cases, tune model parameters, and validate results. This joint program has resulted in a spacecraft capable of accessing, with minimal risk, the four finalist sites chosen for their scientific merit. The capability to operate the landed rover over the latitude range of candidate landing sites, and for all seasons, was verified against an analysis of surface environmental conditions described here. These results, from orbital and model data sets, also drive engineering simulations of the rover's thermal state that are used to plan surface operations.

  17. Dual Mission Scenarios for the Human Lunar Campaign - Performance, Cost and Risk Benefits

    NASA Technical Reports Server (NTRS)

    Saucillo, Rudolph J.; Reeves, David M.; Chrone, Jonathan D.; Stromgren, Chel; Reeves, John D.; North, David D.

    2008-01-01

    Scenarios for human lunar operations with capabilities significantly beyond Constellation Program baseline missions are potentially feasible based on the concept of dual, sequential missions utilizing a common crew and a single Ares I/CEV (Crew Exploration Vehicle). For example, scenarios possible within the scope of baseline technology planning include outpost-based sortie missions and dual sortie missions. Top level cost benefits of these dual sortie scenarios may be estimated by comparison to the Constellation Program reference two-mission-per-year lunar campaign. The primary cost benefit is the accomplishment of Mission B with a "single launch solution" since no Ares I launch is required. Cumulative risk to the crew is lowered since crew exposure to launch risks and Earth return risks are reduced versus comparable Constellation Program reference two-mission-per-year scenarios. Payload-to-the-lunar-surface capability is substantially increased in the Mission B sortie as a result of additional propellant available for Lunar Lander #2 descent. This additional propellant is a result of EDS #2 transferring a smaller stack through trans-lunar injection and using remaining propellant to perform a portion of the lunar orbit insertion (LOI) maneuver. This paper describes these dual mission concepts, including cost, risk and performance benefits per lunar sortie site, and provides an initial feasibility assessment.

  18. YF-17/ADEN system study

    NASA Technical Reports Server (NTRS)

    Gowadia, N. S.; Bard, W. D.; Wooten, W. H.

    1979-01-01

    The YF-17 aircraft was evaluated as a candidate nonaxisymmetric nozzle flight demonstrator. Configuration design modifications, control system design, flight performance assessment, and program plan and cost we are summarized. Two aircraft configurations were studied. The first was modified as required to install only the augmented deflector exhaust nozzle (ADEN). The second one added a canard installation to take advantage of the full (up to 20 deg) nozzle vectoring capability. Results indicate that: (1) the program is feasible and can be accomplished at reasonable cost and low risk; (2) installation of ADEN increases the aircraft weight by 600 kg (1325 lb); (3) the control system can be modified to accomplish direct lift, pointing capability, variable static margin and deceleration modes of operation; (4) unvectored thrust-minus-drag is similar to the baseline YF-17; and (5) vectoring does not improve maneuvering performance. However, some potential benefits in direct lift, aircraft pointing, handling at low dynamic pressure and takeoff/landing ground roll are available. A 27 month program with 12 months of flight test is envisioned, with the cost estimated to be $15.9 million for the canard equipped aircraft and $13.2 million for the version without canard. The feasiblity of adding a thrust reverser to the YF-17/ADEN was investigated.

  19. Overview of ASC Capability Computing System Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott W.

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  20. Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1993-01-01

    The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.

  1. Incorporating Alternative Care Site Characteristics Into Estimates of Substitutable ED Visits.

    PubMed

    Trueger, Nathan Seth; Chua, Kao-Ping; Hussain, Aamir; Liferidge, Aisha T; Pitts, Stephen R; Pines, Jesse M

    2017-07-01

    Several recent efforts to improve health care value have focused on reducing emergency department (ED) visits that potentially could be treated in alternative care sites (ie, primary care offices, retail clinics, and urgent care centers). Estimates of the number of these visits may depend on assumptions regarding the operating hours and functional capabilities of alternative care sites. However, methods to account for the variability in these characteristics have not been developed. To develop methods to incorporate the variability in alternative care site characteristics into estimates of ED visit "substitutability." Our approach uses the range of hours and capabilities among alternative care sites to estimate lower and upper bounds of ED visit substitutability. We constructed "basic" and "extended" criteria that captured the plausible degree of variation in each site's hours and capabilities. To illustrate our approach, we analyzed data from 22,697 ED visits by adults in the 2011 National Hospital Ambulatory Medical Care Survey, defining a visit as substitutable if it was treat-and-release and met both the operating hours and functional capabilities criteria. Use of the combined basic hours/basic capabilities criteria and extended hours/extended capabilities generated lower and upper bounds of estimates. Our criteria classified 5.5%-27.1%, 7.6%-20.4%, and 10.6%-46.0% of visits as substitutable in primary care offices, retail clinics, and urgent care centers, respectively. Alternative care sites vary widely in operating hours and functional capabilities. Methods such as ours may help incorporate this variability into estimates of ED visit substitutability.

  2. Do older adults perceive postural constraints for reach estimation?

    PubMed

    Cordova, Alberto; Gabbard, Carl

    2014-01-01

    BACKGROUND/STUDY CONTEXT: Recent evidence indicates that older persons have difficulty mentally representing intended movements. Furthermore, in an estimation of reach paradigm using motor imagery, a form of mental representation, older persons significantly overestimated their ability compared with young adults. The authors tested the notion that older adults may also have difficulty perceiving the postural constraints associated with reach estimation. The authors compared young (Mage = 22 years) and older (Mage = 67) adults on reach estimation while seated and in a more postural demanding standing and leaning forward position. The expectation was a significant postural effect with the standing condition, as evidenced by reduced overestimation. Whereas there was no difference between groups in the seated condition (both overestimated), older adults underestimated whereas the younger group once again overestimated in the standing condition. From one perspective, these results show that older adults do perceive postural constraints in light of their own physical capabilities. That is, that group perceived greater postural demands with the standing posture and elected to program a more conservative strategy, resulting in underestimation.

  3. 1r2dinv: A finite-difference model for inverse analysis of two dimensional linear or radial groundwater flow

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Butler, J.J.

    2001-01-01

    We have developed a program for inverse analysis of two-dimensional linear or radial groundwater flow problems. The program, 1r2dinv, uses standard finite difference techniques to solve the groundwater flow equation for a horizontal or vertical plane with heterogeneous properties. In radial mode, the program simulates flow to a well in a vertical plane, transforming the radial flow equation into an equivalent problem in Cartesian coordinates. The physical parameters in the model are horizontal or x-direction hydraulic conductivity, anisotropy ratio (vertical to horizontal conductivity in a vertical model, y-direction to x-direction in a horizontal model), and specific storage. The program allows the user to specify arbitrary and independent zonations of these three parameters and also to specify which zonal parameter values are known and which are unknown. The Levenberg-Marquardt algorithm is used to estimate parameters from observed head values. Particularly powerful features of the program are the ability to perform simultaneous analysis of heads from different tests and the inclusion of the wellbore in the radial mode. These capabilities allow the program to be used for analysis of suites of well tests, such as multilevel slug tests or pumping tests in a tomographic format. The combination of information from tests stressing different vertical levels in an aquifer provides the means for accurately estimating vertical variations in conductivity, a factor profoundly influencing contaminant transport in the subsurface. ?? 2001 Elsevier Science Ltd. All rights reserved.

  4. The Spacecraft Materials Selector: An Artificial Intelligence System for Preliminary Design Trade Studies, Materials Assessments, and Estimates of Environments Present

    NASA Technical Reports Server (NTRS)

    Pippin, H. G.; Woll, S. L. B.

    2000-01-01

    Institutions need ways to retain valuable information even as experienced individuals leave an organization. Modern electronic systems have enough capacity to retain large quantities of information that can mitigate the loss of experience. Performance information for long-term space applications is relatively scarce and specific information (typically held by a few individuals within a single project) is often rather narrowly distributed. Spacecraft operate under severe conditions and the consequences of hardware and/or system failures, in terms of cost, loss of information, and time required to replace the loss, are extreme. These risk factors place a premium on appropriate choice of materials and components for space applications. An expert system is a very cost-effective method for sharing valuable and scarce information about spacecraft performance. Boeing has an artificial intelligence software package, called the Boeing Expert System Tool (BEST), to construct and operate knowledge bases to selectively recall and distribute information about specific subjects. A specific knowledge base to evaluate the on-orbit performance of selected materials on spacecraft has been developed under contract to the NASA SEE program. The performance capabilities of the Spacecraft Materials Selector (SMS) knowledge base are described. The knowledge base is a backward-chaining, rule-based system. The user answers a sequence of questions, and the expert system provides estimates of optical and mechanical performance of selected materials under specific environmental conditions. The initial operating capability of the system will include data for Kapton, silverized Teflon, selected paints, silicone-based materials, and certain metals. For situations where a mission profile (launch date, orbital parameters, mission duration, spacecraft orientation) is not precisely defined, the knowledge base still attempts to provide qualitative observations about materials performance and likely exposures. Prior to the NASA contract, a knowledge base, the Spacecraft Environments Assistant (SEA,) was initially developed by Boeing to estimate the environmental factors important for a specific spacecraft mission profile. The NASA SEE program has funded specific enhancements to the capability of this knowledge base. The SEA qualitatively identifies over 25 environmental factors that may influence the performance of a spacecraft during its operational lifetime. For cases where sufficiently detailed answers are provided to questions asked by the knowledge base, atomic oxygen fluence levels, proton and/or electron fluence and dose levels, and solar exposure hours are calculated. The SMS knowledge base incorporates the previously developed SEA knowledge base. A case history for previous flight experiment will be shown as an example, and capabilities and limitations of the system will be discussed.

  5. Research requirements for emergency power to permit hover-one-engine-inoperative helicopter operation

    NASA Technical Reports Server (NTRS)

    Yost, J. H.

    1976-01-01

    The research and technology demonstration requirements to achieve emergency-power capability for a civil helicopter are documented. The goal for emergency power is the ability to hover with one engine inoperative, transition to minimum-power forward flight, and continue to a safe landing where emergency power may or may not be required. The best method to obtain emergency power is to augment the basic engine power by increasing the engine's speed and turbine-inlet temperature, combined with water-alcohol injection at the engine inlet. Other methods, including turbine boost power and flywheel energy, offer potential for obtaining emergency power for minimum time durations. Costs and schedules are estimated for a research and development program to bring emergency power through a hardware-demonstration test. Interaction of engine emergency-power capability with other helicopter systems is examined.

  6. Development of a New VLBI Data Analysis Software

    NASA Technical Reports Server (NTRS)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  7. Studies of aerodynamic technology for VSTOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Nelms, W. P.

    1978-01-01

    The paper summarizes several studies to develop aerodynamic technology for high performance VSTOL aircraft anticipated after 1990. A contracted study jointly sponsored by NASA-Ames and David Taylor Naval Ship Research and Development Center is emphasized. Four contractors analyzed two vertical-attitude and three horizontal-attitude takeoff and landing concepts with gross weights ranging from about 10433 kg (23,000 lb) to 17236 kg (38,000 lb). The aircraft have supersonic capability, high maneuver performance (sustained load factor 6.2 at Mach 0.6, 3048 m (10,000 ft)) and a 4536 kg (10,000-lb) STO overload capability. The contractors have estimated the aerodynamics and identified aerodynamic uncertainties associated with their concept. Example uncertainties relate to propulsion-induced flows, canard-wing interactions, and top inlets. Wind-tunnel research programs were proposed to investigate these uncertainties.

  8. A break-even analysis of a community rehabilitation falls prevention service.

    PubMed

    Comans, Tracy; Brauer, Sandy; Haines, Terry

    2009-06-01

    To identify and compare the minimum number of clients that a multidisciplinary falls prevention service delivered through domiciliary or centre-based care needs to treat to allow the service to reach a 'break-even' point. A break-even analysis was undertaken for each of two models of care for a multidisciplinary community rehabilitation falls prevention service. The two models comprised either a centre-based group exercise and education program or a similar program delivered individually in the client's home. The service consisted of a physiotherapist, occupational therapist and therapy assistant. The participants were adults aged over 65 years who had experienced previous falls. Costs were based on the actual cost of running a community rehabilitation team located in Brisbane. Benefits were obtained by estimating the savings gained to society from the number of falls prevented by the program on the basis of the falls reduction rates obtained in similar multidisciplinary programs. It is estimated that a multi-disciplinary community falls prevention team would need to see 57 clients per year to make the service break-even using a centre-based model of care and 78 clients for a domiciliary-based model. The service this study was based on has the capability to see around 300 clients per year in a centre-based service or 200-250 clients per year in a home-based service. Based on the best available estimates of costs of falls, multidisciplinary falls prevention teams in the community targeting people at high risk of falls are worthwhile funding from a societal viewpoint.

  9. A cooperative strategy for parameter estimation in large scale systems biology models.

    PubMed

    Villaverde, Alejandro F; Egea, Jose A; Banga, Julio R

    2012-06-22

    Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs ("threads") that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems.

  10. A cooperative strategy for parameter estimation in large scale systems biology models

    PubMed Central

    2012-01-01

    Background Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. Results A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs (“threads”) that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. Conclusions The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems. PMID:22727112

  11. Department of the Navy Supporting Data for Fiscal Year 1984 Budget Estimates Descriptive Summaries Submitted to Congress January 1983. Research, Development, Test and Evaluation, Navy. Book 1. Technology Base, Advanced Technology Development, Strategic Programs.

    DTIC Science & Technology

    1983-01-01

    altioser access (2) Asesss maturity of on-gotnR efforts and integrate appropriate development Into an effective globally dftjtributod .command spport...numerical techniques for nonlinear media.structure shock Interaction inrluding effects of elastic-plastic deformation have bee.a developed and used to...shtittle flight; develop camera payload for SPARTAN (free flyer) flight f rom shuttle. Develop detailed Interpretivesystem capablity~ for global ultraviolet

  12. Department of the Navy FY 1990/FY 1991 Biennial Budget Estimates. Military Construction and Family Housing Program FY 1991. Justification Data Submitted to Congress

    DTIC Science & Technology

    1989-01-01

    environmental review process as indicated by the County Traffic Engineers for safe and secure transport of ordnance as well as the chosen alternative...from other appropriations: None. PRVOSEDITION$ MAY SE USED ITRALDD DE 7 PG NO.DD • 391C UNTIL XHAUSTEDP N. S.PRRMELMN . CATGOR OO 7..POETNMER U JC...service. IMPACT IF NOT PROVIDED: Activity must rely r.n truck refuelers which are not Capable of handling the demand. Time delays, logisticA and safety

  13. Tactical radar technology study. Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    Rosien, R.; Cardone, L.; Hammers, D.; Klein, A.; Nozawa, E.

    1980-03-01

    This report presents results of a study to identify new technology required to provide advanced multi-threat performance capabilities in future tactical surveillance radar designs. A baseline design with optional subsystem characteristics has been synthesized to provide both functional and operational survivability in a dynamic and hostile situation postulated for the post 1985 time frame. Comparisons have been made of available technology with that required by the new baseline design to identify new technology requirements. Recommendations are presented for critical new technology programs including estimates of technical risks, costs and required development time.

  14. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Operational Requirements of the Air Force Oral Surgery Team (FFMAX)

    DTIC Science & Technology

    2010-10-14

    non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient conditions that have been selected...The patient stream was used to simulate the equipment and supply requirements for the range of surgical cases and non-surgical injuries and illnesses...supplies” column identifies the items needed to complete the “Insert endo - trach tube” task at that level of capability. Not shown in this figure are

  15. Summary of long-baseline systematics session at CETUP*2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherdack, Daniel; Worcester, Elizabeth

    2015-10-15

    A session studying systematics in long-baseline neutrino oscillation physics was held July 14-18, 2014 as part of CETUP* 2014. Systematic effects from flux normalization and modeling, modeling of cross sections and nuclear interactions, and far detector effects were addressed. Experts presented the capabilities of existing and planned tools. A program of study to determine estimates of and requirements for the size of these effects was designed. This document summarizes the results of the CETUP* systematics workshop and the current status of systematic uncertainty studies in long-baseline neutrino oscillation measurements.

  16. U.S. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Operational Requirements of the EMEDS Specialty Care Augmentation Team

    DTIC Science & Technology

    2011-06-28

    EXTERNA NOS 5 075 INFECTIOUS MONONUCLEOSIS 1 864.01 LIVER HEMATOMA/CONTUSION 1 928.8 MULT CRUSHING INJURY LEG 4 817.0 MULTIPLE FX HAND-CLOSED 1 782.1...medical assets since 2004. Air Force medical modeling capabilities currently capture care and treatment of the sick and injured from the first...begins with the identification of likely patient types to be encountered by a particular type of medical treatment asset, including combat wounds

  17. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  18. CHARMM: The Biomolecular Simulation Program

    PubMed Central

    Brooks, B.R.; Brooks, C.L.; MacKerell, A.D.; Nilsson, L.; Petrella, R.J.; Roux, B.; Won, Y.; Archontis, G.; Bartels, C.; Boresch, S.; Caflisch, A.; Caves, L.; Cui, Q.; Dinner, A.R.; Feig, M.; Fischer, S.; Gao, J.; Hodoscek, M.; Im, W.; Kuczera, K.; Lazaridis, T.; Ma, J.; Ovchinnikov, V.; Paci, E.; Pastor, R.W.; Post, C.B.; Pu, J.Z.; Schaefer, M.; Tidor, B.; Venable, R. M.; Woodcock, H. L.; Wu, X.; Yang, W.; York, D.M.; Karplus, M.

    2009-01-01

    CHARMM (Chemistry at HARvard Molecular Mechanics) is a highly versatile and widely used molecular simulation program. It has been developed over the last three decades with a primary focus on molecules of biological interest, including proteins, peptides, lipids, nucleic acids, carbohydrates and small molecule ligands, as they occur in solution, crystals, and membrane environments. For the study of such systems, the program provides a large suite of computational tools that include numerous conformational and path sampling methods, free energy estimators, molecular minimization, dynamics, and analysis techniques, and model-building capabilities. In addition, the CHARMM program is applicable to problems involving a much broader class of many-particle systems. Calculations with CHARMM can be performed using a number of different energy functions and models, from mixed quantum mechanical-molecular mechanical force fields, to all-atom classical potential energy functions with explicit solvent and various boundary conditions, to implicit solvent and membrane models. The program has been ported to numerous platforms in both serial and parallel architectures. This paper provides an overview of the program as it exists today with an emphasis on developments since the publication of the original CHARMM paper in 1983. PMID:19444816

  19. Direction-of-arrival estimation for co-located multiple-input multiple-output radar using structural sparsity Bayesian learning

    NASA Astrophysics Data System (ADS)

    Wen, Fang-Qing; Zhang, Gong; Ben, De

    2015-11-01

    This paper addresses the direction of arrival (DOA) estimation problem for the co-located multiple-input multiple-output (MIMO) radar with random arrays. The spatially distributed sparsity of the targets in the background makes compressive sensing (CS) desirable for DOA estimation. A spatial CS framework is presented, which links the DOA estimation problem to support recovery from a known over-complete dictionary. A modified statistical model is developed to accurately represent the intra-block correlation of the received signal. A structural sparsity Bayesian learning algorithm is proposed for the sparse recovery problem. The proposed algorithm, which exploits intra-signal correlation, is capable being applied to limited data support and low signal-to-noise ratio (SNR) scene. Furthermore, the proposed algorithm has less computation load compared to the classical Bayesian algorithm. Simulation results show that the proposed algorithm has a more accurate DOA estimation than the traditional multiple signal classification (MUSIC) algorithm and other CS recovery algorithms. Project supported by the National Natural Science Foundation of China (Grant Nos. 61071163, 61271327, and 61471191), the Funding for Outstanding Doctoral Dissertation in Nanjing University of Aeronautics and Astronautics, China (Grant No. BCXJ14-08), the Funding of Innovation Program for Graduate Education of Jiangsu Province, China (Grant No. KYLX 0277), the Fundamental Research Funds for the Central Universities, China (Grant No. 3082015NP2015504), and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PADA), China.

  20. PPP/nonreal-time trajectory program interface requirements and capabilities

    NASA Technical Reports Server (NTRS)

    Mcgavern, J. L.; Arbet, J. D.

    1975-01-01

    The selection process for interfacing a nonreal time trajectory program with the procedures and performance program is outlined; the interface provides summary data timelines for any desired trajectory profile. Consideration was given to two separate digital programs for satisfying capabilities. One was the CDC 6400 digital program BANDITO, and the second was the UNIVAC 1110 SVDS program.

  1. Second AIAA/NASA USAF Symposium on Automation, Robotics and Advanced Computing for the National Space Program

    NASA Technical Reports Server (NTRS)

    Myers, Dale

    1987-01-01

    An introduction is given to NASA goals in the development of automation (expert systems) and robotics technologies in the Space Station program. Artificial intelligence (AI) has been identified as a means to lowering ground support costs. Telerobotics will enhance space assembly, servicing and repair capabilities, and will be used for an estimated half of the necessary EVA tasks. The general principles guiding NASA in the design, development, ground-testing, interactions with industry and construction of the Space Station component systems are summarized. The telerobotics program has progressed to a point where a telerobot servicer is a firm component of the first Space Station element launch, to support assembly, maintenance and servicing of the Station. The University of Wisconsin has been selected for the establishment of a Center for the Commercial Development of Space, specializing in space automation and robotics.

  2. Implementation of a Wavefront-Sensing Algorithm

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce; Aronstein, David

    2013-01-01

    A computer program has been written as a unique implementation of an image-based wavefront-sensing algorithm reported in "Iterative-Transform Phase Retrieval Using Adaptive Diversity" (GSC-14879-1), NASA Tech Briefs, Vol. 31, No. 4 (April 2007), page 32. This software was originally intended for application to the James Webb Space Telescope, but is also applicable to other segmented-mirror telescopes. The software is capable of determining optical-wavefront information using, as input, a variable number of irradiance measurements collected in defocus planes about the best focal position. The software also uses input of the geometrical definition of the telescope exit pupil (otherwise denoted the pupil mask) to identify the locations of the segments of the primary telescope mirror. From the irradiance data and mask information, the software calculates an estimate of the optical wavefront (a measure of performance) of the telescope generally and across each primary mirror segment specifically. The software is capable of generating irradiance data, wavefront estimates, and basis functions for the full telescope and for each primary-mirror segment. Optionally, each of these pieces of information can be measured or computed outside of the software and incorporated during execution of the software.

  3. The design and analysis of simple low speed flap systems with the aid of linearized theory computer programs

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.

    1985-01-01

    The purpose here is to show how two linearized theory computer programs in combination may be used for the design of low speed wing flap systems capable of high levels of aerodynamic efficiency. A fundamental premise of the study is that high levels of aerodynamic performance for flap systems can be achieved only if the flow about the wing remains predominantly attached. Based on this premise, a wing design program is used to provide idealized attached flow camber surfaces from which candidate flap systems may be derived, and, in a following step, a wing evaluation program is used to provide estimates of the aerodynamic performance of the candidate systems. Design strategies and techniques that may be employed are illustrated through a series of examples. Applicability of the numerical methods to the analysis of a representative flap system (although not a system designed by the process described here) is demonstrated in a comparison with experimental data.

  4. Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.

    PubMed

    Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas

    2004-08-01

    The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.

  5. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  6. Shuttle-Derived Launch Vehicles' Capablities: An Overview

    NASA Technical Reports Server (NTRS)

    Rothschild, William J.; Bailey, Debra A.; Henderson, Edward M.; Crumbly, Chris

    2005-01-01

    Shuttle-Derived Launch Vehicle (SDLV) concepts have been developed by a collaborative team comprising the Johnson Space Center, Marshall Space Flight Center, Kennedy Space Center, ATK-Thiokol, Lockheed Martin Space Systems Company, The Boeing Company, and United Space Alliance. The purpose of this study was to provide timely information on a full spectrum of low-risk, cost-effective options for STS-Derived Launch Vehicle concepts to support the definition of crew and cargo launch requirements for the Space Exploration Vision. Since the SDLV options use high-reliability hardware, existing facilities, and proven processes, they can provide relatively low-risk capabilities to launch extremely large payloads to low Earth orbit. This capability to reliably lift very large, high-dollar-value payloads could reduce mission operational risks by minimizing the number of complex on-orbit operations compared to architectures based on multiple smaller launchers. The SDLV options also offer several logical spiral development paths for larger exploration payloads. All of these development paths make practical and cost-effective use of existing Space Shuttle Program (SSP) hardware, infrastructure, and launch and flight operations systems. By utilizing these existing assets, the SDLV project could support the safe and orderly transition of the current SSP through the planned end of life in 2010. The SDLV concept definition work during 2004 focused on three main configuration alternatives: a side-mount heavy lifter (approximately 77 MT payload), an in-line medium lifter (approximately 22 MT Crew Exploration Vehicle payload), and an in-line heavy lifter (greater than 100 MT payload). This paper provides an overview of the configuration, performance capabilities, reliability estimates, concept of operations, and development plans for each of the various SDLV alternatives. While development, production, and operations costs have been estimated for each of the SDLV configuration alternatives, these proprietary data have not been included in this paper.

  7. Field Programmable Gate Array Failure Rate Estimation Guidelines for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Today's launch vehicles complex electronic and avionics systems heavily utilize Field Programmable Gate Array (FPGA) integrated circuits (IC) for their superb speed and reconfiguration capabilities. Consequently, FPGAs are prevalent ICs in communication protocols such as MILSTD- 1553B and in control signal commands such as in solenoid valve actuations. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  8. CSpace: an integrated workplace for the graphical and algebraic analysis of phase assemblages on 32-bit wintel platforms

    NASA Astrophysics Data System (ADS)

    Torres-Roldan, Rafael L.; Garcia-Casco, Antonio; Garcia-Sanchez, Pedro A.

    2000-08-01

    CSpace is a program for the graphical and algebraic analysis of composition relations within chemical systems. The program is particularly suited to the needs of petrologists, but could also prove useful for mineralogists, geochemists and other environmental scientists. A few examples of what can be accomplished with CSpace are the mapping of compositions into some desired set of system/phase components, the estimation of reaction/mixing coefficients and assessment of phase-rule compatibility relations within or between complex mineral assemblages. The program also allows dynamic inspection of compositional relations by means of barycentric plots. CSpace provides an integrated workplace for data management, manipulation and plotting. Data management is done through a built-in spreadsheet-like editor, which also acts as a data repository for the graphical and algebraic procedures. Algebraic capabilities are provided by a mapping engine and a matrix analysis tool, both of which are based on singular-value decomposition. The mapping engine uses a general approach to linear mapping, capable of handling determined, underdetermined and overdetermined problems. The matrix analysis tool is implemented as a task "wizard" that guides the user through a number of steps to perform matrix approximation (finding nearest rank-deficient models of an input composition matrix), and inspection of null-reaction space relationships (i.e. of implicit linear relations among the elements of the composition matrix). Graphical capabilities are provided by a graph engine that directly links with the contents of the data editor. The graph engine can generate sophisticated 2-D ternary (triangular) and 3D quaternary (tetrahedral) barycentric plots and includes features such as interactive re-sizing and rotation, on-the-fly coordinate scaling and support for automated drawing of tie lines.

  9. A Lunar Surface System Supportability Technology Development Roadmap

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Struk, Peter M.; Taleghani, Barmac K.

    2009-01-01

    This paper discusses the establishment of a Supportability Technology Development Roadmap as a guide for developing capabilities intended to allow NASA's Constellation program to enable a supportable, sustainable and affordable exploration of the Moon and Mars. Presented is a discussion of "supportability", in terms of space facility maintenance, repair and related logistics and a comparison of how lunar outpost supportability differs from the International Space Station. Supportability lessons learned from NASA and Department of Defense experience and their impact on a future lunar outpost is discussed. A supportability concept for future missions to the Moon and Mars that involves a transition from a highly logistics dependent to a logistically independent operation is discussed. Lunar outpost supportability capability needs are summarized and a supportability technology development strategy is established. The resulting Lunar Surface Systems Supportability Strategy defines general criteria that will be used to select technologies that will enable future flight crews to act effectively to respond to problems and exploit opportunities in a environment of extreme resource scarcity and isolation. This strategy also introduces the concept of exploiting flight hardware as a supportability resource. The technology roadmap involves development of three mutually supporting technology categories, Diagnostics Test & Verification, Maintenance & Repair, and Scavenging & Recycling. The technology roadmap establishes two distinct technology types, "Embedded" and "Process" technologies, with different implementation and thus different criteria and development approaches. The supportability technology roadmap addresses the technology readiness level, and estimated development schedule for technology groups that includes down-selection decision gates that correlate with the lunar program milestones. The resulting supportability technology roadmap is intended to develop a set of technologies with widest possible capability and utility with a minimum impact on crew time and training and remain within the time and cost constraints of the Constellation program

  10. A Lunar Surface System Supportability Technology Development Roadmap

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Struk, Peter M.; Taleghani, barmac K.

    2011-01-01

    This paper discusses the establishment of a Supportability Technology Development Roadmap as a guide for developing capabilities intended to allow NASA s Constellation program to enable a supportable, sustainable and affordable exploration of the Moon and Mars. Presented is a discussion of supportability, in terms of space facility maintenance, repair and related logistics and a comparison of how lunar outpost supportability differs from the International Space Station. Supportability lessons learned from NASA and Department of Defense experience and their impact on a future lunar outpost is discussed. A supportability concept for future missions to the Moon and Mars that involves a transition from a highly logistics dependent to a logistically independent operation is discussed. Lunar outpost supportability capability needs are summarized and a supportability technology development strategy is established. The resulting Lunar Surface Systems Supportability Strategy defines general criteria that will be used to select technologies that will enable future flight crews to act effectively to respond to problems and exploit opportunities in an environment of extreme resource scarcity and isolation. This strategy also introduces the concept of exploiting flight hardware as a supportability resource. The technology roadmap involves development of three mutually supporting technology categories, Diagnostics Test and Verification, Maintenance and Repair, and Scavenging and Recycling. The technology roadmap establishes two distinct technology types, "Embedded" and "Process" technologies, with different implementation and thus different criteria and development approaches. The supportability technology roadmap addresses the technology readiness level, and estimated development schedule for technology groups that includes down-selection decision gates that correlate with the lunar program milestones. The resulting supportability technology roadmap is intended to develop a set of technologies with widest possible capability and utility with a minimum impact on crew time and training and remain within the time and cost constraints of the Constellation program.

  11. The NASA automation and robotics technology program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee B.; Montemerlo, Melvin D.

    1986-01-01

    The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.

  12. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    USGS Publications Warehouse

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  13. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  14. Commercial treatability study capabilities for application to the US Department of Energy`s anticipated mixed waste streams. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-09-01

    US DOE mixed low-level and mixed transuranic waste inventory was estimated at 181,000 cubic meters (about 2,000 waste streams). Treatability studies may be used as part of DOE`s mixed waste management program. Commercial treatability study suppliers have been identified that either have current capability in their own facilities or have access to licensed facilities. Numerous federal and state regulations, as well as DOE Order 5820.2A, impact the performance of treatability studies. Generators, transporters, and treatability study facilities are subject to regulation. From a mixed- waste standpoint, a key requirement is that the treatability study facility must have an NRC ormore » state license that allows it to possess radioactive materials. From a RCRA perspective, the facility must support treatability study activities with the applicable plans, reports, and documentation. If PCBs are present in the waste, TSCA will also be an issue. CERCLA requirements may apply, and both DOE and NRC regulations will impact the transportation of DOE mixed waste to an off-site treatment facility. DOE waste managers will need to be cognizant of all applicable regulations as mixed-waste treatability study programs are initiated.« less

  15. New DMSP database of precipitating auroral electrons and ions

    NASA Astrophysics Data System (ADS)

    Redmon, Robert J.; Denig, William F.; Kilcommons, Liam M.; Knipp, Delores J.

    2017-08-01

    Since the mid-1970s, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low Earth orbit. As the program evolved, so have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) to F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information and the Coordinated Data Analysis Web. We describe how the new database is being applied to high-latitude studies of the colocation of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field-aligned currents, and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  16. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  17. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  18. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  19. Space station experiment definition: Long-term cryogenic fluid storage

    NASA Technical Reports Server (NTRS)

    Jetley, R. L.; Scarlotti, R. D.

    1987-01-01

    The conceptual design of a space station Technology Development Mission (TDM) experiment to demonstrate and evaluate cryogenic fluid storage and transfer technologies is presented. The experiment will be deployed on the initial operational capability (IOC) space station for a four-year duration. It is modular in design, consisting of three phases to test the following technologies: passive thermal technologies (phase 1), fluid transfer (phase 2), and active refrigeration (phase 3). Use of existing hardware was a primary consideration throughout the design effort. A conceptual design of the experiment was completed, including configuration sketches, system schematics, equipment specifications, and space station resources and interface requirements. These requirements were entered into the NASA Space Station Mission Data Base. A program plan was developed defining a twelve-year development and flight plan. Program cost estimates are given.

  20. Large Deployable Reflector (LDR) system concept and technology definition study. Volume 2: Technology assessment and technology development plan

    NASA Technical Reports Server (NTRS)

    Agnew, Donald L.; Jones, Peter A.

    1989-01-01

    A study was conducted to define reasonable and representative LDR system concepts for the purpose of defining a technology development program aimed at providing the requisite technological capability necessary to start LDR development by the end of 1991. This volume presents thirteen technology assessments and technology development plans, as well as an overview and summary of the LDR concepts. Twenty-two proposed augmentation projects are described (selected from more than 30 candidates). The five LDR technology areas most in need of supplementary support are: cryogenic cooling; astronaut assembly of the optically precise LDR in space; active segmented primary mirror; dynamic structural control; and primary mirror contamination control. Three broad, time-phased, five-year programs were synthesized from the 22 projects, scheduled, and funding requirements estimated.

  1. Impact of upgraded in vivo lung measurement capability on an internal dosimetry program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, E.H.; Sula, M.J.; Aldridge, T.L.

    1985-08-01

    Implementation of high-purity germanium (Ge) detectors in place of sodium iodide (NaI) detectors for in vivo lung measurements of low-energy photon-emitting radionuclides resulted in significant improvement in detection capability and corresponding improvements in the monitoring of potentially exposed workers. Lung activities below those detectable with the NaI system were discovered during the first 18 months of operation. In a number of cases, these activities were estimated to represent intakes resulting in lung doses as high as 25% of the 15 rem/y United States Department of Energy Radiation Protection Standard. Evaluation of these lung activities and their associated intakes was substantiallymore » more time consuming than originally anticipated due to calibration differences between the Ge and NaI systems and to the difficulty of completing some of the follow-up investigations.« less

  2. Data acquisition and control system with a programmable logic controller (PLC) for a pulsed chemical oxygen-iodine laser

    NASA Astrophysics Data System (ADS)

    Yu, Haijun; Li, Guofu; Duo, Liping; Jin, Yuqi; Wang, Jian; Sang, Fengting; Kang, Yuanfu; Li, Liucheng; Wang, Yuanhu; Tang, Shukai; Yu, Hongliang

    2015-02-01

    A user-friendly data acquisition and control system (DACS) for a pulsed chemical oxygen -iodine laser (PCOIL) has been developed. It is implemented by an industrial control computer,a PLC, and a distributed input/output (I/O) module, as well as the valve and transmitter. The system is capable of handling 200 analogue/digital channels for performing various operations such as on-line acquisition, display, safety measures and control of various valves. These operations are controlled either by control switches configured on a PC while not running or by a pre-determined sequence or timings during the run. The system is capable of real-time acquisition and on-line estimation of important diagnostic parameters for optimization of a PCOIL. The DACS system has been programmed using software programmable logic controller (PLC). Using this DACS, more than 200 runs were given performed successfully.

  3. Advanced application flight experiment breadboard pulse compression radar altimeter program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Design, development and performance of the pulse compression radar altimeter is described. The high resolution breadboard system is designed to operate from an aircraft at 10 Kft above the ocean and to accurately measure altitude, sea wave height and sea reflectivity. The minicomputer controlled Ku band system provides six basic variables and an extensive digital recording capability for experimentation purposes. Signal bandwidths of 360 MHz are obtained using a reflective array compression line. Stretch processing is used to achieve 1000:1 pulse compression. The system range command LSB is 0.62 ns or 9.25 cm. A second order altitude tracker, aided by accelerometer inputs is implemented in the system software. During flight tests the system demonstrated an altitude resolution capability of 2.1 cm and sea wave height estimation accuracy of 10%. The altitude measurement performance exceeds that of the Skylab and GEOS-C predecessors by approximately an order of magnitude.

  4. Whitecap coverage from aerial photography

    NASA Technical Reports Server (NTRS)

    Austin, R. W.

    1970-01-01

    A program for determining the feasibility of deriving sea surface wind speeds by remotely sensing ocean surface radiances in the nonglitter regions is discussed. With a knowledge of the duration and geographical extent of the wind field, information about the conventional sea state may be derived. The use of optical techniques for determining sea state has obvious limitations. For example, such means can be used only in daylight and only when a clear path of sight is available between the sensor and the surface. However, sensors and vehicles capable of providing the data needed for such techniques are planned for the near future; therefore, a secondary or backup capability can be provided with little added effort. The information currently being sought regarding white water coverage is also of direct interest to those working with passive microwave systems, the study of energy transfer between winds and ocean currents, the aerial estimation of wind speeds, and many others.

  5. Kalman filter for onboard state of charge estimation and peak power capability analysis of lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Dong, Guangzhong; Wei, Jingwen; Chen, Zonghai

    2016-10-01

    To evaluate the continuous and instantaneous load capability of a battery, this paper describes a joint estimator for state-of-charge (SOC) and state-of-function (SOF) of lithium-ion batteries (LIB) based on Kalman filter (KF). The SOC is a widely used index for remain useful capacity left in a battery. The SOF represents the peak power capability of the battery. It can be determined by real-time SOC estimation and terminal voltage prediction, which can be derived from impedance parameters. However, the open-circuit-voltage (OCV) of LiFePO4 is highly nonlinear with SOC, which leads to the difficulties in SOC estimation. To solve these problems, this paper proposed an onboard SOC estimation method. Firstly, a simplified linearized equivalent-circuit-model is developed to simulate the dynamic characteristics of a battery, where the OCV is regarded as a linearized function of SOC. Then, the system states are estimated based on the KF. Besides, the factors that influence peak power capability are analyzed according to statistical data. Finally, the performance of the proposed methodology is demonstrated by experiments conducted on a LiFePO4 LIBs under different operating currents and temperatures. Experimental results indicate that the proposed approach is suitable for battery onboard SOC and SOF estimation.

  6. Huygens probe entry, descent, and landing trajectory reconstruction using the Program to Optimize Simulated Trajectories II

    NASA Astrophysics Data System (ADS)

    Striepe, Scott Allen

    The objectives of this research were to develop a reconstruction capability using the Program to Optimize Simulated Trajectories II (POST2), apply this capability to reconstruct the Huygens Titan probe entry, descent, and landing (EDL) trajectory, evaluate the newly developed POST2 reconstruction module, analyze the reconstructed trajectory, and assess the pre-flight simulation models used for Huygens EDL simulation. An extended Kalman filter (EKF) module was developed and integrated into POST2 to enable trajectory reconstruction (especially when using POST2-based mission specific simulations). Several validation cases, ranging from a single, constant parameter estimate to multivariable estimation cases similar to an actual mission flight, were executed to test the POST2 reconstruction module. Trajectory reconstruction of the Huygens entry probe at Titan was accomplished using accelerometer measurements taken during flight to adjust an estimated state (e.g., position, velocity, parachute drag, wind velocity, etc.) in a POST2-based simulation developed to support EDL analyses and design prior to entry. Although the main emphasis of the trajectory reconstruction was to evaluate models used in the NASA pre-entry trajectory simulation, the resulting reconstructed trajectory was also assessed to provide an independent evaluation of the ESA result. Major findings from this analysis include: Altitude profiles from this analysis agree well with other NASA and ESA results but not with Radar data, whereas a scale factor of about 0.93 would bring the radar measurements into compliance with these results; entry capsule aerodynamics predictions (axial component only) were well within 3-sigma bounds established pre-flight for most of the entry when compared to reconstructed values; Main parachute drag of 9% to 19% above ESA model was determined from the reconstructed trajectory; based on the tilt sensor and accelerometer data, the conclusion from this assessment was that the probe was tilted about 10 degrees during the Drogue parachute phase.

  7. ESTIMATION OF INTERNAL EXPOSURE TO URANIUM WITH UNCERTAINTY FROM URINALYSIS DATA USING THE InDEP COMPUTER CODE

    PubMed Central

    Anderson, Jeri L.; Apostoaei, A. Iulian; Thomas, Brian A.

    2015-01-01

    The National Institute for Occupational Safety and Health (NIOSH) is currently studying mortality in a cohort of 6409 workers at a former uranium processing facility. As part of this study, over 220 000 urine samples were used to reconstruct organ doses due to internal exposure to uranium. Most of the available computational programs designed for analysis of bioassay data handle a single case at a time, and thus require a significant outlay of time and resources for the exposure assessment of a large cohort. NIOSH is currently supporting the development of a computer program, InDEP (Internal Dose Evaluation Program), to facilitate internal radiation exposure assessment as part of epidemiological studies of both uranium- and plutonium-exposed cohorts. A novel feature of InDEP is its batch processing capability which allows for the evaluation of multiple study subjects simultaneously. InDEP analyses bioassay data and derives intakes and organ doses with uncertainty estimates using least-squares regression techniques or using the Bayes’ Theorem as applied to internal dosimetry (Bayesian method). This paper describes the application of the current version of InDEP to formulate assumptions about the characteristics of exposure at the study facility that were used in a detailed retrospective intake and organ dose assessment of the cohort. PMID:22683620

  8. Aspect-Oriented Subprogram Synthesizes UML Sequence Diagrams

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2006-01-01

    The Rational Sequence computer program described elsewhere includes a subprogram that utilizes the capability for aspect-oriented programming when that capability is present. This subprogram is denoted the Rational Sequence (AspectJ) component because it uses AspectJ, which is an extension of the Java programming language that introduces aspect-oriented programming techniques into the language

  9. Effect of a pharmacist-managed hypertension program on health system costs: an evaluation of the Study of Cardiovascular Risk Intervention by Pharmacists-Hypertension (SCRIP-HTN).

    PubMed

    Houle, Sherilyn K D; Chuck, Anderson W; McAlister, Finlay A; Tsuyuki, Ross T

    2012-06-01

    To quantify the potential cost savings of a community pharmacy-based hypertension management program based on the results of the Study of Cardiovascular Risk Intervention by Pharmacists-Hypertension (SCRIP-HTN) study in terms of avoided cardiovascular events-myocardial infarction, stroke, and heart failure hospitalization, and to compare these cost savings with the cost of the pharmacist intervention program. An economic model was developed to estimate the potential cost avoidance in direct health care resources from reduced cardiovascular events over a 1-year period. The SCRIP-HTN study found that patients with diabetes mellitus and hypertension who were receiving the pharmacist intervention had a greater mean reduction in systolic blood pressure of 5.6 mm Hg than patients receiving usual care. For our model, published meta-analysis data were used to compute cardiovascular event absolute risk reductions associated with a 5.6-mm Hg reduction in systolic blood pressure over 6 months. Costs/event were obtained from administrative data, and probabilistic sensitivity analyses were performed to assess the robustness of the results. Two program scenarios were evaluated-one with monthly follow-up for a total of 1 year with sustained blood pressure reduction, and the other in which pharmacist care ended after the 6-month program but the effects on systolic blood pressure diminished over time. The cost saving results from the economic model were then compared with the costs of the program. Annual estimated cost savings (in 2011 Canadian dollars) from avoided cardiovascular events were $265/patient (95% confidence interval [CI] $63-467) if the program lasted 1 year or $221/patient (95%CI $72-371) if pharmacist care ceased after 6 months with an assumed loss of effect afterward. Estimated pharmacist costs were $90/patient for 6 months or $150/patient for 1 year, suggesting that pharmacist-managed programs are cost saving, with the annual net total cost savings/patient estimated to be $131 for a program lasting 6 months or $115 for a program lasting 1 year. Our model found that community pharmacist interventions capable of reducing systolic blood pressure by 5.6 mm Hg within 6 months are cost saving and result in improved patient outcomes. Wider adoption of pharmacist-managed hypertension care for patients with diabetes and hypertension is encouraged. © 2012 Pharmacotherapy Publications, Inc.

  10. Neural network fusion capabilities for efficient implementation of tracking algorithms

    NASA Astrophysics Data System (ADS)

    Sundareshan, Malur K.; Amoozegar, Farid

    1997-03-01

    The ability to efficiently fuse information of different forms to facilitate intelligent decision making is one of the major capabilities of trained multilayer neural networks that is now being recognized. While development of innovative adaptive control algorithms for nonlinear dynamical plants that attempt to exploit these capabilities seems to be more popular, a corresponding development of nonlinear estimation algorithms using these approaches, particularly for application in target surveillance and guidance operations, has not received similar attention. We describe the capabilities and functionality of neural network algorithms for data fusion and implementation of tracking filters. To discuss details and to serve as a vehicle for quantitative performance evaluations, the illustrative case of estimating the position and velocity of surveillance targets is considered. Efficient target- tracking algorithms that can utilize data from a host of sensing modalities and are capable of reliably tracking even uncooperative targets executing fast and complex maneuvers are of interest in a number of applications. The primary motivation for employing neural networks in these applications comes from the efficiency with which more features extracted from different sensor measurements can be utilized as inputs for estimating target maneuvers. A system architecture that efficiently integrates the fusion capabilities of a trained multilayer neural net with the tracking performance of a Kalman filter is described. The innovation lies in the way the fusion of multisensor data is accomplished to facilitate improved estimation without increasing the computational complexity of the dynamical state estimator itself.

  11. Power capability evaluation for lithium iron phosphate batteries based on multi-parameter constraints estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yujie; Pan, Rui; Liu, Chang; Chen, Zonghai; Ling, Qiang

    2018-01-01

    The battery power capability is intimately correlated with the climbing, braking and accelerating performance of the electric vehicles. Accurate power capability prediction can not only guarantee the safety but also regulate driving behavior and optimize battery energy usage. However, the nonlinearity of the battery model is very complex especially for the lithium iron phosphate batteries. Besides, the hysteresis loop in the open-circuit voltage curve is easy to cause large error in model prediction. In this work, a multi-parameter constraints dynamic estimation method is proposed to predict the battery continuous period power capability. A high-fidelity battery model which considers the battery polarization and hysteresis phenomenon is presented to approximate the high nonlinearity of the lithium iron phosphate battery. Explicit analyses of power capability with multiple constraints are elaborated, specifically the state-of-energy is considered in power capability assessment. Furthermore, to solve the problem of nonlinear system state estimation, and suppress noise interference, the UKF based state observer is employed for power capability prediction. The performance of the proposed methodology is demonstrated by experiments under different dynamic characterization schedules. The charge and discharge power capabilities of the lithium iron phosphate batteries are quantitatively assessed under different time scales and temperatures.

  12. NDE reliability and probability of detection (POD) evolution and paradigm shift

    NASA Astrophysics Data System (ADS)

    Singh, Surendra

    2014-02-01

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed "Have Cracks - Will Travel" or in short "Have Cracks" by Lockheed Georgia Company for US Air Force during 1974-1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability &Reproducibility (Gage R&R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using Integrated Computational Materials Engineering (ICME), MAPOD, SAPOD, and Bayesian statistics for studying controllable and non-controllable variables including human factors for estimating POD. Another objective is to list gaps between "hoped for" versus validated or fielded failed hardware.

  13. Air Force Operational Medicine: Using the Estimating Supplies Program to Develop Materiel Solutions for the Operational Clinical Requirements for the U.S. Air Force Otolaryngology Team (FFENT)

    DTIC Science & Technology

    2007-10-10

    Nose and Throat UTC self -sufficient for a 7-day period, and will support tasks like OR Team Preparation and Patient Assessment. Category Weight Cube...These line items would need to be added to the FFENT AS for the FFENT to be self -sufficient for 7 days, as discussed in the current FFENT CONOPS...additions enable the Ear, Nose and Throat UTC to be self -sufficient for a 7-day period and meet its capabilities as stated in the CONOPS. Discussion

  14. Research requirements for development of regenerative engines for helicopters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Semple, R.D.

    1976-12-01

    The improved specific fuel consumption of the regenerative engine was compared to a simple-cycle turboshaft engine. The performance improvement and fuel saving are obtained at the expense of increased engine weight, development and production costs, and maintenance costs. Costs and schedules are estimated for the elements of the research and development program. Interaction of the regenerative engine with other technology goals for an advanced civil helicopter is examined, including its impact on engine noise, hover and cruise performance, helicopter empty weight, drive-system efficiency and weight, one-engine-inoperative hover capability, and maintenance and reliability.

  15. Research requirements for development of regenerative engines for helicopters

    NASA Technical Reports Server (NTRS)

    Semple, R. D.

    1976-01-01

    The improved specific fuel consumption of the regenerative engine was compared to a simple-cycle turboshaft engine. The performance improvement and fuel saving are obtained at the expense of increased engine weight, development and production costs, and maintenance costs. Costs and schedules are estimated for the elements of the research and development program. Interaction of the regenerative engine with other technology goals for an advanced civil helicopter is examined, including its impact on engine noise, hover and cruise performance, helicopter empty weight, drive-system efficiency and weight, one-engine-inoperative hover capability, and maintenance and reliability.

  16. Department of the Navy Supporting Data for Fiscal Year 1983 Budget Estimates Descriptive Summaries Submitted to Congress February 1982. Research, Development, Test & Evaluation, Navy. Book 1 of 3. Technology Base, Advanced Technology Development, Strategic Programs.

    DTIC Science & Technology

    1982-02-01

    optimization methods have been developed for problems in production and distribution modeling including design and evaluation of storage alternatives under...and winds using high frequency , X-band doppler, pulse -limited, and Delta-K radars. Development of millimeter-wave radiometric imaging systems and...generic system design concept for a system capable of defending the Fleet from the high angle threat 1.4 The first model of the drive system for a

  17. Mechanical Analysis of W78/88-1 Life Extension Program Warhead Design Options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Nathan

    2014-09-01

    Life Extension Program (LEP) is a program to repair/replace components of nuclear weapons to ensure the ability to meet military requirements. The W78/88-1 LEP encompasses the modernization of two major nuclear weapon reentry systems into an interoperable warhead. Several design concepts exist to provide different options for robust safety and security themes, maximum non-nuclear commonality, and cost. Simulation is one capability used to evaluate the mechanical performance of the designs in various operational environments, plan for system and component qualification efforts, and provide insight into the survivability of the warhead in environments that are not currently testable. The simulation effortsmore » use several Sandia-developed tools through the Advanced Simulation and Computing program, including Cubit for mesh generation, the DART Model Manager, SIERRA codes running on the HPC TLCC2 platforms, DAKOTA, and ParaView. Several programmatic objectives were met using the simulation capability including: (1) providing early environmental specification estimates that may be used by component designers to understand the severity of the loads their components will need to survive, (2) providing guidance for load levels and configurations for subassembly tests intended to represent operational environments, and (3) recommending design options including modified geometry and material properties. These objectives were accomplished through regular interactions with component, system, and test engineers while using the laboratory's computational infrastructure to effectively perform ensembles of simulations. Because NNSA has decided to defer the LEP program, simulation results are being documented and models are being archived for future reference. However, some advanced and exploratory efforts will continue to mature key technologies, using the results from these and ongoing simulations for design insights, test planning, and model validation.« less

  18. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    EPA Science Inventory

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends

    A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  19. TEMPEST: A three-dimensional time-dependence computer program for hydrothermal analysis: Volume 1, Numerical methods and input instructions: Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    TEMPEST offers simulation capabilities over a wide range of hydrothermal problems that are definable by input instructions. These capabilities are summarized by categories as follows: modeling capabilities; program control; and I/O control. 10 refs., 22 figs., 2 tabs. (LSP)

  20. Practical Considerations for Optic Nerve Estimation in Telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karnowski, Thomas Paul; Aykac, Deniz; Chaum, Edward

    The projected increase in diabetes in the United States and worldwide has created a need for broad-based, inexpensive screening for diabetic retinopathy (DR), an eye disease which can lead to vision impairment. A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion / anomaly detection is a low-cost way of achieving broad-based screening. In this work we report on the effect of quality estimation on an optic nerve (ON) detection method with a confidence metric. We report on an improvement of the fusion technique using a data set from an ophthalmologists practice then show themore » results of the method as a function of image quality on a set of images from an on-line telemedicine network collected in Spring 2009 and another broad-based screening program. We show that the fusion method, combined with quality estimation processing, can improve detection performance and also provide a method for utilizing a physician-in-the-loop for images that may exceed the capabilities of automated processing.« less

  1. Field development will cost $1 billion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rintoul, B.

    1980-10-01

    The development of the Belridge property 40 miles west of Bakersfield in the San Joaquin Valley is not only one of the biggest production developments of this decade in California but also is one of the msot challenging. It will call on advanced expertise and, ultimately, on techniques that are still in the research stage. The program calls for drilling at least 3000 wells and reworking another 2200 wells. In excess of $100 million is being committed in 1980 alone. Since acquiring the property, Shell has increased the estimate of proved developed and undeveloped reserves to approximately 598 million bblmore » of hydrocarbon liquids and 364 billion cu ft of natural gas. The higher estimates mirror the company's confidence in its capability to recover a larger amount of the in-place oil and gas than previously expected.« less

  2. A versatile pitch tracking algorithm: from human speech to killer whale vocalizations.

    PubMed

    Shapiro, Ari Daniel; Wang, Chao

    2009-07-01

    In this article, a pitch tracking algorithm [named discrete logarithmic Fourier transformation-pitch detection algorithm (DLFT-PDA)], originally designed for human telephone speech, was modified for killer whale vocalizations. The multiple frequency components of some of these vocalizations demand a spectral (rather than temporal) approach to pitch tracking. The DLFT-PDA algorithm derives reliable estimations of pitch and the temporal change of pitch from the harmonic structure of the vocal signal. Scores from both estimations are combined in a dynamic programming search to find a smooth pitch track. The algorithm is capable of tracking killer whale calls that contain simultaneous low and high frequency components and compares favorably across most signal to noise ratio ranges to the peak-picking and sidewinder algorithms that have been used for tracking killer whale vocalizations previously.

  3. Merging Sounder and Imager Data for Improved Cloud Depiction on SNPP and JPSS.

    NASA Astrophysics Data System (ADS)

    Heidinger, A. K.; Holz, R.; Li, Y.; Platnick, S. E.; Wanzong, S.

    2017-12-01

    Under the NOAA GOES-R Algorithm Working Group (AWG) Program, NOAA supports the development of an Infrared (IR) Optimal Estimation (OE) Cloud Height Algorithm (ACHA). ACHA is an enterprise solution that supports many geostationary and polar orbiting imager sensors. ACHA is operational at NOAA on SNPP VIIRS and has been adopted as the cloud height algorithm for the NASA NPP Atmospheric Suite of products. Being an OE algorithm, ACHA is flexible and capable of using additional observations and constraints. We have modified ACHA to use sounder (CriS) observations to improve the cloud detection, typing and height estimation. Specifically, these improvements include retrievals in multi-layer scenarios and improved performance in polar regions. This presentation will describe the process for merging VIIRS and CrIS and a demonstration of the improvements.

  4. Generalized environmental control and life support system computer program (G189A) configuration control. [computer subroutine libraries for shuttle orbiter analyses

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.

    1973-01-01

    A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.

  5. WTAQ version 2-A computer program for analysis of aquifer tests in confined and water-table aquifers with alternative representations of drainage from the unsaturated zone

    USGS Publications Warehouse

    Barlow, Paul M.; Moench, Allen F.

    2011-01-01

    The computer program WTAQ simulates axial-symmetric flow to a well pumping from a confined or unconfined (water-table) aquifer. WTAQ calculates dimensionless or dimensional drawdowns that can be used with measured drawdown data from aquifer tests to estimate aquifer hydraulic properties. Version 2 of the program, which is described in this report, provides an alternative analytical representation of drainage to water-table aquifers from the unsaturated zone than that which was available in the initial versions of the code. The revised drainage model explicitly accounts for hydraulic characteristics of the unsaturated zone, specifically, the moisture retention and relative hydraulic conductivity of the soil. The revised program also retains the original conceptualizations of drainage from the unsaturated zone that were available with version 1 of the program to provide alternative approaches to simulate the drainage process. Version 2 of the program includes all other simulation capabilities of the first versions, including partial penetration of the pumped well and of observation wells and piezometers, well-bore storage and skin effects at the pumped well, and delayed drawdown response of observation wells and piezometers.

  6. Parallel mutual information estimation for inferring gene regulatory networks on GPUs

    PubMed Central

    2011-01-01

    Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA) programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs) from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs. PMID:21672264

  7. NDE detectability of fatigue-type cracks in high-strength alloys: NDI reliability assessments

    NASA Technical Reports Server (NTRS)

    Christner, Brent K.; Long, Donald L.; Rummel, Ward D.

    1988-01-01

    This program was conducted to generate quantitative flaw detection capability data for the nondestructive evaluation (NDE) techniques typically practiced by aerospace contractors. Inconel 718 and Haynes 188 alloy test specimens containing fatigue flaws with a wide distribution of sizes were used to assess the flaw detection capabilities at a number of contractor and government facilities. During this program 85 inspection sequences were completed presenting a total of 20,994 fatigue cracks to 53 different inspectors. The inspection sequences completed included 78 liquid penetrant, 4 eddy current, and 3 ultrasonic evaluations. The results of the assessment inspections are presented and discussed. In generating the flaw detection capability data base, procedures for data collection, data analysis, and specimen care and maintenance were developed, demonstrated, and validated. The data collection procedures and methods that evolved during this program for the measurement of flaw detection capabilities and the effects of inspection variables on performance are discussed. The Inconel 718 and Haynes 188 test specimens that were used in conducting this program and the NDE assessment procedures that were demonstrated, provide NASA with the capability to accurately assess the flaw detection capabilities of specific inspection procedures being applied or proposed for use on current and future fracture control hardware program.

  8. Report by the International Space Station (ISS) Management and Cost Evaluation (IMCE) Task Force

    NASA Technical Reports Server (NTRS)

    Young, A. Thomas; Kellogg, Yvonne (Technical Monitor)

    2001-01-01

    The International Space Station (ISS) Management and Cost Evaluation Task Force (IMCE) was chartered to conduct an independent external review and assessment of the ISS cost, budget, and management. In addition, the Task Force was asked to provide recommendations that could provide maximum benefit to the U.S. taxpayers and the International Partners within the President's budget request. The Task Force has made the following principal findings: (1) The ISS Program's technical achievements to date, as represented by on-orbit capability, are extraordinary; (2) The Existing ISS Program Plan for executing the FY 02-06 budget is not credible; (3) The existing deficiencies in management structure, institutional culture, cost estimating, and program control must be acknowledged and corrected for the Program to move forward in a credible fashion; (4) Additional budget flexibility, from within the Office of Space Flight (OSF) must be provided for a credible core complete program; (5) The research support program is proceeding assuming the budget that was in place before the FY02 budget runout reduction of $1B; (6) There are opportunities to maximize research on the core station program with modest cost impact; (7) The U.S. Core Complete configuration (three person crew) as an end-state will not achieve the unique research potential of the ISS; (8) The cost estimates for the U.S.-funded enhancement options (e.g., permanent seven person crew) are not sufficiently developed to assess credibility. After these findings, the Task Force has formulated several primary recommendations which are published here and include: (1) Major changes must be made in how the ISS program is managed; (2) Additional cost reductions are required within the baseline program; (3) Additional funds must be identified and applied from the Human Space Flight budget; (4) A clearly defined program with a credible end-state, agreed to by all stakeholders, must be developed and implemented.

  9. An Interactive Preprocessor Program with Graphics for a Three-Dimensional Finite Element Code.

    ERIC Educational Resources Information Center

    Hamilton, Claude Hayden, III

    The development and capabilities of an interactive preprocessor program with graphics for an existing three-dimensional finite element code is presented. This preprocessor program, EDGAP3D, is designed to be used in conjunction with the Texas Three Dimensional Grain Analysis Program (TXCAP3D). The code presented in this research is capable of the…

  10. 34 CFR 668.16 - Standards of administrative capability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... completion of Title IV, HEA program training provided or approved by the Secretary, and previous experience... participation in a Title IV, HEA program, does not have more than 33 percent of its undergraduate regular... Participation in Title IV, HEA Programs § 668.16 Standards of administrative capability. To begin and to...

  11. 34 CFR 668.16 - Standards of administrative capability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... completion of Title IV, HEA program training provided or approved by the Secretary, and previous experience... Participation in Title IV, HEA Programs § 668.16 Standards of administrative capability. To begin and to continue to participate in any Title IV, HEA program, an institution shall demonstrate to the Secretary...

  12. 34 CFR 668.16 - Standards of administrative capability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... completion of Title IV, HEA program training provided or approved by the Secretary, and previous experience... Participation in Title IV, HEA Programs § 668.16 Standards of administrative capability. To begin and to continue to participate in any Title IV, HEA program, an institution shall demonstrate to the Secretary...

  13. 34 CFR 668.16 - Standards of administrative capability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... completion of Title IV, HEA program training provided or approved by the Secretary, and previous experience... Participation in Title IV, HEA Programs § 668.16 Standards of administrative capability. To begin and to continue to participate in any Title IV, HEA program, an institution shall demonstrate to the Secretary...

  14. Anlysis capabilities for plutonium-238 programs

    NASA Astrophysics Data System (ADS)

    Wong, A. S.; Rinehart, G. H.; Reimus, M. H.; Pansoy-Hjelvik, M. E.; Moniz, P. F.; Brock, J. C.; Ferrara, S. E.; Ramsey, S. S.

    2000-07-01

    In this presentation, an overview of analysis capabilities that support 238Pu programs will be discussed. These capabilities include neutron emission rate and calorimetric measurements, metallography/ceramography, ultrasonic examination, particle size determination, and chemical analyses. The data obtained from these measurements provide baseline parameters for fuel clad impact testing, fuel processing, product certifications, and waste disposal. Also several in-line analyses capabilities will be utilized for process control in the full-scale 238Pu Aqueous Scrap Recovery line in FY01.

  15. Educational Experiences of Embry-Riddle Students through NASA Research Collaboration

    NASA Technical Reports Server (NTRS)

    Schlee, Keith; Chatman, Yadira; Ristow, James; Gangadharan, Sathya; Sudermann, James; Walker, Charles

    2007-01-01

    NASA's educational programs benefit students while increasing the overall productivity of the organization. The NASA Graduate Student Research Program (GSRP) awards fellowships for graduate study leading to both masters and doctoral degrees in several technical fields, while the Cooperative Education program allows undergraduate and graduate students the chance to gain work experience in the field. The Mission Analysis Branch of the Expendable Launch Vehicles Division at NASA Kennedy Space Center has utilized these two programs with students from Embry-Riddle Aeronautical University to conduct research in modeling and developing a parameter estimation method for spacecraft fuel slosh using simple pendulum analogs. Simple pendulum models are used to understand complicated spacecraft fuel slosh behavior. A robust parameter estimation process will help to identiFy the parameters that will predict the response fairly accurately during the initial stages of design. NASA's Cooperative Education Program trains the next wave of new hires while allowing graduate and undergraduate college students to gain valuable "real-world" work experience. It gives NASA a no risk capability to evaluate the true performance of a prospective new hire without relying solely on a paper resume, while providing the students with a greater hiring potential upon graduation, at NASA or elsewhere. In addition, graduate students serve as mentors for undergrad students and provide a unique learning environment. Providing students with a unique opportunity to work on "real-world" aerospace problems ultimately reinforces their problem solving abilities and their communication skills (in terms of interviewing, resume writing, technical writing, presentation, and peer review) that are vital for the workforce to succeed.

  16. AMDTreat 5.0+ with PHREEQC titration module to compute caustic chemical quantity, effluent quality, and sludge volume

    USGS Publications Warehouse

    Cravotta, Charles A.; Means, Brent P; Arthur, Willam; McKenzie, Robert M; Parkhurst, David L.

    2015-01-01

    Alkaline chemicals are commonly added to discharges from coal mines to increase pH and decrease concentrations of acidity and dissolved aluminum, iron, manganese, and associated metals. The annual cost of chemical treatment depends on the type and quantities of chemicals added and sludge produced. The AMDTreat computer program, initially developed in 2003, is widely used to compute such costs on the basis of the user-specified flow rate and water quality data for the untreated AMD. Although AMDTreat can use results of empirical titration of net-acidic or net-alkaline effluent with caustic chemicals to accurately estimate costs for treatment, such empirical data are rarely available. A titration simulation module using the geochemical program PHREEQC has been incorporated with AMDTreat 5.0+ to improve the capability of AMDTreat to estimate: (1) the quantity and cost of caustic chemicals to attain a target pH, (2) the chemical composition of the treated effluent, and (3) the volume of sludge produced by the treatment. The simulated titration results for selected caustic chemicals (NaOH, CaO, Ca(OH)2, Na2CO3, or NH3) without aeration or with pre-aeration can be compared with or used in place of empirical titration data to estimate chemical quantities, treated effluent composition, sludge volume (precipitated metals plus unreacted chemical), and associated treatment costs. This paper describes the development, evaluation, and potential utilization of the PHREEQC titration module with the new AMDTreat 5.0+ computer program available at http://www.amd.osmre.gov/.

  17. Lessons Learned for Planning and Estimating Operations Support Requirements

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn

    2011-01-01

    Operations (phase E) costs are typically small compared to the spacecraft development and test costs. This, combined with the long lead time for realizing operations costs, can lead projects to focus on hardware development schedules and costs, de-emphasizing estimation of operations support requirements during proposal, early design, and replan cost exercises. The Discovery and New Frontiers (D&NF) programs comprise small, cost-capped missions supporting scientific exploration of the solar system. Even moderate yearly underestimates of the operations costs can present significant LCC impacts for deep space missions with long operational durations, and any LCC growth can directly impact the programs ability to fund new missions. The D&NF Program Office at Marshall Space Flight Center recently studied cost overruns for 7 D&NF missions related to phase C/D development of operational capabilities and phase E mission operations. The goal was to identify the underlying causes for the overruns and develop practical mitigations to assist the D&NF projects in identifying potential operations risks and controlling the associated impacts to operations development and execution costs. The study found that the drivers behind these overruns include overly optimistic assumptions regarding the savings resulting from the use of heritage technology, late development of operations requirements, inadequate planning for sustaining engineering and the special requirements of long duration missions (e.g., knowledge retention and hardware/software refresh), and delayed completion of ground system development work. This presentation summarizes the study and the results, providing a set of lessons NASA can use to improve early estimation and validation of operations costs.

  18. NASA/FAA general aviation crash dynamics program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.; Carden, H. D.

    1981-01-01

    The program involves controlled full scale crash testing, nonlinear structural analyses to predict large deflection elastoplastic response, and load attenuating concepts for use in improved seat and subfloor structure. Both analytical and experimental methods are used to develop expertise in these areas. Analyses include simplified procedures for estimating energy dissipating capabilities and comprehensive computerized procedures for predicting airframe response. These analyses are developed to provide designers with methods for predicting accelerations, loads, and displacements on collapsing structure. Tests on typical full scale aircraft and on full and subscale structural components are performed to verify the analyses and to demonstrate load attenuating concepts. A special apparatus was built to test emergency locator transmitters when attached to representative aircraft structure. The apparatus is shown to provide a good simulation of the longitudinal crash pulse observed in full scale aircraft crash tests.

  19. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  20. Decision Support System For Management Of Low-Level Radioactive Waste Disposal At The Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shott, G.; Yucel, V.; Desotell, L.

    2006-07-01

    The long-term safety of U.S. Department of Energy (DOE) low-level radioactive disposal facilities is assessed by conducting a performance assessment -- a systematic analysis that compares estimated risks to the public and the environment with performance objectives contained in DOE Manual 435.1-1, Radioactive Waste Management Manual. Before site operations, facilities design features such as final inventory, waste form characteristics, and closure cover design may be uncertain. Site operators need a modeling tool that can be used throughout the operational life of the disposal site to guide decisions regarding the acceptance of problematic waste streams, new disposal cell design, environmental monitoringmore » program design, and final site closure. In response to these needs the National Nuclear Security Administration Nevada Site Office (NNSA/NSO) has developed a decision support system for the Area 5 Radioactive Waste Management Site in Frenchman Flat on the Nevada Test Site. The core of the system is a probabilistic inventory and performance assessment model implemented in the GoldSim{sup R} simulation platform. The modeling platform supports multiple graphic capabilities that allow clear documentation of the model data sources, conceptual model, mathematical implementation, and results. The combined models have the capability to estimate disposal site inventory, contaminant concentrations in environmental media, and radiological doses to members of the public engaged in various activities at multiple locations. The model allows rapid assessment and documentation of the consequences of waste management decisions using the most current site characterization information, radionuclide inventory, and conceptual model. The model is routinely used to provide annual updates of site performance, evaluate the consequences of disposal of new waste streams, develop waste concentration limits, optimize the design of new disposal cells, and assess the adequacy of environmental monitoring programs. (authors)« less

  1. Neuropeptide Signaling Networks and Brain Circuit Plasticity.

    PubMed

    McClard, Cynthia K; Arenkiel, Benjamin R

    2018-01-01

    The brain is a remarkable network of circuits dedicated to sensory integration, perception, and response. The computational power of the brain is estimated to dwarf that of most modern supercomputers, but perhaps its most fascinating capability is to structurally refine itself in response to experience. In the language of computers, the brain is loaded with programs that encode when and how to alter its own hardware. This programmed "plasticity" is a critical mechanism by which the brain shapes behavior to adapt to changing environments. The expansive array of molecular commands that help execute this programming is beginning to emerge. Notably, several neuropeptide transmitters, previously best characterized for their roles in hypothalamic endocrine regulation, have increasingly been recognized for mediating activity-dependent refinement of local brain circuits. Here, we discuss recent discoveries that reveal how local signaling by corticotropin-releasing hormone reshapes mouse olfactory bulb circuits in response to activity and further explore how other local neuropeptide networks may function toward similar ends.

  2. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  3. Photovoltaic Systems Test Facilities: Existing capabilities compilation

    NASA Technical Reports Server (NTRS)

    Volkmer, K.

    1982-01-01

    A general description of photovoltaic systems test facilities (PV-STFs) operated under the U.S. Department of Energy's photovoltaics program is given. Descriptions of a number of privately operated facilities having test capabilities appropriate to photovoltaic hardware development are given. A summary of specific, representative test capabilities at the system and subsystem level is presented for each listed facility. The range of system and subsystem test capabilities available to serve the needs of both the photovoltaics program and the private sector photovoltaics industry is given.

  4. Predicting the payload capability of cable logging systems including the effect of partial suspension

    Treesearch

    Gary D. Falk

    1981-01-01

    A systematic procedure for predicting the payload capability of running, live, and standing skylines is presented. Three hand-held calculator programs are used to predict payload capability that includes the effect of partial suspension. The programs allow for predictions for downhill yarding and for yarding away from the yarder. The equations and basic principles...

  5. Nuclear and Radiological Forensics and Attribution Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D K; Niemeyer, S

    2005-11-04

    The goal of the U.S. Department of Homeland Security (DHS) Nuclear and Radiological Forensics and Attribution Program is to develop the technical capability for the nation to rapidly, accurately, and credibly attribute the origins and pathways of interdicted or collected materials, intact nuclear devices, and radiological dispersal devices. A robust attribution capability contributes to threat assessment, prevention, and deterrence of nuclear terrorism; it also supports the Federal Bureau of Investigation (FBI) in its investigative mission to prevent and respond to nuclear terrorism. Development of the capability involves two major elements: (1) the ability to collect evidence and make forensic measurements,more » and (2) the ability to interpret the forensic data. The Program leverages the existing capability throughout the U.S. Department of Energy (DOE) national laboratory complex in a way that meets the requirements of the FBI and other government users. At the same time the capability is being developed, the Program also conducts investigations for a variety of sponsors using the current capability. The combination of operations and R&D in one program helps to ensure a strong linkage between the needs of the user community and the scientific development.« less

  6. The Perceived Technology Proficiency of Students in a Teacher Education Program

    ERIC Educational Resources Information Center

    Coffman, Vonda G.

    2013-01-01

    The purpose of this study is to determine the perceived technology capabilities of different levels of undergraduate students of Kent State University in the College of Education, Health, and Human Services teacher education programs; to determine if the perceived technology capabilities of students beginning the teacher education program differ…

  7. Exploration Medical Capability (ExMC) Program

    NASA Technical Reports Server (NTRS)

    Kalla, Elizabeth

    2006-01-01

    This document reviews NASA's Exploration Medical Capability (ExMC) program. The new space exploration program, outlined by the President will present new challenges to the crew's health. The project goals are to develop and validate requirements for reliable, efficient, and robust medical systems and treatments for space exploration to maximize crew performance for mission objectives.

  8. ICF Annual Report 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Correll, D

    The continuing objective of Lawrence Livermore National Laboratory's (LLNL's) Inertial Confinement Fusion (ICF) Program is the demonstration of thermonuclear fusion ignition and energy gain in the laboratory and to support the nuclear weapons program in its use of ICF facilities. The underlying theme of all ICF activities as a science research and development program is the Department of Energy's (DOE's) Defense Programs (DP) science-based Stockpile Stewardship Program (SSP). The mission of the US Inertial Fusion Program is twofold: (1) to address high-energy-density physics issues for the SSP and (2) to develop a laboratory microfusion capability for defense and energy applications.more » In pursuit of this mission, the ICF Program has developed a state-of-the-art capability to investigate high-energy-density physics in the laboratory. The near-term goals pursued by the ICF Program in support of its mission are demonstrating fusion ignition in the laboratory and expanding the Program's capabilities in high-energy-density science. The National Ignition Facility (NIF) project is a cornerstone of this effort.« less

  9. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    PubMed

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  10. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS

    PubMed Central

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183

  11. New DMSP Database of Precipitating Auroral Electrons and Ions.

    PubMed

    Redmon, Robert J; Denig, William F; Kilcommons, Liam M; Knipp, Delores J

    2017-08-01

    Since the mid 1970's, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low earth orbit. As the program evolved, so to have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) through F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information (NCEI) and the Coordinated Data Analysis Web (CDAWeb). We describe how the new database is being applied to high latitude studies of: the co-location of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field aligned currents and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  12. Self-teaching digital-computer program for fail-operational control of a turbojet engine in a sea-level test stand

    NASA Technical Reports Server (NTRS)

    Wallhagen, R. E.; Arpasi, D. J.

    1974-01-01

    The design and evaluation are described of a digital turbojet engine control which is capable of sensing catastrophic failures in either the engine rotor speed or the compressor discharge static-pressure signal and is capable of switching control modes to maintain near normal operation. The control program was developed for and tested on a turbojet engine located in a sea-level test stand. The control program is also capable of acquiring all the data that are necessary for the fail-operational control to function.

  13. Development of NASA Technical Standards Program Relative to Enhancing Engineering Capabilities

    NASA Technical Reports Server (NTRS)

    Gill, Paul S.; Vaughan, William W.

    2003-01-01

    The enhancement of engineering capabilities is an important aspect of any organization; especially those engaged in aerospace development activities. Technical Standards are one of the key elements of this endeavor. The NASA Technical Standards Program was formed in 1997 in response to the NASA Administrator s directive to develop an Agencywide Technical Standards Program. The Program s principal objective involved the converting Center-unique technical standards into Agency wide standards and the adoption/endorsement of non-Government technical standards in lieu of government standards. In the process of these actions, the potential for further enhancement of the Agency s engineering capabilities was noted relative to value of being able to access Agencywide the necessary full-text technical standards, standards update notifications, and integration of lessons learned with technical standards, all available to the user from one Website. This was accomplished and is now being enhanced based on feedbacks from the Agency's engineering staff and supporting contractors. This paper addresses the development experiences with the NASA Technical Standards Program and the enhancement of the Agency's engineering capabilities provided by the Program s products. Metrics are provided on significant aspects of the Program.

  14. Incorporating a Capability for Estimating Inhalation Doses in ...

    EPA Pesticide Factsheets

    Report and Data Files This report presents the approach to be used to incorporate in the U.S. Environmental Protection Agency’s TEVA-SPOT software (U.S.EPA 2014) a capability for estimating inhalation doses that result from the most important sources of contaminated aerosols and volatile contaminants during a contamination event.

  15. Performance and Weight Estimates for an Advanced Open Rotor Engine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Tong, Michael T.

    2012-01-01

    NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.

  16. Solute and heat transport model of the Henry and Hilleke laboratory experiment

    USGS Publications Warehouse

    Langevin, C.D.; Dausman, A.M.; Sukop, M.C.

    2010-01-01

    SEAWAT is a coupled version of MODFLOW and MT3DMS designed to simulate variable-density ground water flow and solute transport. The most recent version of SEAWAT, called SEAWAT Version 4, includes new capabilities to represent simultaneous multispecies solute and heat transport. To test the new features in SEAWAT, the laboratory experiment of Henry and Hilleke (1972) was simulated. Henry and Hilleke used warm fresh water to recharge a large sand-filled glass tank. A cold salt water boundary was represented on one side. Adjustable heating pads were used to heat the bottom and left sides of the tank. In the laboratory experiment, Henry and Hilleke observed both salt water and fresh water flow systems separated by a narrow transition zone. After minor tuning of several input parameters with a parameter estimation program, results from the SEAWAT simulation show good agreement with the experiment. SEAWAT results suggest that heat loss to the room was more than expected by Henry and Hilleke, and that multiple thermal convection cells are the likely cause of the widened transition zone near the hot end of the tank. Other computer programs with similar capabilities may benefit from benchmark testing with the Henry and Hilleke laboratory experiment. Journal Compilation ?? 2009 National Ground Water Association.

  17. Molecular Sieve Bench Testing and Computer Modeling

    NASA Technical Reports Server (NTRS)

    Mohamadinejad, Habib; DaLee, Robert C.; Blackmon, James B.

    1995-01-01

    The design of an efficient four-bed molecular sieve (4BMS) CO2 removal system for the International Space Station depends on many mission parameters, such as duration, crew size, cost of power, volume, fluid interface properties, etc. A need for space vehicle CO2 removal system models capable of accurately performing extrapolated hardware predictions is inevitable due to the change of the parameters which influences the CO2 removal system capacity. The purpose is to investigate the mathematical techniques required for a model capable of accurate extrapolated performance predictions and to obtain test data required to estimate mass transfer coefficients and verify the computer model. Models have been developed to demonstrate that the finite difference technique can be successfully applied to sorbents and conditions used in spacecraft CO2 removal systems. The nonisothermal, axially dispersed, plug flow model with linear driving force for 5X sorbent and pore diffusion for silica gel are then applied to test data. A more complex model, a non-darcian model (two dimensional), has also been developed for simulation of the test data. This model takes into account the channeling effect on column breakthrough. Four FORTRAN computer programs are presented: a two-dimensional model of flow adsorption/desorption in a packed bed; a one-dimensional model of flow adsorption/desorption in a packed bed; a model of thermal vacuum desorption; and a model of a tri-sectional packed bed with two different sorbent materials. The programs are capable of simulating up to four gas constituents for each process, which can be increased with a few minor changes.

  18. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Computer modelling of cyclic deformation of high-temperature materials. Technical progress report, 1 September-30 November 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duesbery, M.S.

    1993-11-30

    This program aims at improving current methods of lifetime assessment by building in the characteristics of the micro-mechanisms known to be responsible for damage and failure. The broad approach entails the integration and, where necessary, augmentation of the micro-scale research results currently available in the literature into a macro-sale model with predictive capability. In more detail, the program will develop a set of hierarchically structured models at different length scales, from atomic to macroscopic, at each level taking as parametric input the results of the model at the next smaller scale. In this way the known microscopic properties can bemore » transported by systematic procedures to the unknown macro-scale region. It may mot be possible to eliminate empiricism completely, because some of the quantities involved cannot yet be estimated to the required degree of precision. In this case the aim will be at least to eliminate functional empiricism. Restriction of empiricism to the choice of parameters to be input to known functional forms permits some confidence in extrapolation procedures and has the advantage that the models can readily be updated as better estimates of the parameters become available.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Kenneth; Oxstrand, Johanna

    The Digital Architecture effort is a part of the Department of Energy (DOE) sponsored Light-Water Reactor Sustainability (LWRS) Program conducted at Idaho National Laboratory (INL). The LWRS program is performed in close collaboration with industry research and development (R&D) programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants (NPPs). One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Therefore,more » a major objective of the LWRS program is the development of a seamless digital environment for plant operations and support by integrating information from plant systems with plant processes for nuclear workers through an array of interconnected technologies. In order to get the most benefits of the advanced technology suggested by the different research activities in the LWRS program, the nuclear utilities need a digital architecture in place to support the technology. A digital architecture can be defined as a collection of information technology (IT) capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. It is not hard to imagine that many processes within the plant can be largely improved from both a system and human performance perspective by utilizing a plant wide (or near plant wide) wireless network. For example, a plant wide wireless network allows for real time plant status information to easily be accessed in the control room, field workers’ computer-based procedures can be updated based on the real time plant status, and status on ongoing procedures can be incorporated into smart schedules in the outage command center to allow for more accurate planning of critical tasks. The goal of the digital architecture project is to provide a long-term strategy to integrate plant systems, plant processes, and plant workers. This include technologies to improve nuclear worker efficiency and human performance; to offset a range of plant surveillance and testing activities with new on-line monitoring technologies; improve command, control, and collaboration in settings such as outage control centers and work execution centers; and finally to improve operator performance with new operator aid technologies for the control room. The requirements identified through the activities in the Digital Architecture project will be used to estimate the amount of traffic on the network and hence estimating the minimal bandwidth needed.« less

  1. Scheduling language and algorithm development study. Volume 2: Use of the basic language and module library

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. A.; Cornick, D. E.; Flater, J. F.; Odoherty, R. J.; Peterson, F. M.; Ramsey, H. R.; Willoughby, J. K.

    1974-01-01

    The capabilities of the specified scheduling language and the program module library are outlined. The summary is written with the potential user in mind and, therefore, provides maximum insight on how the capabilities will be helpful in writing scheduling programs. Simple examples and illustrations are provided to assist the potential user in applying the capabilities of his problem.

  2. Minimum Detectable Dose as a Measure of Bioassay Programme Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.

    2003-01-01

    This paper suggests that minimum detectable dose (MDD) be used to describe the capability of bioassay programs for which intakes are expected to be rare. This allows expression of the capability in units that correspond directly to primary dose limits. The concept uses the well-established analytical statistic minimum detectable amount (MDA) as the starting point and assumes MDA detection at a prescribed time post intake. The resulting dose can then be used as an indication of the adequacy or capability of the program for demonstrating compliance with the performance criteria. MDDs can be readily tabulated or plotted to demonstrate themore » effectiveness of different types of monitoring programs. The inclusion of cost factors for bioassay measurements can allow optimisation.« less

  3. Using H. Stephen Glenn's Developing Capable People Program with Adults in Montana: How Effective Is the Curriculum?

    ERIC Educational Resources Information Center

    Astroth, Kirk A.; Lorbeer, Scott

    1998-01-01

    Pre/posttest scores of 30 participants in H. Stephen Glenn's Developing Capable People (DCP) program offered by Montana Extension showed that DCP effectively increased the use of positive behaviors and decreased negative behaviors in adults interacting with youth. These changes were sustained over 18 months after program completion. (SK)

  4. Real-time flutter identification

    NASA Technical Reports Server (NTRS)

    Roy, R.; Walker, R.

    1985-01-01

    The techniques and a FORTRAN 77 MOdal Parameter IDentification (MOPID) computer program developed for identification of the frequencies and damping ratios of multiple flutter modes in real time are documented. Physically meaningful model parameterization was combined with state of the art recursive identification techniques and applied to the problem of real time flutter mode monitoring. The performance of the algorithm in terms of convergence speed and parameter estimation error is demonstrated for several simulated data cases, and the results of actual flight data analysis from two different vehicles are presented. It is indicated that the algorithm is capable of real time monitoring of aircraft flutter characteristics with a high degree of reliability.

  5. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  6. Evaluating science return in space exploration initiative architectures

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann; Spudis, Paul D.

    1993-01-01

    Science is an important aspect of the Space Exploration Initiative, a program to explore the Moon and Mars with people and machines. Different SEI mission architectures are evaluated on the basis of three variables: access (to the planet's surface), capability (including number of crew, equipment, and supporting infrastructure), and time (being the total number of man-hours available for scientific activities). This technique allows us to estimate the scientific return to be expected from different architectures and from different implementations of the same architecture. Our methodology allows us to maximize the scientific return from the initiative by illuminating the different emphases and returns that result from the alternative architectural decisions.

  7. Portability studies of modular data base managers. Interim reports. [Running CDC's DATATRAN 2 on IBM 360/370 and IBM's JOSHUA on CDC computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopp, H.J.; Mortensen, G.A.

    1978-04-01

    Approximately 60% of the full CDC 6600/7600 Datatran 2.0 capability was made operational on IBM 360/370 equipment. Sufficient capability was made operational to demonstrate adequate performance for modular program linking applications. Also demonstrated were the basic capabilities and performance required to support moderate-sized data base applications and moderately active scratch input/output applications. Approximately one to two calendar years are required to develop DATATRAN 2.0 capabilities fully for the entire spectrum of applications proposed. Included in the next stage of conversion should be syntax checking and syntax conversion features that would foster greater FORTRAN compatibility between IBM and CDC developed modules.more » The batch portion of the JOSHUA Modular System, which was developed by Savannah River Laboratory to run on an IBM computer, was examined for the feasibility of conversion to run on a Control Data Corporation (CDC) computer. Portions of the JOSHUA Precompiler were changed so as to be operable on the CDC computer. The Data Manager and Batch Monitor were also examined for conversion feasibility, but no changes were made in them. It appears to be feasible to convert the batch portion of the JOSHUA Modular System to run on a CDC computer with an estimated additional two to three man-years of effort. 9 tables.« less

  8. Identification of potential compensatory muscle strategies in a breast cancer survivor population: A combined computational and experimental approach.

    PubMed

    Chopp-Hurley, Jaclyn N; Brookham, Rebecca L; Dickerson, Clark R

    2016-12-01

    Biomechanical models are often used to estimate the muscular demands of various activities. However, specific muscle dysfunctions typical of unique clinical populations are rarely considered. Due to iatrogenic tissue damage, pectoralis major capability is markedly reduced in breast cancer population survivors, which could influence arm internal and external rotation muscular strategies. Accordingly, an optimization-based muscle force prediction model was systematically modified to emulate breast cancer population survivors through adjusting pectoralis capability and enforcing an empirical muscular co-activation relationship. Model permutations were evaluated through comparisons between predicted muscle forces and empirically measured muscle activations in survivors. Similarities between empirical data and model outputs were influenced by muscle type, hand force, pectoralis major capability and co-activation constraints. Differences in magnitude were lower when the co-activation constraint was enforced (-18.4% [31.9]) than unenforced (-23.5% [27.6]) (p<0.0001). This research demonstrates that muscle dysfunction in breast cancer population survivors can be reflected through including a capability constraint for pectoralis major. Further refinement of the co-activation constraint for survivors could improve its generalizability across this population and activities. Improving biomechanical models to more accurately represent clinical populations can provide novel information that can help in the development of optimal treatment programs for breast cancer population survivors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. ASC FY17 Implementation Plan, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, P. G.

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less

  10. Development of National Program of Cancer Registries SAS Tool for Population-Based Cancer Relative Survival Analysis.

    PubMed

    Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth

    2016-01-01

    Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.

  11. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meisner, Robert; McCoy, Michel; Archer, Bill

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less

  12. NASTRAN interfacing modules within the Integrated Analysis Capability (IAC) Program

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1986-01-01

    The IAC program provides the framework required for the development of an extensive multidisciplinary analysis capability. Several NASTRAN related capabilities were developed which can all be expanded in a routine manner to meet in-house unique needs. Plans are to complete the work discussed herein and to provide it to the engineering community through COSMIC. Release is to be after the current IAC Level 2 contract work on the IAC executive system is completed and meshed with the interfacing modules and analysis capabilities under development at the GSFC.

  13. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Improved CPAS Photogrammetric Capabilities for Engineering Development Unit (EDU) Testing

    NASA Technical Reports Server (NTRS)

    Ray, Eric S.; Bretz, David R.

    2013-01-01

    This paper focuses on two key improvements to the photogrammetric analysis capabilities of the Capsule Parachute Assembly System (CPAS) for the Orion vehicle. The Engineering Development Unit (EDU) system deploys Drogue and Pilot parachutes via mortar, where an important metric is the muzzle velocity. This can be estimated using a high speed camera pointed along the mortar trajectory. The distance to the camera is computed from the apparent size of features of known dimension. This method was validated with a ground test and compares favorably with simulations. The second major photogrammetric product is measuring the geometry of the Main parachute cluster during steady-state descent using onboard cameras. This is challenging as the current test vehicles are suspended by a single-point attachment unlike earlier stable platforms suspended under a confluence fitting. The mathematical modeling of fly-out angles and projected areas has undergone significant revision. As the test program continues, several lessons were learned about optimizing the camera usage, installation, and settings to obtain the highest quality imagery possible.

  15. Direct Estimation of Power Distribution in Reactors for Nuclear Thermal Space Propulsion

    NASA Astrophysics Data System (ADS)

    Aldemir, Tunc; Miller, Don W.; Burghelea, Andrei

    2004-02-01

    A recently proposed constant temperature power sensor (CTPS) has the capability to directly measure the local power deposition rate in nuclear reactor cores proposed for space thermal propulsion. Such a capability reduces the uncertainties in the estimated power peaking factors and hence increases the reliability of the nuclear engine. The CTPS operation is sensitive to the changes in the local thermal conditions. A procedure is described for the automatic on-line calibration of the sensor through estimation of changes in thermal .conditions.

  16. The Efficacy of Machine Learning Programs for Navy Manpower Analysis

    DTIC Science & Technology

    1993-03-01

    This thesis investigated the efficacy of two machine learning programs for Navy manpower analysis. Two machine learning programs, AIM and IXL, were...to generate models from the two commercial machine learning programs. Using a held out sub-set of the data the capabilities of the three models were...partial effects. The author recommended further investigation of AIM’s capabilities, and testing in an operational environment.... Machine learning , AIM, IXL.

  17. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  18. Seasat-A ASVT: Commercial demonstration experiments. Results analysis methodology for the Seasat-A case studies

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The SEASAT-A commercial demonstration program ASVT is described. The program consists of a set of experiments involving the evaluation of a real time data distributions system, the SEASAT-A user data distribution system, that provides the capability for near real time dissemination of ocean conditions and weather data products from the U.S. Navy Fleet Numerical Weather Central to a selected set of commercial and industrial users and case studies, performed by commercial and industrial users, using the data gathered by SEASAT-A during its operational life. The impact of the SEASAT-A data on business operations is evaluated by the commercial and industrial users. The approach followed in the performance of the case studies, and the methodology used in the analysis and integration of the case study results to estimate the actual and potential economic benefits of improved ocean condition and weather forecast data are described.

  19. Taking the Measure of Massive Stars and their Environments with the CHARA Array Long-baseline Interferometer

    NASA Astrophysics Data System (ADS)

    Gies, Douglas R.

    2017-11-01

    Most massive stars are so distant that their angular diameters are too small for direct resolution. However, the observational situation is now much more favorable, thanks to new opportunities available with optical/IR long-baseline interferometry. The Georgia State University Center for High Angular Resolution Astronomy Array at Mount Wilson Observatory is a six-telescope instrument with a maximum baseline of 330 meters, which is capable of resolving stellar disks with diameters as small as 0.2 milliarcsec. The distant stars are no longer out of range, and many kinds of investigations are possible. Here we summarize a number of studies involving angular diameter measurements and effective temperature estimates for OB stars, binary and multiple stars (including the σ Orionis system), and outflows in Luminous Blue Variables. An enlarged visitors program will begin in 2017 that will open many opportunities for new programs in high angular resolution astronomy.

  20. Systems Analysis for Large Army Formations.

    DTIC Science & Technology

    1984-06-01

    Science & Technology Division, April 1982. 8. Deitel , H.M., An Introduction to Operating Systems, Addison-Wesley Systems Programming Series, 1982, pp...information about enemy units and their structure. The corresponding application program should provide the user with the capability to enter, maintain and...corres- ponding application program should provide the Operations Sub- system personnel the capability to enter, retrieve, and modify proposed changes to

  1. VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1994-01-01

    VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.

  2. The Impact of Advanced Greenhouse Gas Measurement Science on Policy Goals and Research Strategies

    NASA Astrophysics Data System (ADS)

    Abrahams, L.; Clavin, C.; McKittrick, A.

    2016-12-01

    In support of the Paris agreement, accurate characterizations of U.S. greenhouse gas (GHG) emissions estimates have been area of increased scientific focus. Over the last several years, the scientific community has placed significant emphasis on understanding, quantifying, and reconciling measurement and modeling methods that characterize methane emissions from petroleum and natural gas sources. This work has prompted national policy discussions and led to the improvement of regional and national methane emissions estimates. Research campaigns focusing on reconciling atmospheric measurements ("top-down") and process-based emissions estimates ("bottom-up") have sought to identify where measurement technology advances could inform policy objectives. A clear next step is development and deployment of advanced detection capabilities that could aid U.S. emissions mitigation and verification goals. The breadth of policy-relevant outcomes associated with advances in GHG measurement science are demonstrated by recent improvements in the petroleum and natural gas sector emission estimates in the EPA Greenhouse Gas Inventory, ambitious efforts to apply inverse modeling results to inform or validate national GHG inventory, and outcomes from federal GHG measurement science technology development programs. In this work, we explore the variety of policy-relevant outcomes impacted by advances in GHG measurement science, with an emphasis on improving GHG inventory estimates, identifying emissions mitigation strategies, and informing technology development requirements.

  3. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  4. Estimating the State of Aerodynamic Flows in the Presence of Modeling Errors

    NASA Astrophysics Data System (ADS)

    da Silva, Andre F. C.; Colonius, Tim

    2017-11-01

    The ensemble Kalman filter (EnKF) has been proven to be successful in fields such as meteorology, in which high-dimensional nonlinear systems render classical estimation techniques impractical. When the model used to forecast state evolution misrepresents important aspects of the true dynamics, estimator performance may degrade. In this work, parametrization and state augmentation are used to track misspecified boundary conditions (e.g., free stream perturbations). The resolution error is modeled as a Gaussian-distributed random variable with the mean (bias) and variance to be determined. The dynamics of the flow past a NACA 0009 airfoil at high angles of attack and moderate Reynolds number is represented by a Navier-Stokes equations solver with immersed boundaries capabilities. The pressure distribution on the airfoil or the velocity field in the wake, both randomized by synthetic noise, are sampled as measurement data and incorporated into the estimated state and bias following Kalman's analysis scheme. Insights about how to specify the modeling error covariance matrix and its impact on the estimator performance are conveyed. This work has been supported in part by a Grant from AFOSR (FA9550-14-1-0328) with Dr. Douglas Smith as program manager, and by a Science without Borders scholarship from the Ministry of Education of Brazil (Capes Foundation - BEX 12966/13-4).

  5. Impact of expanded newborn screening--United States, 2006.

    PubMed

    2008-09-19

    Universal newborn screening for selected metabolic, endocrine, hematologic, and functional disorders is a well-established practice of state public health programs. Recent developments in tandem mass spectrometry (MS/MS), which is now capable of multi-analyte analysis in a high throughput capacity, has enabled newborn screening to include many more disorders detectable from a newborn blood spot. In 2006, to address the substantial variation that existed from state to state in the number of disorders included in newborn screening panels, the American College of Medical Genetics (ACMG), under guidance from the Health Resources and Services Administration, recommended a uniform panel of 29 disorders, which was subsequently endorsed by the federal Advisory Committee on Heritable Disorders in Newborns and Children. After 2006, most states began to expand their panels to include all 29 disorders; currently, 21 states and the District of Columbia have fully implemented the ACMG panel. To estimate the burden to state newborn screening programs resulting from this expansion, CDC used 2001-2006 data from those states with well-established MS/MS screening programs to estimate the number of children in the United States who would have been identified with disorders in 2006 if all 50 states and the District of Columbia had been using the ACMG panel. This report describes the results of that analysis, which indicated that, although such an expansion would have increased the number of children identified by only 32% (from 4,370 to 6,439), these children would have had many rare disorders that require local or regional capacity to deliver expertise in screening, diagnosis, and management. The findings underscore the need for public health and health-care delivery systems to build or expand the programs required to manage the rare disorders detected through expanded newborn screening, while also continuing programs to address more common disorders.

  6. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  7. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  8. The NASA Space Launch System Program Systems Engineering Approach for Affordability

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    The National Aeronautics and Space Administration is currently developing the Space Launch System to provide the United States with a capability to launch large Payloads into Low Earth orbit and deep space. One of the development tenets of the SLS Program is affordability. One initiative to enhance affordability is the SLS approach to requirements definition, verification and system certification. The key aspects of this initiative include: 1) Minimizing the number of requirements, 2) Elimination of explicit verification requirements, 3) Use of certified models of subsystem capability in lieu of requirements when appropriate and 4) Certification of capability beyond minimum required capability. Implementation of each aspect is described and compared to a "typical" systems engineering implementation, including a discussion of relative risk. Examples of each implementation within the SLS Program are provided.

  9. Effectiveness evaluation of STOL transport operations (phase 2). [computer simulation program of commercial short haul aircraft operations

    NASA Technical Reports Server (NTRS)

    Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.

    1974-01-01

    A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.

  10. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix A: ROBSIM user's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotics Simulation Program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotics systems. ROBSIM is program in FORTRAN 77 for use on a VAX 11/750 computer under the VMS operating system. This user's guide describes the capabilities of the ROBSIM programs, including the system definition function, the analysis tools function and the postprocessor function. The options a user may encounter with each of these executables are explained in detail and the different program prompts appearing to the user are included. Some useful suggestions concerning the appropriate answers to be given by the user are provided. An example user interactive run in enclosed for each of the main program services, and some of the capabilities are illustrated.

  11. KSC-2012-1265

    NASA Image and Video Library

    2012-02-07

    CAPE CANAVERAL, Fla. -- Commercial Crew Program (CCP) Manager Ed Mango, left, and Deputy Program Manager Brent Jett host a Program Strategy Forum at NASA's Kennedy Space Center in Florida. The forum was held to update industry partners about NASA's next phase of developing commercial space transportation system capabilities. CCP is helping to mature the design and development of a crew transportation system with the overall goal of accelerating a United States-led capability to the International Space Station. The goal of the program is to drive down the cost of space travel as well as open up space to more people than ever before by balancing industry’s own innovative capabilities with NASA's 50 years of human spaceflight experience. For more information, visit www.nasa.gov/commercialcrew. Photo credit: NASA/Kim Shiflett

  12. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    NASA Astrophysics Data System (ADS)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  13. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  14. Neural network fusion capabilities for efficient implementation of tracking algorithms

    NASA Astrophysics Data System (ADS)

    Sundareshan, Malur K.; Amoozegar, Farid

    1996-05-01

    The ability to efficiently fuse information of different forms for facilitating intelligent decision-making is one of the major capabilities of trained multilayer neural networks that is being recognized int eh recent times. While development of innovative adaptive control algorithms for nonlinear dynamical plants which attempt to exploit these capabilities seems to be more popular, a corresponding development of nonlinear estimation algorithms using these approaches, particularly for application in target surveillance and guidance operations, has not received similar attention. In this paper we describe the capabilities and functionality of neural network algorithms for data fusion and implementation of nonlinear tracking filters. For a discussion of details and for serving as a vehicle for quantitative performance evaluations, the illustrative case of estimating the position and velocity of surveillance targets is considered. Efficient target tracking algorithms that can utilize data from a host of sensing modalities and are capable of reliably tracking even uncooperative targets executing fast and complex maneuvers are of interest in a number of applications. The primary motivation for employing neural networks in these applications comes form the efficiency with which more features extracted from different sensor measurements can be utilized as inputs for estimating target maneuvers. Such an approach results in an overall nonlinear tracking filter which has several advantages over the popular efforts at designing nonlinear estimation algorithms for tracking applications, the principle one being the reduction of mathematical and computational complexities. A system architecture that efficiently integrates the processing capabilities of a trained multilayer neural net with the tracking performance of a Kalman filter is described in this paper.

  15. NDE reliability and probability of detection (POD) evolution and paradigm shift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surendra

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823Amore » (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using Integrated Computational Materials Engineering (ICME), MAPOD, SAPOD, and Bayesian statistics for studying controllable and non-controllable variables including human factors for estimating POD. Another objective is to list gaps between “hoped for” versus validated or fielded failed hardware.« less

  16. Human Research Program Exploration Medical Capability

    NASA Technical Reports Server (NTRS)

    Barsten, Kristina

    2010-01-01

    NASA s Human Research Program (HRP) conducts and coordinates research projects that provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. The Program is divided into 6 major elements, which a) Provide the Program s knowledge and capabilities to conduct research, addressing the human health and performance risks. b) Advance the readiness levels of technology and countermeasures to the point of transfer to the customer programs and organizations. The National Space Biomedical Research Institute (NSBRI) is a partner with the HRP in developing a successful research program. 3

  17. AFRL’s ALREST Physics-Based Combustion Stability Program

    DTIC Science & Technology

    2012-11-08

    enduring challenge because of the inherent complexities in the physics of multiphase turbulent flames. The present paper provides the Air Force...Combustor F i d e l i t y URANS LES Steady RANS HLES Current SOA Capability with 2000 cores Capability at Program End in 2015 (2,000 cores+GPUs) Capability...Unlimited ALREST Validation Cases “Final Exam ” Hydrogen Stable Single Element (PSU) Stable Single Element Methane (Singla) Supercritical Non

  18. Development of a curved pipe capability for the NASTRAN finite element program

    NASA Technical Reports Server (NTRS)

    Jeter, J. W., Jr.

    1977-01-01

    A curved pipe element capability for the NASTRAN structural analysis program is developed using the NASTRAN dummy element feature. A description is given of the theory involved in the subroutines which describe stiffness, mass, thermal and enforced deformation loads, and force and stress recovery for the curved pipe element. Incorporation of these subroutines into NASTRAN is discussed. Test problems are proposed. Instructions on use of the new element capability are provided.

  19. Software Technology for Adaptable Reliable Systems (STARS) Workshop March 24-27 1986.

    DTIC Science & Technology

    1986-03-01

    syntax is aug- monitor program behavior. Trace and mented to accept design notes in arbitrary single-step facilities will provide the capability ... capabilities of these worksta- inrs tions make them a logical choice for hosting The final component of Vise is the a visual development environment. We...the following When the user picks the desired action, capabilities : graphical program display and linguistic analysis is used to extract informa

  20. INFORM: An interactive data collection and display program with debugging capability

    NASA Technical Reports Server (NTRS)

    Cwynar, D. S.

    1980-01-01

    A computer program was developed to aid ASSEMBLY language programmers of mini and micro computers in solving the man machine communications problems that exist when scaled integers are involved. In addition to producing displays of quasi-steady state values, INFORM provides an interactive mode for debugging programs, making program patches, and modifying the displays. Auxiliary routines SAMPLE and DATAO add dynamic data acquisition and high speed dynamic display capability to the program. Programming information and flow charts to aid in implementing INFORM on various machines together with descriptions of all supportive software are provided. Program modifications to satisfy the individual user's needs are considered.

  1. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  2. 7 CFR 281.4 - Determining Indian tribal organization capability.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....4 Section 281.4 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM ADMINISTRATION OF THE FOOD STAMP PROGRAM ON INDIAN RESERVATIONS § 281.4 Determining Indian tribal organization capability. (a...

  3. 7 CFR 281.4 - Determining Indian tribal organization capability.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....4 Section 281.4 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM ADMINISTRATION OF THE FOOD STAMP PROGRAM ON INDIAN RESERVATIONS § 281.4 Determining Indian tribal organization capability. (a...

  4. 7 CFR 281.4 - Determining Indian tribal organization capability.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....4 Section 281.4 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM ADMINISTRATION OF THE FOOD STAMP PROGRAM ON INDIAN RESERVATIONS § 281.4 Determining Indian tribal organization capability. (a...

  5. 7 CFR 281.4 - Determining Indian tribal organization capability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....4 Section 281.4 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM ADMINISTRATION OF THE FOOD STAMP PROGRAM ON INDIAN RESERVATIONS § 281.4 Determining Indian tribal organization capability. (a...

  6. Developing Soft Power Using Afloat Medical Capability

    DTIC Science & Technology

    2009-03-02

    the National Security Strategy. Depending on its program employment , it has the capability to effectively combine the other elements of national power...Strategy through the employment of combatant commanders’ Theater Security Cooperation (TSC) Program in their area of responsibility. The TSC program is...In the final phases of Pacific campaign during World War II, tactical doctrine for employment of Navy hospital vessels changed, allowing them to

  7. Implementation of a Mentor-Protege Program by a Major Department of Defense Contractor

    DTIC Science & Technology

    1991-06-01

    Department of Defense contractors to furnish disadvantaged small business concerns with assistance designed to enhance their capabilities to perform...implementation by analyzing the perceptions of one large DoD contractor and the small disadvantaged business community regarding the Mentor-Protege program...offered by this program would be effective in improving the capabilities of small disadvantaged businesses. There are, however, several barriers present

  8. Educational Experiences of Embry-Riddle Students through NASA Research Collaboration

    NASA Technical Reports Server (NTRS)

    Schlee, Keith; Gangadharan, Sathya; Chatman, Yadira; Sudermann, James; Walker, Charles; Ristow, James

    2006-01-01

    NASA's educational. programs benefit students and faculty while increasing the overall productivity of the organization. The NASA Graduate Student Research Program (GSRP) awards fellowships for graduate study leading to both masters and doctoral degrees in several technical fields. GSRP participants have the option to utilize NASA Centers andlor university research facilities. In addition, GSRP students can serve as mentors for undergrad students to provide a truly unique learning experience. NASA's Cooperative Education Program allows undergraduate students the chance to gain "real-world" work experience in the field. It also gives NASA a no risk capability to evaluate the true performance of a prospective new hire without relying solely on a "paper resume" while providing the students with a greater hiring potential upon graduation, at NASA or elsewhere. University faculty can also benefit by participating in the NASA Faculty Fellowship Program (NFFP). This program gives the faculty an opportunity to work with NASA peers. The Mission Analysis Branch of the Expendable Launch Vehicles Division at NASA Kennedy Space Center has utilized these two programs with students from Embry-Riddle Aeronautical University (ERAU) to conduct research in modeling and developing a parameter estimation method for spacecraft fuel slosh using simple pendulum analogs. Simple pendulum models are used to understand complicated spacecraft fuel slosh behavior. A robust parameter estimation process will help to identif' the parameters that will predict the response fairly accurately during the initial stages of design. These programs provide students with a unique opportunity to work on "real-world" aerospace problems, like spacecraft fuel slosh,. This in turn reinforces their problem solving abilities and their communication skills such as interviewing, resume writing, technical writing, and presentation. Faculty benefits by applying what they have learned to the classroom. Through university collaborations with NASA and industry help students to acquire skills that are vital for their success upon entering the workforce.

  9. Digital PIV (DPIV) Software Analysis System

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  10. The Capabilities of the Graphical Observation Scheduling System (GROSS) as Used by the Astro-2 Spacelab Mission

    NASA Technical Reports Server (NTRS)

    Phillips, Shaun

    1996-01-01

    The Graphical Observation Scheduling System (GROSS) and its functionality and editing capabilities are reported on. The GROSS system was developed as a replacement for a suite of existing programs and associated processes with the aim of: providing a software tool that combines the functionality of several of the existing programs, and provides a Graphical User Interface (GUI) that gives greater data visibility and editing capabilities. It is considered that the improved editing capability provided by this approach enhanced the efficiency of the second astronomical Spacelab mission's (ASTRO-2) mission planning.

  11. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Astrophysics Data System (ADS)

    Godines, Cody R.; Manteufel, Randall D.

    2002-12-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  12. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Technical Reports Server (NTRS)

    Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)

    2002-01-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  13. Parameter estimation accuracies of Galactic binaries with eLISA

    NASA Astrophysics Data System (ADS)

    Błaut, Arkadiusz

    2018-09-01

    We study parameter estimation accuracy of nearly monochromatic sources of gravitational waves with the future eLISA-like detectors. eLISA will be capable of observing millions of such signals generated by orbiting pairs of compact binaries consisting of white dwarf, neutron star or black hole and to resolve and estimate parameters of several thousands of them providing crucial information regarding their orbital dynamics, formation rates and evolutionary paths. Using the Fisher matrix analysis we compare accuracies of the estimated parameters for different mission designs defined by the GOAT advisory team established to asses the scientific capabilities and the technological issues of the eLISA-like missions.

  14. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  15. Effect of present technology on airship capabilities

    NASA Technical Reports Server (NTRS)

    Madden, R. T.

    1975-01-01

    The effect is presented of updating past airship designs using current materials and propulsion systems to determine new airship performance and productivity capabilities. New materials and power plants permit reductions in the empty weights and increases in the useful load capabilities of past airship designs. The increased useful load capability results in increased productivity for a given range, i.e., either increased payload at the same operating speed or increased operating speed for the same payload weight or combinations of both. Estimated investment costs and operating costs are presented to indicate the significant cost parameters in estimating transportation costs of payloads in cents per ton mile. Investment costs are presented considering production lots of 1, 10 and 100 units. Operating costs are presented considering flight speeds and ranges.

  16. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  17. Random harmonic analysis program, L221 (TEV156). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Miller, R. D.; Graham, M. L.

    1979-01-01

    A digital computer program capable of calculating steady state solutions for linear second order differential equations due to sinusoidal forcing functions is described. The field of application of the program, the analysis of airplane response and loads due to continuous random air turbulence, is discussed. Optional capabilities including frequency dependent input matrices, feedback damping, gradual gust penetration, multiple excitation forcing functions, and a static elastic solution are described. Program usage and a description of the analysis used are presented.

  18. The JPL Library information retrieval system

    NASA Technical Reports Server (NTRS)

    Walsh, J.

    1975-01-01

    The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.

  19. Financial Analysis of the Northeast Corridor Development Project : Volume 2. Appendixes E Through I.

    DOT National Transportation Integrated Search

    1976-11-01

    This appendix consists of two parts. The first part, Program Capability, contains a description of the capability of the program and is intended to bridge the gap between the descriptive material contained in Appendix D and the explanation of procedu...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less

  1. Sandia Laboratories technical capabilities: engineering analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundergan, C. D.

    1975-12-01

    This report characterizes the engineering analysis capabilities at Sandia Laboratories. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs. (auth)

  2. Development of Modern Performance Assessment Tools and Capabilities for Underground Disposal of Transuranic Waste at WIPP

    NASA Astrophysics Data System (ADS)

    Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.

    2014-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  3. 10 CFR 26.123 - Testing facility capabilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...

  4. 10 CFR 26.123 - Testing facility capabilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...

  5. 10 CFR 26.123 - Testing facility capabilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...

  6. 10 CFR 26.123 - Testing facility capabilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...

  7. 10 CFR 26.123 - Testing facility capabilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...

  8. Budget estimates, fiscal year 1995. Volume 1: Agency summary, human space flight, and science, aeronautics and technology

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The NASA budget request has been restructured in FY 1995 into four appropriations: human space flight; science, aeronautics, and technology; mission support; and inspector general. The human space flight appropriations provides funding for NASA's human space flight activities. This includes the on-orbit infrastructure (space station and Spacelab), transportation capability (space shuttle program, including operations, program support, and performance and safety upgrades), and the Russian cooperation program, which includes the flight activities associated with the cooperative research flights to the Russian Mir space station. These activities are funded in the following budget line items: space station, Russian cooperation, space shuttle, and payload utilization and operations. The science, aeronautics, and technology appropriations provides funding for the research and development activities of NASA. This includes funds to extend our knowledge of the earth, its space environment, and the universe and to invest in new technologies, particularly in aeronautics, to ensure the future competitiveness of the nation. These objectives are achieved through the following elements: space science, life and microgravity sciences and applications, mission to planet earth, aeronautical research and technology, advanced concepts and technology, launch services, mission communication services, and academic programs.

  9. Westinghouse Hanford Company health and safety performance report. Fourth quarter calendar year 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lansing, K.A.

    1995-03-01

    Detailed information pertaining to As Low As Reasonably Achievable/Contamination Control Improvement Project (ALARA/CCIP) activities are outlined. Improved commitment to the WHC ALARA/CCIP Program was experienced throughout FY 1994. During CY 1994, 17 of 19 sitewide ALARA performance goals were completed on or ahead of schedule. Estimated total exposure by facility for CY 1994 is listed in tables by organization code for each dosimeter frequency. Facilities/areas continue to utilize the capabilities of the RPR tracking system in conjunction with the present site management action-tracking system to manage deficiencies, trend performance, and develop improved preventive efforts. Detailed information pertaining to occupational injuries/illnessesmore » are provided. The Industrial Safety and Hygiene programs are described which have generated several key initiatives that are believed responsible for improved safety performance. A breakdown of CY 1994 occupational injuries/illnesses by type, affected body group, cause, job type, age/gender, and facility is provided. The contributing experience of each WHC division/department in attaining this significant improvement is described along with tables charting specific trends. The Radiological Control Program is on schedule to meet all RL Site Management System milestones and program commitments.« less

  10. A Model for a Single Unmanned Aircraft Systems (UAS) Program Office Managing Joint ISR Capabilities

    DTIC Science & Technology

    2017-10-01

    reduction in manning from the multiple program office structure to the new single program management model. Additional information regarding this...OFFICE MANAGING JOINT ISR CAPABILITIES by Angela E. Burris A Research Report Submitted to the Faculty In Partial Fulfillment of...research paper is to answer how a single management office could provide greater agility for unmanned aircraft systems (UAS); supporting Joint concepts

  11. Effectiveness of the Civil Aviation Security Program.

    DTIC Science & Technology

    1976-09-20

    commerce--a pr per balance appears to exist. Moreover, airline and airport security programs appear to be capable of responding to changes in the nature...Moreover, airline and airport security programs appear to be capable of responding to changes in the nature and level of current and future threats. The...delays and diversions were experienced. Airline and airport security measures continued to afford the necessary level of protection to U.S. air

  12. Improvements to the FATOLA computer program including nosewheel steering: Supplemental instruction manual

    NASA Technical Reports Server (NTRS)

    Carden, H. D.; Mcgehee, J. R.

    1978-01-01

    Modifications to a multidegree of freedom flexible aircraft take-off and landing analysis (FATOLA) computer program, which improved its simulation capabilities, are discussed, and supplemental instructions for use of the program are included. Sample analytical results which illustrate the capabilities of an added nosewheel steering option indicate consistent behavior of the airplane tracking, attitude, motions, and loads for the landing cases and steering situations which were investigated.

  13. Policy Options Analysis of Assistance to Firefighters Grant Program

    DTIC Science & Technology

    2014-03-01

    services in the grant process. The funding level, however, has been insufficient to address the unmet needs of fire services across the nation. The policy...capability, increasing regional capabilities and retaining local support for the AFG. The current approach to grant distribution was determined to provide the...The Assistance to Firefighters Grant Program (AFG) is a direct federal grant program, administered by the Department of Homeland Security, for fire

  14. Public Service Communication Satellite Program

    NASA Technical Reports Server (NTRS)

    Brown, J. P.

    1977-01-01

    The proposed NASA Public Service Communication Satellite Program consists of four different activities designed to fulfill the needs of public service sector. These are: interaction with the users, experimentation with existing satellites, development of a limited capability satellite for the earliest possible launch, and initiation of an R&D program to develop the greatly increased capability that future systems will require. This paper will discuss NASA efforts in each of these areas.

  15. DC-9 Flight Demonstration Program with Refanned JT8D Engines. Volume 3; Performance and Analysis

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The JT8D-109 engine has a sea level static, standard day bare engine takeoff thrust of 73,840 N. At sea level standard day conditions the additional thrust of the JT8D-109 results in 2,040 kg additional takeoff gross weight capability for a given field length. Range loss of the DC-9 Refan airplane for long range cruise was determined. The Refan airplane demonstrated stall, static longitudinal stability, longitudinal control, longitudinal trim, minimum control speeds, and directional control characteristics similar to the DC-9-30 production airplane and complied with airworthiness requirements. Cruise, climb, and thrust reverser performance were evaluated. Structural and dynamic ground test, flight test and analytical results substantiate Refan Program requirements that the nacelle, thrust reverser hardware, and the airplane structural modifications are flightworthy and certifiable and that the airplane meets flutter speed margins. Estimated unit cost of a DC-9 Refan retrofit program is 1.338 million in mid-1975 dollars with about an equal split in cost between airframe and engine.

  16. A geometric approach to identify cavities in particle systems

    NASA Astrophysics Data System (ADS)

    Voyiatzis, Evangelos; Böhm, Michael C.; Müller-Plathe, Florian

    2015-11-01

    The implementation of a geometric algorithm to identify cavities in particle systems in an open-source python program is presented. The algorithm makes use of the Delaunay space tessellation. The present python software is based on platform-independent tools, leading to a portable program. Its successful execution provides information concerning the accessible volume fraction of the system, the size and shape of the cavities and the group of atoms forming each of them. The program can be easily incorporated into the LAMMPS software. An advantage of the present algorithm is that no a priori assumption on the cavity shape has to be made. As an example, the cavity size and shape distributions in a polyethylene melt system are presented for three spherical probe particles. This paper serves also as an introductory manual to the script. It summarizes the algorithm, its implementation, the required user-defined parameters as well as the format of the input and output files. Additionally, we demonstrate possible applications of our approach and compare its capability with the ones of well documented cavity size estimators.

  17. Study of aerodynamic technology for VSTOL fighter/attack aircraft, phase 1

    NASA Technical Reports Server (NTRS)

    Driggers, H. H.

    1978-01-01

    A conceptual design study was performed of a vertical attitude takeoff and landing (VATOL) fighter/attack aircraft. The configuration has a close-coupled canard-delta wing, side two-dimensional ramp inlets, and two augmented turbofan engines with thrust vectoring capability. Performance and sensitivities to objective requirements were calculated. Aerodynamic characteristics were estimated based on contractor and NASA wind tunnel data. Computer simulations of VATOL transitions were performed. Successful transitions can be made, even with series post-stall instabilities, if reaction controls are properly phased. Principal aerodynamic uncertainties identified were post-stall aerodynamics, transonic aerodynamics with thrust vectoring and inlet performance in VATOL transition. A wind tunnel research program was recommended to resolve the aerodynamic uncertainties.

  18. Structural weights analysis of advanced aerospace vehicles using finite element analysis

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.; Lentz, Christopher A.; Rehder, John J.; Naftel, J. Chris; Cerro, Jeffrey A.

    1989-01-01

    A conceptual/preliminary level structural design system has been developed for structural integrity analysis and weight estimation of advanced space transportation vehicles. The system includes a three-dimensional interactive geometry modeler, a finite element pre- and post-processor, a finite element analyzer, and a structural sizing program. Inputs to the system include the geometry, surface temperature, material constants, construction methods, and aerodynamic and inertial loads. The results are a sized vehicle structure capable of withstanding the static loads incurred during assembly, transportation, operations, and missions, and a corresponding structural weight. An analysis of the Space Shuttle external tank is included in this paper as a validation and benchmark case of the system.

  19. Solar Thermal Enhanced Oil Recovery, (STEOR) Volume 1: Executive summary

    NASA Astrophysics Data System (ADS)

    Elzinga, E.; Arnold, C.; Allen, D.; Garman, R.; Joy, P.; Mitchell, P.; Shaw, H.

    1980-11-01

    Thermal enhanced oil recovery is widely used in California to aid in the production of heavy oils. Steam injection either to stimulate individual wells or to drive oil to the producing wells, is by far the major thermal process today and has been in use for over 20 years. Since steam generation at the necessary pressures (generally below 4000 kPa (580 psia)) is within the capabilities of present day solar technology, it is logical to consider the possibilities of solar thermal enhanced oil recovery (STEOR). The present project consisted of an evaluation of STEOR. Program objectives, system selection, trade-off studies, preliminary design, cost estimate, development plan, and market and economic analysis are summarized.

  20. National Nuclear Forensics Expertise Development Program

    NASA Astrophysics Data System (ADS)

    Kentis, Samantha E.; Ulicny, William D.

    2009-08-01

    Over the course of the 2009 Federal Fiscal Year the United States (U.S.) Department of Homeland Security (DHS), in partnership with the Departments of Defense (DoD) and Energy (DOE), is continuing existing programs and introducing new programs designed to maintain a highly qualified, enduring workforce capable of performing the technical nuclear forensics mission. These student and university programs are designed to recruit the best and brightest students, develop university faculty and research capabilities, and engage the national laboratories in fields of study with application in nuclear forensics. This comprehensive effort constitutes the National Nuclear Forensics Expertise Development Program.

  1. Ensuring US National Aeronautics Test Capabilities

    NASA Technical Reports Server (NTRS)

    Marshall, Timothy J.

    2010-01-01

    U.S. leadership in aeronautics depends on ready access to technologically advanced, efficient, and affordable aeronautics test capabilities. These systems include major wind tunnels and propulsion test facilities and flight test capabilities. The federal government owns the majority of the major aeronautics test capabilities in the United States, primarily through the National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD). However, changes in the Aerospace landscape, primarily the decrease in demand for testing over the last 20 years required an overarching strategy for management of these national assets. Therefore, NASA established the Aeronautics Test Program (ATP) as a two-pronged strategic initiative to: (1) retain and invest in NASA aeronautics test capabilities considered strategically important to the agency and the nation, and (2) establish a strong, high level partnership with the DoD. Test facility utilization is a critical factor for ATP because it relies on user occupancy fees to recover a substantial part of the operations costs for its facilities. Decreasing utilization is an indicator of excess capacity and in some cases low-risk redundancy (i.e., several facilities with basically the same capability and overall low utilization). However, low utilization does not necessarily translate to lack of strategic importance. Some facilities with relatively low utilization are nonetheless vitally important because of the unique nature of the capability and the foreseeable aeronautics testing needs. Unfortunately, since its inception, the customer base for ATP has continued to shrink. Utilization of ATP wind tunnels has declined by more than 50% from the FY 2006 levels. This significant decrease in customer usage is attributable to several factors, including the overall decline in new programs and projects in the aerospace sector; the impact of computational fluid dynamics (CFD) on the design, development, and research process; and the reductions in wind tunnel testing requirements within the largest consumer of ATP wind tunnel test time, the Aeronautics Research Mission Directorate (ARMD). Retirement of the Space Shuttle Program and recent perturbations of NASA's Constellation Program will exacerbate this downward trend. Therefore it is crucial that ATP periodically revisit and determine which of its test capabilities are strategically important, which qualify as low-risk redundancies that could be put in an inactive status or closed, and address the challenges associated with both sustainment and improvements to the test capabilities that must remain active. This presentation will provide an overview of the ATP vision, mission, and goals as well as the challenges and opportunities the program is facing both today and in the future. We will discuss the strategy ATP is taking over the next five years to address the National aeronautics test capability challenges and what the program will do to capitalize on its opportunities to ensure a ready, robust and relevant portfolio of National aeronautics test capabilities.

  2. Radio Science from an Optical Communications Signal

    NASA Technical Reports Server (NTRS)

    Moision, Bruce; Asmar, Sami; Oudrhiri, Kamal

    2013-01-01

    NASA is currently developing the capability to deploy deep space optical communications links. This creates the opportunity to utilize the optical link to obtain range, doppler, and signal intensity estimates. These may, in turn, be used to complement or extend the capabilities of current radio science. In this paper we illustrate the achievable precision in estimating range, doppler, and received signal intensity of an non-coherent optical link (the current state-of-the-art for a deep-space link). We provide a joint estimation algorithm with performance close to the bound. We draw comparisons to estimates based on a coherent radio frequency signal, illustrating that large gains in either precision or observation time are possible with an optical link.

  3. Concept designs for NASA's Solar Electric Propulsion Technology Demonstration Mission

    NASA Technical Reports Server (NTRS)

    Mcguire, Melissa L.; Hack, Kurt J.; Manzella, David H.; Herman, Daniel A.

    2014-01-01

    Multiple Solar Electric Propulsion Technology Demonstration Mission were developed to assess vehicle performance and estimated mission cost. Concepts ranged from a 10,000 kilogram spacecraft capable of delivering 4000 kilogram of payload to one of the Earth Moon Lagrange points in support of future human-crewed outposts to a 180 kilogram spacecraft capable of performing an asteroid rendezvous mission after launched to a geostationary transfer orbit as a secondary payload. Low-cost and maximum Delta-V capability variants of a spacecraft concept based on utilizing a secondary payload adapter as the primary bus structure were developed as were concepts designed to be co-manifested with another spacecraft on a single launch vehicle. Each of the Solar Electric Propulsion Technology Demonstration Mission concepts developed included an estimated spacecraft cost. These data suggest estimated spacecraft costs of $200 million - $300 million if 30 kilowatt-class solar arrays and the corresponding electric propulsion system currently under development are used as the basis for sizing the mission concept regardless of launch vehicle costs. The most affordable mission concept developed based on subscale variants of the advanced solar arrays and electric propulsion technology currently under development by the NASA Space Technology Mission Directorate has an estimated cost of $50M and could provide a Delta-V capability comparable to much larger spacecraft concepts.

  4. NASA's Space Launch System (SLS) Program: Mars Program Utilization

    NASA Technical Reports Server (NTRS)

    May, Todd A.; Creech, Stephen D.

    2012-01-01

    NASA's Space Launch System is being designed for safe, affordable, and sustainable human and scientific exploration missions beyond Earth's orbit (BEO), as directed by the NASA Authorization Act of 2010 and NASA's 2011 Strategic Plan. This paper describes how the SLS can dramatically change the Mars program's science and human exploration capabilities and objectives. Specifically, through its high-velocity change (delta V) and payload capabilities, SLS enables Mars science missions of unprecedented size and scope. By providing direct trajectories to Mars, SLS eliminates the need for complicated gravity-assist missions around other bodies in the solar system, reducing mission time, complexity, and cost. SLS's large payload capacity also allows for larger, more capable spacecraft or landers with more instruments, which can eliminate the need for complex packaging or "folding" mechanisms. By offering this capability, SLS can enable more science to be done more quickly than would be possible through other delivery mechanisms using longer mission times.

  5. MODFLOW-2000, the U.S. Geological Survey modular ground-water model; user guide to the observation, sensitivity, and parameter-estimation processes and three post-processing programs

    USGS Publications Warehouse

    Hill, Mary C.; Banta, E.R.; Harbaugh, A.W.; Anderman, E.R.

    2000-01-01

    This report documents the Observation, Sensitivity, and Parameter-Estimation Processes of the ground-water modeling computer program MODFLOW-2000. The Observation Process generates model-calculated values for comparison with measured, or observed, quantities. A variety of statistics is calculated to quantify this comparison, including a weighted least-squares objective function. In addition, a number of files are produced that can be used to compare the values graphically. The Sensitivity Process calculates the sensitivity of hydraulic heads throughout the model with respect to specified parameters using the accurate sensitivity-equation method. These are called grid sensitivities. If the Observation Process is active, it uses the grid sensitivities to calculate sensitivities for the simulated values associated with the observations. These are called observation sensitivities. Observation sensitivities are used to calculate a number of statistics that can be used (1) to diagnose inadequate data, (2) to identify parameters that probably cannot be estimated by regression using the available observations, and (3) to evaluate the utility of proposed new data. The Parameter-Estimation Process uses a modified Gauss-Newton method to adjust values of user-selected input parameters in an iterative procedure to minimize the value of the weighted least-squares objective function. Statistics produced by the Parameter-Estimation Process can be used to evaluate estimated parameter values; statistics produced by the Observation Process and post-processing program RESAN-2000 can be used to evaluate how accurately the model represents the actual processes; statistics produced by post-processing program YCINT-2000 can be used to quantify the uncertainty of model simulated values. Parameters are defined in the Ground-Water Flow Process input files and can be used to calculate most model inputs, such as: for explicitly defined model layers, horizontal hydraulic conductivity, horizontal anisotropy, vertical hydraulic conductivity or vertical anisotropy, specific storage, and specific yield; and, for implicitly represented layers, vertical hydraulic conductivity. In addition, parameters can be defined to calculate the hydraulic conductance of the River, General-Head Boundary, and Drain Packages; areal recharge rates of the Recharge Package; maximum evapotranspiration of the Evapotranspiration Package; pumpage or the rate of flow at defined-flux boundaries of the Well Package; and the hydraulic head at constant-head boundaries. The spatial variation of model inputs produced using defined parameters is very flexible, including interpolated distributions that require the summation of contributions from different parameters. Observations can include measured hydraulic heads or temporal changes in hydraulic heads, measured gains and losses along head-dependent boundaries (such as streams), flows through constant-head boundaries, and advective transport through the system, which generally would be inferred from measured concentrations. MODFLOW-2000 is intended for use on any computer operating system. The program consists of algorithms programmed in Fortran 90, which efficiently performs numerical calculations and is fully compatible with the newer Fortran 95. The code is easily modified to be compatible with FORTRAN 77. Coordination for multiple processors is accommodated using Message Passing Interface (MPI) commands. The program is designed in a modular fashion that is intended to support inclusion of new capabilities.

  6. TRU Waste Management Program. Cost/schedule optimization analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, taskmore » guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions.« less

  7. 75 FR 44 - Temporary Suspension of the Population Estimates and Income Estimates Challenge Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ..., conduct research to enhance the estimates and challenge programs, and to integrate the updates from the... local governments would increase the administrative and evaluative complexity of this program for the... comparison with the population estimates, conducting research to enhance the estimates and challenge programs...

  8. Design and simulation of sensor networks for tracking Wifi users in outdoor urban environments

    NASA Astrophysics Data System (ADS)

    Thron, Christopher; Tran, Khoi; Smith, Douglas; Benincasa, Daniel

    2017-05-01

    We present a proof-of-concept investigation into the use of sensor networks for tracking of WiFi users in outdoor urban environments. Sensors are fixed, and are capable of measuring signal power from users' WiFi devices. We derive a maximum likelihood estimate for user location based on instantaneous sensor power measurements. The algorithm takes into account the effects of power control, and is self-calibrating in that the signal power model used by the location algorithm is adjusted and improved as part of the operation of the network. Simulation results to verify the system's performance are presented. The simulation scenario is based on a 1.5 km2 area of lower Manhattan, The self-calibration mechanism was verified for initial rms (root mean square) errors of up to 12 dB in the channel power estimates: rms errors were reduced by over 60% in 300 track-hours, in systems with limited power control. Under typical operating conditions with (without) power control, location rms errors are about 8.5 (5) meters with 90% accuracy within 9 (13) meters, for both pedestrian and vehicular users. The distance error distributions for smaller distances (<30 m) are well-approximated by an exponential distribution, while the distributions for large distance errors have fat tails. The issue of optimal sensor placement in the sensor network is also addressed. We specify a linear programming algorithm for determining sensor placement for networks with reduced number of sensors. In our test case, the algorithm produces a network with 18.5% fewer sensors with comparable accuracy estimation performance. Finally, we discuss future research directions for improving the accuracy and capabilities of sensor network systems in urban environments.

  9. A Study on Coexistence Capability Evaluations of the Enhanced Channel Hopping Mechanism in WBANs

    PubMed Central

    Wei, Zhongcheng; Sun, Yongmei; Ji, Yuefeng

    2017-01-01

    As an important coexistence technology, channel hopping can reduce the interference among Wireless Body Area Networks (WBANs). However, it simultaneously brings some issues, such as energy waste, long latency and communication interruptions, etc. In this paper, we propose an enhanced channel hopping mechanism that allows multiple WBANs coexisted in the same channel. In order to evaluate the coexistence performance, some critical metrics are designed to reflect the possibility of channel conflict. Furthermore, by taking the queuing and non-queuing behaviors into consideration, we present a set of analysis approaches to evaluate the coexistence capability. On the one hand, we present both service-dependent and service-independent analysis models to estimate the number of coexisting WBANs. On the other hand, based on the uniform distribution assumption and the additive property of Possion-stream, we put forward two approximate methods to compute the number of occupied channels. Extensive simulation results demonstrate that our estimation approaches can provide an effective solution for coexistence capability estimation. Moreover, the enhanced channel hopping mechanism can significantly improve the coexistence capability and support a larger arrival rate of WBANs. PMID:28098818

  10. Graphical Visualization of Human Exploration Capabilities

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description of planned future work to modify the computer program to include additional data and of alternate capability roadmap formats currently under consideration.

  11. Surgical Capabilities for Exploration and Colonization Space Flight - An Exploratory Symposium

    NASA Technical Reports Server (NTRS)

    Pantalos, George; Strangman, Gary; Doarn, Charles R.; Broderick, Timothy; Antonsen, Erik

    2015-01-01

    Identify realistic and achievable pathways for surgical capabilities during exploration and colonization space operations and develop a list of recommendations to the NASA Human Research Program to address challenges to developing surgical capabilities.

  12. Computer routine adds plotting capabilities to existing programs

    NASA Technical Reports Server (NTRS)

    Harris, J. C.; Linnekin, J. S.

    1966-01-01

    PLOTAN, a generalized plot analysis routine written for the IBM 7094 computer, minimizes the difficulties in adding plot capabilities to large existing programs. PLOTAN is used in conjunction with a binary tape writing routine and has the ability to plot any variable on the intermediate binary tape as a function of any other.

  13. Guide for Teacher Preparation in Driver Education. Driving School Edition.

    ERIC Educational Resources Information Center

    National Highway Traffic Safety Administration (DOT), Washington, DC.

    This guide provides operators of driving schools with a body of information and procedures that will enable them to develop programs capable of meeting the particular needs of their students, and capable of being administered within the resources available to them. The guide does not attempt to prescribe one specific instructional program but…

  14. Mariner Jupiter/Saturn LCSSE thruster/valve assembly and injection propulsion unit rocket engine assemblies: 0.2-lbf T/VA development and margin limit test report

    NASA Technical Reports Server (NTRS)

    Clark, E. C.

    1975-01-01

    Thruster valve assemblies (T/VA's) were subjected to the development test program for the combined JPL Low-Cost Standardized Spacecraft Equipment (LCSSE) and Mariner Jupiter/Saturn '77 spacecraft (MJS) programs. The development test program was designed to achieve the following program goals: (1) demonstrate T/VA design compliance with JPL Specifications, (2) to conduct a complete performance Cf map of the T/VA over the full operating range of environment, (3) demonstrate T/VA life capability and characteristics of life margin for steady-state limit cycle and momentum wheel desaturation duty cycles, (4) verification of structural design capability, and (5) generate a computerized performance model capable of predicting T/VA operation over pressures ranging from 420 to 70 psia, propellant temperatures ranging from 140 F to 40 F, pulse widths of 0.008 to steady-state operation with unlimited duty cycle capability, and finally predict the transient performance associated with reactor heatup during any given duty cycle, start temperature, feed pressure, and propellant temperature conditions.

  15. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  16. Navy/Marine Corps innovative science and technology developments for future enhanced mine detection capabilities

    NASA Astrophysics Data System (ADS)

    Holloway, John H., Jr.; Witherspoon, Ned H.; Miller, Richard E.; Davis, Kenn S.; Suiter, Harold R.; Hilton, Russell J.

    2000-08-01

    JMDT is a Navy/Marine Corps 6.2 Exploratory Development program that is closely coordinated with the 6.4 COBRA acquisition program. The objective of the program is to develop innovative science and technology to enhance future mine detection capabilities. The objective of the program is to develop innovative science and technology to enhance future mine detection capabilities. Prior to transition to acquisition, the COBRA ATD was extremely successful in demonstrating a passive airborne multispectral video sensor system operating in the tactical Pioneer unmanned aerial vehicle (UAV), combined with an integrated ground station subsystem to detect and locate minefields from surf zone to inland areas. JMDT is investigating advanced technology solutions for future enhancements in mine field detection capability beyond the current COBRA ATD demonstrated capabilities. JMDT has recently been delivered next- generation, innovative hardware which was specified by the Coastal System Station and developed under contract. This hardware includes an agile-tuning multispectral, polarimetric, digital video camera and advanced multi wavelength laser illumination technologies to extend the same sorts of multispectral detections from a UAV into the night and over shallow water and other difficult littoral regions. One of these illumination devices is an ultra- compact, highly-efficient near-IR laser diode array. The other is a multi-wavelength range-gateable laser. Additionally, in conjunction with this new technology, algorithm enhancements are being developed in JMDT for future naval capabilities which will outperform the already impressive record of automatic detection of minefields demonstrated by the COBAR ATD.

  17. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  18. FY 1986 current fiscal year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office/RI during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance development, taskmore » monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. System models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions.« less

  19. Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation

    NASA Astrophysics Data System (ADS)

    Lim, Tae W.

    2015-06-01

    A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.

  20. Characterizing the Survey Strategy and Initial Orbit Determination Abilities of the NASA MCAT Telescope for Geosynchronous Orbital Debris Environmental Studies

    NASA Technical Reports Server (NTRS)

    Frith, James; Barker, Ed; Cowardin, Heather; Buckalew, Brent; Anz-Meado, Phillip; Lederer, Susan

    2017-01-01

    The NASA Orbital Debris Program Office (ODPO) recently commissioned the Meter Class Autonomous Telescope (MCAT) on Ascension Island with the primary goal of obtaining population statistics of the geosynchronous (GEO) orbital debris environment. To help facilitate this, studies have been conducted using MCAT's known and projected capabilities to estimate the accuracy and timeliness in which it can survey the GEO environment. A simulated GEO debris population is created and sampled at various cadences and run through the Constrained Admissible Region Multi Hypotheses Filter (CAR-MHF). The orbits computed from the results are then compared to the simulated data to assess MCAT's ability to determine accurately the orbits of debris at various sample rates. Additionally, estimates of the rate at which MCAT will be able produce a complete GEO survey are presented using collected weather data and the proposed observation data collection cadence. The specific methods and results are presented here.

  1. Thermal conductivity and emissivity measurements of uranium carbides

    NASA Astrophysics Data System (ADS)

    Corradetti, S.; Manzolaro, M.; Andrighetto, A.; Zanonato, P.; Tusseau-Nenez, S.

    2015-10-01

    Thermal conductivity and emissivity measurements on different types of uranium carbide are presented, in the context of the ActiLab Work Package in ENSAR, a project within the 7th Framework Program of the European Commission. Two specific techniques were used to carry out the measurements, both taking place in a laboratory dedicated to the research and development of materials for the SPES (Selective Production of Exotic Species) target. In the case of thermal conductivity, estimation of the dependence of this property on temperature was obtained using the inverse parameter estimation method, taking as a reference temperature and emissivity measurements. Emissivity at different temperatures was obtained for several types of uranium carbide using a dual frequency infrared pyrometer. Differences between the analyzed materials are discussed according to their compositional and microstructural properties. The obtainment of this type of information can help to carefully design materials to be capable of working under extreme conditions in next-generation ISOL (Isotope Separation On-Line) facilities for the generation of radioactive ion beams.

  2. Impact of the HITECH Act on physicians' adoption of electronic health records.

    PubMed

    Mennemeyer, Stephen T; Menachemi, Nir; Rahurkar, Saurabh; Ford, Eric W

    2016-03-01

    The Health Information Technology for Economic and Clinical Health (HITECH) Act has distributed billions of dollars to physicians as incentives for adopting certified electronic health records (EHRs) through the meaningful use (MU) program ultimately aimed at improving healthcare outcomes. The authors examine the extent to which the MU program impacted the EHR adoption curve that existed prior to the Act. Bass and Gamma Shifted Gompertz (G/SG) diffusion models of the adoption of "Any" and "Basic" EHR systems in physicians' offices using consistent data series covering 2001-2013 and 2006-2013, respectively, are estimated to determine if adoption was stimulated during either a PrePay (2009-2010) period of subsidy anticipation or a PostPay (2011-2013) period when payments were actually made. Adoption of Any EHR system may have increased by as much as 7 percentage points above the level predicted in the absence of the MU subsidies. This estimate, however, lacks statistical significance and becomes smaller or negative under alternative model specifications. No substantial effects are found for Basic systems. The models suggest that adoption was largely driven by "imitation" effects (q-coefficient) as physicians mimic their peers' technology use or respond to mandates. Small and often insignificant "innovation" effects (p-coefficient) are found suggesting little enthusiasm by physicians who are leaders in technology adoption. The authors find weak evidence of the impact of the MU program on EHR uptake. This is consistent with reports that many current EHR systems reduce physician productivity, lack data sharing capabilities, and need to incorporate other key interoperability features (e.g., application program interfaces). © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. rFRET: A comprehensive, Matlab-based program for analyzing intensity-based ratiometric microscopic FRET experiments.

    PubMed

    Nagy, Peter; Szabó, Ágnes; Váradi, Tímea; Kovács, Tamás; Batta, Gyula; Szöllősi, János

    2016-04-01

    Fluorescence or Förster resonance energy transfer (FRET) remains one of the most widely used methods for assessing protein clustering and conformation. Although it is a method with solid physical foundations, many applications of FRET fall short of providing quantitative results due to inappropriate calibration and controls. This shortcoming is especially valid for microscopy where currently available tools have limited or no capability at all to display parameter distributions or to perform gating. Since users of multiparameter flow cytometry usually apply these tools, the absence of these features in applications developed for microscopic FRET analysis is a significant limitation. Therefore, we developed a graphical user interface-controlled Matlab application for the evaluation of ratiometric, intensity-based microscopic FRET measurements. The program can calculate all the necessary overspill and spectroscopic correction factors and the FRET efficiency and it displays the results on histograms and dot plots. Gating on plots and mask images can be used to limit the calculation to certain parts of the image. It is an important feature of the program that the calculated parameters can be determined by regression methods, maximum likelihood estimation (MLE) and from summed intensities in addition to pixel-by-pixel evaluation. The confidence interval of calculated parameters can be estimated using parameter simulations if the approximate average number of detected photons is known. The program is not only user-friendly, but it provides rich output, it gives the user freedom to choose from different calculation modes and it gives insight into the reliability and distribution of the calculated parameters. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  4. Some thoughts on problems associated with various sampling media used for environmental monitoring

    USGS Publications Warehouse

    Horowitz, A.J.

    1997-01-01

    Modern analytical instrumentation is capable of measuring a variety of trace elements at concentrations down into the single or double digit parts-per-trillion (ng l-1) range. This holds for the three most common sample media currently used in environmental monitoring programs: filtered water, whole-water and separated suspended sediment. Unfortunately, current analytical capabilities have exceeded the current capacity to collect both uncontaminated and representative environmental samples. The success of any trace element monitoring program requires that this issue be both understood and addressed. The environmental monitoring of trace elements requires the collection of calendar- and event-based dissolved and suspended sediment samples. There are unique problems associated with the collection and chemical analyses of both types of sample media. Over the past 10 years, reported ambient dissolved trace element concentrations have declined. Generally, these decreases do not reflect better water quality, but rather improvements in the procedures used to collect, process, preserve and analyze these samples without contaminating them during these steps. Further, recent studies have shown that the currently accepted operational definition of dissolved constituents (material passing a 0.45 ??m membrane filter) is inadequat owing to sampling and processing artifacts. The existence of these artifacts raises questions about the generation of accurate, precise and comparable 'dissolved' trace element data. Suspended sediment and associated trace elements can display marked short- and long-term spatial and temporal variability. This implies that spatially representative samples only can be obtained by generating composites using depth- and width-integrated sampling techniques. Additionally, temporal variations have led to the view that the determination of annual trace element fluxes may require nearly constant (e.g., high-frequency) sampling and subsequent chemical analyses. Ultimately, sampling frequency for flux estimates becomes dependent on the time period of concern (daily, weekly, monthly, yearly) and the amount of acceptable error associated with these estimates.

  5. Coupled rotor/airframe vibration analysis program manual manual. Volume 1: User's and programmer's instructions

    NASA Technical Reports Server (NTRS)

    Cassarino, S.; Sopher, R.

    1982-01-01

    user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.

  6. Manned Systems Utilization Analysis. Study 2.1: Space Servicing Pilot Program Study. [for automated payloads

    NASA Technical Reports Server (NTRS)

    Wolfe, R. R.

    1975-01-01

    Space servicing automated payloads was studied for potential cost benefits for future payload operations. Background information is provided on space servicing in general, and on a pilot flight test program in particular. An fight test is recommended to demonstrate space servicing. An overall program plan is provided which builds upon the pilot program through an interim servicing capability. A multipayload servicing concept for the time when the full capability tug becomes operational is presented. The space test program is specifically designed to provide low-cost booster vehicles and a flight test platform for several experiments on a single flight.

  7. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  8. NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    NASA is working toward the first launch of the Space Launch System, a new, unmatched capability for deep space exploration with launch readiness planned for 2019. Since program start in 2011, SLS has passed several major formal design milestones, and every major element of the vehicle has produced test and flight hardware. The SLS approach to systems engineering has been key to the program's success. Key aspects of the SLS SE&I approach include: 1) minimizing the number of requirements, 2) elimination of explicit verification requirements, 3) use of certified models of subsystem capability in lieu of requirements when appropriate and 4) certification of capability beyond minimum required capability.

  9. The MAGIC of CINEMA: first in-flight science results from a miniaturised anisotropic magnetoresistive magnetometer

    NASA Astrophysics Data System (ADS)

    Archer, M. O.; Horbury, T. S.; Brown, P.; Eastwood, J. P.; Oddy, T. M.; Whiteside, B. J.; Sample, J. G.

    2015-06-01

    We present the first in-flight results from a novel miniaturised anisotropic magnetoresistive space magnetometer, MAGIC (MAGnetometer from Imperial College), aboard the first CINEMA (CubeSat for Ions, Neutrals, Electrons and MAgnetic fields) spacecraft in low Earth orbit. An attitude-independent calibration technique is detailed using the International Geomagnetic Reference Field (IGRF), which is temperature dependent in the case of the outboard sensor. We show that the sensors accurately measure the expected absolute field to within 2% in attitude mode and 1% in science mode. Using a simple method we are able to estimate the spacecraft's attitude using the magnetometer only, thus characterising CINEMA's spin, precession and nutation. Finally, we show that the outboard sensor is capable of detecting transient physical signals with amplitudes of ~ 20-60 nT. These include field-aligned currents at the auroral oval, qualitatively similar to previous observations, which agree in location with measurements from the DMSP (Defense Meteorological Satellite Program) and POES (Polar-orbiting Operational Environmental Satellites) spacecraft. Thus, we demonstrate and discuss the potential science capabilities of the MAGIC instrument onboard a CubeSat platform.

  10. Design analysis of levitation facility for space processing applications. [Skylab program, space shuttles

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Kornrumpf, W. P.; Napaluch, L. J.; Harden, J. D., Jr.; Walden, J. P.; Stockhoff, E. H.; Wouch, G.; Walker, L. H.

    1974-01-01

    Containerless processing facilities for the space laboratory and space shuttle are defined. Materials process examples representative of the most severe requirements for the facility in terms of electrical power, radio frequency equipment, and the use of an auxiliary electron beam heater were used to discuss matters having the greatest effect upon the space shuttle pallet payload interfaces and envelopes. Improved weight, volume, and efficiency estimates for the RF generating equipment were derived. Results are particularly significant because of the reduced requirements for heat rejection from electrical equipment, one of the principal envelope problems for shuttle pallet payloads. It is shown that although experiments on containerless melting of high temperature refractory materials make it desirable to consider the highest peak powers which can be made available on the pallet, total energy requirements are kept relatively low by the very fast processing times typical of containerless experiments and allows consideration of heat rejection capabilities lower than peak power demand if energy storage in system heat capacitances is considered. Batteries are considered to avoid a requirement for fuel cells capable of furnishing this brief peak power demand.

  11. Development of biomechanical models for human factors evaluations

    NASA Technical Reports Server (NTRS)

    Woolford, Barbara; Pandya, Abhilash; Maida, James

    1991-01-01

    Previewing human capabilities in a computer-aided engineering mode has assisted greatly in planning well-designed systems without the cost and time involved in mockups and engineering models. To date, the computer models have focused on such variables as field of view, accessibility and fit, and reach envelopes. Program outputs have matured from simple static pictures to animations viewable from any eyepoint. However, while kinematics models are available, there are few biomechanical models available for estimating strength and motion patterns. Those, such as Crew Chief, that are available are based on strength measurements taken in specific positions. Johnson Space Center is pursuing a biomechanical model which will use strength data collected on single joints at two or three velocities to attempt to predict compound motions of several joint simultaneously and the resulting force at the end effector. Two lines of research are coming together to produce this result. One is an attempt to use optimal control theory to predict joint motion in complex motions, and another is the development of graphical representation of human capabilities. The progress to date in this research is described.

  12. Estimating global per-capita carbon emissions with VIIRS nighttime lights satellite data

    NASA Astrophysics Data System (ADS)

    Jasmin, T.; Desai, A. R.; Pierce, R. B.

    2015-12-01

    With the launch of the Suomi National Polar-orbiting Partnership (NPP) satellite in November 2011, we now have nighttime lights remote sensing capability vastly improved over the predecessor Defense Meteorological Satellite Program (DMSP), owing to improved spatial and radiometric resolution provided by the Visible Infrared Imaging Radiometer Suite (VIIRS) Day Night Band (DNB) along with technology improvements in data transfer, processing, and storage. This development opens doors for improving novel scientific applications utilizing remotely sensed low-level visible light, for purposes ranging from estimating population to inferring factors relating to economic development. For example, the success of future international agreements to reduce greenhouse gas emissions will be dependent on mechanisms to monitor remotely for compliance. Here, we discuss implementation and evaluation of the VRCE system (VIIRS Remote Carbon Estimates), developed at the University of Wisconsin-Madison, which provides monthly independent, unbiased estimates of per-capita carbon emissions. Cloud-free global composites of Earth nocturnal lighting are generated from VIIRS DNB at full spatial resolution (750 meter). A population equation is derived from a linear regression of DNB radiance sums at state level to U.S. Census data. CO2 emissions are derived from a linear regression of VIIRS DNB radiance sums to U.S. Department of Energy emission estimates. Regional coefficients for factors such as percentage of energy use from renewable sources are factored in, and together these equations are used to generate per-capita CO2 emission estimates at the country level.

  13. RL10 Engine Ability to Transition from Atlas to Shuttle/Centaur Program

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    2015-01-01

    A key launch vehicle design feature is the ability to take advantage of new technologies while minimizing expensive and time consuming development and test programs. With successful space launch experiences and the unique features of both the National Aeronautics and Space Administration (NASA) Space Transportation System (Space Shuttle) and Atlas/Centaur programs, it became attractive to leverage these capabilities. The Shuttle/Centaur Program was created to transition the existing Centaur vehicle to be launched from the Space Shuttle cargo bay. This provided the ability to launch heaver and larger payloads, and take advantage of new unique launch operational capabilities. A successful Shuttle/Centaur Program required the Centaur main propulsion system to quickly accommodate the new operating conditions for two new Shuttle/Centaur configurations and evolve to function in the human Space Shuttle environment. This paper describes the transition of the Atlas/Centaur RL10 engine to the Shuttle/Centaur configurations; shows the unique versatility and capability of the engine; and highlights the importance of ground testing. Propulsion testing outcomes emphasize the value added benefits of testing heritage hardware and the significant impact to existing and future programs.

  14. RL10 Engine Ability to Transition from Atlas to Shuttle/Centaur Program

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    2014-01-01

    A key launch vehicle design feature is the ability to take advantage of new technologies while minimizing expensive and time consuming development and test programs. With successful space launch experiences and the unique features of both the National Aeronautics and Space Administration (NASA) Space Transportation System (Space Shuttle) and Atlas/Centaur programs, it became attractive to leverage these capabilities. The Shuttle/Centaur Program was created to transition the existing Centaur vehicle to be launched from the Space Shuttle cargo bay. This provided the ability to launch heaver and larger payloads, and take advantage of new unique launch operational capabilities. A successful Shuttle/Centaur Program required the Centaur main propulsion system to quickly accommodate the new operating conditions for two new Shuttle/Centaur configurations and evolve to function in the human Space Shuttle environment. This paper describes the transition of the Atlas/Centaur RL10 engine to the Shuttle/Centaur configurations; shows the unique versatility and capability of the engine; and highlights the importance of ground testing. Propulsion testing outcomes emphasize the value added benefits of testing heritage hardware and the significant impact to existing and future programs.

  15. Program Facilitates CMMI Appraisals

    NASA Technical Reports Server (NTRS)

    Sweetser, Wesley

    2005-01-01

    A computer program has been written to facilitate appraisals according to the methodology of Capability Maturity Model Integration (CMMI). [CMMI is a government/industry standard, maintained by the Software Engineering Institute at Carnegie Mellon University, for objectively assessing the engineering capability and maturity of an organization (especially, an organization that produces software)]. The program assists in preparation for a CMMI appraisal by providing drop-down lists suggesting required artifacts or evidence. It identifies process areas for which similar evidence is required and includes a copy feature that reduces or eliminates repetitive data entry. It generates reports to show the entire framework for reference, the appraisal artifacts to determine readiness for an appraisal, and lists of interviewees and questions to ask them during the appraisal. During an appraisal, the program provides screens for entering observations and ratings, and reviewing evidence provided thus far. Findings concerning strengths and weaknesses can be exported for use in a report or a graphical presentation. The program generates a chart showing capability level ratings of the organization. A context-sensitive Windows help system enables a novice to use the program and learn about the CMMI appraisal process.

  16. The X2000 Program: An Institutional Approach to Enabling Smaller Spacecraft

    NASA Technical Reports Server (NTRS)

    Deutsch, Les; Salvo, Chris; Woerner, Dave

    2000-01-01

    NASA's X2000 Program is important for many reasons - It develops the technology that will enable new types of deep space space exploration - It is a new, faster and cheaper process for technology infusion into NASA missions - It transfers these capabilities to US industry so they are available for future spacecraft. Many of these new capabilities are relevant to Earth missions as well X2000 will work with the NASA Goddard Space Flight Center (and others) to help make these capabilities available to a larger community.

  17. Roll Damping Characterisation Program: User Guide

    DTIC Science & Technology

    2014-06-01

    integral to conducting accurate numerical simulations of maritime platforms in support of the Australian Defence Organisation’s capability acquisition...programs and the Royal Australian Navy’s in-theatre operations and through-life capability management. This report provides detailed operational...Research Scientist with the Australian Defence Science and Technology Organisation. After graduating from the University of Tasmania with a Bachelor

  18. Protecting Your Computer from Viruses

    ERIC Educational Resources Information Center

    Descy, Don E.

    2006-01-01

    A computer virus is defined as a software program capable of reproducing itself and usually capable of causing great harm to files or other programs on the same computer. The existence of computer viruses--or the necessity of avoiding viruses--is part of using a computer. With the advent of the Internet, the door was opened wide for these…

  19. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports on developments in programs managed by the JPL Office of Telecommunications and Data Acquisition (TDA) are provided. Topics covered include: DSN advanced systems (tracking and ground-based navigation; communications, spacecraft-ground; and station control and system technology) and DSN systems implementation (capabilities for existing projects; capabilities for new projects; TDA program management and analysis; and Goldstone solar system radar).

  20. At the Edge of Translation – Materials to Program Cells for Directed Differentiation

    PubMed Central

    Arany, Praveen R; Mooney, David J

    2010-01-01

    The rapid advancement in basic biology knowledge, especially in the stem cell field, has created new opportunities to develop biomaterials capable of orchestrating the behavior of transplanted and host cells. Based on our current understanding of cellular differentiation, a conceptual framework for the use of materials to program cells in situ is presented, namely a domino versus a switchboard model, to highlight the use of single versus multiple cues in a controlled manner to modulate biological processes. Further, specific design principles of material systems to present soluble and insoluble cues that are capable of recruiting, programming and deploying host cells for various applications are presented. The evolution of biomaterials from simple inert substances used to fill defects, to the recent development of sophisticated material systems capable of programming cells in situ is providing a platform to translate our understanding of basic biological mechanisms to clinical care. PMID:20860763

  1. Program CONTRAST--A general program for the analysis of several survival or recovery rate estimates

    USGS Publications Warehouse

    Hines, J.E.; Sauer, J.R.

    1989-01-01

    This manual describes the use of program CONTRAST, which implements a generalized procedure for the comparison of several rate estimates. This method can be used to test both simple and composite hypotheses about rate estimates, and we discuss its application to multiple comparisons of survival rate estimates. Several examples of the use of program CONTRAST are presented. Program CONTRAST will run on IBM-cimpatible computers, and requires estimates of the rates to be tested, along with associated variance and covariance estimates.

  2. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  3. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  4. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    NASA Technical Reports Server (NTRS)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been proposed by the Society of Automotive Engineers (SAE). The test cases compare different probabilistic methods within NESSUS because it is important that a user can have confidence that estimates of stochastic parameters of a response will be within an acceptable error limit. For each response, the mean, standard deviation, and 0.99 percentile, are repeatedly estimated which allows confidence statements to be made for each parameter estimated, and for each method. Thus, the ability of several stochastic methods to efficiently and accurately estimate density parameters is compared using four valid test cases. While all of the reliability methods used performed quite well, for the new LHS module within NESSUS it was found that it had a lower estimation error than MC when they were used to estimate the mean, standard deviation, and 0.99 percentile of the four different stochastic responses. Also, LHS required a smaller amount of calculations to obtain low error answers with a high amount of confidence than MC. It can therefore be stated that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ and the newest LHS module is a valuable new enhancement of the program.

  5. Space shuttle propulsion estimation development verification, volume 1

    NASA Technical Reports Server (NTRS)

    Rogers, Robert M.

    1989-01-01

    The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.

  6. Variational Iterative Refinement Source Term Estimation Algorithm Assessment for Rural and Urban Environments

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.

    2016-12-01

    It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.

  7. Establishment of a Photon Data Section of the BNL National Nuclear Data Center: A preliminary proposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, A.L.; Pearlstein, S.

    1992-05-01

    It is proposed to establish a Photon Data Section (PDS) of the BNL National Nuclear Data Center (NNDC). This would be a total program encompassing both photon-atom and photon-nucleus interactions. By utilizing the existing NNDC data base management expertise and on-line access capabilities, the implementation of photon interaction data activities within the existing NNDC nuclear structure and nuclear-reaction activities can reestablish a viable photon interaction data program at minimum cost. By taking advantage of the on-line capabilities, the x-ray users' community will have access to a dynamic, state-of-the-art data base of interaction information. The proposed information base would include datamore » that presently are scattered throughout the literature usually in tabulated form. It is expected that the data bases would include at least the most precise data available in photoelectric cross sections, atomic form factors and incoherent scattering functions, anomalous scattering factors, oscillator strengths and oscillator densities, fluorescence yields, Auger electron yields, etc. It could also include information not presently available in tabulations or in existing data bases such as EXAFS (extended x-ray absorption fine structure) reference spectra, chemical bonding induced shifts in the photoelectric absorption edge, matrix corrections, x-ray Raman, and x-ray resonant Raman cross sections. The data base will also include the best estimates of the accuracy of the interaction data as it exists in the data base. It is proposed that the PDS would support computer programs written for calculating scattering cross sections for given solid angles, sample geometries, and polarization of incident x-rays, for calculating Compton profiles, and for analyzing data as in EXAFS and x-ray fluorescence.« less

  8. Establishment of a Photon Data Section of the BNL National Nuclear Data Center: A preliminary proposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, A.L.; Pearlstein, S.

    1992-05-01

    It is proposed to establish a Photon Data Section (PDS) of the BNL National Nuclear Data Center (NNDC). This would be a total program encompassing both photon-atom and photon-nucleus interactions. By utilizing the existing NNDC data base management expertise and on-line access capabilities, the implementation of photon interaction data activities within the existing NNDC nuclear structure and nuclear-reaction activities can reestablish a viable photon interaction data program at minimum cost. By taking advantage of the on-line capabilities, the x-ray users` community will have access to a dynamic, state-of-the-art data base of interaction information. The proposed information base would include datamore » that presently are scattered throughout the literature usually in tabulated form. It is expected that the data bases would include at least the most precise data available in photoelectric cross sections, atomic form factors and incoherent scattering functions, anomalous scattering factors, oscillator strengths and oscillator densities, fluorescence yields, Auger electron yields, etc. It could also include information not presently available in tabulations or in existing data bases such as EXAFS (extended x-ray absorption fine structure) reference spectra, chemical bonding induced shifts in the photoelectric absorption edge, matrix corrections, x-ray Raman, and x-ray resonant Raman cross sections. The data base will also include the best estimates of the accuracy of the interaction data as it exists in the data base. It is proposed that the PDS would support computer programs written for calculating scattering cross sections for given solid angles, sample geometries, and polarization of incident x-rays, for calculating Compton profiles, and for analyzing data as in EXAFS and x-ray fluorescence.« less

  9. An Automated Strategy for Binding-Pose Selection and Docking Assessment in Structure-Based Drug Design.

    PubMed

    Ballante, Flavio; Marshall, Garland R

    2016-01-25

    Molecular docking is a widely used technique in drug design to predict the binding pose of a candidate compound in a defined therapeutic target. Numerous docking protocols are available, each characterized by different search methods and scoring functions, thus providing variable predictive capability on a same ligand-protein system. To validate a docking protocol, it is necessary to determine a priori the ability to reproduce the experimental binding pose (i.e., by determining the docking accuracy (DA)) in order to select the most appropriate docking procedure and thus estimate the rate of success in docking novel compounds. As common docking programs use generally different root-mean-square deviation (RMSD) formulas, scoring functions, and format results, it is both difficult and time-consuming to consistently determine and compare their predictive capabilities in order to identify the best protocol to use for the target of interest and to extrapolate the binding poses (i.e., best-docked (BD), best-cluster (BC), and best-fit (BF) poses) when applying a given docking program over thousands/millions of molecules during virtual screening. To reduce this difficulty, two new procedures called Clusterizer and DockAccessor have been developed and implemented for use with some common and "free-for-academics" programs such as AutoDock4, AutoDock4(Zn), AutoDock Vina, DOCK, MpSDockZn, PLANTS, and Surflex-Dock to automatically extrapolate BD, BC, and BF poses as well as to perform consistent cluster and DA analyses. Clusterizer and DockAccessor (code available over the Internet) represent two novel tools to collect computationally determined poses and detect the most predictive docking approach. Herein an application to human lysine deacetylase (hKDAC) inhibitors is illustrated.

  10. SMP: A solid modeling program version 2.0

    NASA Technical Reports Server (NTRS)

    Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.

    1986-01-01

    The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.

  11. A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G. (Compiler)

    1993-01-01

    The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.

  12. Acoustic radiation from lined, unflanged ducts: Acoustic source distribution program

    NASA Technical Reports Server (NTRS)

    Beckemeyer, R. J.; Sawdy, D. T.

    1971-01-01

    An acoustic radiation analysis was developed to predict the far-field characteristics of fan noise radiated from an acoustically lined unflanged duct. This analysis is comprised of three modular digital computer programs which together provide a capability of accounting for the impedance mismatch at the duct exit plane. Admissible duct configurations include circular or annular, with or without an extended centerbody. This variation in duct configurations provides a capability of modeling inlet and fan duct noise radiation. The computer programs are described in detail.

  13. Helicopter crashworthiness research program

    NASA Technical Reports Server (NTRS)

    Farley, Gary L.; Boitnott, Richard L.; Carden, Huey D.

    1988-01-01

    Results are presented from the U.S. Army-Aerostructures Directorate/NASA-Langley Research Center joint research program on helicopter crashworthiness. Through the on-going research program an in-depth understanding was developed on the cause/effect relationships between material and architectural variables and the energy-absorption capability of composite material and structure. Composite materials were found to be efficient energy absorbers. Graphite/epoxy subfloor structures were more efficient energy absorbers than comparable structures fabricated from Kevlar or aluminum. An accurate method predicting the energy-absorption capability of beams was developed.

  14. Physician capability to electronically exchange clinical information, 2011.

    PubMed

    Patel, Vaishali; Swain, Matthew J; King, Jennifer; Furukawa, Michael F

    2013-10-01

    To provide national estimates of physician capability to electronically share clinical information with other providers and to describe variation in exchange capability across states and electronic health record (EHR) vendors using the 2011 National Ambulatory Medical Care Survey Electronic Medical Record Supplement. Survey of a nationally representative sample of nonfederal office-based physicians who provide direct patient care. The survey was administered by mail with telephone follow-up and had a 61% weighted response rate. The overall sample consisted of 4326 respondents. We calculated estimates of electronic exchange capability at the national and state levels, and applied multivariate analyses to examine the association between the capability to exchange different types of clinical information and physician and practice characteristics. In 2011, 55% of physicians had computerized capability to send prescriptions electronically; 67% had the capability to view lab results electronically; 42% were able to incorporate lab results into their EHR; 35% were able to send lab orders electronically; and, 31% exchanged patient clinical summaries with other providers. The strongest predictor of exchange capability is adoption of an EHR. However, substantial variation exists across geography and EHR vendors in exchange capability, especially electronic exchange of clinical summaries. In 2011, a majority of office-based physicians could exchange lab and medication data, and approximately one-third could exchange clinical summaries with patients or other providers. EHRs serve as a key mechanism by which physicians can exchange clinical data, though physicians' capability to exchange varies by vendor and by state.

  15. Genetic network inference as a series of discrimination tasks.

    PubMed

    Kimura, Shuhei; Nakayama, Satoshi; Hatakeyama, Mariko

    2009-04-01

    Genetic network inference methods based on sets of differential equations generally require a great deal of time, as the equations must be solved many times. To reduce the computational cost, researchers have proposed other methods for inferring genetic networks by solving sets of differential equations only a few times, or even without solving them at all. When we try to obtain reasonable network models using these methods, however, we must estimate the time derivatives of the gene expression levels with great precision. In this study, we propose a new method to overcome the drawbacks of inference methods based on sets of differential equations. Our method infers genetic networks by obtaining classifiers capable of predicting the signs of the derivatives of the gene expression levels. For this purpose, we defined a genetic network inference problem as a series of discrimination tasks, then solved the defined series of discrimination tasks with a linear programming machine. Our experimental results demonstrated that the proposed method is capable of correctly inferring genetic networks, and doing so more than 500 times faster than the other inference methods based on sets of differential equations. Next, we applied our method to actual expression data of the bacterial SOS DNA repair system. And finally, we demonstrated that our approach relates to the inference method based on the S-system model. Though our method provides no estimation of the kinetic parameters, it should be useful for researchers interested only in the network structure of a target system. Supplementary data are available at Bioinformatics online.

  16. Development of NSSS Thermal-Hydraulic Model for KNPEC-2 Simulator Using the Best-Estimate Code RETRAN-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kyung-Doo; Jeong, Jae-Jun; Lee, Seung-Wook

    The Nuclear Steam Supply System (NSSS) thermal-hydraulic model adopted in the Korea Nuclear Plant Education Center (KNPEC)-2 simulator was provided in the early 1980s. The reference plant for KNPEC-2 is the Yong Gwang Nuclear Unit 1, which is a Westinghouse-type 3-loop, 950 MW(electric) pressurized water reactor. Because of the limited computational capability at that time, it uses overly simplified physical models and assumptions for a real-time simulation of NSSS thermal-hydraulic transients. This may entail inaccurate results and thus, the possibility of so-called ''negative training,'' especially for complicated two-phase flows in the reactor coolant system. To resolve the problem, we developedmore » a realistic NSSS thermal-hydraulic program (named ARTS code) based on the best-estimate code RETRAN-3D. The systematic assessment of ARTS has been conducted by both a stand-alone test and an integrated test in the simulator environment. The non-integrated stand-alone test (NIST) results were reasonable in terms of accuracy, real-time simulation capability, and robustness. After successful completion of the NIST, ARTS was integrated with a 3-D reactor kinetics model and other system models. The site acceptance test (SAT) has been completed successively and confirmed to comply with the ANSI/ANS-3.5-1998 simulator software performance criteria. This paper presents our efforts for the ARTS development and some test results of the NIST and SAT.« less

  17. Computer modelling of cyclic deformation of high-temperature materials. Technical progress report, 16 November 1992-15 February 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duesbery, M.S.

    1993-02-26

    This program aims at improving current methods of lifetime assessment by building in the characteristics of the micro-mechanisms known to be responsible for damage and failure. The broad approach entails the integration and, where necessary, augmentation of the micro-scale research results currently available in the literature into a macro-scale model with predictive capability. In more detail, the program will develop a set of hierarchically structured models at different length scales, from atomic to macroscopic, at each level taking as parametric input the results of the model at the next smaller scale. In this way the known microscopic properties can bemore » transported by systematic procedures to the unknown macro-scale region. It may not be possible to eliminate empiricism completely, because some of the quantities involved cannot yet be estimated to the required degree of precision. In this case the aim will be at least to eliminate functional empiricism.« less

  18. Efficient Convex Optimization for Energy-Based Acoustic Sensor Self-Localization and Source Localization in Sensor Networks.

    PubMed

    Yan, Yongsheng; Wang, Haiyan; Shen, Xiaohong; Leng, Bing; Li, Shuangquan

    2018-05-21

    The energy reading has been an efficient and attractive measure for collaborative acoustic source localization in practical application due to its cost saving in both energy and computation capability. The maximum likelihood problems by fusing received acoustic energy readings transmitted from local sensors are derived. Aiming to efficiently solve the nonconvex objective of the optimization problem, we present an approximate estimator of the original problem. Then, a direct norm relaxation and semidefinite relaxation, respectively, are utilized to derive the second-order cone programming, semidefinite programming or mixture of them for both cases of sensor self-location and source localization. Furthermore, by taking the colored energy reading noise into account, several minimax optimization problems are formulated, which are also relaxed via the direct norm relaxation and semidefinite relaxation respectively into convex optimization problems. Performance comparison with the existing acoustic energy-based source localization methods is given, where the results show the validity of our proposed methods.

  19. Advanced space engine preliminary design

    NASA Technical Reports Server (NTRS)

    Cuffe, J. P. B.; Bradie, R. E.

    1973-01-01

    A preliminary design was completed for an O2/H2, 89 kN (20,000 lb) thrust staged combustion rocket engine that has a single-bell nozzle with an overall expansion ratio of 400:1. The engine has a best estimate vacuum specific impulse of 4623.8 N-s/kg (471.5 sec) at full thrust and mixture ratio = 6.0. The engine employs gear-driven, low pressure pumps to provide low NPSH capability while individual turbine-driven, high-speed main pumps provide the system pressures required for high-chamber pressure operation. The engine design dry weight for the fixed-nozzle configuration is 206.9 kg (456.3 lb). Engine overall length is 234 cm (92.1 in.). The extendible nozzle version has a stowed length of 141.5 cm (55.7 in.). Critical technology items in the development of the engine were defined. Development program plans and their costs for development, production, operation, and flight support of the ASE were established for minimum cost and minimum time programs.

  20. Analytical study of striated nozzle flow with small radius of curvature ratio throats

    NASA Technical Reports Server (NTRS)

    Norton, D. J.; White, R. E.

    1972-01-01

    An analytical method was developed which is capable of estimating the chamber and throat conditions in a nozzle with a low radius of curvature throat. The method was programmed using standard FORTRAN 4 language and includes chemical equilibrium calculation subprograms (modified NASA Lewis program CEC71) as an integral part. The method determines detailed and gross rocket characteristics in the presence of striated flows and gives detailed results for the motor chamber and throat plane with as many as 20 discrete zones. The method employs a simultaneous solution of the mass, momentum, and energy equations and allows propellant types, 0/F ratios, propellant distribution, nozzle geometry, and injection schemes to be varied so to predict spatial velocity, density, pressure, and other thermodynamic variable distributions in the chamber as well as the throat. Results for small radius of curvature have shown good comparison to experimental results. Both gaseous and liquid injection may be considered with frozen or equilibrium flow calculations.

  1. Applications of satellite ocean color sensors for monitoring and predicting harmful algal blooms

    USGS Publications Warehouse

    Stumpf, Richard P.

    2001-01-01

    The new satellite ocean color sensors offer a means of detecting and monitoring algal blooms in the ocean and coastal zone. Beginning with SeaWiFS (Sea Wide Field-of-view Sensor) in September 1997, these sensors provide coverage every 1 to 2 days with 1-km pixel view at nadir. Atmospheric correction algorithms designed for the coastal zone combined with regional chlorophyll algorithms can provide good and reproducible estimates of chlorophyll, providing the means of monitoring various algal blooms. Harmful algal blooms (HABs) caused by Karenia brevis in the Gulf of Mexico are particularly amenable to remote observation. The Gulf of Mexico has relatively clear water and K. brevis, in bloom conditions, tends to produce a major portion of the phytoplankton biomass. A monitoring program has begun in the Gulf of Mexico that integrates field data from state monitoring programs with satellite imagery, providing an improved capability for the monitoring of K. brevis blooms.

  2. Efficient Convex Optimization for Energy-Based Acoustic Sensor Self-Localization and Source Localization in Sensor Networks

    PubMed Central

    Yan, Yongsheng; Wang, Haiyan; Shen, Xiaohong; Leng, Bing; Li, Shuangquan

    2018-01-01

    The energy reading has been an efficient and attractive measure for collaborative acoustic source localization in practical application due to its cost saving in both energy and computation capability. The maximum likelihood problems by fusing received acoustic energy readings transmitted from local sensors are derived. Aiming to efficiently solve the nonconvex objective of the optimization problem, we present an approximate estimator of the original problem. Then, a direct norm relaxation and semidefinite relaxation, respectively, are utilized to derive the second-order cone programming, semidefinite programming or mixture of them for both cases of sensor self-location and source localization. Furthermore, by taking the colored energy reading noise into account, several minimax optimization problems are formulated, which are also relaxed via the direct norm relaxation and semidefinite relaxation respectively into convex optimization problems. Performance comparison with the existing acoustic energy-based source localization methods is given, where the results show the validity of our proposed methods. PMID:29883410

  3. Preliminary performance estimates of an oblique, all-wing, remotely piloted vehicle for air-to-air combat

    NASA Technical Reports Server (NTRS)

    Nelms, W. P., Jr.; Bailey, R. O.

    1974-01-01

    A computerized aircraft synthesis program has been used to assess the effects of various vehicle and mission parameters on the performance of an oblique, all-wing, remotely piloted vehicle (RPV) for the highly maneuverable, air-to-air combat role. The study mission consists of an outbound cruise, an acceleration phase, a series of subsonic and supersonic turns, and a return cruise. The results are presented in terms of both the required vehicle weight to accomplish this mission and the combat effectiveness as measured by turning and acceleration capability. This report describes the synthesis program, the mission, the vehicle, and results from sensitivity studies. An optimization process has been used to establish the nominal RPV configuration of the oblique, all-wing concept for the specified mission. In comparison to a previously studied conventional wing-body canard design for the same mission, this oblique, all-wing nominal vehicle is lighter in weight and has higher performance.

  4. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  5. Wastewater reclamation and recharge: A water management strategy for Albuquerque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorder, P.J.; Brunswick, R.J.; Bockemeier, S.W.

    1995-12-31

    Approximately 61,000 acre-feet of the pumped water is annually discharged to the Rio Grande as treated wastewater. Albuquerque`s Southside Water Reclamation Plant (SWRP) is the primary wastewater treatment facility for most of the Albuquerque area. Its current design capacity is 76 million gallons per day (mgd), which is expected to be adequate until about 2004. A master plan currently is being prepared (discussed here in Wastewater Master Planning and the Zero Discharge Concept section) to provide guidelines for future expansions of the plant and wastewater infrastructure. Construction documents presently are being prepared to add ammonia and nitrogen removal capability tomore » the plant, as required by its new discharge permit. The paper discusses water management strategies, indirect potable reuse for Albuquerque, water quality considerations for indirect potable reuse, treatment for potable reuse, geohydrological aspects of a recharge program, layout and estimated costs for a conceptual reclamation and recharge system, and work to be accomplished under phase 2 of the reclamation and recharge program.« less

  6. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  7. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  8. Fine tuning GPS clock estimation in the MCS

    NASA Technical Reports Server (NTRS)

    Hutsell, Steven T.

    1995-01-01

    With the completion of a 24 operational satellite constellation, GPS is fast approaching the critical milestone, Full Operational Capability (FOC). Although GPS is well capable of providing the timing accuracy and stability figures required by system specifications, the GPS community will continue to strive for further improvements in performance. The GPS Master Control Station (MCS) recently demonstrated that timing improvements are always composite Clock, and hence, Kalman Filter state estimation, providing a small improvement to user accuracy.

  9. Time concurrency/phase-time synchronization in digital communications networks

    NASA Technical Reports Server (NTRS)

    Kihara, Masami; Imaoka, Atsushi

    1990-01-01

    Digital communications networks have the intrinsic capability of time synchronization which makes it possible for networks to supply time signals to some applications and services. A practical estimation method for the time concurrency on terrestrial networks is presented. By using this method, time concurrency capability of the Nippon Telegraph and Telephone Corporation (NTT) digital communications network is estimated to be better than 300 ns rms at an advanced level, and 20 ns rms at final level.

  10. 17 CFR 37.205 - Audit trail.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... trading; and (iv) Identification of each account to which fills are allocated. (3) Electronic analysis capability. A swap execution facility's audit trail program shall include electronic analysis capability with respect to all audit trail data in the transaction history database. Such electronic analysis capability...

  11. Validation and Error Characterization for the Global Precipitation Measurement

    NASA Technical Reports Server (NTRS)

    Bidwell, Steven W.; Adams, W. J.; Everett, D. F.; Smith, E. A.; Yuter, S. E.

    2003-01-01

    The Global Precipitation Measurement (GPM) is an international effort to increase scientific knowledge on the global water cycle with specific goals of improving the understanding and the predictions of climate, weather, and hydrology. These goals will be achieved through several satellites specifically dedicated to GPM along with the integration of numerous meteorological satellite data streams from international and domestic partners. The GPM effort is led by the National Aeronautics and Space Administration (NASA) of the United States and the National Space Development Agency (NASDA) of Japan. In addition to the spaceborne assets, international and domestic partners will provide ground-based resources for validating the satellite observations and retrievals. This paper describes the validation effort of Global Precipitation Measurement to provide quantitative estimates on the errors of the GPM satellite retrievals. The GPM validation approach will build upon the research experience of the Tropical Rainfall Measuring Mission (TRMM) retrieval comparisons and its validation program. The GPM ground validation program will employ instrumentation, physical infrastructure, and research capabilities at Supersites located in important meteorological regimes of the globe. NASA will provide two Supersites, one in a tropical oceanic and the other in a mid-latitude continental regime. GPM international partners will provide Supersites for other important regimes. Those objectives or regimes not addressed by Supersites will be covered through focused field experiments. This paper describes the specific errors that GPM ground validation will address, quantify, and relate to the GPM satellite physical retrievals. GPM will attempt to identify the source of errors within retrievals including those of instrument calibration, retrieval physical assumptions, and algorithm applicability. With the identification of error sources, improvements will be made to the respective calibration, assumption, or algorithm. The instrumentation and techniques of the Supersites will be discussed. The GPM core satellite, with its dual-frequency radar and conically scanning radiometer, will provide insight into precipitation drop-size distributions and potentially increased measurement capabilities of light rain and snowfall. The ground validation program will include instrumentation and techniques commensurate with these new measurement capabilities.

  12. Clients' experiences of a community based lifestyle modification program: a qualitative study.

    PubMed

    Chan, Ruth S M; Lok, Kris Y W; Sea, Mandy M M; Woo, Jean

    2009-10-01

    There is little information about how clients attending lifestyle modification programs view the outcomes. This qualitative study examined the clients' experience of a community based lifestyle modification program in Hong Kong. Semi-structured interviews were conducted with 25 clients attending the program. Clients perceived the program had positive impacts on their health and nutrition knowledge. They experienced frustration, negative emotion, lack of motivation, and pressure from others during the program. Working environment and lack of healthy food choices in restaurants were the major perceived environmental barriers for lifestyle modification. Clients valued nutritionists' capability of providing professional information and psychological support in the program. Our results suggest that nutritionist's capability of providing quality consultations and patient-centered care are important for empowering clients achieve lifestyle modification.

  13. Automated delay estimation at signalized intersections : phase I concept and algorithm development.

    DOT National Transportation Integrated Search

    2011-07-01

    Currently there are several methods to measure the performance of surface streets, but their capabilities in dynamically estimating vehicle delay are limited. The objective of this research is to develop a method to automate traffic delay estimation ...

  14. Pre- and postprocessing techniques for determining goodness of computational meshes

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Westermann, T.; Bass, J. M.

    1993-01-01

    Research in error estimation, mesh conditioning, and solution enhancement for finite element, finite difference, and finite volume methods has been incorporated into AUDITOR, a modern, user-friendly code, which operates on 2D and 3D unstructured neutral files to improve the accuracy and reliability of computational results. Residual error estimation capabilities provide local and global estimates of solution error in the energy norm. Higher order results for derived quantities may be extracted from initial solutions. Within the X-MOTIF graphical user interface, extensive visualization capabilities support critical evaluation of results in linear elasticity, steady state heat transfer, and both compressible and incompressible fluid dynamics.

  15. Can Citizen Science Assist in Determining Koala (Phascolarctos cinereus) Presence in a Declining Population?

    PubMed

    Flower, Emily; Jones, Darryl; Bernede, Lilia

    2016-07-14

    The acceptance and application of citizen science has risen over the last 10 years, with this rise likely attributed to an increase in public awareness surrounding anthropogenic impacts affecting urban ecosystems. Citizen science projects have the potential to expand upon data collected by specialist researchers as they are able to gain access to previously unattainable information, consequently increasing the likelihood of an effective management program. The primary objective of this research was to develop guidelines for a successful regional-scale citizen science project following a critical analysis of 12 existing citizen science case studies. Secondly, the effectiveness of these guidelines was measured through the implementation of a citizen science project, Koala Quest, for the purpose of estimating the presence of koalas in a fragmented landscape. Consequently, this research aimed to determine whether citizen-collected data can augment traditional science research methods, by comparing and contrasting the abundance of koala sightings gathered by citizen scientists and professional researchers. Based upon the guidelines developed, Koala Quest methodologies were designed, the study conducted, and the efficacy of the project assessed. To combat the high variability of estimated koala populations due to differences in counting techniques, a national monitoring and evaluation program is required, in addition to a standardised method for conducting koala population estimates. Citizen science is a useful method for monitoring animals such as the koala, which are sparsely distributed throughout a vast geographical area, as the large numbers of volunteers recruited by a citizen science project are capable of monitoring a similarly broad spatial range.

  16. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  17. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  18. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D environment has considerable potential in the field of software engineering.

  19. Ada Linear-Algebra Program

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.; Lawson, C. L.

    1988-01-01

    Routines provided for common scalar, vector, matrix, and quaternion operations. Computer program extends Ada programming language to include linear-algebra capabilities similar to HAS/S programming language. Designed for such avionics applications as software for Space Station.

  20. Mars Sample Return Landed with Red Dragon

    NASA Technical Reports Server (NTRS)

    Stoker, Carol R.; Lemke, Lawrence G.

    2013-01-01

    A Mars Sample Return (MSR) mission is the highest priority science mission for the next decade as recommended by the recent Decadal Survey of Planetary Science. However, an affordable program to carry this out has not been defined. This paper describes a study that examined use of emerging commercial capabilities to land the sample return elements, with the goal of reducing mission cost. A team at NASA Ames examined the feasibility of the following scenario for MSR: A Falcon Heavy launcher injects a SpaceX Dragon crew capsule and trunk onto a Trans Mars Injection trajectory. The capsule is modified to carry all the hardware needed to return samples collected on Mars including a Mars Ascent Vehicle (MAV), an Earth Return Vehicle (ERV) and Sample Collection and Storage hardware. The Dragon descends to land on the surface of Mars using SuperSonic Retro Propulsion (SSRP) as described by Braun and Manning [IEEEAC paper 0076, 2005]. Samples are acquired and deliverd to the MAV by a prelanded asset, possibly the proposed 2020 rover. After samples are obtained and stored in the ERV, the MAV launches the sample-containing ERV from the surface of Mars. We examined cases where the ERV is delivered to either low Mars orbit (LMO), C3 = 0 (Mars escape), or an intermediate energy state. The ERV then provides the rest of the energy (delta V) required to perform trans-Earth injection (TEI), cruise, and insertion into a Moon-trailing Earth Orbit (MTEO). A later mission, possibly a crewed Dragon launched by a Falcon Heavy (not part of the current study) retrieves the sample container, packages the sample, and performs a controlled Earth re-entry to prevent Mars materials from accidentally contaminating Earth. The key analysis methods used in the study employed a set of parametric mass estimating relationships (MERs) and standard aerospace analysis software codes modified for the MAV class of launch vehicle to determine the range of performance parameters that produced converged spacecraft designs capable of meeting mission requirements. Subsystems modeled in this study included structures, power system, propulsion system, nose fairing, thermal insulation, actuation devices, and GN&C. Best practice application of loads and design margins for all resources were used. Both storable and cryogenic propellant systems were examined. The landed mass and lander capsule size provide boundary conditions for the MAV design and packaging. We estimated the maximum mass the Dragon capsule is capable of landing. This and the volume capability to store the MAV was deduced from publically available data from SpaceX as well as our own engineering and aerodynamic estimates. Minimum gross-liftoff mass (GLOM) for the MAV were obtained for configurations that used pump-fed storable bi-propellant rocket engines for both the MAV and the ERV stage. The GLOM required fits within our internal estimate of the mass that Dragon can land at low elevation/optimal seasons on Mars. Based on the analysis, we show that a single Mars launch sample return mission is feasible using current commercial capabilities to deliver the return spacecraft assets.

  1. The Future of Nuclear Archaeology: Reducing Legacy Risks of Weapons Fissile Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Thomas W.; Reid, Bruce D.; Toomey, Christopher M.

    2014-01-01

    This report describes the value proposition for a "nuclear archeological" technical capability and applications program, targeted at resolving uncertainties regarding fissile materials production and use. At its heart, this proposition is that we can never be sure that all fissile material is adequately secure without a clear idea of what "all" means, and that uncertainty in this matter carries risk. We argue that this proposition is as valid today, under emerging state and possible non-state nuclear threats, as it was in an immediate post-Cold-War context, and describe how nuclear archeological methods can be used to verify fissile materials declarations, ormore » estimate and characterize historical fissile materials production independently of declarations.« less

  2. CHAMP (Camera, Handlens, and Microscope Probe)

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Boynton, John E.; Balzer, Mark A.; Beegle, Luther; Sobel, Harold R.; Fisher, Ted; Klein, Dan; Deans, Matthew; Lee, Pascal; Sepulveda, Cesar A.

    2005-01-01

    CHAMP (Camera, Handlens And Microscope Probe)is a novel field microscope capable of color imaging with continuously variable spatial resolution from infinity imaging down to diffraction-limited microscopy (3 micron/pixel). As a robotic arm-mounted imager, CHAMP supports stereo imaging with variable baselines, can continuously image targets at an increasing magnification during an arm approach, can provide precision rangefinding estimates to targets, and can accommodate microscopic imaging of rough surfaces through a image filtering process called z-stacking. CHAMP was originally developed through the Mars Instrument Development Program (MIDP) in support of robotic field investigations, but may also find application in new areas such as robotic in-orbit servicing and maintenance operations associated with spacecraft and human operations. We overview CHAMP'S instrument performance and basic design considerations below.

  3. LH2 airport requirements study

    NASA Technical Reports Server (NTRS)

    Brewer, G. D. (Editor)

    1976-01-01

    A preliminary assessment of the facilities and equipment which will be required at a representative airport is provided so liquid hydrogen LH2 can be used as fuel in long range transport aircraft in 1995-2000. A complete facility was conceptually designed, sized to meet the projected air traffic requirement. The facility includes the liquefaction plant, LH2, storage capability, and LH2 fuel handling system. The requirements for ground support and maintenance for the LH2 fueled aircraft were analyzed. An estimate was made of capital and operating costs which might be expected for the facility. Recommendations were made for design modifications to the reference aircraft, reflecting results of the analysis of airport fuel handling requirements, and for a program of additional technology development for air terminal related items.

  4. Space and radiation protection: scientific requirements for space research

    NASA Technical Reports Server (NTRS)

    Schimmerling, W.

    1995-01-01

    Ionizing radiation poses a significant risk to humans living and working in space. The major sources of radiation are solar disturbances and galactic cosmic rays. The components of this radiation are energetic charged particles, protons, as well as fully ionized nuclei of all elements. The biological effects of these particles cannot be extrapolated in a straightforward manner from available data on x-rays and gamma-rays. A radiation protection program that meets the needs of spacefaring nations must have a solid scientific basis, capable not only of predicting biological effects, but also of making reliable estimates of the uncertainty in these predictions. A strategy leading to such predictions is proposed, and scientific requirements arising from this strategy are discussed.

  5. Monitoring spacecraft atmosphere contaminants by laser absorption spectroscopy

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.

    1976-01-01

    Laser-based spectrophotometric methods which have been proposed for the detection of trace concentrations of gaseous contaminants include Raman backscattering (LIDAR) and passive radiometry (LOPAIR). Remote sensing techniques using laser spectrometry are presented and in particular a simple long-path laser absorption method (LOLA), which is capable of resolving complex mixtures of closely related trace contaminants at ppm levels is discussed. A number of species were selected for study which are representative of those most likely to accumulate in closed environments, such as submarines or long-duration manned space flights. Computer programs were developed which will permit a real-time analysis of the monitored atmosphere. Estimates of the dynamic range of this monitoring technique for various system configurations, and comparison with other methods of analysis, are given.

  6. Flight program language requirements. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The activities and results of a study for the definition of flight program language requirements are described. A set of detailed requirements are presented for a language capable of supporting onboard application programming for the Marshall Space Flight Center's anticipated future activities in the decade of 1975-85. These requirements are based, in part, on the evaluation of existing flight programming language designs to determine the applicability of these designs to flight programming activities which are anticipated. The coding of benchmark problems in the selected programming languages is discussed. These benchmarks are in the form of program kernels selected from existing flight programs. This approach was taken to insure that the results of the study would reflect state of the art language capabilities, as well as to determine whether an existing language design should be selected for adaptation.

  7. Attitude Estimation in Fractionated Spacecraft Cluster Systems

    NASA Technical Reports Server (NTRS)

    Hadaegh, Fred Y.; Blackmore, James C.

    2011-01-01

    An attitude estimation was examined in fractioned free-flying spacecraft. Instead of a single, monolithic spacecraft, a fractionated free-flying spacecraft uses multiple spacecraft modules. These modules are connected only through wireless communication links and, potentially, wireless power links. The key advantage of this concept is the ability to respond to uncertainty. For example, if a single spacecraft module in the cluster fails, a new one can be launched at a lower cost and risk than would be incurred with onorbit servicing or replacement of the monolithic spacecraft. In order to create such a system, however, it is essential to know what the navigation capabilities of the fractionated system are as a function of the capabilities of the individual modules, and to have an algorithm that can perform estimation of the attitudes and relative positions of the modules with fractionated sensing capabilities. Looking specifically at fractionated attitude estimation with startrackers and optical relative attitude sensors, a set of mathematical tools has been developed that specify the set of sensors necessary to ensure that the attitude of the entire cluster ( cluster attitude ) can be observed. Also developed was a navigation filter that can estimate the cluster attitude if these conditions are satisfied. Each module in the cluster may have either a startracker, a relative attitude sensor, or both. An extended Kalman filter can be used to estimate the attitude of all modules. A range of estimation performances can be achieved depending on the sensors used and the topology of the sensing network.

  8. Global cross-station assessment of neuro-fuzzy models for estimating daily reference evapotranspiration

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal; Nazemi, Amir Hossein; Sadraddini, Ali Ashraf; Landeras, Gorka; Kisi, Ozgur; Fard, Ahmad Fakheri; Marti, Pau

    2013-02-01

    SummaryAccurate estimation of reference evapotranspiration is important for irrigation scheduling, water resources management and planning and other agricultural water management issues. In the present paper, the capabilities of generalized neuro-fuzzy models were evaluated for estimating reference evapotranspiration using two separate sets of weather data from humid and non-humid regions of Spain and Iran. In this way, the data from some weather stations in the Basque Country and Valencia region (Spain) were used for training the neuro-fuzzy models [in humid and non-humid regions, respectively] and subsequently, the data from these regions were pooled to evaluate the generalization capability of a general neuro-fuzzy model in humid and non-humid regions. The developed models were tested in stations of Iran, located in humid and non-humid regions. The obtained results showed the capabilities of generalized neuro-fuzzy model in estimating reference evapotranspiration in different climatic zones. Global GNF models calibrated using both non-humid and humid data were found to successfully estimate ET0 in both non-humid and humid regions of Iran (the lowest MAE values are about 0.23 mm for non-humid Iranian regions and 0.12 mm for humid regions). non-humid GNF models calibrated using non-humid data performed much better than the humid GNF models calibrated using humid data in non-humid region while the humid GNF model gave better estimates in humid region.

  9. A Proposed Strategy for the U.S. to Develop and Maintain a Mainstream Capability Suite ("Warehouse") for Automated/Autonomous Rendezvous and Docking in Low Earth Orbit and Beyond

    NASA Technical Reports Server (NTRS)

    Krishnakumar, Kalmanje S.; Stillwater, Ryan A.; Babula, Maria; Moreau, Michael C.; Riedel, J. Ed; Mrozinski, Richard B.; Bradley, Arthur; Bryan, Thomas C.

    2012-01-01

    The ability of space assets to rendezvous and dock/capture/berth is a fundamental enabler for numerous classes of NASA fs missions, and is therefore an essential capability for the future of NASA. Mission classes include: ISS crew rotation, crewed exploration beyond low-Earth-orbit (LEO), on-orbit assembly, ISS cargo supply, crewed satellite servicing, robotic satellite servicing / debris mitigation, robotic sample return, and robotic small body (e.g. near-Earth object, NEO) proximity operations. For a variety of reasons to be described, NASA programs requiring Automated/Autonomous Rendezvous and Docking/Capture/Berthing (AR&D) capabilities are currently spending an order-of-magnitude more than necessary and taking twice as long as necessary to achieve their AR&D capability, "reinventing the wheel" for each program, and have fallen behind all of our foreign counterparts in AR&D technology (especially autonomy) in the process. To ensure future missions' reliability and crew safety (when applicable), to achieve the noted cost and schedule savings by eliminate costs of continually "reinventing the wheel ", the NASA AR&D Community of Practice (CoP) recommends NASA develop an AR&D Warehouse, detailed herein, which does not exist today. The term "warehouse" is used herein to refer to a toolbox or capability suite that has pre-integrated selectable supply-chain hardware and reusable software components that are considered ready-to-fly, low-risk, reliable, versatile, scalable, cost-effective, architecture and destination independent, that can be confidently utilized operationally on human spaceflight and robotic vehicles over a variety of mission classes and design reference missions, especially beyond LEO. The CoP also believes that it is imperative that NASA coordinate and integrate all current and proposed technology development activities into a cohesive cross-Agency strategy to produce and utilize this AR&D warehouse. An initial estimate indicates that if NASA strategically coordinates the development of a robust AR&D capability across the Agency, the cost of implementing AR&D on a spacecraft could be reduced from roughly $70M per mission to as low as $7M per mission, and the associated development time could be reduced from 4 years to 2 years, after the warehouse is completely developed. Table 1 shows the clear long-term benefits to the Agency in term of costs and schedules for various missions. (The methods used to arrive at the Table 1 numbers is presented in Appendices A and B.)

  10. New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program

    NASA Technical Reports Server (NTRS)

    Strain, D.; Levy, R.

    1986-01-01

    The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.

  11. Space Launch System (SLS) Program Overview NASA Research Announcement (NRA) Advanced Booster (AB) Engineering Demonstration and Risk Reduction (EDRR) Industry Day

    NASA Technical Reports Server (NTRS)

    May, Todd A.

    2011-01-01

    SLS is a national capability that empowers entirely new exploration for missions of national importance. Program key tenets are safety, affordability, and sustainability. SLS builds on a solid foundation of experience and current capacities to enable a timely initial capability and evolve to a flexible heavy-lift capability through competitive opportunities: (1) Reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS (2) Enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability and performance The road ahead promises to be an exciting journey for present and future generations, and we look forward to working with you to continue America fs space exploration.

  12. Scoring the Icecap-a capability instrument. Estimation of a UK general population tariff.

    PubMed

    Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna

    2015-03-01

    This paper reports the results of a best-worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that 'attachment' and 'stability' each account for around 22% of the space, and 'autonomy', 'achievement' and 'enjoyment' account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd.

  13. The Renovation and Future Capabilities of the Thacher Observatory

    NASA Astrophysics Data System (ADS)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  14. Modifications to the streamtube curvature program. Volume 1: Program modifications and user's manual. [user manuals (computer programs) for transonic flow of nacelles and intake systems of turbofan engines

    NASA Technical Reports Server (NTRS)

    Ferguson, D. R.; Keith, J. S.

    1975-01-01

    The improvements which have been incorporated in the Streamtube Curvature Program to enhance both its computational and diagnostic capabilities are described. Detailed descriptions are given of the revisions incorporated to more reliably handle the jet stream-external flow interaction at trailing edges. Also presented are the augmented boundary layer procedures and a variety of other program changes relating to program diagnostics and extended solution capabilities. An updated User's Manual, that includes information on the computer program operation, usage, and logical structure, is presented. User documentation includes an outline of the general logical flow of the program and detailed instructions for program usage and operation. From the standpoint of the programmer, the overlay structure is described. The input data, output formats, and diagnostic printouts are covered in detail and illustrated with three typical test cases.

  15. The U.S. Space Grant College and Fellowship Program

    NASA Technical Reports Server (NTRS)

    Dasch, E. Julius; Schwartz, Elaine T.; Keffer, Lynne

    1990-01-01

    The U.S. NASA Space Grant College and Fellowship Program, congressionally mandated in 1987, consists of two phases. Phase I consisted of the designation of 21 university consortia as 'Space Grant Colleges/Consortia' which received support from NASA to conduct programs to achieve, maintain, and advance a balanced program of research capability, curriculum, and public service. Program descriptions for phase II are given. This phase is designed to broaden participation in the Space Grant Program by targeting states that currently are not as involved in NASA programs as are the states for which phase I was constructed. Under phase II, states will compete in either the Programs Grants or the Capability Enhancement Grants category. Only one proposal per state will be accepted with the state determining in which category it will compete. The amount of total award, $150,000, is the same in both categories and includes funds for university-administered fellowship programs.

  16. An Overview of the NASA Sounding Rocket and Balloon Programs

    NASA Technical Reports Server (NTRS)

    Eberspeaker, Philip J.; Smith, Ira S.

    2003-01-01

    The U.S. National Aeronautics and Space Administration (NASA) Sounding Rockets and Balloon Programs conduct a total of 50 to 60 missions per year in support of the NASA scientific community. These missions support investigations sponsored by NASA's Offices of Space Science, Life and Microgravity Sciences & Applications, and Earth Science. The Goddard Space Flight Center has management and implementation responsibility for these programs. The NASA Sounding Rockets Program provides the science community with payload development support, environmental testing, launch vehicles, and launch operations from fixed and mobile launch ranges. Sounding rockets continue to provide a cost-effective way to make in situ observations from 50 to 1500 km in the near-earth environment and to uniquely cover the altitude regime between 50 km and 130 km above the Earth's surface. New technology efforts include GPS payload event triggering, tailored trajectories, new vehicle configuration development to expand current capabilities, and the feasibility assessment of an ultra high altitude sounding rocket vehicle. The NASA Balloon Program continues to make advancements and developments in its capabilities for support of the scientific ballooning community. The Long Duration Balloon (LDB) is capable of providing flight durations in excess of two weeks and has had many successful flights since its development. The NASA Balloon Program is currently engaged in the development of the Ultra Long Duration Balloon (ULDB), which will be capable of providing flight times up to 100-days. Additional development efforts are focusing on ultra high altitude balloons, station keeping techniques and planetary balloon technologies.

  17. Workstations take over conceptual design

    NASA Technical Reports Server (NTRS)

    Kidwell, George H.

    1987-01-01

    Workstations provide sufficient computing memory and speed for early evaluations of aircraft design alternatives to identify those worthy of further study. It is recommended that the programming of such machines permit integrated calculations of the configuration and performance analysis of new concepts, along with the capability of changing up to 100 variables at a time and swiftly viewing the results. Computations can be augmented through links to mainframes and supercomputers. Programming, particularly debugging operations, are enhanced by the capability of working with one program line at a time and having available on-screen error indices. Workstation networks permit on-line communication among users and with persons and computers outside the facility. Application of the capabilities is illustrated through a description of NASA-Ames design efforts for an oblique wing for a jet performed on a MicroVAX network.

  18. Tri-Laboratory Linux Capacity Cluster 2007 SOW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seager, M

    2007-03-22

    The Advanced Simulation and Computing (ASC) Program (formerly know as Accelerated Strategic Computing Initiative, ASCI) has led the world in capability computing for the last ten years. Capability computing is defined as a world-class platform (in the Top10 of the Top500.org list) with scientific simulations running at scale on the platform. Example systems are ASCI Red, Blue-Pacific, Blue-Mountain, White, Q, RedStorm, and Purple. ASC applications have scaled to multiple thousands of CPUs and accomplished a long list of mission milestones on these ASC capability platforms. However, the computing demands of the ASC and Stockpile Stewardship programs also include a vastmore » number of smaller scale runs for day-to-day simulations. Indeed, every 'hero' capability run requires many hundreds to thousands of much smaller runs in preparation and post processing activities. In addition, there are many aspects of the Stockpile Stewardship Program (SSP) that can be directly accomplished with these so-called 'capacity' calculations. The need for capacity is now so great within the program that it is increasingly difficult to allocate the computer resources required by the larger capability runs. To rectify the current 'capacity' computing resource shortfall, the ASC program has allocated a large portion of the overall ASC platforms budget to 'capacity' systems. In addition, within the next five to ten years the Life Extension Programs (LEPs) for major nuclear weapons systems must be accomplished. These LEPs and other SSP programmatic elements will further drive the need for capacity calculations and hence 'capacity' systems as well as future ASC capability calculations on 'capability' systems. To respond to this new workload analysis, the ASC program will be making a large sustained strategic investment in these capacity systems over the next ten years, starting with the United States Government Fiscal Year 2007 (GFY07). However, given the growing need for 'capability' systems as well, the budget demands are extreme and new, more cost effective ways of fielding these systems must be developed. This Tri-Laboratory Linux Capacity Cluster (TLCC) procurement represents the ASC first investment vehicle in these capacity systems. It also represents a new strategy for quickly building, fielding and integrating many Linux clusters of various sizes into classified and unclassified production service through a concept of Scalable Units (SU). The programmatic objective is to dramatically reduce the overall Total Cost of Ownership (TCO) of these 'capacity' systems relative to the best practices in Linux Cluster deployments today. This objective only makes sense in the context of these systems quickly becoming very robust and useful production clusters under the crushing load that will be inflicted on them by the ASC and SSP scientific simulation capacity workload.« less

  19. Counterforce Targeting Capabilities and Challenges

    DTIC Science & Technology

    2004-08-01

    COUNTERFORCE TARGETING CAPABILITIES AND CHALLENGES by Barry R. Schneider The Counterproliferation Papers Future Warfare Series No. 22 USAF...TITLE AND SUBTITLE Counterforce Targeting Capabilities and Challenges 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Rev. 8-98) Prescribed by ANSI Std Z39-18 Counterforce Targeting Capabilities and Challenges Barry R. Schneider August 2004 The Counterproliferation

  20. Progress in Public Health Emergency Preparedness-United States, 2001-2016.

    PubMed

    Murthy, Bhavini Patel; Molinari, Noelle-Angelique M; LeBlanc, Tanya T; Vagi, Sara J; Avchen, Rachel N

    2017-09-01

    To evaluate the Public Health Emergency Preparedness (PHEP) program's progress toward meeting public health preparedness capability standards in state, local, and territorial health departments. All 62 PHEP awardees completed the Centers for Disease Control and Prevention's self-administered PHEP Impact Assessment as part of program review measuring public health preparedness capability before September 11, 2001 (9/11), and in 2014. We collected additional self-reported capability self-assessments from 2016. We analyzed trends in congressional funding for public health preparedness from 2001 to 2016. Before 9/11, most PHEP awardees reported limited preparedness capabilities, but considerable progress was reported by 2016. The number of jurisdictions reporting established capability functions within the countermeasures and mitigation domain had the largest increase, almost 200%, by 2014. However, more than 20% of jurisdictions still reported underdeveloped coordination between the health system and public health agencies in 2016. Challenges and barriers to building PHEP capabilities included lack of trained personnel, plans, and sustained resources. Considerable progress in public health preparedness capability was observed from before 9/11 to 2016. Support, sustainment, and advancement of public health preparedness capability is critical to ensure a strong public health infrastructure.

  1. Measuring Provider Performance for Physicians Participating in the Merit-Based Incentive Payment System.

    PubMed

    Squitieri, Lee; Chung, Kevin C

    2017-07-01

    In 2017, the Centers for Medicare and Medicaid Services began requiring all eligible providers to participate in the Quality Payment Program or face financial reimbursement penalty. The Quality Payment Program outlines two paths for provider participation: the Merit-Based Incentive Payment System and Advanced Alternative Payment Models. For the first performance period beginning in January of 2017, the Centers for Medicare and Medicaid Services estimates that approximately 83 to 90 percent of eligible providers will not qualify for participation in an Advanced Alternative Payment Model and therefore must participate in the Merit-Based Incentive Payment System program. The Merit-Based Incentive Payment System path replaces existing quality-reporting programs and adds several new measures to evaluate providers using four categories of data: (1) quality, (2) cost/resource use, (3) improvement activities, and (4) advancing care information. These categories will be combined to calculate a weighted composite score for each provider or provider group. Composite Merit-Based Incentive Payment System scores based on 2017 performance data will be used to adjust reimbursed payment in 2019. In this article, the authors provide relevant background for understanding value-based provider performance measurement. The authors also discuss Merit-Based Incentive Payment System reporting requirements and scoring methodology to provide plastic surgeons with the necessary information to critically evaluate their own practice capabilities in the context of current performance metrics under the Quality Payment Program.

  2. Proximity Operations Nano-Satellite Flight Demonstration (PONSFD) Rendezvous Proximity Operations Design and Trade Studies

    NASA Astrophysics Data System (ADS)

    Griesbach, J.; Westphal, J. J.; Roscoe, C.; Hawes, D. R.; Carrico, J. P.

    2013-09-01

    The Proximity Operations Nano-Satellite Flight Demonstration (PONSFD) program is to demonstrate rendezvous proximity operations (RPO), formation flying, and docking with a pair of 3U CubeSats. The program is sponsored by NASA Ames via the Office of the Chief Technologist (OCT) in support of its Small Spacecraft Technology Program (SSTP). The goal of the mission is to demonstrate complex RPO and docking operations with a pair of low-cost 3U CubeSat satellites using passive navigation sensors. The program encompasses the entire system evolution including system design, acquisition, satellite construction, launch, mission operations, and final disposal. The satellite is scheduled for launch in Fall 2015 with a 1-year mission lifetime. This paper provides a brief mission overview but will then focus on the current design and driving trade study results for the RPO mission specific processor and relevant ground software. The current design involves multiple on-board processors, each specifically tasked with providing mission critical capabilities. These capabilities range from attitude determination and control to image processing. The RPO system processor is responsible for absolute and relative navigation, maneuver planning, attitude commanding, and abort monitoring for mission safety. A low power processor running a Linux operating system has been selected for implementation. Navigation is one of the RPO processor's key tasks. This entails processing data obtained from the on-board GPS unit as well as the on-board imaging sensors. To do this, Kalman filters will be hosted on the processor to ingest and process measurements for maintenance of position and velocity estimates with associated uncertainties. While each satellite carries a GPS unit, it will be used sparsely to conserve power. As such, absolute navigation will mainly consist of propagating past known states, and relative navigation will be considered to be of greater importance. For relative observations, each spacecraft hosts 3 electro-optical sensors dedicated to imaging the companion satellite. The image processor will analyze the images to obtain estimates for range, bearing, and pose, with associated rates and uncertainties. These observations will be fed to the RPO processor's relative Kalman filter to perform relative navigation updates. This paper includes estimates for expected navigation accuracies for both absolute and relative position and velocity. Another key task for the RPO processor is maneuver planning. This includes automation to plan maneuvers to achieve a desired formation configuration or trajectory (including docking), as well as automation to safely react to potentially dangerous situations. This will allow each spacecraft to autonomously plan fuel-efficient maneuvers to achieve a desired trajectory as well as compute adjustment maneuvers to correct for thrusting errors. This paper discusses results from a trade study that has been conducted to examine maneuver targeting algorithms required on-board the spacecraft. Ground software will also work in conjunction with the on-board software to validate and approve maneuvers as necessary.

  3. Advanced helium purge seals for Liquid Oxygen (LOX) turbopumps

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur; Lee, Chester C.

    1989-01-01

    Program objectives were to determine three advanced configurations of helium buffer seals capable of providing improved performance in a space shuttle main engine (SSME), high-pressure liquid oxygen (LOX) turbopump environment, and to provide NASA with the analytical tools to determine performance of a variety of seal configurations. The three seal designs included solid-ring fluid-film seals often referred to as floating ring seals, back-to-back fluid-film face seals, and a circumferential sectored seal that incorporated inherent clearance adjustment capabilities. Of the three seals designed, the sectored seal is favored because the self-adjusting clearance features accommodate the variations in clearance that will occur because of thermal and centrifugal distortions without compromising performance. Moreover, leakage can be contained well below the maximum target values; minimizing leakage is important on the SSME since helium is provided by an external tank. A reduction in tank size translates to an increase in payload that can be carried on board the shuttle. The computer codes supplied under this program included a code for analyzing a variety of gas-lubricated, floating ring, and sector seals; a code for analyzing gas-lubricated face seals; a code for optimizing and analyzing gas-lubricated spiral-groove face seals; and a code for determining fluid-film face seal response to runner excitations in as many as five degrees of freedom. These codes proved invaluable for optimizing designs and estimating final performance of the seals described.

  4. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    NASA Astrophysics Data System (ADS)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.

  5. An external logic architecture for implementing traffic signal system control strategies.

    DOT National Transportation Integrated Search

    2011-09-01

    The built-in logic functions in traffic controllers have very limited capability to store information, to analyze input data, to estimate performance measures, and to adopt control strategy decisions. These capabilities are imperative to support traf...

  6. Lessons Learned From the Environmental Public Health Tracking Sub-County Data Pilot Project.

    PubMed

    Werner, Angela K; Strosnider, Heather; Kassinger, Craig; Shin, Mikyong

    2017-12-07

    Small area data are key to better understanding the complex relationships between environmental health, health outcomes, and risk factors at a local level. In 2014, the Centers for Disease Control and Prevention's National Environmental Public Health Tracking Program (Tracking Program) conducted the Sub-County Data Pilot Project with grantees to consider integration of sub-county data into the National Environmental Public Health Tracking Network (Tracking Network). The Tracking Program and grantees developed sub-county-level data for several data sets during this pilot project, working to standardize processes for submitting data and creating required geographies. Grantees documented challenges they encountered during the pilot project and documented decisions. This article covers the challenges revealed during the project. It includes insights into geocoding, aggregation, population estimates, and data stability and provides recommendations for moving forward. National standards for generating, analyzing, and sharing sub-county data should be established to build a system of sub-county data that allow for comparison of outcomes, geographies, and time. Increasing the availability and accessibility of small area data will not only enhance the Tracking Network's capabilities but also contribute to an improved understanding of environmental health and informed decision making at a local level.

  7. Elliptical orbit performance computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates and plots elliptical orbit performance capability of space boosters for presentation purposes is described. Orbital performance capability of space boosters is typically presented as payload weight as a function of perigee and apogee altitudes. The parameters are derived from a parametric computer simulation of the booster flight which yields the payload weight as a function of velocity and altitude at insertion. The process of converting from velocity and altitude to apogee and perigee altitude and plotting the results as a function of payload weight is mechanized with the ELOPE program. The program theory, user instruction, input/output definitions, subroutine descriptions and detailed FORTRAN coding information are included.

  8. Calculating Trajectories And Orbits

    NASA Technical Reports Server (NTRS)

    Alderson, Daniel J.; Brady, Franklyn H.; Breckheimer, Peter J.; Campbell, James K.; Christensen, Carl S.; Collier, James B.; Ekelund, John E.; Ellis, Jordan; Goltz, Gene L.; Hintz, Gerarld R.; hide

    1989-01-01

    Double-Precision Trajectory Analysis Program, DPTRAJ, and Orbit Determination Program, ODP, developed and improved over years to provide highly reliable and accurate navigation capability for deep-space missions like Voyager. Each collection of programs working together to provide desired computational results. DPTRAJ, ODP, and supporting utility programs capable of handling massive amounts of data and performing various numerical calculations required for solving navigation problems associated with planetary fly-by and lander missions. Used extensively in support of NASA's Voyager project. DPTRAJ-ODP available in two machine versions. UNIVAC version, NPO-15586, written in FORTRAN V, SFTRAN, and ASSEMBLER. VAX/VMS version, NPO-17201, written in FORTRAN V, SFTRAN, PL/1 and ASSEMBLER.

  9. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  10. Estimating Mutual Information for High-to-Low Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaud, Isaac James; Williams, Brian J.; Weaver, Brian Phillip

    Presentation shows that KSG 2 is superior to KSG 1 because it scales locally automatically; KSG estimators are limited to a maximum MI due to sample size; LNC extends the capability of KSG without onerous assumptions; iLNC allows LNC to estimate information gain.

  11. Contractor Accounting, Reporting and Estimating (CARE).

    DTIC Science & Technology

    Contractor Accounting Reporting and Estimating (CARE) provides check lists that may be used as guides in evaluating the accounting system, financial reporting , and cost estimating capabilities of the contractor. Experience gained from the Management Review Technique was used as a basis for the check lists. (Author)

  12. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individualmore » work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.« less

  13. The NASA controls-structures interaction technology program

    NASA Technical Reports Server (NTRS)

    Newsom, Jerry R.; Layman, W. E.; Waites, H. B.; Hayduk, R. J.

    1990-01-01

    The interaction between a flexible spacecraft structure and its control system is commonly referred to as controls-structures interaction (CSI). The CSI technology program is developing the capability and confidence to integrate the structure and control system, so as to avoid interactions that cause problems and to exploit interactions to increase spacecraft capability. A NASA program has been initiated to advance CSI technology to a point where it can be used in spacecraft design for future missions. The CSI technology program is a multicenter program utilizing the resources of the NASA Langley Research Center (LaRC), the NASA Marshall Space Flight Center (MSFC), and the NASA Jet Propulsion Laboratory (JPL). The purpose is to describe the current activities, results to date, and future activities of the NASA CSI technology program.

  14. Tests of transferability and validation of disaggregate behavioral demand models for evaluating the energy conservation potential of alternative transportation policies in nine US cities. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-04-01

    A transportation policy analysis methodology described in Guidelines for Travel Demand Analyses of Program Measures to Promote Carpools, Vanpools, and Public Transportation, November, 1976 (EAPA 4:1921) is demonstrated. The results reported build upon the two levels of analysis capabilities (a fully calibrated and operational computer package based on a set of disaggregate travel demand models that were estimated on a random sample of urban travelers and a manual procedure or sketch planning pivot-point version of the above methodology) and have undertaken to accomplish the following objectives: transferability, testing the manual approach on actual applications, and validating the method. The firstmore » objective was investigated by examining and comparing disaggregate models that were estimated in 7 US cities by eight different organizations. The next two objectives were investigated using separate case studies: the Washington, DC, Shirley Highway preferential transit and carpool lanes; the Portland, Oregon, Banfield Highway Expressway preferential transit and carpool lanes; the Los Angeles, Santa Monica Freeway preferential Diamond Lane and ramp metering facilities for transit and carpools; the Minneapolis, express bus on metered freeway project; and the Portland, Oregon, carpool matching and promotion programs for the general public and for employer-based groups. Principal findings are summarized and results consolidated. (MCW)« less

  15. Nutrient Tracking Tool - A user-friendly tool for evaluating the water and air quality and quantity as affected by various agricultural management practices

    NASA Astrophysics Data System (ADS)

    Saleh, A.; Niraula, R.; Gallego, O.; Osei, E.; Kannan, N.

    2017-12-01

    The Nutrient Tracking Tool (NTT) is a user-friendly web-based computer program that estimate nutrient (nitrogen and phosphorus) and sediment losses from fields managed under a variety of cropping patterns and management practices. The NTT includes a user-friendly web-based interface and is linked to the Agricultural Policy Environmental eXtender (APEX) model. It also accesses USDA-NRCS's Web Soil Survey to obtain field, weather, and soil information. NTT provides producers, government officials, and other users with a fast and efficient method of estimating the nutrient, sediment, and atmosphoric gases (N2o, Co2, and NH4) losses, and crop production under different conservation practices regims at the farm-level. The information obtained from NTT can help producers to determine the most cost-effective conservation practice(s) to reduce the nutrient and sediment losses while optimizing the crop production. Also, the recent version of NTT (NTTg3) has been developed for those coutries without access to national databasis, such as soils and wether. The NTTg3 also has been designed as easy to use APEX interface. NTT is currently being evaluated for trading and other programs at Cheaseapea Bay regions and numerous states in US. During this presentation the new capabilities of NTTg3 will be described and demonstrated.

  16. Successful Development of Generic Capabilities in an Undergraduate Medical Education Program

    ERIC Educational Resources Information Center

    McNeil, H. Patrick; Scicluna, Helen A.; Boyle, Patrick; Grimm, Michael C.; Gibson, Kathryn A.; Jones, Philip D.

    2012-01-01

    The development of generic capabilities or graduate attributes in communication, teamwork, critical analysis of information, problem solving and ethical practice is widely recognised as a desired outcome of higher education. This emphasis on generic capabilities has emerged despite ongoing debates about the concept and development of such…

  17. Investigation of the Capability of Compact Polarimetric SAR Interferometry to Estimate Forest Height

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Xie, Lei; Wang, Chao; Chen, Jiehong

    2013-08-01

    The main objective of this paper is to investigate the capability of compact Polarimetric SAR Interferometry (C-PolInSAR) on forest height estimation. For this, the pseudo fully polarimetric interferomteric (F-PolInSAR) covariance matrix is firstly reconstructed, then the three- stage inversion algorithm, hybrid algorithm, Music and Capon algorithm are applied to both C-PolInSAR covariance matrix and pseudo F-PolInSAR covariance matrix. The availability of forest height estimation is demonstrated using L-band data generated by simulator PolSARProSim and X-band airborne data acquired by East China Research Institute of Electronic Engineering, China Electronics Technology Group Corporation.

  18. Martin Marietta, Y-12 Plant Laboratory Partnership Program Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koger, J.

    1995-02-10

    The Y-12 Plant currently embraces three mission areas; stockpile surveillance, maintaining production capability, and storage of special nuclear materials. The Y-12 Plant also contributes to the nations` economic strength by partnering with industry in deploying technology. This partnering has been supported to a great extent through the Technology Transfer Initiative (TTI) directed by DOE/Defense Programs (DP-14). The Oak Ridge Centers for Manufacturing Technology (ORCMT) was established to draw upon the manufacturing and fabrication capabilities at the Y-12 Plant to coordinate and support collaborative efforts, between DP and the domestic industrial sector, toward the development of technologies which offer mutual benefitmore » to both DOE/DP programs and the private sector. Most of the needed technologies for the ``Factory of the Future`` (FOF) are being pursued as core areas at the Y-12 Plant. As a result, 85% of DP-14 projects already support the FOF. The unique capabilities of ORCMT can be applied to a wide range of manufacturing problems to enhance the capabilities of the US industrial base and its economic outcome. The ORCMT has an important role to play in DOE`s Technology Transfer initiative because its capabilities are focused on applied manufacturing and technology deployment which has a more near-term impact on private sector competitiveness. The Y-12 Plant uses the ORCMT to help maintain its own core competencies for the FOF by challenging its engineers and capabilities with technical problems from industry. Areas of strength at the Y-12 Plant that could impact the FOF include modeling of processes and advanced materials; intelligent inspection systems with standardized operator interfaces, analysis software, and part programming language; electronic transfer of designs and features; existing computer-based concurrent engineering; and knowledge-based forming process.« less

  19. HYDES: A generalized hybrid computer program for studying turbojet or turbofan engine dynamics

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.

    1974-01-01

    This report describes HYDES, a hybrid computer program capable of simulating one-spool turbojet, two-spool turbojet, or two-spool turbofan engine dynamics. HYDES is also capable of simulating two- or three-stream turbofans with or without mixing of the exhaust streams. The program is intended to reduce the time required for implementing dynamic engine simulations. HYDES was developed for running on the Lewis Research Center's Electronic Associates (EAI) 690 Hybrid Computing System and satisfies the 16384-word core-size and hybrid-interface limits of that machine. The program could be modified for running on other computing systems. The use of HYDES to simulate a single-spool turbojet and a two-spool, two-stream turbofan engine is demonstrated. The form of the required input data is shown and samples of output listings (teletype) and transient plots (x-y plotter) are provided. HYDES is shown to be capable of performing both steady-state design and off-design analyses and transient analyses.

  20. Low Cost Sensors-Current Capabilities and Gaps

    EPA Science Inventory

    1. Present the findings from the a recent technology review of gas and particulate phase sensors 2. Focus on the lower-cost sensors 3. Discuss current capabilities, estimated range of measurement, selectivity, deployment platforms, response time, and expected range of acceptabl...

  1. Description and User Manual for a Web-Based Interface to a Transit-Loss Accounting Program for Monument and Fountain Creeks, El Paso and Pueblo Counties, Colorado

    USGS Publications Warehouse

    Kuhn, Gerhard; Krammes, Gary S.; Beal, Vivian J.

    2007-01-01

    The U.S. Geological Survey, in cooperation with Colorado Springs Utilities, the Colorado Water Conservation Board, and the El Paso County Water Authority, began a study in 2004 with the following objectives: (1) Apply a stream-aquifer model to Monument Creek, (2) use the results of the modeling to develop a transit-loss accounting program for Monument Creek, (3) revise an existing accounting program for Fountain Creek to easily incorporate ongoing and future changes in management of return flows of reusable water, and (4) integrate the two accounting programs into a single program and develop a Web-based interface to the integrated program that incorporates simple and reliable data entry that is automated to the fullest extent possible. This report describes the results of completing objectives (2), (3), and (4) of that study. The accounting program for Monument Creek was developed first by (1) using the existing accounting program for Fountain Creek as a prototype, (2) incorporating the transit-loss results from a stream-aquifer modeling analysis of Monument Creek, and (3) developing new output reports. The capabilities of the existing accounting program for Fountain Creek then were incorporated into the program for Monument Creek and the output reports were expanded to include Fountain Creek. A Web-based interface to the new transit-loss accounting program then was developed that provided automated data entry. An integrated system of 34 nodes and 33 subreaches was integrated by combining the independent node and subreach systems used in the previously completed stream-aquifer modeling studies for the Monument and Fountain Creek reaches. Important operational criteria that were implemented in the new transit-loss accounting program for Monument and Fountain Creeks included the following: (1) Retain all the reusable water-management capabilities incorporated into the existing accounting program for Fountain Creek; (2) enable daily accounting and transit-loss computations for a variable number of reusable return flows discharged into Monument Creek at selected locations; (3) enable diversion of all or a part of a reusable return flow at any selected node for purposes of storage in off-stream reservoirs or other similar types of reusable water management; (4) and provide flexibility in the accounting program to change the number of return-flow entities, the locations at which the return flows discharge into Monument or Fountain Creeks, or the locations to which the return flows are delivered. The primary component of the Web-based interface is a data-entry form that displays data stored in the accounting program input file; the data-entry form allows for entry and modification of new data, which then is rewritten to the input file. When the data-entry form is displayed, up-to-date discharge data for each station are automatically computed and entered on the data-entry form. Data for native return flows, reusable return flows, reusable return flow diversions, and native diversions also are entered automatically or manually, if needed. In computing the estimated quantities of reusable return flow and the associated transit losses, the accounting program uses two sets of computations. The first set of computations is made between any two adjacent streamflow-gaging stations (termed 'stream-segment loop'); the primary purpose of the stream-segment loop is to estimate the loss or gain in native discharge between the two adjacent streamflow-gaging stations. The second set of computations is made between any two adjacent nodes (termed 'subreach loop'); the actual transit-loss computations are made in the subreach loop, using the result from the stream-segment loop. The stream-segment loop is completed for a stream segment, and then the subreach loop is completed for each subreach within the segment. When the subreach loop is completed for all subreaches within a stream segment, the stream-segment loop is initiated for the ne

  2. GPS-based Microenvironment Tracker (MicroTrac) Model to ...

    EPA Pesticide Factsheets

    A critical aspect of air pollution exposure assessment is the estimation of the time spent by individuals in various microenvironments (ME). Accounting for the time spent in different ME with different pollutant concentrations can reduce exposure misclassifications, while failure to do so can add uncertainty and bias to risk estimates. In this study, a classification model, called MicroTrac, was developed to estimate time of day and duration spent in eight ME (indoors and outdoors at home, work, school; inside vehicles; other locations) from global positioning system (GPS) data and geocoded building boundaries. Based on a panel study, MicroTrac estimates were compared to 24 h diary data from 7 participants on workdays and 2 participants on nonworkdays, with corresponding GPS data and building boundaries of home, school, and work. MicroTrac correctly classified the ME for 99.5% of the daily time spent by the participants. The capability of MicroTrac could help to reduce the time-location uncertainty in air pollution exposure models and exposure metrics for individuals in health studies. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize

  3. Sensor data security level estimation scheme for wireless sensor networks.

    PubMed

    Ramos, Alex; Filho, Raimir Holanda

    2015-01-19

    Due to their increasing dissemination, wireless sensor networks (WSNs) have become the target of more and more sophisticated attacks, even capable of circumventing both attack detection and prevention mechanisms. This may cause WSN users, who totally trust these security mechanisms, to think that a sensor reading is secure, even when an adversary has corrupted it. For that reason, a scheme capable of estimating the security level (SL) that these mechanisms provide to sensor data is needed, so that users can be aware of the actual security state of this data and can make better decisions on its use. However, existing security estimation schemes proposed for WSNs fully ignore detection mechanisms and analyze solely the security provided by prevention mechanisms. In this context, this work presents the sensor data security estimator (SDSE), a new comprehensive security estimation scheme for WSNs. SDSE is designed for estimating the sensor data security level based on security metrics that analyze both attack prevention and detection mechanisms. In order to validate our proposed scheme, we have carried out extensive simulations that show the high accuracy of SDSE estimates.

  4. Sensor Data Security Level Estimation Scheme for Wireless Sensor Networks

    PubMed Central

    Ramos, Alex; Filho, Raimir Holanda

    2015-01-01

    Due to their increasing dissemination, wireless sensor networks (WSNs) have become the target of more and more sophisticated attacks, even capable of circumventing both attack detection and prevention mechanisms. This may cause WSN users, who totally trust these security mechanisms, to think that a sensor reading is secure, even when an adversary has corrupted it. For that reason, a scheme capable of estimating the security level (SL) that these mechanisms provide to sensor data is needed, so that users can be aware of the actual security state of this data and can make better decisions on its use. However, existing security estimation schemes proposed for WSNs fully ignore detection mechanisms and analyze solely the security provided by prevention mechanisms. In this context, this work presents the sensor data security estimator (SDSE), a new comprehensive security estimation scheme for WSNs. SDSE is designed for estimating the sensor data security level based on security metrics that analyze both attack prevention and detection mechanisms. In order to validate our proposed scheme, we have carried out extensive simulations that show the high accuracy of SDSE estimates. PMID:25608215

  5. Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robustness

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Heater, Daniel; Lee, Ashley

    2013-01-01

    Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.

  6. Updated Panel-Method Computer Program

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1995-01-01

    Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.

  7. A user's guide for DTIZE an interactive digitizing and graphical editing computer program

    NASA Technical Reports Server (NTRS)

    Thomas, C. C.

    1981-01-01

    A guide for DTIZE, a two dimensional digitizing program with graphical editing capability, is presented. DTIZE provides the capability to simultaneously create and display a picture on the display screen. Data descriptions may be permanently saved in three different formats. DTIZE creates the picture graphics in the locator mode, thus inputting one coordinate each time the terminator button is pushed. Graphic input devices (GIN) are also used to select function command menu. These menu commands and the program's interactive prompting sequences provide a complete capability for creating, editing, and permanently recording a graphical picture file. DTIZE is written in FORTRAN IV language for the Tektronix 4081 graphic system utilizing the Plot 80 Distributed Graphics Library (DGL) subroutines. The Tektronix 4953/3954 Graphic Tablet with mouse, pen, or joystick are used as graphics input devices to create picture graphics.

  8. Graduate capabilities for health service managers: reconfiguring health management education @UNSW.

    PubMed

    Meyer, Lois D; Hodgkinson, Alan R; Knight, Rosemary; Ho, Maria Theresa; di Corpo, Sophie K; Bhalla, Sonal

    2007-08-01

    The Master of Health Administration program at UNSW was extensively revised in 2006 to ensure that it effectively meets the challenging and dynamic environment of health service managers in local and global health contexts. This paper describes the innovative approach to the redesign of the health management program within the Faculty of Medicine. It outlines the method and considerations undertaken, particularly in identifying and embedding new graduate capabilities within the program. The paper concludes that using an outcomes-based approach and engaging with key stakeholders provides opportunity to identify and promote critical capabilities needed by managers to support the challenges confronting health services, including workforce flexibility. Further research is required on how such curriculum initiatives might impact on the performance of health service managers, but initial indications are that the health industry recognises the need and value of this approach.

  9. Building a Predictive Capability for Decision-Making that Supports MultiPEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel

    Multi-phenomenological explosion monitoring (multiPEM) is a developing science that uses multiple geophysical signatures of explosions to better identify and characterize their sources. MultiPEM researchers seek to integrate explosion signatures together to provide stronger detection, parameter estimation, or screening capabilities between different sources or processes. This talk will address forming a predictive capability for screening waveform explosion signatures to support multiPEM.

  10. Meeting report: Estimating the benefits of reducing hazardous air pollutants--summary of 2009 workshop and future considerations.

    PubMed

    Gwinn, Maureen R; Craig, Jeneva; Axelrad, Daniel A; Cook, Rich; Dockins, Chris; Fann, Neal; Fegley, Robert; Guinnup, David E; Helfand, Gloria; Hubbell, Bryan; Mazur, Sarah L; Palma, Ted; Smith, Roy L; Vandenberg, John; Sonawane, Babasaheb

    2011-01-01

    Quantifying the benefits of reducing hazardous air pollutants (HAPs, or air toxics) has been limited by gaps in toxicological data, uncertainties in extrapolating results from high-dose animal experiments to estimate human effects at lower doses, limited ambient and personal exposure monitoring data, and insufficient economic research to support valuation of the health impacts often associated with exposure to individual air toxics. To address some of these issues, the U.S. Environmental Protection Agency held the Workshop on Estimating the Benefits of Reducing Hazardous Air Pollutants (HAPs) in Washington, DC, from 30 April to 1 May 2009. Experts from multiple disciplines discussed how best to move forward on air toxics benefits assessment, with a focus on developing near-term capability to conduct quantitative benefits assessment. Proposed methodologies involved analysis of data-rich pollutants and application of this analysis to other pollutants, using dose-response modeling of animal data for estimating benefits to humans, determining dose-equivalence relationships for different chemicals with similar health effects, and analysis similar to that used for criteria pollutants. Limitations and uncertainties in economic valuation of benefits assessment for HAPS were discussed as well. These discussions highlighted the complexities in estimating the benefits of reducing air toxics, and participants agreed that alternative methods for benefits assessment of HAPs are needed. Recommendations included clearly defining the key priorities of the Clean Air Act air toxics program to identify the most effective approaches for HAPs benefits analysis, focusing on susceptible and vulnerable populations, and improving dose-response estimation for quantification of benefits.

  11. Estimation of process capability indices from the results of limit gauge inspection of dimensional parameters in machining industry

    NASA Astrophysics Data System (ADS)

    Masterenko, Dmitry A.; Metel, Alexander S.

    2018-03-01

    The process capability indices Cp, Cpk are widely used in the modern quality management as statistical measures of the ability of a process to produce output X within specification limits. The customer's requirement to ensure Cp ≥ 1.33 is often applied in contracts. Capability indices estimates may be calculated with the estimates of the mean µ and the variability 6σ, and for it, the quality characteristic in a sample of pieces should be measured. It requires, in turn, using advanced measuring devices and well-qualified staff. From the other hand, quality inspection by attributes, fulfilled with limit gauges (go/no-go) is much simpler and has a higher performance, but it does not give the numerical values of the quality characteristic. The described method allows estimating the mean and the variability of the process on the basis of the results of limit gauge inspection with certain lower limit LCL and upper limit UCL, which separates the pieces into three groups: where X < LCL, number of pieces is n1, where LCL ≤ X < UCL, n2 pieces, and where X ≥ UCL, n3 pieces. So-called Pittman-type estimates, developed by the author, are functions of n1, n2, n3 and allow calculation of the estimated µ and σ. Thus, Cp and Cpk also may be estimated without precise measurements. The estimates can be used in quality inspection of lots of pieces as well as in monitoring and control of the manufacturing process. It is very important for improving quality of articles in machining industry through their tolerance.

  12. The Shock and Vibration Digest, Volume 14, Number 4

    DTIC Science & Technology

    1982-04-01

    temperature, humidity, shock, and vibration -- can influence this capability; as a result an almost continuous program of research and development has...pro- ducing reliability tests. For some time there has been interest in the Army Test Methodology program for developing a vibration system capable...geology of the Livermore Valley is obtained. 82-768 Transient Stress Wave Propagation in HTGR Fuel Element Impacts I.T. Almajan and P.D. Smith

  13. Defense Acquisitions: Assessments of Selected Weapon Programs

    DTIC Science & Technology

    2010-03-01

    improved availability for small terminals. It is to replace the Ultra High Frequency (UHF) Follow-On ( UFO ) satellite system currently in operation...of MUOS capabilities is time-critical due to the operational failures of two UFO satellites. The MUOS program has taken several steps to address...failures of two UFO satellites. Based on the current health of on-orbit satellites, UHF communication capabilities are predicted to fall below the

  14. The C-27J Spartan Procurement Program: A Case Study in USAF Sourcing Practices for National Security

    DTIC Science & Technology

    2012-06-15

    9  Figure 2: Mobility System Utilization by MCRS Case...into the viability of other missions. Mobility Capabilities and Requirements Study 2016 The Mobility Capabilities and Requirements Study 2016 ( MCRS -16...the second since 9-11, and it was released in February 2010 using the programmed force in the 2009 President’s Budget (PB09). The MCRS -16 Executive

  15. The Use of a UNIX-Based Workstation in the Information Systems Laboratory

    DTIC Science & Technology

    1989-03-01

    system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability

  16. Developing Systems Engineering Experience Accelerator (SEEA) Prototype and Roadmap -- Increment 4

    DTIC Science & Technology

    2017-08-08

    of an acquisition program, two categories of new capabilities were added to the UAV experience. Based on a student project at Stevens Institute of...program for a new unmanned aerial vehicle (UAV) system. It was based on the concept of the learners assuming this role shortly after preliminary...University curriculum for systems engineers. First, several new capabilities have been added. These include a trade study for additional technical

  17. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  18. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less

  19. Investments by NASA to build planetary protection capability

    NASA Astrophysics Data System (ADS)

    Buxbaum, Karen; Conley, Catharine; Lin, Ying; Hayati, Samad

    NASA continues to invest in capabilities that will enable or enhance planetary protection planning and implementation for future missions. These investments are critical to the Mars Exploration Program and will be increasingly important as missions are planned for exploration of the outer planets and their icy moons. Since the last COSPAR Congress, there has been an opportunity to respond to the advice of NRC-PREVCOM and the analysis of the MEPAG Special Regions Science Analysis Group. This stimulated research into such things as expanded bioburden reduction options, modern molecular assays and genetic inventory capability, and approaches to understand or avoid recontamination of spacecraft parts and samples. Within NASA, a portfolio of PP research efforts has been supported through the NASA Office of Planetary Protection, the Mars Technology Program, and the Mars Program Office. The investment strategy focuses on technology investments designed to enable future missions and reduce their costs. In this presentation we will provide an update on research and development supported by NASA to enhance planetary protection capability. Copyright 2008 California Institute of Technology. Government sponsorship acknowledged.

  20. Program of Policy Studies in Science and Technology, supplement to seven year review

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    The activities of the Program of Policy Studies are described and evaluated. Awards, seminars, publications are included along with student researcher profiles, graduate program in science, technology, and public policy, and a statement of program capability.

  1. SPOT Program

    NASA Technical Reports Server (NTRS)

    Smith, Jason T.; Welsh, Sam J.; Farinetti, Antonio L.; Wegner, Tim; Blakeslee, James; Deboeck, Toni F.; Dyer, Daniel; Corley, Bryan M.; Ollivierre, Jarmaine; Kramer, Leonard; hide

    2010-01-01

    A Spacecraft Position Optimal Tracking (SPOT) program was developed to process Global Positioning System (GPS) data, sent via telemetry from a spacecraft, to generate accurate navigation estimates of the vehicle position and velocity (state vector) using a Kalman filter. This program uses the GPS onboard receiver measurements to sequentially calculate the vehicle state vectors and provide this information to ground flight controllers. It is the first real-time ground-based shuttle navigation application using onboard sensors. The program is compact, portable, self-contained, and can run on a variety of UNIX or Linux computers. The program has a modular objec-toriented design that supports application-specific plugins such as data corruption remediation pre-processing and remote graphics display. The Kalman filter is extensible to additional sensor types or force models. The Kalman filter design is also strong against data dropouts because it uses physical models from state and covariance propagation in the absence of data. The design of this program separates the functionalities of SPOT into six different executable processes. This allows for the individual processes to be connected in an a la carte manner, making the feature set and executable complexity of SPOT adaptable to the needs of the user. Also, these processes need not be executed on the same workstation. This allows for communications between SPOT processes executing on the same Local Area Network (LAN). Thus, SPOT can be executed in a distributed sense with the capability for a team of flight controllers to efficiently share the same trajectory information currently being computed by the program. SPOT is used in the Mission Control Center (MCC) for Space Shuttle Program (SSP) and International Space Station Program (ISSP) operations, and can also be used as a post -flight analysis tool. It is primarily used for situational awareness, and for contingency situations.

  2. [Thinking about several problems of the research of our family planning strategy].

    PubMed

    Shi, H

    1989-03-01

    On the basis of 1982 census data, it is estimated that from 1987-1997 13 million women will enter the age of marriage and child-bearing each year. The tasks of keeping the population size around 1.2 billion by the year 2000 is arduous. Great efforts have to be made to continue encouraging one child/couple, and to pursue the current plans and policies and maintain strict control over fertility. Keeping population growth in pace with economic growth, environment, ecological balance, availability of per capita resources, education programs, employment capability, health services, maternal and child care, social welfare and social security should be a component of the long term development strategy of the country. Family planning is a comprehensive program which involves long cycles and complicated factors, viewpoints of expediency in guiding policy and program formulation for short term benefits are inappropriate. The emphasis of family planning program strategy should be placed on the rural areas where the majority of population reside. Specifically, the major aspects of strategic thrusts should be the linkage between policy implementation and reception, between family planning publicity and changes of ideation on fertility; the integrated urban and rural program management relating to migration and differentiation of policy towards minority population and areas in different economic development stages. In order to achieve the above strategies, several measures are proposed. (1) strengthening family planning program and organization structure; (2) providing information on population and contraception; (3) establishing family planning program network for infiltration effects; (4) using government financing, taxation, loan, social welfare and penalty to regulate fertility motivations; (5) improving the system of target allocation and data reporting to facilitate program implementation; (6) strengthening population projection and policy research; (7) and strengthening training of family planning personnel to improve program efficiency.

  3. Community Psychology and the Capabilities Approach

    PubMed Central

    2016-01-01

    What makes for a good life? The capabilities approach to this question has much to offer community psychology, particularly with respect to marginalized groups. Capabilities are freedoms to engage in valued social activities and roles—what people can do and be given both their capacities, and environmental opportunities and constraints. Economist Amartya Sen’s focus on freedoms and agency resonates with psychological calls for empowerment, and philosopher Martha Nussbaum’s specification of requirements for a life that is fully human provides an important guide for social programs. Community psychology’s focus on mediating structures has much to offer the capabilities approach. Parallels between capabilities, as enumerated by Nussbaum, and settings that foster positive youth development, as described in a National Research Council Report (Eccles and Gootman (Eds) in Community programs to promote youth development. National Academy Press, Washington, 2002) suggest extensions of the approach to children. Community psychologists can contribute to theory about ways to create and modify settings to enhance capabilities as well as empowerment and positive youth development. Finally, capabilities are difficult to measure, because they involve freedoms to choose but only choices actually made or enacted can be observed. The variation in activities or goals across members of a setting provides a measure of the capabilities that the setting fosters. PMID:25822113

  4. Community psychology and the capabilities approach.

    PubMed

    Shinn, Marybeth

    2015-06-01

    What makes for a good life? The capabilities approach to this question has much to offer community psychology, particularly with respect to marginalized groups. Capabilities are freedoms to engage in valued social activities and roles-what people can do and be given both their capacities, and environmental opportunities and constraints. Economist Amartya Sen's focus on freedoms and agency resonates with psychological calls for empowerment, and philosopher Martha Nussbaum's specification of requirements for a life that is fully human provides an important guide for social programs. Community psychology's focus on mediating structures has much to offer the capabilities approach. Parallels between capabilities, as enumerated by Nussbaum, and settings that foster positive youth development, as described in a National Research Council Report (Eccles and Gootman (Eds) in Community programs to promote youth development. National Academy Press, Washington, 2002) suggest extensions of the approach to children. Community psychologists can contribute to theory about ways to create and modify settings to enhance capabilities as well as empowerment and positive youth development. Finally, capabilities are difficult to measure, because they involve freedoms to choose but only choices actually made or enacted can be observed. The variation in activities or goals across members of a setting provides a measure of the capabilities that the setting fosters.

  5. The Necessity of Functional Analysis for Space Exploration Programs

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Breidenthal, Julian C.

    2011-01-01

    As NASA moves toward expanded commercial spaceflight within its human exploration capability, there is increased emphasis on how to allocate responsibilities between government and commercial organizations to achieve coordinated program objectives. The practice of program-level functional analysis offers an opportunity for improved understanding of collaborative functions among heterogeneous partners. Functional analysis is contrasted with the physical analysis more commonly done at the program level, and is shown to provide theoretical performance, risk, and safety advantages beneficial to a government-commercial partnership. Performance advantages include faster convergence to acceptable system solutions; discovery of superior solutions with higher commonality, greater simplicity and greater parallelism by substituting functional for physical redundancy to achieve robustness and safety goals; and greater organizational cohesion around program objectives. Risk advantages include avoidance of rework by revelation of some kinds of architectural and contractual mismatches before systems are specified, designed, constructed, or integrated; avoidance of cost and schedule growth by more complete and precise specifications of cost and schedule estimates; and higher likelihood of successful integration on the first try. Safety advantages include effective delineation of must-work and must-not-work functions for integrated hazard analysis, the ability to formally demonstrate completeness of safety analyses, and provably correct logic for certification of flight readiness. The key mechanism for realizing these benefits is the development of an inter-functional architecture at the program level, which reveals relationships between top-level system requirements that would otherwise be invisible using only a physical architecture. This paper describes the advantages and pitfalls of functional analysis as a means of coordinating the actions of large heterogeneous organizations for space exploration programs.

  6. A survey of electric and hybrid vehicle simulation programs

    NASA Technical Reports Server (NTRS)

    Bevan, J.; Heimburger, D. A.; Metcalfe, M. A.

    1978-01-01

    Results of a survey conducted within the United States to determine the extent of development and capabilities of automotive performance simulation programs suitable for electric and hybrid vehicle studies are summarized. Altogether, 111 programs were identified as being in a usable state. The complexity of the existing programs spans a range from a page of simple desktop calculator instructions to 300,000 lines of a high-level programming language. The capability to simulate electric vehicles was most common, heat-engines second, and hybrid vehicles least common. Batch-operated programs are slightly more common than interactive ones, and one-third can be operated in either mode. The most commonly used language was FORTRAN, the language typically used by engineers. The higher-level simulation languages (e.g. SIMSCRIPT, GPSS, SIMULA) used by "model builders" were conspicuously lacking.

  7. Scoring the Icecap-A Capability Instrument. Estimation of a UK General Population Tariff†

    PubMed Central

    Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna

    2015-01-01

    This paper reports the results of a best–worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that ‘attachment’ and ‘stability’ each account for around 22% of the space, and ‘autonomy’, ‘achievement’ and ‘enjoyment’ account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24254584

  8. Shuttle cryogenic supply system. Optimization study. Volume 5 B-1: Programmers manual for math models

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for rapid parametric evaluation of various types of cryogenics spacecraft systems is presented. The mathematical techniques of the program provide the capability for in-depth analysis combined with rapid problem solution for the production of a large quantity of soundly based trade-study data. The program requires a large data bank capable of providing characteristics performance data for a wide variety of component assemblies used in cryogenic systems. The program data requirements are divided into: (1) the semipermanent data tables and source data for performance characteristics and (2) the variable input data which contains input parameters which may be perturbated for parametric system studies.

  9. Space radiation studies

    NASA Technical Reports Server (NTRS)

    Gregory, J. C.

    1986-01-01

    Instrument design and data analysis expertise was provided in support of several space radiation monitoring programs. The Verification of Flight Instrumentation (VFI) program at NASA included both the Active Radiation Detector (ARD) and the Nuclear Radiation Monitor (NRM). Design, partial fabrication, calibration and partial data analysis capability to the ARD program was provided, as well as detector head design and fabrication, software development and partial data analysis capability to the NRM program. The ARD flew on Spacelab-1 in 1983, performed flawlessly and was returned to MSFC after flight with unchanged calibration factors. The NRM, flown on Spacelab-2 in 1985, also performed without fault, not only recording the ambient gamma ray background on the Spacelab, but also recording radiation events of astrophysical significance.

  10. Low NO/x/ heavy fuel combustor program

    NASA Technical Reports Server (NTRS)

    Lister, E.; Niedzwiecki, R. W.; Nichols, L.

    1980-01-01

    The paper deals with the 'Low NO/x/ Heavy Fuel Combustor Program'. Main program objectives are to generate and demonstrate the technology required to develop durable gas turbine combustors for utility and industrial applications, which are capable of sustained, environmentally acceptable operation with minimally processed petroleum residual fuels. The program will focus on 'dry' reductions of oxides of nitrogen (NO/x/), improved combustor durability and satisfactory combustion of minimally processed petroleum residual fuels. Other technology advancements sought include: fuel flexibility for operation with petroleum distillates, blends of petroleum distillates and residual fuels, and synfuels (fuel oils derived from coal or shale); acceptable exhaust emissions of carbon monoxide, unburned hydrocarbons, sulfur oxides and smoke; and retrofit capability to existing engines.

  11. The Boeing plastic analysis capability for engines

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1976-01-01

    The current BOPACE program is described as a nonlinear stress analysis program, which is based on a family of isoparametric finite elements. The theoretical, user, programmer, preprocessing aspects are discussed, and example problems are included. New features in the current program version include substructuring, an out-of-core Gauss wavefront equation solver, multipoint constraints, combined material and geometric nonlinearities, automatic calculation of inertia effects, provision for distributed as well as concentrated mechanical loads, follower forces, singular crack-tip elements, the SAIL automatic generation capability, and expanded user control over input quantity definition, output selection, and program execution. BOPACE is written in FORTRAN 4 and is currently available for both the IBM 360/370 and the UNIVAC 1108 machines.

  12. Semi-Automated Identification of Rocks in Images

    NASA Technical Reports Server (NTRS)

    Bornstein, Benjamin; Castano, Andres; Anderson, Robert

    2006-01-01

    Rock Identification Toolkit Suite is a computer program that assists users in identifying and characterizing rocks shown in images returned by the Mars Explorer Rover mission. Included in the program are components for automated finding of rocks, interactive adjustments of outlines of rocks, active contouring of rocks, and automated analysis of shapes in two dimensions. The program assists users in evaluating the surface properties of rocks and soil and reports basic properties of rocks. The program requires either the Mac OS X operating system running on a G4 (or more capable) processor or a Linux operating system running on a Pentium (or more capable) processor, plus at least 128MB of random-access memory.

  13. Composite structural materials

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1981-01-01

    The composite aircraft program component (CAPCOMP) is a graduate level project conducted in parallel with a composite structures program. The composite aircraft program glider (CAPGLIDE) is an undergraduate demonstration project which has as its objectives the design, fabrication, and testing of a foot launched ultralight glider using composite structures. The objective of the computer aided design (COMPAD) portion of the composites project is to provide computer tools for the analysis and design of composite structures. The major thrust of COMPAD is in the finite element area with effort directed at implementing finite element analysis capabilities and developing interactive graphics preprocessing and postprocessing capabilities. The criteria for selecting research projects to be conducted under the innovative and supporting research (INSURE) program are described.

  14. Low NO(x) heavy fuel combustor program

    NASA Technical Reports Server (NTRS)

    Lister, E.; Niedzwiecki, R. W.; Nichols, L.

    1979-01-01

    The 'low nitrogen oxides heavy fuel combustor' program is described. Main program objectives are to generate and demonstrate the technology required to develop durable gas turbine combustors for utility and industrial applications, which are capable of sustained, environmentally acceptable operation with minimally processed petroleum residual fuels. The program will focus on 'dry' reductions of oxides of nitrogen, improved combustor durability, and satisfactory combustion of minimally processed petroleum residual fuels. Other technology advancements sought include: fuel flexibility for operation with petroleum distillates, blends of petroleum distillates and residual fuels, and synfuels (fuel oils derived from coal or shale); acceptable exhaust emissions of carbon monoxide, unburned hydrocarbons, sulfur oxides and smoke; and retrofit capability to existing engines.

  15. Creation of an instrument maintenance program at W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    Hill, G. M.; Kwok, S. H.; Mader, J. A.; Wirth, G. D.; Dahm, S. E.; Goodrich, R. W.

    2014-08-01

    Until a few years ago, the W. M. Keck Observatory (WMKO) did not have a systematic program of instrument maintenance at a level appropriate for a world-leading observatory. We describe the creation of such a program within the context of WMKO's lean operations model which posed challenges but also guided the design of the system and resulted in some unique and notable capabilities. These capabilities and the flexibility of the system have led to its adoption across the Observatory for virtually all PM's. The success of the Observatory in implementing the program and its impact on instrument reliability are presented. Lessons learned are reviewed and strategic implications discussed.

  16. Deriving Continuous Fields of Tree Cover at 1-m over the Continental United States From the National Agriculture Imagery Program (NAIP) Imagery to Reduce Uncertainties in Forest Carbon Stock Estimation

    NASA Astrophysics Data System (ADS)

    Ganguly, S.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Milesi, C.; Votava, P.; Nemani, R. R.

    2013-12-01

    An unresolved issue with coarse-to-medium resolution satellite-based forest carbon mapping over regional to continental scales is the high level of uncertainty in above ground biomass (AGB) estimates caused by the absence of forest cover information at a high enough spatial resolution (current spatial resolution is limited to 30-m). To put confidence in existing satellite-derived AGB density estimates, it is imperative to create continuous fields of tree cover at a sufficiently high resolution (e.g. 1-m) such that large uncertainties in forested area are reduced. The proposed work will provide means to reduce uncertainty in present satellite-derived AGB maps and Forest Inventory and Analysis (FIA) based regional estimates. Our primary objective will be to create Very High Resolution (VHR) estimates of tree cover at a spatial resolution of 1-m for the Continental United States using all available National Agriculture Imaging Program (NAIP) color-infrared imagery from 2010 till 2012. We will leverage the existing capabilities of the NASA Earth Exchange (NEX) high performance computing and storage facilities. The proposed 1-m tree cover map can be further aggregated to provide percent tree cover at any medium-to-coarse resolution spatial grid, which will aid in reducing uncertainties in AGB density estimation at the respective grid and overcome current limitations imposed by medium-to-coarse resolution land cover maps. We have implemented a scalable and computationally-efficient parallelized framework for tree-cover delineation - the core components of the algorithm [that] include a feature extraction process, a Statistical Region Merging image segmentation algorithm and a classification algorithm based on Deep Belief Network and a Feedforward Backpropagation Neural Network algorithm. An initial pilot exercise has been performed over the state of California (~11,000 scenes) to create a wall-to-wall 1-m tree cover map and the classification accuracy has been assessed. Results show an improvement in accuracy of tree-cover delineation as compared to existing forest cover maps from NLCD, especially over fragmented, heterogeneous and urban landscapes. Estimates of VHR tree cover will complement and enhance the accuracy of present remote-sensing based AGB modeling approaches and forest inventory based estimates at both national and local scales. A requisite step will be to characterize the inherent uncertainties in tree cover estimates and propagate them to estimate AGB.

  17. Tuning in to Another Person's Action Capabilities: Perceiving Maximal Jumping-Reach Height from Walking Kinematics

    ERIC Educational Resources Information Center

    Ramenzoni, Veronica; Riley, Michael A.; Davis, Tehran; Shockley, Kevin; Armstrong, Rachel

    2008-01-01

    Three experiments investigated the ability to perceive the maximum height to which another actor could jump to reach an object. Experiment 1 determined the accuracy of estimates for another actor's maximal reach-with-jump height and compared these estimates to estimates of the actor's standing maximal reaching height and to estimates of the…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, L.T.; Hickey, M.

    This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must bemore » sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)« less

  19. The value of forage measurement information in rangeland management. [implementation of satellite data in range management

    NASA Technical Reports Server (NTRS)

    Lietzke, K. R.

    1975-01-01

    An economic model and simulation are developed to estimate the potential social benefit arising from the use of alternative measurement systems in rangeland management. In order to estimate these benefits, it was necessary to model three separate systems: the range environment, the rangeland manager, and the information system which links the two. The rancher's decision-making behavior is modeled according to sound economic principles. Results indicate substantial potential benefits, particularly when used in assisting management of government-operated ranges; possible annual benefits in this area range from $20 to $46 million, depending upon the system capabilities assumed. Possible annual benefit in privately-managed stocker operations range from $2.8 to $49.5 million, depending upon where actual rancher capabilities lie and what system capabilities are assumed.

  20. Equivalent source modeling of the main field using MAGSAT data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The software was considerably enhanced to accommodate a more comprehensive examination of data available for field modeling using the equivalent sources method by (1) implementing a dynamic core allocation capability into the software system for the automatic dimensioning of the normal matrix; (2) implementing a time dependent model for the dipoles; (3) incorporating the capability to input specialized data formats in a fashion similar to models in spherical harmonics; and (4) implementing the optional ability to simultaneously estimate observatory anomaly biases where annual means data is utilized. The time dependence capability was demonstrated by estimating a component model of 21 deg resolution using the 14 day MAGSAT data set of Goddard's MGST (12/80). The equivalent source model reproduced both the constant and the secular variation found in MGST (12/80).

Top