78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...
2008-07-23
This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.
Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications
1991-12-01
prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the
Seventh NASTRAN User's Colloquium
NASA Technical Reports Server (NTRS)
1978-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems are described. Topics include: fluids and thermal applications, NASTRAN programming, substructuring methods, unique new applications, general auxiliary programs, specific applications, and new capabilities.
NASA Astrophysics Data System (ADS)
Sharma, Amita; Sarangdevot, S. S.
2010-11-01
Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... applications under the Nutrition, Physical Activity, and Obesity Program. These applications have been..., Physical Activity, and Obesity Program Funding Opportunity Announcement (FOA). Award Information... based on methodology published in the Nutrition, Physical Activity, and Obesity Program CDC-RFA-DP08-805...
Estimating the return on investment in disease management programs using a pre-post analysis.
Fetterolf, Donald; Wennberg, David; Devries, Andrea
2004-01-01
Disease management programs have become increasingly popular over the past 5-10 years. Recent increases in overall medical costs have precipitated new concerns about the cost-effectiveness of medical management programs that have extended to the program directors for these programs. Initial success of the disease management movement is being challenged on the grounds that reported results have been the result of the application of faulty, if intuitive, methodologies. This paper discusses the use of "pre-post" methodology approaches in the analysis of disease management programs, and areas where application of this approach can result in spurious results and incorrect financial outcome assessments. The paper includes a checklist of these items for use by operational staff working with the programs, and a comprehensive bibliography that addresses many of the issues discussed.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
Introduction to Computational Methods for Stability and Control (COMSAC)
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Fremaux, C. Michael; Chambers, Joseph R.
2004-01-01
This Symposium is intended to bring together the often distinct cultures of the Stability and Control (S&C) community and the Computational Fluid Dynamics (CFD) community. The COMSAC program is itself a new effort by NASA Langley to accelerate the application of high end CFD methodologies to the demanding job of predicting stability and control characteristics of aircraft. This talk is intended to set the stage for needing a program like COMSAC. It is not intended to give details of the program itself. The topics include: 1) S&C Challenges; 2) Aero prediction methodology; 3) CFD applications; 4) NASA COMSAC planning; 5) Objectives of symposium; and 6) Closing remarks.
Observations of fallibility in applications of modern programming methodologies
NASA Technical Reports Server (NTRS)
Gerhart, S. L.; Yelowitz, L.
1976-01-01
Errors, inconsistencies, or confusing points are noted in a variety of published algorithms, many of which are being used as examples in formulating or teaching principles of such modern programming methodologies as formal specification, systematic construction, and correctness proving. Common properties of these points of contention are abstracted. These properties are then used to pinpoint possible causes of the errors and to formulate general guidelines which might help to avoid further errors. The common characteristic of mathematical rigor and reasoning in these examples is noted, leading to some discussion about fallibility in mathematics, and its relationship to fallibility in these programming methodologies. The overriding goal is to cast a more realistic perspective on the methodologies, particularly with respect to older methodologies, such as testing, and to provide constructive recommendations for their improvement.
NASA Technical Reports Server (NTRS)
Culclasure, D. F.; Sigmon, J. L.; Carter, J. M.
1973-01-01
The activities are reported of the NASA Biomedical Applications Team at Southwest Research Institute between 25 August, 1972 and 15 November, 1973. The program background and methodology are discussed along with the technology applications, and biomedical community impacts.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Development of a North American paleoclimate pollen-based reconstruction database application
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Mosher, Steven; Viau, Andre
2013-04-01
Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.
Medically related activities of application team program
NASA Technical Reports Server (NTRS)
1971-01-01
Application team methodology identifies and specifies problems in technology transfer programs to biomedical areas through direct contact with users of aerospace technology. The availability of reengineering sources increases impact of the program on the medical community and results in broad scale application of some bioinstrumentation systems. Examples are given that include devices adapted to the rehabilitation of neuromuscular disorders, power sources for artificial organs, and automated monitoring and detection equipment in clinical medicine.
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Biomedical application in space, pilot program in the southern California region
NASA Technical Reports Server (NTRS)
Kelton, A. A.
1979-01-01
A pilot program is presented which was to promote utilization of the Shuttle/Spacelab for medical and biological research applied to terrestrial needs. The program was limited to the Southern California region and consisted of the following five tasks: (1) preparation of educational materials; (2) identification of principal investigators; (3) initial contact and visit; (4)development of promising applications; and (5) evaluation of regional program methodology.
Acquisition: Navy Transition of Advanced Technology Programs to Military Applications
2003-02-04
The audit objective was to determine whether the Navy was successful in transitioning advanced technology projects to military applications...they relate to the audit objective. See Appendix A for a discussion of the audit scope and methodology, the review of the management control program, and prior coverage related to the audit objectives.
NASA Astrophysics Data System (ADS)
Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana
2016-10-01
Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.
Indexing NASA programs for technology transfer methods development and feasibility
NASA Technical Reports Server (NTRS)
Clingman, W. H.
1972-01-01
This project was undertaken to evaluate the application of a previously developed indexing methodology to ongoing NASA programs. These programs are comprehended by the NASA Program Approval Documents (PADS). Each PAD contains a technical plan for the area it covers. It was proposed that these could be used to generate an index to the complete NASA program. To test this hypothesis two PADS were selected by the NASA Technology Utilization Office for trial indexing. Twenty-five individuals indexed the two PADS using NASA Thesaurus terms. The results demonstrated the feasibility of indexing ongoing NASA programs using PADS as the source of information. The same indexing methodology could be applied to other documents containing a brief description of the technical plan. Results of this project showed that over 85% of the concepts in the technology should be covered by the indexing. Also over 85% of the descriptors chosen would be accurate. This completeness and accuracy for the indexing is considered satisfactory for application in technology transfer.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
A survey of application: genomics and genetic programming, a new frontier.
Khan, Mohammad Wahab; Alam, Mansaf
2012-08-01
The aim of this paper is to provide an introduction to the rapidly developing field of genetic programming (GP). Particular emphasis is placed on the application of GP to genomics. First, the basic methodology of GP is introduced. This is followed by a review of applications in the areas of gene network inference, gene expression data analysis, SNP analysis, epistasis analysis and gene annotation. Finally this paper concluded by suggesting potential avenues of possible future research on genetic programming, opportunities to extend the technique, and areas for possible practical applications. Copyright © 2012 Elsevier Inc. All rights reserved.
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
French, Michael T; Salomé, Helena J; Sindelar, Jody L; McLellan, A Thomas
2002-04-01
To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment.
ERIC Educational Resources Information Center
Simon, Charles W.
A major part of the Naval Training Equipment Center's Aviation Wide Angle Visual System (AWAVS) program involves behavioral research to provide a basis for establishing design criteria for flight trainers. As part of the task of defining the purpose and approach of this program, the applications of advanced experimental methods are explained and…
Application of a Dynamic Programming Algorithm for Weapon Target Assignment
2016-02-01
25] A . Turan , “Techniques for the Allocation of Resources Under Uncertainty,” Middle Eastern Technical University, Ankara, Turkey, 2012. [26] K...UNCLASSIFIED UNCLASSIFIED Application of a Dynamic Programming Algorithm for Weapon Target Assignment Lloyd Hammond Weapons and...optimisation techniques to support the decision making process. This report documents the methodology used to identify, develop and assess a
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems
Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio
2013-01-01
Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131
Fuzzy Linear Programming and its Application in Home Textile Firm
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2011-06-01
In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.
A distorted-wave methodology for electron-ion impact excitation - Calculation for two-electron ions
NASA Technical Reports Server (NTRS)
Bhatia, A. K.; Temkin, A.
1977-01-01
A distorted-wave program is being developed for calculating the excitation of few-electron ions by electron impact. It uses the exchange approximation to represent the exact initial-state wavefunction in the T-matrix expression for the excitation amplitude. The program has been implemented for excitation of the 2/1,3/(S,P) states of two-electron ions. Some of the astrophysical applications of these cross sections as well as the motivation and requirements of the calculational methodology are discussed.
Incorporation of lean methodology into pharmacy residency programs.
John, Natalie; Snider, Holly; Edgerton, Lisa; Whalin, Laurie
2017-03-15
The implementation of lean methodology into pharmacy residency programs at a community teaching hospital is described. New Hanover Regional Medical Center, a community teaching hospital in southeastern North Carolina, fully adopted a lean culture in 2010. Given the success of lean strategies organizationally, this methodology was used to assist with the evaluation and development of its pharmacy residency programs in 2014. Lean tools and activities have also been incorporated into residency requirements and rotation learning activities. The majority of lean events correspond to the required competency areas evaluating leadership and management, teaching, and education. These events have included participation in and facilitation of various lean problem-solving and communication tools. The application of the 4 rules of lean has resulted in enhanced management of the programs and provides a set of tools by which continual quality improvement can be ensured. Regular communication and direct involvement of all invested parties have been critical in developing and sustaining new improvements. In addition to program enhancements, lean methodology offers novel methods by which residents may be incorporated into leadership activities. The incorporation of lean methodology into pharmacy residency programs has translated into a variety of realized and potential benefits for the programs, the preceptors and residents, and the health system. Specific areas of growth have included quality-improvement processes, the expansion of leadership opportunities for residents, and improved communication among program directors, preceptors, and residents. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Methodology for 2012 Application Trends Survey
ERIC Educational Resources Information Center
Graduate Management Admission Council, 2012
2012-01-01
From early June to late July 2012, the Graduate Management Admission Council[R] (GMAC[R]) conducted the "Application Trends Survey", its annual survey of business school admission professionals worldwide to assess how application volume at MBA and other graduate management programs compared with that from the same period in 2011. This…
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
Education and Training in Ethical Decision Making: Comparing Context and Orientation
ERIC Educational Resources Information Center
Perri, David F.; Callanan, Gerard A.; Rotenberry, Paul F.; Oehlers, Peter F.
2009-01-01
Purpose: The purpose of this paper is to present a teaching methodology for improving the understanding of ethical decision making. This pedagogical approach is applicable in college courses and in corporate training programs. Design/methodology/approach: Participants are asked to analyze a set of eight ethical dilemmas with differing situational…
34 CFR 428.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2013 CFR
2013-07-01
... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...
34 CFR 428.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2014 CFR
2014-07-01
... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...
Peterman, Amber; Palermo, Tia M; Handa, Sudhanshu; Seidenfeld, David
2018-03-01
Social scientists have increasingly invested in understanding how to improve data quality and measurement of sensitive topics in household surveys. We utilize the technique of list randomization to collect measures of physical intimate partner violence in an experimental impact evaluation of the Government of Zambia's Child Grant Program. The Child Grant Program is an unconditional cash transfer, which targeted female caregivers of children under the age of 5 in rural areas to receive the equivalent of US $24 as a bimonthly stipend. The implementation results show that the list randomization methodology functioned as planned, with approximately 15% of the sample identifying 12-month prevalence of physical intimate partner violence. According to this measure, after 4 years, the program had no measurable effect on partner violence. List randomization is a promising approach to incorporate sensitive measures into multitopic evaluations; however, more research is needed to improve upon methodology for application to measurement of violence. Copyright © 2017 John Wiley & Sons, Ltd.
Application and systems software in Ada: Development experiences
NASA Technical Reports Server (NTRS)
Kuschill, Jim
1986-01-01
In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A conference, held in Washington, D. C., in 1967 by the Association for Educational Data Systems and the U.S. Office of Education, attempted to lay the groundwork for an efficient automatic data processing training program for the Federal Government utilizing new instructional methodologies. The rapid growth of computer applications and computer…
Application of Real Options Theory to DoD Software Acquisitions
2009-02-20
Future Combat Systems Program. Washington, DC. U.S. Government Printing Office. Damodaran , A. (2007). Investment Valuation : The Options To Expand... valuation methodology, when enhanced and properly formulated around a proposed or existing software investment employing the spiral development approach...THIS PAGE INTENTIONALLY LEFT BLANK iii ABSTRACT The traditional real options valuation methodology, when enhanced and properly formulated
The SERI solar energy storage program
NASA Technical Reports Server (NTRS)
Copeland, R. J.; Wright, J. D.; Wyman, C. E.
1980-01-01
In support of the DOE thermal and chemical energy storage program, the solar energy storage program (SERI) provides research on advanced technologies, systems analyses, and assessments of thermal energy storage for solar applications in support of the Thermal and Chemical Energy Storage Program of the DOE Division of Energy Storage Systems. Currently, research is in progress on direct contact latent heat storage and thermochemical energy storage and transport. Systems analyses are being performed of thermal energy storage for solar thermal applications, and surveys and assessments are being prepared of thermal energy storage in solar applications. A ranking methodology for comparing thermal storage systems (performance and cost) is presented. Research in latent heat storage and thermochemical storage and transport is reported.
Developing Supply Chain Management Program: A Competency Model
ERIC Educational Resources Information Center
Sauber, Matthew H.; McSurely, Hugh B.; Tummala, V. M. Rao
2008-01-01
Purpose: This paper aims to show the process of designing and measuring learning competencies in program development. Design/methodology/approach: The paper includes cross-sectoral comparisons to draw on programmatic and pedagogical strategies, more commonly utilized in vocational education, and transfer the application of these strategies into…
Research Management--Of What Nature Is the Concept?
ERIC Educational Resources Information Center
Cook, Desmond L.
Research management is defined as the application of both management and management science to a particular field of research and development activities. Seven components of research management include theory and methodology; the planning, implementation, and evaluation of research programs; communications; utilization; and special applications.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminsky, J.; Tschanz, J.F.
In order to adress barriers to community energy-conservation efforts, DOE has established the Comprehensive Community Energy Management (CCEM) program. The role of CCEM is to provide direction and technical support for energy-conservation efforts at the local level. The program to date has included project efforts to develop combinations and variations of community energy planning and management tools applicable to communities of diverse characteristics. This paper describes the salient features of some of the tools and relates them to the testing program soon to begin in several pilot-study communities. Two methodologies that arose within such an actual planning context are takenmore » from DOE-sponsored projects in Clarksburg, West Virginia and the proposed new capital city for Alaska. Energy management in smaller communities and/or communities with limited funding and manpower resources has received special attention. One project of this type developed in general methodology that emphasizes efficient ways for small communities to reach agreement on local energy problems and potential solutions; by this guidance, the community is led to understand where it should concentrate its efforts in subsequent management activities. Another project concerns rapid growth of either a new or an existing community that could easily outstrip the management resources available locally. This methodology strives to enable the community to seize the opportunity for energy conservation through integrating the design of its energy systems and its development pattern. The last methodology creates applicable tools for comprehensive community energy planning. (MCW)« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
ERIC Educational Resources Information Center
Hughes, Carolyn; Agran, Martin
1993-01-01
This literature review examines the effects of self-instructional programs on increasing independence of persons with moderate/severe mental retardation in integrated environments. The article discusses methodological issues, research needs, and recommendations for program implementation. The feasibility of using self-instruction to promote…
The Application of Cost-Benefit Analysis in Manpower Area.
ERIC Educational Resources Information Center
Barsby, Steven L.
The relative efficiency of various manpower programs as seen through cost-benefit analysis is assessed, and the contribution that cost-benefit analysis has made in evaluating manpower programs is discussed, taking into account a variety of methodologies presented in different studies. Vocational rehabilitation appears to yield the highest…
Selected Tether Applications Cost Model
NASA Technical Reports Server (NTRS)
Keeley, Michael G.
1988-01-01
Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.
ERIC Educational Resources Information Center
Gaven, Patricia; Williams, R. David
An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…
Project MAC Progress Report 11
1974-12-01
whether a subroutine would be useful as a part of some larger program , and, if so, how to use it [8]. The programming methodology employed by CALICO...7. Seriff, Marc, How to Write Programs for the CALICO Environment. SYS. 14.04 (unpublished). 8. Reeve, Chris, Marty Draper, D. E. Burmaster, and J...Introduction Automatic Programming Group A. Introduction B. Understanding How a User Might Interact with a Knowledge-Based Application System C
Determining Training Device Requirements in Army Aviation Systems
NASA Technical Reports Server (NTRS)
Poumade, M. L.
1984-01-01
A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.
Parallel Programming Strategies for Irregular Adaptive Applications
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
Achieving scalable performance for dynamic irregular applications is eminently challenging. Traditional message-passing approaches have been making steady progress towards this goal; however, they suffer from complex implementation requirements. The use of a global address space greatly simplifies the programming task, but can degrade the performance for such computations. In this work, we examine two typical irregular adaptive applications, Dynamic Remeshing and N-Body, under competing programming methodologies and across various parallel architectures. The Dynamic Remeshing application simulates flow over an airfoil, and refines localized regions of the underlying unstructured mesh. The N-Body experiment models two neighboring Plummer galaxies that are about to undergo a merger. Both problems demonstrate dramatic changes in processor workloads and interprocessor communication with time; thus, dynamic load balancing is a required component.
Teaching Environmental Education through PBL: Evaluation of a Teaching Intervention Program
NASA Astrophysics Data System (ADS)
Vasconcelos, Clara
2012-04-01
If our chosen aim in science education is to be inclusive and to improve students' learning achievements, then we must identify teaching methodologies that are appropriate for teaching and learning specific knowledge. Karagiorgi and Symeo 2005) remind us that instructional designers are thus challenged to translate the philosophy of constructivism into current practice. Thus, research in science education must focus on evaluating intervention programs which ensure the effective construction of knowledge and development of competencies. The present study reports the elaboration, application and evaluation of a problem-based learning (PBL) program with the aim of examining its effectiveness with students learning Environmental Education. Prior research on both PBL and Environmental Education (EE) was conducted within the context of science education so as to elaborate and construct the intervention program. Findings from these studies indicated both the PBL methodology and EE as helpful for teachers and students. PBL methodology has been adopted in this study since it is logically incorporated in a constructivism philosophy application (Hendry et al. 1999) and it was expected that this approach would assist students towards achieving a specific set of competencies (Engel 1997). On the other hand, EE has evolved at a rapid pace within many countries in the new millennium (Hart 2007), unlike any other educational area. However, many authors still appear to believe that schools are failing to prepare students adequately in EE (Walsche 2008; Winter 2007). The following section describes the research that was conducted in both areas so as to devise the intervention program.
42 CFR 414.900 - Basis and scope.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) MEDICARE PROGRAM PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Payment for Drugs and Biologicals... of the Act and outlines two payment methodologies applicable to drugs and biologicals covered under...
42 CFR 414.900 - Basis and scope.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) MEDICARE PROGRAM PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Payment for Drugs and Biologicals... of the Act and outlines two payment methodologies applicable to drugs and biologicals covered under...
49 CFR 385.3 - Definitions and acronyms.
Code of Federal Regulations, 2010 CFR
2010-10-01
... applicable FMCSRs. RSPA means the Research and Special Programs Administration. Safety fitness determination... the Safety Fitness Rating Methodology (SFRM) set forth in Appendix B to this part and based on the...
Moore, J H
1995-06-01
A genetic algorithm for instrumentation control and optimization was developed using the LabVIEW graphical programming environment. The usefulness of this methodology for the optimization of a closed loop control instrument is demonstrated with minimal complexity and the programming is presented in detail to facilitate its adaptation to other LabVIEW applications. Closed loop control instruments have variety of applications in the biomedical sciences including the regulation of physiological processes such as blood pressure. The program presented here should provide a useful starting point for those wishing to incorporate genetic algorithm approaches to LabVIEW mediated optimization of closed loop control instruments.
Perceptions of Bachelor-Degree Graduates Regarding General Education Program Quality
ERIC Educational Resources Information Center
Bittinger, Sara-Beth
2017-01-01
This study was directed by a modified Delphi-methodology design to gain perspective of the perceptions of alumni regarding the value and applicability of the general education program. The expert-panel participants were 14 alumni of Frostburg State University from various majors, representative of all three colleges, who graduated between 2006 and…
Using "Kaizen" to Improve Graduate Business School Degree Programs
ERIC Educational Resources Information Center
Emiliani, M. L.
2005-01-01
Purpose: To illustrate the applicability of "kaizen" in higher education. Design/methodology/approach: "Kaizen" process was used for ten courses contained in a part-time executive MS degree program in management. Findings: "Kaizen" was found to be an effective process for improving graduate business school courses and the value proposition for…
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Development of flight experiment task requirements. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Hatterick, G. R.
1972-01-01
A study was conducted to develop the means to identify skills required of scientist passengers on advanced missions related to the space shuttle and RAM programs. The scope of the study was defined to include only the activities of on-orbit personnel which are directly related to, or required by, on-orbit experimentation and scientific investigations conducted on or supported by the shuttle orbiter. A program summary is presented which provides a description of the methodology developed, an overview of the activities performed during the study, and the results obtained through application of the methodology.
Epistasis analysis using artificial intelligence.
Moore, Jason H; Hill, Doug P
2015-01-01
Here we introduce artificial intelligence (AI) methodology for detecting and characterizing epistasis in genetic association studies. The ultimate goal of our AI strategy is to analyze genome-wide genetics data as a human would using sources of expert knowledge as a guide. The methodology presented here is based on computational evolution, which is a type of genetic programming. The ability to generate interesting solutions while at the same time learning how to solve the problem at hand distinguishes computational evolution from other genetic programming approaches. We provide a general overview of this approach and then present a few examples of its application to real data.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J
2009-01-01
Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.
A methodology for comprehensive strategic planning and program prioritization
NASA Astrophysics Data System (ADS)
Raczynski, Christopher Michael
2008-10-01
This process developed in this work, Strategy Optimization for the Allocation of Resources (SOAR), is a strategic planning methodology based off Integrated Product and Process Development and systems engineering techniques. Utilizing a top down approach, the process starts with the creation of the organization vision and its measures of effectiveness. These measures are prioritized based on their application to external world scenarios which will frame the future. The programs which will be used to accomplish this vision are identified by decomposing the problem. Information is gathered on the programs as to the application, cost, schedule, risk, and other pertinent information. The relationships between the levels of the hierarchy are mapped utilizing subject matter experts. These connections are then utilized to determine the overall benefit of the programs to the vision of the organization. Through a Multi-Objective Genetic Algorithm a tradespace of potential program portfolios can be created amongst which the decision maker can allocate resources. The information and portfolios are presented to the decision maker through the use of a Decision Support System which collects and visualizes all the data in a single location. This methodology was tested utilizing a science and technology planning exercise conducted by the United States Navy. A thorough decomposition was defined and technology programs identified which had the potential to provide benefit to the vision. The prioritization of the top level capabilities was performed through the use of a rank ordering scheme and a previous naval application was used to demonstrate a cumulative voting scheme. Voting was performed utilizing the Nominal Group Technique to capture the relationships between the levels of the hierarchy. Interrelationships between the technologies were identified and a MOGA was utilized to optimize portfolios with respect to these constraints and information was placed in a DSS. This formulation allowed the decision makers to assess which portfolio could provide the greatest benefit to the Navy while still fitting within the funding profile.
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
NASA Technical Reports Server (NTRS)
Barr, B. G.; Martinko, E. A.
1976-01-01
Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.
NASA Technical Reports Server (NTRS)
Rowell, Lawrence F.; Davis, John S.
1989-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.
Tavakoli, Ali; Nikoo, Mohammad Reza; Kerachian, Reza; Soltani, Maryam
2015-04-01
In this paper, a new fuzzy methodology is developed to optimize water and waste load allocation (WWLA) in rivers under uncertainty. An interactive two-stage stochastic fuzzy programming (ITSFP) method is utilized to handle parameter uncertainties, which are expressed as fuzzy boundary intervals. An iterative linear programming (ILP) is also used for solving the nonlinear optimization model. To accurately consider the impacts of the water and waste load allocation strategies on the river water quality, a calibrated QUAL2Kw model is linked with the WWLA optimization model. The soil, water, atmosphere, and plant (SWAP) simulation model is utilized to determine the quantity and quality of each agricultural return flow. To control pollution loads of agricultural networks, it is assumed that a part of each agricultural return flow can be diverted to an evaporation pond and also another part of it can be stored in a detention pond. In detention ponds, contaminated water is exposed to solar radiation for disinfecting pathogens. Results of applying the proposed methodology to the Dez River system in the southwestern region of Iran illustrate its effectiveness and applicability for water and waste load allocation in rivers. In the planning phase, this methodology can be used for estimating the capacities of return flow diversion system and evaporation and detention ponds.
NASA-Ames workload research program
NASA Technical Reports Server (NTRS)
Hart, Sandra
1988-01-01
Research has been underway for several years to develop valid and reliable measures and predictors of workload as a function of operator state, task requirements, and system resources. Although the initial focus of this research was on aeronautics, the underlying principles and methodologies are equally applicable to space, and provide a set of tools that NASA and its contractors can use to evaluate design alternatives from the perspective of the astronauts. Objectives and approach of the research program are described, as well as the resources used in conducting research and the conceptual framework around which the program evolved. Next, standardized tasks are described, in addition to predictive models and assessment techniques and their application to the space program. Finally, some of the operational applications of these tasks and measures are reviewed.
Wendel, Jeanne; Dumitras, Diana
2005-06-01
This paper describes an analytical methodology for obtaining statistically unbiased outcomes estimates for programs in which participation decisions may be correlated with variables that impact outcomes. This methodology is particularly useful for intraorganizational program evaluations conducted for business purposes. In this situation, data is likely to be available for a population of managed care members who are eligible to participate in a disease management (DM) program, with some electing to participate while others eschew the opportunity. The most pragmatic analytical strategy for in-house evaluation of such programs is likely to be the pre-intervention/post-intervention design in which the control group consists of people who were invited to participate in the DM program, but declined the invitation. Regression estimates of program impacts may be statistically biased if factors that impact participation decisions are correlated with outcomes measures. This paper describes an econometric procedure, the Treatment Effects model, developed to produce statistically unbiased estimates of program impacts in this type of situation. Two equations are estimated to (a) estimate the impacts of patient characteristics on decisions to participate in the program, and then (b) use this information to produce a statistically unbiased estimate of the impact of program participation on outcomes. This methodology is well-established in economics and econometrics, but has not been widely applied in the DM outcomes measurement literature; hence, this paper focuses on one illustrative application.
Waste treatability guidance program. User`s guide. Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toth, C.
1995-12-21
DOE sites across the country generate and manage radioactive, hazardous, mixed, and sanitary wastes. It is necessary for each site to find the technologies and associated capacities required to manage its waste. One role of DOE HQ Office of Environmental Restoration and Waste Management is to facilitate the integration of the site- specific plans into coherent national plans. DOE has developed a standard methodology for defining and categorizing waste streams into treatability groups based on characteristic parameters that influence waste management technology needs. This Waste Treatability Guidance Program automates the Guidance Document for the categorization of waste information into treatabilitymore » groups; this application provides a consistent implementation of the methodology across the National TRU Program. This User`s Guide provides instructions on how to use the program, including installations instructions and program operation. This document satisfies the requirements of the Software Quality Assurance Plan.« less
T-H-A-T-S: timber-harvesting-and-transport-simulator: with subroutines for Appalachian logging
A. Jeff Martin
1975-01-01
A computer program for simulating harvesting operations is presented. Written in FORTRAN IV, the program contains subroutines that were developed for Appalachian logging conditions. However, with appropriate modifications, the simulator would be applicable for most logging operations and locations. The details of model development and its methodology are presented,...
The Job Training Partnership Act and Computer-Assisted Instruction. Research Report 88-13.
ERIC Educational Resources Information Center
Education Turnkey Systems, Inc., Falls Church, VA.
A study sought to (1) determine the current and potential instructional application of computers in Job Training Partnership Act (JTPA) Titles II, III, and IV programs; and (2) present policy options that would increase the effective use of this technology in employment and training programs. Research methodology involved conducting an assessment…
ERIC Educational Resources Information Center
Gore, Peter H.; And Others
Design, application, and interpretation of a three-tiered sampling framework as a strategy for eliciting public participation in planning and program implementation is presented, with emphasis on implications for federal programs which mandate citizen participation (for example, Level B planning of Water Resources Planning Act, Federal Water…
NASA Astrophysics Data System (ADS)
Wu, Q. H.; Ma, J. T.
1993-09-01
A primary investigation into application of genetic algorithms in optimal reactive power dispatch and voltage control is presented. The application was achieved, based on (the United Kingdom) National Grid 48 bus network model, using a novel genetic search approach. Simulation results, compared with that obtained using nonlinear programming methods, are included to show the potential of applications of the genetic search methodology in power system economical and secure operations.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
A finite element program for postbuckling calculations (PSTBKL)
NASA Technical Reports Server (NTRS)
Simitses, G. T.; Carlson, R. L.; Riff, R.
1991-01-01
The object of the research reported herein was to develop a general mathematical model and solution methodologies for analyzing the structural response of thin, metallic shell structures under large transient, cyclic, or static thermochemical loads. This report describes the computer program resulting from the research. Among the system responses associated with these loads and conditions are thermal buckling, creep buckling, and ratcheting. Thus geometric and material nonlinearities (of high order) have been anticipated and are considered in developing the mathematical model. The methodology is demonstrated through different problems of extension, shear, and of planar curved beams. Moreover, importance of the inclusion of large strains is clearly demonstrated, through the chosen applications.
1990-01-31
and marketing . nological base for the applications, go;ithms, and soft- ware aspects of neurocomputing. The PYGMALION will lead to the coordination of...this by realizing prototypes in two fields systems, system programming, methodology for system of speech application that represent market sectors of...customer rclationrships These demonstrations will ,,c developed and assessed in throughout the internal market by facilitating product close co
Designing application software in wide area network settings
NASA Technical Reports Server (NTRS)
Makpangou, Mesaac; Birman, Ken
1990-01-01
Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.
NASA Technical Reports Server (NTRS)
Palguta, T.; Bradley, W.; Stockton, T.
1988-01-01
Supportability issues for the 1.8 meter centrifuge in the Life Science Research Facility are addressed. The analysis focuses on reliability and maintainability and the potential impact on supportability and affordability. Standard logistics engineering methodologies that will be applied to all Office of Space Science and Applications' (OSSA) payload programs are outlined. These methodologies are applied to the 1.8 meter centrifuge.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The paper describes a computer programming system designed to be used for methodology research as well as applications in structural optimization. The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities existing in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of contraints and design variables. Features shown in numerical examples include: (1) variability of structural layout and overall shape geometry, (2) static strength and stiffness constraints, (3) local buckling failure, and (4) vibration constraints. The paper concludes with a review of the further development trends of this programing system.
André, Francisco J; Cardenete, M Alejandro; Romero, Carlos
2009-05-01
The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.
ERIC Educational Resources Information Center
Lipsey, Mark W.; Weiland, Christina; Yoshikawa, Hirokazu; Wilson, Sandra Jo; Hofer, Kerry G.
2015-01-01
Much of the currently available evidence on the causal effects of public prekindergarten programs on school readiness outcomes comes from studies that use a regression-discontinuity design (RDD) with the age cutoff to enter a program in a given year as the basis for assignment to treatment and control conditions. Because the RDD has high internal…
76 FR 21036 - Application of the Prevailing Wage Methodology in the H-2B Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
...: Notice. SUMMARY: On January 19, 2011, the Department of Labor (Department) published a final rule, Wage... work performed on or after January 1, 2012. Employers whose work commences in 2011 and continues into...-agricultural Employment H-2B Program, 76 FR 3452, Jan. 19, 2011. DATES: This Notice is effective on April 14...
NASTRAN benefits analysis. Volume 2: Final technical report
NASA Technical Reports Server (NTRS)
1972-01-01
Baseline data are considered for comparisons of the costs and benefits of the NASA structural analysis program and to determine impacts and benefits to current users. To develop this information, questionnaires were mailed to users. Personal and telephone interviews were made to solicit further information. The questions in the questionnaire and in the interview were related to benefits derived from the programs, areas of needed improvement, and applicable usage comments. The collected information was compiled and analyzed. Methodology, analyses, and results are presented. The information is applicable to issues preceding NASTRAN Level 15.
Structural Technology and Analysis Program (STAP) Delivery Order 0004: Durability Patch
NASA Astrophysics Data System (ADS)
Ikegami, Roy; Haugse, Eric; Trego, Angela; Rogers, Lynn; Maly, Joe
2001-06-01
Structural cracks in secondary structure, resulting from a high cycle fatigue (HCF) environment, are often referred to as nuisance cracks. This type of damage can result in costly inspections and repair. The repairs often do not last long because the repaired structure continues to respond in a resonant fashion to the environment. Although the use of materials for passive damping applications is well understood, there are few applications to high-cycle fatigue problems. This is because design information characterization temperature, resonant response frequency and strain levels are difficult to determine. The Durability Patch and Damage Dosimeter Program addressed these problems by: (1) Developing a damped repair design process which includes a methodology for designing the material and application characteristics required to optimally damp the repair. (2) Designing and developing a rugged, small, and lightweight data acquisition unit called the damage dosimeter. This is a battery operated, single board computer, capable of collecting three channels of strain and one channel of temperature, processing this data by user developed algorithms written in the C programming language, and storing the processed data in resident memory. The dosimeter is used to provide flight data needed to characterize the vibration environment. The vibration environment is then used to design the damping material characteristics and repair. The repair design methodology and dosimeter were demonstrated on B-52, C-130, and F-15 aircraft applications.
Millimeter wave satellite concepts, volume 1
NASA Technical Reports Server (NTRS)
Hilsen, N. B.; Holland, L. D.; Thomas, R. E.; Wallace, R. W.; Gallagher, J. G.
1977-01-01
The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications.
Remote-sensing applications as utilized in Florida's coastal zone management program
NASA Technical Reports Server (NTRS)
Worley, D. R.
1975-01-01
Land use maps were developed from photomaps obtained by remote sensing in order to develop a comprehensive state plan for the protection, development, and zoning of coastal regions. Only photographic remote sensors have been used in support of the coastal council's planning/management methodology. Standard photointerpretation and cartographic application procedures for map compilation were used in preparing base maps.
Automated sizing of large structures by mixed optimization methods
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.; Loendorf, D.
1973-01-01
A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.
The First NASA Advanced Composites Technology Conference, part 1
NASA Technical Reports Server (NTRS)
Davis, John G., Jr. (Compiler); Bohon, Herman L. (Compiler)
1991-01-01
Papers are presented from the conference. The ACT program is a multiyear research initiative to achieve a national goal of technology readiness before the end of the decade. Conference papers recorded results of research in the ACT program on new materials development and processing, innovative design concepts, analysis development and validation, cost effective manufacturing methodology, and cost tracking and prediction procedures. Papers presented on major applications programs approved by the Department of Defense are also included.
Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987
NASA Technical Reports Server (NTRS)
Gilmore, John F. (Editor)
1987-01-01
The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.
NASA Technical Reports Server (NTRS)
Elrad, Tzilla (Editor); Filman, Robert E. (Editor); Bader, Atef (Editor)
2001-01-01
Computer science has experienced an evolution in programming languages and systems from the crude assembly and machine codes of the earliest computers through concepts such as formula translation, procedural programming, structured programming, functional programming, logic programming, and programming with abstract data types. Each of these steps in programming technology has advanced our ability to achieve clear separation of concerns at the source code level. Currently, the dominant programming paradigm is object-oriented programming - the idea that one builds a software system by decomposing a problem into objects and then writing the code of those objects. Such objects abstract together behavior and data into a single conceptual and physical entity. Object-orientation is reflected in the entire spectrum of current software development methodologies and tools - we have OO methodologies, analysis and design tools, and OO programming languages. Writing complex applications such as graphical user interfaces, operating systems, and distributed applications while maintaining comprehensible source code has been made possible with OOP. Success at developing simpler systems leads to aspirations for greater complexity. Object orientation is a clever idea, but has certain limitations. We are now seeing that many requirements do not decompose neatly into behavior centered on a single locus. Object technology has difficulty localizing concerns invoking global constraints and pandemic behaviors, appropriately segregating concerns, and applying domain-specific knowledge. Post-object programming (POP) mechanisms that look to increase the expressiveness of the OO paradigm are a fertile arena for current research. Examples of POP technologies include domain-specific languages, generative programming, generic programming, constraint languages, reflection and metaprogramming, feature-oriented development, views/viewpoints, and asynchronous message brokering. (Czarneclu and Eisenecker s book includes a good survey of many of these technologies).
Payload training methodology study
NASA Technical Reports Server (NTRS)
1990-01-01
The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.
[Educative programs based on self-management: an integrative review].
Nascimento, Luciana da Silva; de Gutierrez, Maria Gaby Rivero; De Domenico, Edvane Birelo Lopes
2010-06-01
The objective was to identify definitions and/or explanations of the term self-management in educative programs that aim its development. The authors also aimed to describe the educative plans and results of the educative programs analyzed. As a methodology we used integrative review, with 15 published articles (2002 the 2007). The inclusion criteria was: the occurrence of the term self-management; the existence of an educative program for the development of self-management; to be related to the area of the health of the adult. Self-management means the improvement or acquisition of abilities to solve problems in biological, social and affective scopes. The review pointed to different educational methodologies. However, it also showed the predominance of traditional methods, with conceptual contents and of physiopathological nature. The learning was evaluated as favorable, with warns in relation to the application in different populations and contexts and to the increase of costs of the educative intervention. It was concluded that research has evidenced the importance of the education for self-management, but lacked in strength for not relating the biopsychosocial demands of the chronic patient and for not describing in detail the teaching and evaluation methodologies employed.
Linear programming computational experience with onyx
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atrek, E.
1994-12-31
ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.
Engine environmental effects on composite behavior
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Smith, G. T.
1980-01-01
A series of programs were conducted to investigate and develop the application of composite materials to turbojet engines. A significant part of that effort was directed to establishing the impact resistance and defect growth chracteristics of composite materials over the wide range of environmental conditions found in commercial turbojet engine operations. Both analytical and empirical efforts were involved. The experimental programs and the analytical methodology development as well as an evaluation program for the use of composite materials as fan exit guide vanes are summarized.
Characterization of Damage Accumulation in a C/SiC Composite at Elevated Temperatures
NASA Technical Reports Server (NTRS)
Telesman, Jack; Verrilli, Mike; Ghosn, Louis; Kantzos, Pete
1997-01-01
This research is part of a program aimed to evaluate and demonstrate the ability of candidate CMC materials for a variety of applications in reusable launch vehicles. The life and durability of these materials in rocket and engine applications are of major concern and there is a need to develop and validate life prediction methodology. In this study, material characterization and mechanical testing was performed in order to identify the failure modes, degradation mechanisms, and progression of damage in a C/SiC composite at elevated temperatures. The motivation for this work is to provide the relevant damage information that will form the basis for the development of a physically based life prediction methodology.
Model-Driven Approach for Body Area Network Application Development.
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-05-12
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.
Model-Driven Approach for Body Area Network Application Development
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-01-01
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394
Forecasting the Economic Impact of Future Space Station Operations
NASA Technical Reports Server (NTRS)
Summer, R. A.; Smolensky, S. M.; Muir, A. H.
1967-01-01
Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.
2003-01-01
This report documents research findings from a RAND project titled The Relative Cost Effectiveness of Military Advertising , the goal of which was to...develop and apply a methodology for assessing the cost effectiveness of the services’ advertising programs and to provide guidance for a more...examines issues related to the effectiveness of recruiting advertising during the 1980s and 1990s. It describes the policy context, summarizes the current
Optimal Trajectories Generation in Robotic Fiber Placement Systems
NASA Astrophysics Data System (ADS)
Gao, Jiuchun; Pashkevich, Anatol; Caro, Stéphane
2017-06-01
The paper proposes a methodology for optimal trajectories generation in robotic fiber placement systems. A strategy to tune the parameters of the optimization algorithm at hand is also introduced. The presented technique transforms the original continuous problem into a discrete one where the time-optimal motions are generated by using dynamic programming. The developed strategy for the optimization algorithm tuning allows essentially reducing the computing time and obtaining trajectories satisfying industrial constraints. Feasibilities and advantages of the proposed methodology are confirmed by an application example.
Space Station - An integrated approach to operational logistics support
NASA Technical Reports Server (NTRS)
Hosmer, G. J.
1986-01-01
Development of an efficient and cost effective operational logistics system for the Space Station will require logistics planning early in the program's design and development phase. This paper will focus on Integrated Logistics Support (ILS) Program techniques and their application to the Space Station program design, production and deployment phases to assure the development of an effective and cost efficient operational logistics system. The paper will provide the methodology and time-phased programmatic steps required to establish a Space Station ILS Program that will provide an operational logistics system based on planned Space Station program logistics support.
Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.
Lan, Y
1992-12-01
This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.
Programming Methodology for High Performance Applications on Tiled Architectures
2009-06-01
members, both voting and non- voting ; • A page of links, including links to all available PCA home pages, the home pages of other DARPA programs of...HPEC-SI); and • Organizational information on the Morphware Forum, such as membership requirements and voting procedures. The web site was the...The following organizations are voting members of the Morphware Forum at this writing: o Defense Advanced Research Projects Agency o Georgia
NASA Technical Reports Server (NTRS)
Bekey, I.; Mayer, H. L.; Wolfe, M. G.
1976-01-01
The methodology of alternate world future scenarios is utilized for selecting a plausible, though not advocated, set of future scenarios each of which results in a program plan appropriate for the respective environment. Each such program plan gives rise to different building block and technology requirements, which are analyzed for common need between the NASA and the DoD for each of the alternate world scenarios. An essentially invariant set of system, building block, and technology development plans is presented at the conclusion, intended to allow protection of most of the options for system concepts regardless of what the actual future world environment turns out to be. Thus, building block and technology needs are derived which support: (1) each specific world scenario; (2) all the world scenarios identified in this study; or (3) generalized scenarios applicable to almost any future environment. The output included in this volume consists of the building blocks, i.e.: transportation vehicles, orbital support vehicles, and orbital support facilities; the technology required to support the program plans; identification of their features which could support the DoD and NASA in common; and a complete discussion of the planning methodology.
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2011 CFR
2011-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2010 CFR
2010-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
Geo-Engineering through Internet Informatics (GEMINI)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doveton, John H.; Watney, W. Lynn
The program, for development and methodologies, was a 3-year interdisciplinary effort to develop an interactive, integrated Internet Website named GEMINI (Geo-Engineering Modeling through Internet Informatics) that would build real-time geo-engineering reservoir models for the Internet using the latest technology in Web applications.
42 CFR 86.13 - Project requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... the applicant has established good cause for its omission. (1) Provision of a method for development... method for implementation of the needed training; (3) Provision of an evaluation methodology, including... the training program; and (4) Provision of a method by which trainees will be selected. (b) In...
42 CFR 86.13 - Project requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the applicant has established good cause for its omission. (1) Provision of a method for development... method for implementation of the needed training; (3) Provision of an evaluation methodology, including... the training program; and (4) Provision of a method by which trainees will be selected. (b) In...
42 CFR 86.13 - Project requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... the applicant has established good cause for its omission. (1) Provision of a method for development... method for implementation of the needed training; (3) Provision of an evaluation methodology, including... the training program; and (4) Provision of a method by which trainees will be selected. (b) In...
42 CFR 86.13 - Project requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... the applicant has established good cause for its omission. (1) Provision of a method for development... method for implementation of the needed training; (3) Provision of an evaluation methodology, including... the training program; and (4) Provision of a method by which trainees will be selected. (b) In...
Algorithm-Based Fault Tolerance for Numerical Subroutines
NASA Technical Reports Server (NTRS)
Tumon, Michael; Granat, Robert; Lou, John
2007-01-01
A software library implements a new methodology of detecting faults in numerical subroutines, thus enabling application programs that contain the subroutines to recover transparently from single-event upsets. The software library in question is fault-detecting middleware that is wrapped around the numericalsubroutines. Conventional serial versions (based on LAPACK and FFTW) and a parallel version (based on ScaLAPACK) exist. The source code of the application program that contains the numerical subroutines is not modified, and the middleware is transparent to the user. The methodology used is a type of algorithm- based fault tolerance (ABFT). In ABFT, a checksum is computed before a computation and compared with the checksum of the computational result; an error is declared if the difference between the checksums exceeds some threshold. Novel normalization methods are used in the checksum comparison to ensure correct fault detections independent of algorithm inputs. In tests of this software reported in the peer-reviewed literature, this library was shown to enable detection of 99.9 percent of significant faults while generating no false alarms.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.
Little, Todd D; Wang, Eugene W; Gorrall, Britt K
2017-06-01
This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.
NASA Astrophysics Data System (ADS)
Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei
2018-01-01
In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
Use of Intervention Mapping to Enhance Health Care Professional Practice: A Systematic Review.
Durks, Desire; Fernandez-Llimos, Fernando; Hossain, Lutfun N; Franco-Trigo, Lucia; Benrimoj, Shalom I; Sabater-Hernández, Daniel
2017-08-01
Intervention Mapping is a planning protocol for developing behavior change interventions, the first three steps of which are intended to establish the foundations and rationales of such interventions. This systematic review aimed to identify programs that used Intervention Mapping to plan changes in health care professional practice. Specifically, it provides an analysis of the information provided by the programs in the first three steps of the protocol to determine their foundations and rationales of change. A literature search was undertaken in PubMed, Scopus, SciELO, and DOAJ using "Intervention Mapping" as keyword. Key information was gathered, including theories used, determinants of practice, research methodologies, theory-based methods, and practical applications. Seventeen programs aimed at changing a range of health care practices were included. The social cognitive theory and the theory of planned behavior were the most frequently used frameworks in driving change within health care practices. Programs used a large variety of research methodologies to identify determinants of practice. Specific theory-based methods (e.g., modelling and active learning) and practical applications (e.g., health care professional training and facilitation) were reported to inform the development of practice change interventions and programs. In practice, Intervention Mapping delineates a three-step systematic, theory- and evidence-driven process for establishing the theoretical foundations and rationales underpinning change in health care professional practice. The use of Intervention Mapping can provide health care planners with useful guidelines for the theoretical development of practice change interventions and programs.
ERIC Educational Resources Information Center
Comptroller General of the U.S., Washington, DC.
Efforts of the U.S. Department of Education to verify data submitted by applicants to the Pell Grant program were analyzed by the General Accounting Office. The effects of carrying out the Department's policy or methodology, called "validation," on financial aid applicants and colleges were assessed. Costs of 1982-1983 validation on…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashford, Mike
The report describes the prospects for energy efficiency and greenhouse gas emissions reductions in Mexico, along with renewable energy potential. A methodology for developing emissions baselines is shown, in order to prepare project emissions reductions calculations. An application to the USIJI program was also prepared through this project, for a portfolio of energy efficiency projects.
THE EPA'S EMERGING FOCUS ON LIFE CYCLE ASSESSMENT
EPA has been actively engaged in LCA research since 1990 to help advance the methodology and application of life cycle thinking in decision making. Across the Agency consideration of the life cycle concept is increasing in the development of policies and programs. A major force i...
NASA Astrophysics Data System (ADS)
D'silva, Oneil; Kerrison, Roger
2013-09-01
A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs
Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D
2017-04-01
Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Program management aid for redundancy selection and operational guidelines
NASA Technical Reports Server (NTRS)
Hodge, P. W.; Davis, W. L.; Frumkin, B.
1972-01-01
Although this criterion was developed specifically for use on the shuttle program, it has application to many other multi-missions programs (i.e. aircraft or mechanisms). The methodology employed is directly applicable even if the tools (nomographs and equations) are for mission peculiar cases. The redundancy selection criterion was developed to insure that both the design and operational cost impacts (life cycle costs) were considered in the selection of the quantity of operational redundancy. These tools were developed as aids in expediting the decision process and not intended as the automatic decision maker. This approach to redundancy selection is unique in that it enables a pseudo systems analysis to be performed on an equipment basis without waiting for all designs to be hardened.
Toward a Formal Evaluation of Refactorings
NASA Technical Reports Server (NTRS)
Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James
2008-01-01
Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.
Research on Statistical Methodology Applicable to Technical Problems Associated with Navy Programs.
1987-10-01
the fleetwide use of organotin antifouling paints that contain tributyltin ( TBT ), a tin-based compound, as a biocide. Organotin antifouling compounds...with this program has been devoted to providing technical and analytical support on various related statistical aspects. The TBT -release rate of a...paini ;q detrmined by placing a coated specimen panel or cylinder into a container of synthetic seawater, measuring the increasing concentration of TBT in
NASA Technical Reports Server (NTRS)
1971-01-01
Appendixes are presented that provide model input requirements, a sample case, flow charts, and a program listing. At the beginning of each appendix, descriptive details and technical comments are provided to indicate any special instructions applicable to the use of that appendix. In addition, the program listing includes comment cards that state the purpose of each subroutine in the complete program and describe operations performed within that subroutine. The input requirements includes details on the many options that adapt the program to the specific needs of the analyst for a particular problem.
1981-11-01
documenting attempts to define production functions in education. Levin 2 and Hanushek characterize current methodologies as being deficient both concept- ually...Evaluation and Policy Analysis, 2, 1, (1980), 27-36. 16. Hanushek , E. A., "Conceptual and Empirical Issues in Lstimation of Educational Production
Overview of Computer Simulation Modeling Approaches and Methods
Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett
2005-01-01
The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...
Automatic mathematical modeling for real time simulation program (AI application)
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1989-01-01
A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.
Using GIS Tools and Environmental Scanning to Forecast Industry Workforce Needs
ERIC Educational Resources Information Center
Gaertner, Elaine; Fleming, Kevin; Marquez, Michelle
2009-01-01
The Centers of Excellence (COE) provide regional workforce data on high growth, high demand industries and occupations for use by community colleges in program planning and resource enhancement. This article discusses the environmental scanning research methodology and its application to data-driven decision making in community college program…
Applied Virtual Reality Research and Applications at NASA/Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
1995-01-01
A Virtual Reality (VR) applications program has been under development at NASA/Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before this technology can be utilized with confidence in these applications, it must be validated for each particular class of application. That is, the precision and reliability with which it maps onto real settings and scenarios, representative of a class, must be calculated and assessed. The approach of the MSFC VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems. Specific validation studies for selected classes of applications have been completed or are currently underway. These include macro-ergonomic "control-room class" design analysis, Spacelab stowage reconfiguration training, a full-body micro-gravity functional reach simulator, and a gross anatomy teaching simulator. This paper describes the MSFC VR Applications Program and the validation studies.
Decision support and disease management: a logic engineering approach.
Fox, J; Thomson, R
1998-12-01
This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.
Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.
1978-01-01
Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.
Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.
2009-01-01
The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.
NASA Technical Reports Server (NTRS)
Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.
2010-01-01
An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.
Predicting the Coupling Properties of Axially-Textured Materials.
Fuentes-Cobas, Luis E; Muñoz-Romero, Alejandro; Montero-Cabrera, María E; Fuentes-Montero, Luis; Fuentes-Montero, María E
2013-10-30
A description of methods and computer programs for the prediction of "coupling properties" in axially-textured polycrystals is presented. Starting data are the single-crystal properties, texture and stereography. The validity and proper protocols for applying the Voigt, Reuss and Hill approximations to estimate coupling properties effective values is analyzed. Working algorithms for predicting mentioned averages are given. Bunge's symmetrized spherical harmonics expansion of orientation distribution functions, inverse pole figures and (single and polycrystals) physical properties is applied in all stages of the proposed methodology. The established mathematical route has been systematized in a working computer program. The discussion of piezoelectricity in a representative textured ferro-piezoelectric ceramic illustrates the application of the proposed methodology. Polycrystal coupling properties, predicted by the suggested route, are fairly close to experimentally measured ones.
Predicting the Coupling Properties of Axially-Textured Materials
Fuentes-Cobas, Luis E.; Muñoz-Romero, Alejandro; Montero-Cabrera, María E.; Fuentes-Montero, Luis; Fuentes-Montero, María E.
2013-01-01
A description of methods and computer programs for the prediction of “coupling properties” in axially-textured polycrystals is presented. Starting data are the single-crystal properties, texture and stereography. The validity and proper protocols for applying the Voigt, Reuss and Hill approximations to estimate coupling properties effective values is analyzed. Working algorithms for predicting mentioned averages are given. Bunge’s symmetrized spherical harmonics expansion of orientation distribution functions, inverse pole figures and (single and polycrystals) physical properties is applied in all stages of the proposed methodology. The established mathematical route has been systematized in a working computer program. The discussion of piezoelectricity in a representative textured ferro-piezoelectric ceramic illustrates the application of the proposed methodology. Polycrystal coupling properties, predicted by the suggested route, are fairly close to experimentally measured ones. PMID:28788370
A study of commuter airplane design optimization
NASA Technical Reports Server (NTRS)
Roskam, J.; Wyatt, R. D.; Griswold, D. A.; Hammer, J. L.
1977-01-01
Problems of commuter airplane configuration design were studied to affect a minimization of direct operating costs. Factors considered were the minimization of fuselage drag, methods of wing design, and the estimated drag of an airplane submerged in a propellor slipstream; all design criteria were studied under a set of fixed performance, mission, and stability constraints. Configuration design data were assembled for application by a computerized design methodology program similar to the NASA-Ames General Aviation Synthesis Program.
Taylor, George C.
1971-01-01
Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.
Evaluative methodology for prioritizing transportation energy conservation strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pang, L.M.G.
An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less
Developing CORBA-Based Distributed Scientific Applications From Legacy Fortran Programs
NASA Technical Reports Server (NTRS)
Sang, Janche; Kim, Chan; Lopez, Isaac
2000-01-01
An efficient methodology is presented for integrating legacy applications written in Fortran into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into Common Object Request Broker Architecture (CORBA) objects are discussed. Fortran codes are modified as little as possible as they are decomposed into modules and wrapped as objects. A new conversion tool takes the Fortran application as input and generates the C/C++ header file and Interface Definition Language (IDL) file. In addition, the performance of the client server computing is evaluated.
Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach
ERIC Educational Resources Information Center
Khoumsi, Ahmed; Hadjou, Brahim
2005-01-01
Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…
The report gives results of work that is part of EPA's program to identify and characterize emissions sources not currently accounted for by either the existing Aerometric Information Retrieval System (AIRS) or State Implementation Plan (SIP) area source methodologies and to deve...
European Social Fund in Portugal: A Complex Question for Human Resource Development
ERIC Educational Resources Information Center
Tome, Eduardo
2012-01-01
Purpose: This article aims to review the application of the funds awarded by the European Social Fund (ESF) to Portugal, since 1986, from a human resource development (HRD) perspective. Design/methodology/approach: Several variables are analyzed: investment, absorption, people, impact of investment, evolution of skills, main programs, supply and…
30 CFR 777.13 - Reporting of technical data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... descriptions of the methodology used to collect and analyze the data. (b) Technical analyses shall be planned... 30 Mineral Resources 3 2010-07-01 2010-07-01 false Reporting of technical data. 777.13 Section 777... PROGRAMS GENERAL CONTENT REQUIREMENTS FOR PERMIT APPLICATIONS § 777.13 Reporting of technical data. (a) All...
E-Halagat: An E-Learning System for Teaching the Holy Quran
ERIC Educational Resources Information Center
Elhadj, Yahya O. Mohamed
2010-01-01
Recently, there has been a great interest in Islamic software that try to harness computer to serve the religion. This brought about some applications and programs for the Holy Quran and its sciences, Hadith "[image omitted]" (Prophet's Tradition) and its methodology, Fiqh "[image omitted]" (Islamic jurisdiction), and Islamic law in general.…
Logistic Achievement Test Scaling and Equating with Fixed versus Estimated Lower Asymptotes.
ERIC Educational Resources Information Center
Phillips, S. E.
This study compared the lower asymptotes estimated by the maximum likelihood procedures of the LOGIST computer program with those obtained via application of the Norton methodology. The study also compared the equating results from the three-parameter logistic model with those obtained from the equipercentile, Rasch, and conditional…
Safeguards Technology Development Program 1st Quarter FY 2018 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prasad, Manoj K.
LLNL will evaluate the performance of a stilbene-based scintillation detector array for IAEA neutron multiplicity counting (NMC) applications. This effort will combine newly developed modeling methodologies and recently acquired high-efficiency stilbene detector units to quantitatively compare the prototype system performance with the conventional He-3 counters and liquid scintillator alternatives.
A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models
ERIC Educational Resources Information Center
Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela
2010-01-01
Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…
42 CFR 422.6 - Cost-sharing in enrollment-related costs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... counseling and assistance program) and section 1860D-1(c) of the Act (relating to dissemination of enrollment...—(1) Timing of collection. CMS collects the fees over 9 consecutive months beginning with January of... under title XVIII. (f) Assessment methodology. (1) The amount of the applicable portion of the user fee...
42 CFR 422.6 - Cost-sharing in enrollment-related costs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... assistance program) and section 1860D-1(c) of the Act (relating to dissemination of enrollment information... which enrollment is effected or coordinated under section 1851 of the Act. (d) Collection of fees—(1... under title XVIII. (f) Assessment methodology. (1) The amount of the applicable portion of the user fee...
42 CFR 422.6 - Cost-sharing in enrollment-related costs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... counseling and assistance program) and section 1860D-1(c) of the Act (relating to dissemination of enrollment...—(1) Timing of collection. CMS collects the fees over 9 consecutive months beginning with January of... under title XVIII. (f) Assessment methodology. (1) The amount of the applicable portion of the user fee...
42 CFR 422.6 - Cost-sharing in enrollment-related costs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... counseling and assistance program) and section 1860D-1(c) of the Act (relating to dissemination of enrollment...—(1) Timing of collection. CMS collects the fees over 9 consecutive months beginning with January of... under title XVIII. (f) Assessment methodology. (1) The amount of the applicable portion of the user fee...
Ranked Set Sampling and Its Applications in Educational Statistics
ERIC Educational Resources Information Center
Stovall, Holly
2012-01-01
Over the past decade educational research has been stimulated by new legislation such as the No Child Left Behind Act. Increasing emphasis is being placed on accurately quantifying the success of treatment programs through student achievement scores, so precise estimation is vital for establishing the efficacy of new methodology. Ranked set…
Knowledge Management Model: Practical Application for Competency Development
ERIC Educational Resources Information Center
Lustri, Denise; Miura, Irene; Takahashi, Sergio
2007-01-01
Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…
THE EMERGING FOCUS ON LIFE-CYCLE ASSESSMENT IN THE U. S. ENVIRONMENTAL PROTECTION AGENCY
EPA has been actively engaged in LCA research since 1990 to help advance the methodology and application of life cycle thinking in decision-making. Across the Agency consideration of the life cycle concept is increasing in the development of policies and programs. A major force i...
Summaries of the thematic conferences on remote sensing for exploration geology
NASA Technical Reports Server (NTRS)
1989-01-01
The Thematic Conference series was initiated to address the need for concentrated discussion of particular remote sensing applications. The program is primarily concerned with the application of remote sensing to mineral and hydrocarbon exploration, with special emphasis on data integration, methodologies, and practical solutions for geologists. Some fifty invited papers are scheduled for eleven plenary sessions, formulated to address such important topics as basement tectonics and their surface expressions, spectral geology, applications for hydrocarbon exploration, and radar applications and future systems. Other invited presentations will discuss geobotanical remote sensing, mineral exploration, engineering and environmental applications, advanced image processing, and integration and mapping.
A Robustness Testing Campaign for IMA-SP Partitioning Kernels
NASA Astrophysics Data System (ADS)
Grixti, Stephen; Lopez Trecastro, Jorge; Sammut, Nicholas; Zammit-Mangion, David
2015-09-01
With time and space partitioned architectures becoming increasingly appealing to the European space sector, the dependability of partitioning kernel technology is a key factor to its applicability in European Space Agency projects. This paper explores the potential of the data type fault model, which injects faults through the Application Program Interface, in partitioning kernel robustness testing. This fault injection methodology has been tailored to investigate its relevance in uncovering vulnerabilities within partitioning kernels and potentially contributing towards fault removal campaigns within this domain. This is demonstrated through a robustness testing case study of the XtratuM partitioning kernel for SPARC LEON3 processors. The robustness campaign exposed a number of vulnerabilities in XtratuM, exhibiting the potential benefits of using such a methodology for the robustness assessment of partitioning kernels.
Machine learning in motion control
NASA Technical Reports Server (NTRS)
Su, Renjeng; Kermiche, Noureddine
1989-01-01
The existing methodologies for robot programming originate primarily from robotic applications to manufacturing, where uncertainties of the robots and their task environment may be minimized by repeated off-line modeling and identification. In space application of robots, however, a higher degree of automation is required for robot programming because of the desire of minimizing the human intervention. We discuss a new paradigm of robotic programming which is based on the concept of machine learning. The goal is to let robots practice tasks by themselves and the operational data are used to automatically improve their motion performance. The underlying mathematical problem is to solve the problem of dynamical inverse by iterative methods. One of the key questions is how to ensure the convergence of the iterative process. There have been a few small steps taken into this important approach to robot programming. We give a representative result on the convergence problem.
Automation of Shuttle Tile Inspection - Engineering methodology for Space Station
NASA Technical Reports Server (NTRS)
Wiskerchen, M. J.; Mollakarimi, C.
1987-01-01
The Space Systems Integration and Operations Research Applications (SIORA) Program was initiated in late 1986 as a cooperative applications research effort between Stanford University, NASA Kennedy Space Center, and Lockheed Space Operations Company. One of the major initial SIORA tasks was the application of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. This effort has adopted a systems engineering approach consisting of an integrated set of rapid prototyping testbeds in which a government/university/industry team of users, technologists, and engineers test and evaluate new concepts and technologies within the operational world of Shuttle. These integrated testbeds include speech recognition and synthesis, laser imaging inspection systems, distributed Ada programming environments, distributed relational database architectures, distributed computer network architectures, multimedia workbenches, and human factors considerations.
Application of quality function deployment in defense technology development
NASA Astrophysics Data System (ADS)
Cornejo, Estrella De Maria Forster
1998-12-01
Introduction. As advances in aviation technology take place, the research community recognizes, and the operator demands that the progress in technology be integrated with the human component of the same, the aircrew. Integrative programs are primarily concerned with three sub-systems: the operator, the aircraft, and the cockpit. To accomplish their integration, a "dialogue" between the various disciplines addressing these aspects of the weapon system is indispensable. Such dialogue is theorized to be possible via Quality Function Deployment (QFD). QFD emphasizes an understanding of the relationships between the requirements of the aircrew and the technology the research and engineering community provides. In establishing these relationships, program management concerns such as need, applicability, affordability, and transition of the technology are addressed. Procedures. QFD was incorporated in a Performance Methodology (PMM). This methodology associates a particular technology's Measures of Performance (MOP) and the overall weapon system's Measures of Effectiveness (MOE) to which it is applied. Incorporation of QFD in the PMM was hypothesized to result in an improved PMM (Q-PMM) that would address both, the aircrew's interests and those of program management. Both methodologies were performed. The g-ensemble was selected as the technology of interest. The Standard and Combat Edge designs were examined for comparison purposes. Technology MOPs were ranked in order of importance in accordance to both the PMM and its proposed improvement, the Q-PMM. These methodologies were then evaluated by way of an experiment in the human centrifuge. This experiment was to answer two questions: Is there a relationship between the technology's MOP and the aircraft's MOEs? Given a MOP-MOE relationship, is there a difference between the two ensembles? Findings. The Q-PMM was superior to the PMM in addressing customer's requirements. The Q-PMM was superior to the PMM in addressing program management concerns. The Q-PMM provided an improvement in defining the MOPs that would best describe the overall system MOEs. Both, MOP-MOE associations and a basis of comparison between the two ensembles were elucidated. Conclusion. The findings demonstrated QFD to be an effective approach to defense technology development.
NERVA dynamic analysis methodology, SPRVIB
NASA Technical Reports Server (NTRS)
Vronay, D. F.
1972-01-01
The general dynamic computer code called SPRVIB (Spring Vib) developed in support of the NERVA (nuclear engine for rocket vehicle application) program is described. Using normal mode techniques, the program computes kinematical responses of a structure caused by various combinations of harmonic and elliptic forcing functions or base excitations. Provision is made for a graphical type of force or base excitation input to the structure. A description of the required input format and a listing of the program are presented, along with several examples illustrating the use of the program. SPRVIB is written in FORTRAN 4 computer language for use on the CDC 6600 or the IBM 360/75 computers.
Variable thickness transient ground-water flow model. Volume 3. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenauer, A.E.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less
CSTI Earth-to-orbit propulsion research and technology program overview
NASA Technical Reports Server (NTRS)
Gentz, Steven J.
1993-01-01
NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marietta, Melvin Gary; Anderson, D. Richard; Bonano, Evaristo J.
2011-11-01
Sandia National Laboratories (SNL) is the world leader in the development of the detailed science underpinning the application of a probabilistic risk assessment methodology, referred to in this report as performance assessment (PA), for (1) understanding and forecasting the long-term behavior of a radioactive waste disposal system, (2) estimating the ability of the disposal system and its various components to isolate the waste, (3) developing regulations, (4) implementing programs to estimate the safety that the system can afford to individuals and to the environment, and (5) demonstrating compliance with the attendant regulatory requirements. This report documents the evolution of themore » SNL PA methodology from inception in the mid-1970s, summarizing major SNL PA applications including: the Subseabed Disposal Project PAs for high-level radioactive waste; the Waste Isolation Pilot Plant PAs for disposal of defense transuranic waste; the Yucca Mountain Project total system PAs for deep geologic disposal of spent nuclear fuel and high-level radioactive waste; PAs for the Greater Confinement Borehole Disposal boreholes at the Nevada National Security Site; and PA evaluations for disposal of high-level wastes and Department of Energy spent nuclear fuels stored at Idaho National Laboratory. In addition, the report summarizes smaller PA programs for long-term cover systems implemented for the Monticello, Utah, mill-tailings repository; a PA for the SNL Mixed Waste Landfill in support of environmental restoration; PA support for radioactive waste management efforts in Egypt, Iraq, and Taiwan; and, most recently, PAs for analysis of alternative high-level radioactive waste disposal strategies including repositories deep borehole disposal and geologic repositories in shale and granite. Finally, this report summarizes the extension of the PA methodology for radioactive waste disposal toward development of an enhanced PA system for carbon sequestration and storage systems. These efforts have produced a generic PA methodology for the evaluation of waste management systems that has gained wide acceptance within the international community. This report documents how this methodology has been used as an effective management tool to evaluate different disposal designs and sites; inform development of regulatory requirements; identify, prioritize, and guide research aimed at reducing uncertainties for objective estimations of risk; and support safety assessments.« less
Cognitive Networking With Regards to NASA's Space Communication and Navigation Program
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Paulsen, Phillip E.; Vaden, Karl R.; Ponchak, Denise S.
2013-01-01
This report describes cognitive networking (CN) and its application to NASA's Space Communication and Networking (SCaN) Program. This report clarifies the terminology and framework of CN and provides some examples of cognitive systems. It then provides a methodology for developing and deploying CN techniques and technologies. Finally, the report attempts to answer specific questions regarding how CN could benefit SCaN. It also describes SCaN's current and target networks and proposes places where cognition could be deployed.
NASA Astrophysics Data System (ADS)
1992-10-01
The overall objective of the DARPA/Tri-Service RASSP program is to demonstrate a capability to rapidly specify, produce, and yield domain-specific, affordable signal processors for use in Department of Defense systems such as automatic target acquisition, tracking, and recognition, electronic countermeasures, communications, and SIGINT. The objective of the study phase is to specify a recommended program plan for the government to use as a template for procurement of the RASSP design system and demonstration program. To accomplish that objective, the study phase program tasks are to specify a development methodology for signal processors (adaptable to various organizational design styles, and application areas), analyze the requirements in CAD/CAE tools to support the development methodology, identify the state and development plans of the industry relative to this area, and to recommend the additional developments not currently being addressed by the industry, which are recommended as RASSP developments. In addition, the RASSP study phase will define a linking approach for electronically linking design centers to manufacturing centers so a complete cycle for prototyping can be accomplished with significantly reduced cycle time.
HPC Programming on Intel Many-Integrated-Core Hardware with MAGMA Port to Xeon Phi
Dongarra, Jack; Gates, Mark; Haidar, Azzam; ...
2015-01-01
This paper presents the design and implementation of several fundamental dense linear algebra (DLA) algorithms for multicore with Intel Xeon Phi coprocessors. In particular, we consider algorithms for solving linear systems. Further, we give an overview of the MAGMA MIC library, an open source, high performance library, that incorporates the developments presented here and, more broadly, provides the DLA functionality equivalent to that of the popular LAPACK library while targeting heterogeneous architectures that feature a mix of multicore CPUs and coprocessors. The LAPACK-compliance simplifies the use of the MAGMA MIC library in applications, while providing them with portably performant DLA.more » High performance is obtained through the use of the high-performance BLAS, hardware-specific tuning, and a hybridization methodology whereby we split the algorithm into computational tasks of various granularities. Execution of those tasks is properly scheduled over the heterogeneous hardware by minimizing data movements and mapping algorithmic requirements to the architectural strengths of the various heterogeneous hardware components. Our methodology and programming techniques are incorporated into the MAGMA MIC API, which abstracts the application developer from the specifics of the Xeon Phi architecture and is therefore applicable to algorithms beyond the scope of DLA.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
...: Data collection to understand how NIH programs apply methodologies to improve their research programs... research programs apply methodologies to improve their organizational effectiveness. The degree of an...; 30-Day Comment Request; Data Collection To Understand How NIH Programs Apply Methodologies To Improve...
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
Campbell, Rebecca; Patterson, Debra; Bybee, Deborah
2011-03-01
This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.
Energy efficient transport technology: Program summary and bibliography
NASA Technical Reports Server (NTRS)
Middleton, D. B.; Bartlett, D. W.; Hood, R. V.
1985-01-01
The Energy Efficient Transport (EET) Program began in 1976 as an element of the NASA Aircraft Energy Efficiency (ACEE) Program. The EET Program and the results of various applications of advanced aerodynamics and active controls technology (ACT) as applicable to future subsonic transport aircraft are discussed. Advanced aerodynamics research areas included high aspect ratio supercritical wings, winglets, advanced high lift devices, natural laminar flow airfoils, hybrid laminar flow control, nacelle aerodynamic and inertial loads, propulsion/airframe integration (e.g., long duct nacelles) and wing and empennage surface coatings. In depth analytical/trade studies, numerous wind tunnel tests, and several flight tests were conducted. Improved computational methodology was also developed. The active control functions considered were maneuver load control, gust load alleviation, flutter mode control, angle of attack limiting, and pitch augmented stability. Current and advanced active control laws were synthesized and alternative control system architectures were developed and analyzed. Integrated application and fly by wire implementation of the active control functions were design requirements in one major subprogram. Additional EET research included interdisciplinary technology applications, integrated energy management, handling qualities investigations, reliability calculations, and economic evaluations related to fuel savings and cost of ownership of the selected improvements.
Programming research: where are we and where do we go from here?
Koletzko, Berthold; Symonds, Michael E; Olsen, Sjurdur F
2011-12-01
Convincing evidence has accumulated to show that both pre- and postnatal nutrition preprogram long-term health, well-being, and performance until adulthood and old age. There is a very large potential in the application of this knowledge to promote public health. One of the prerequisites for translational application is to strengthen the scientific evidence. More extensive knowledge is needed (eg, on effect sizes of early life programming in contemporary populations, on specific nutritional exposures, on sensitive time periods in early life, on precise underlying mechanisms, and on potential effect differences in subgroups characterized by, eg, genetic predisposition or sex). Future programming research should aim at filling the existing gaps in scientific knowledge, consider the entire lifespan, address socioeconomic issues, and foster innovation. Research should aim at results suitable for translational application (eg, by leading to health-promoting policies and evidence-based dietary recommendations in the perinatal period). International collaboration and a close research partnership of academia, industry, and small and medium enterprises may strengthen research and innovative potential enhancing the likelihood of translational application. The scientific know-how and methodology available today allow us to take major steps forward in the near future; hence, research on nutritional programming deserves high priority.
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
Application of a substructuring technique to the problem of crack extension and closure
NASA Technical Reports Server (NTRS)
Armen, H., Jr.
1974-01-01
A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.
Balanced Scorecards in Managing Higher Education Institutions: An Indian Perspective
ERIC Educational Resources Information Center
Umashankar, Venkatesh; Dutta, Kirti
2007-01-01
Purpose: The paper aims to look at the balanced scorecard (BSC) concept and discuss in what way it should be applied to higher education programs/institutions in the Indian context. Design/methodology/approach: The paper is based on extant literature on the balanced scorecard concept per se, as well as applications of BSC in higher education as…
Development of Robotics Applications in a Solid Propellant Mixing Laboratory
1988-06-01
implementation of robotic hardware and software into a laboratory environment requires a carefully structured series of phases which examines, in...strategy. The general methodology utilized in this project is discussed in Appendix A. The proposed laboratory robotics development program was structured ...Accessibility - Potential modifications - Safety precautions e) Robot Transport - Slider mechanisms - Linear tracks - Gantry configuration - Mobility f
ERIC Educational Resources Information Center
Jobbitt, Todd
2014-01-01
The purpose of this survey was to ascertain Korean teacher-trainees' perspectives on the awareness, likability, perceived usefulness and prospective application of varied language teaching methods that they had been taught in a sixteen-week language teaching methodology course. What did the students think about these methods? Will students…
Error-rate prediction for programmable circuits: methodology, tools and studied cases
NASA Astrophysics Data System (ADS)
Velazco, Raoul
2013-05-01
This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).
Performance evaluation of a distance learning program.
Dailey, D J; Eno, K R; Brinkley, J F
1994-01-01
This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.
Application of parameter estimation to aircraft stability and control: The output-error approach
NASA Technical Reports Server (NTRS)
Maine, Richard E.; Iliff, Kenneth W.
1986-01-01
The practical application of parameter estimation methodology to the problem of estimating aircraft stability and control derivatives from flight test data is examined. The primary purpose of the document is to present a comprehensive and unified picture of the entire parameter estimation process and its integration into a flight test program. The document concentrates on the output-error method to provide a focus for detailed examination and to allow us to give specific examples of situations that have arisen. The document first derives the aircraft equations of motion in a form suitable for application to estimation of stability and control derivatives. It then discusses the issues that arise in adapting the equations to the limitations of analysis programs, using a specific program for an example. The roles and issues relating to mass distribution data, preflight predictions, maneuver design, flight scheduling, instrumentation sensors, data acquisition systems, and data processing are then addressed. Finally, the document discusses evaluation and the use of the analysis results.
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.
1988-01-01
The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.
Web-4D-QSAR: A web-based application to generate 4D-QSAR descriptors.
Ataide Martins, João Paulo; Rougeth de Oliveira, Marco Antônio; Oliveira de Queiroz, Mário Sérgio
2018-06-05
A web-based application is developed to generate 4D-QSAR descriptors using the LQTA-QSAR methodology, based on molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. The LQTAGrid module calculates the intermolecular interaction energies at each grid point, considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. A friendly front end web interface, built using the Django framework and Python programming language, integrates all steps of the LQTA-QSAR methodology in a way that is transparent to the user, and in the backend, GROMACS and LQTAGrid are executed to generate 4D-QSAR descriptors to be used later in the process of QSAR model building. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
Features and characterization needs of rubber composite structures
NASA Technical Reports Server (NTRS)
Tabaddor, Farhad
1989-01-01
Some of the major unique features of rubber composite structures are outlined. The features covered are those related to the material properties, but the analytical features are also briefly discussed. It is essential to recognize these features at the planning stage of any long-range analytical, experimental, or application program. The development of a general and comprehensive program which fully accounts for all the important characteristics of tires, under all the relevant modes of operation, may present a prohibitively expensive and impractical task at the near future. There is therefore a need to develop application methodologies which can utilize the less general models, beyond their theoretical limitations and yet with reasonable reliability, by proper mix of analytical, experimental, and testing activities.
Yeun, Eun Ja; Kwon, Hye Jin; Kim, Hyun Jeong
2012-06-01
This study was done to identify the awareness of gender equality among nursing college students, and to provide basic data for educational solutions and desirable directions. A Q-methodology which provides a method of analyzing the subjectivity of each item was used. 34 selected Q-statements from each of 20 women nursing college students were classified into a shape of normal distribution using 9-point scale. Subjectivity on the equality among genders was analyzed by the pc-QUANL program. Four types of awareness of gender equality in nursing college students were identified. The name for type I was 'pursuit of androgyny', for type II, 'difference-recognition', for type III, 'human-relationship emphasis', and for type IV, 'social-system emphasis'. The results of this study indicate that different approaches to educational programs on gender equality are recommended for nursing college students based on the four types of gender equality awareness.
Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.K.; Cole, C.R.; Bond, F.W.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less
First NASA Advanced Composites Technology Conference, Part 2
NASA Technical Reports Server (NTRS)
Davis, John G., Jr. (Compiler); Bohon, Herman L. (Compiler)
1991-01-01
Presented here is a compilation of papers presented at the first NASA Advanced Composites Technology (ACT) Conference held in Seattle, Washington, from 29 Oct. to 1 Nov. 1990. The ACT program is a major new multiyear research initiative to achieve a national goal of technology readiness before the end of the decade. Included are papers on materials development and processing, innovative design concepts, analysis development and validation, cost effective manufacturing methodology, and cost tracking and prediction procedures. Papers on major applications programs approved by the Department of Defense are also included.
A survey of computational aerodynamics in the United States
NASA Technical Reports Server (NTRS)
Gessow, A.; Morris, D. J.
1977-01-01
Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.
ERIC Educational Resources Information Center
Jenkins, Thomas H.; Seufert, Robert
This bibliography lists publications that deal with the application of social science theory, methodology, and research to urban action programs and social policy planning. Entries were selected primarily with social and urban planners in mind; however, they will be of value to anyone interested in the relationship between the social sciences and…
A computer simulator for development of engineering system design methodologies
NASA Technical Reports Server (NTRS)
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Smartfiles: An OO approach to data file interoperability
NASA Technical Reports Server (NTRS)
Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John
1995-01-01
Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.
1990-01-01
The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Application of subsize specimens in nuclear plant life extension
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosinski, S.T.; Kumar, A.S.; Cannon, S.C.
1991-01-01
The US Department of Energy is sponsoring a research effort through Sandia National Laboratories and the University of Missour-Rolla to test a correlation for the upper shelf energy (USE) values obtained from the impact testing of subsize Charpy V-notch specimens to those obtained from the testing of full size samples. The program involves the impact testing of unirradiated and irradiated full, half, and third size Charpy V-notch specimens. To verify the applicability of the correlation on LWR materials unirradiated and irradiated full, half, and third size Charpy V-notch specimens of a commercial pressure vessel steel (ASTM A533 Grade B) willmore » be tested. This paper will provide details of the program and present results obtained from the application of the developed correlation methodology to the impact testing of the unirradiated full, half, and third size A533 Grade B Charpy V-notch specimens.« less
Application of subsize specimens in nuclear plant life extension
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosinski, S.T.; Kumar, A.S.; Cannon, S.C.
1991-12-31
The US Department of Energy is sponsoring a research effort through Sandia National Laboratories and the University of Missour-Rolla to test a correlation for the upper shelf energy (USE) values obtained from the impact testing of subsize Charpy V-notch specimens to those obtained from the testing of full size samples. The program involves the impact testing of unirradiated and irradiated full, half, and third size Charpy V-notch specimens. To verify the applicability of the correlation on LWR materials unirradiated and irradiated full, half, and third size Charpy V-notch specimens of a commercial pressure vessel steel (ASTM A533 Grade B) willmore » be tested. This paper will provide details of the program and present results obtained from the application of the developed correlation methodology to the impact testing of the unirradiated full, half, and third size A533 Grade B Charpy V-notch specimens.« less
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Structural Optimization Methodology for Rotating Disks of Aircraft Engines
NASA Technical Reports Server (NTRS)
Armand, Sasan C.
1995-01-01
In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.
Solar energy program evaluation: an introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
deLeon, P.
The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less
Metal- matrix composite processing technologies for aircraft engine applications
NASA Astrophysics Data System (ADS)
Pank, D. R.; Jackson, J. J.
1993-06-01
Titanium metal-matrix composites (MMC) are prime candidate materials for aerospace applications be-cause of their excellent high-temperature longitudinal strength and stiffness and low density compared with nickel- and steel-base materials. This article examines the steps GE Aircraft Engines (GEAE) has taken to develop an induction plasma deposition (IPD) processing method for the fabrication of Ti6242/SiC MMC material. Information regarding process methodology, microstructures, and mechani-cal properties of consolidated MMC structures will be presented. The work presented was funded under the GE-Aircraft Engine IR & D program.
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Integrated Data Base Program: a status report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Notz, K.J.; Klein, J.A.
1984-06-01
The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providingmore » direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures.« less
Development and exploration of a new methodology for the fitting and analysis of XAS data.
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010), J. Synchrotron Rad. 17, 132-137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl(4)(2-), a common reference compound used for calibration and covalency estimation in M-Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples.
Development and exploration of a new methodology for the fitting and analysis of XAS data
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Succession planning for technical experts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirk, Bernadette Lugue; Cain, Ronald A.; Dewji, Shaheen A.
This report describes a methodology for identifying, evaluating, and mitigating the loss of key technical skills at nuclear operations facilities. The methodology can be adapted for application within regulatory authorities and research and development organizations, and can be directly applied by international engagement partners of the Department of Energy’s National Nuclear Security Administration (NNSA). The resultant product will be of direct benefit to two types of NNSA missions: (1) domestic human capital development programs tasked to provide focused technical expertise to succeed an aging nuclear operations workforce, and (2) international safeguards programs charged with maintaining operational safeguards for developing/existing nuclearmore » power program in nations where minimal available resources must be used effectively. This report considers succession planning and the critical skills necessary to meet an institution’s goals and mission. Closely tied to succession planning are knowledge management and mentorship. In considering succession planning, critical skill sets are identified and are greatly dependent on the subject matter expert in question. This report also provides examples of critical skills that are job specific.« less
Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millard, W. David; Johnson, Daniel M.; Henderson, John M.
2014-07-28
Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedbackmore » during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.« less
Ensemble: an Architecture for Mission-Operations Software
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Powell, Mark; Fox, Jason; Rabe, Kenneth; Shu, IHsiang; McCurdy, Michael; Vera, Alonso
2008-01-01
Ensemble is the name of an open architecture for, and a methodology for the development of, spacecraft mission operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations- type software. Ensemble capitalizes on the strengths of the open-source Eclipse software and its architecture to address several issues that have arisen repeatedly in the development of mission-operations software: Heretofore, mission-operations application programs have been developed in disparate programming environments and integrated during the final stages of development of missions. The programs have been poorly integrated, and it has been costly to develop, test, and deploy them. Users of each program have been forced to interact with several different graphical user interfaces (GUIs). Also, the strategy typically used in integrating the programs has yielded serial chains of operational software tools of such a nature that during use of a given tool, it has not been possible to gain access to the capabilities afforded by other tools. In contrast, the Ensemble approach offers a low-risk path towards tighter integration of mission-operations software tools.
ERIC Educational Resources Information Center
Macy, Barry A.; Mirvis, Philip H.
1982-01-01
A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…
Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure
ERIC Educational Resources Information Center
Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.
2014-01-01
Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…
Object-oriented microcomputer software for earthquake seismology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroeger, G.C.
1993-02-01
A suite of graphically interactive applications for the retrieval, editing and modeling of earthquake seismograms have been developed using object-orientation programming methodology and the C++ language. Retriever is an application which allows the user to search for, browse, and extract seismic data from CD-ROMs produced by the National Earthquake Information Center (NEIC). The user can restrict the date, size, location and depth of desired earthquakes and extract selected data into a variety of common seismic file formats. Reformer is an application that allows the user to edit seismic data and data headers, and perform a variety of signal processing operationsmore » on that data. Synthesizer is a program for the generation and analysis of teleseismic P and SH synthetic seismograms. The program provides graphical manipulation of source parameters, crustal structures and seismograms, as well as near real-time response in generating synthetics for arbitrary flat-layered crustal structures. All three applications use class libraries developed for implementing geologic and seismic objects and views. Standard seismogram view objects and objects that encapsulate the reading and writing of different seismic data file formats are shared by all three applications. The focal mechanism views in Synthesizer are based on a generic stereonet view object. Interaction with the native graphical user interface is encapsulated in a class library in order to simplify the porting of the software to different operating systems and application programming interfaces. The software was developed on the Apple Macintosh and is being ported to UNIX/X-Window platforms.« less
ERIC Educational Resources Information Center
Coryn, Chris L. S.; Schroter, Daniela C.; Hanssen, Carl E.
2009-01-01
Brinkerhoff's Success Case Method (SCM) was developed with the specific purpose of assessing the impact of organizational interventions (e.g., training and coaching) on business goals by analyzing extreme groups using case study techniques and storytelling. As an efficient and cost-effective method of evaluative inquiry, SCM is attractive in other…
Evolving rule-based systems in two medical domains using genetic programming.
Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf
2004-11-01
To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?
Piwek, Lukasz; Ellis, David A
2016-01-01
Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use and secure becomes freely available.
Effective peer education in HIV: defining factors that maximise success.
Lambert, Steven M; Debattista, Joseph; Bodiroza, Aleksandar; Martin, Jack; Staunton, Shaun; Walker, Rebecca
2013-08-01
Background Peer education is considered an effective health promotion and education strategy, particularly to populations traditionally resistant to conventional forms of health information dissemination. This has made it very applicable to HIV education and prevention, where those who are affected or at risk are often amongst the most vulnerable in society. However, there still remains uncertainty as to the reasons for its effectiveness, what constitutes an effective methodology and why a consistent methodology can often result in widely variable outcomes. Between 2008 and 2010, three separate reviews of peer education were undertaken across more than 30 countries in three distinct geographical regions across the globe. The reviews sought to identify determinants of the strengths and weaknesses inherent in approaches to peer education, particularly targeting young people and the most at-risk populations. By assessing the implementation of peer education programs across a variety of social environments, it was possible to develop a contextual understanding for peer education's effectiveness and provide a picture of the social, cultural, political, legal and geographic enablers and disablers to effective peer education. Several factors were significant contributors to program success, not as strategies of methodology, but as elements of the social, cultural, political and organisational context in which peer education was situated. Contextual elements create environments supportive of peer education. Consequently, adherence to a methodology or strategy without proper regard to its situational context rarely contributes to effective peer education.
Designing nursing excellence through a National Quality Forum nurse scholar program.
Neumann, Julie A; Brady-Schluttner, Katherine A; Attlesey-Pries, Jacqueline M; Twedell, Diane M
2010-01-01
Closing the knowledge gap for current practicing nurses in the Institute of Medicine (IOM) core competencies is critical to providing safe patient care. The National Quality Forum (NQF) nurse scholar program is one organization's journey to close the gap in the IOM core competencies in a large teaching organization. The NQF nurse scholar program is positioned to provide a plan to assist current nurses to accelerate their learning about quality improvement, evidence-based practice, and informatics, 3 of the core competencies identified by the IOM, and focus on application of skills to NQF nurse-sensitive measures. Curriculum outline, educational methodologies, administrative processes, and aims of the project are discussed.
Thermal and orbital analysis of Earth monitoring Sun-synchronous space experiments
NASA Technical Reports Server (NTRS)
Killough, Brian D.
1990-01-01
The fundamentals of an Earth monitoring Sun-synchronous orbit are presented. A Sun-synchronous Orbit Analysis Program (SOAP) was developed to calculate orbital parameters for an entire year. The output from this program provides the required input data for the TRASYS thermal radiation computer code, which in turn computes the infrared, solar and Earth albedo heat fluxes incident on a space experiment. Direct incident heat fluxes can be used as input to a generalized thermal analyzer program to size radiators and predict instrument operating temperatures. The SOAP computer code and its application to the thermal analysis methodology presented, should prove useful to the thermal engineer during the design phases of Earth monitoring Sun-synchronous space experiments.
NASA Astrophysics Data System (ADS)
Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig
2013-05-01
Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.
Low-thrust trajectory analysis for the geosynchronous mission
NASA Technical Reports Server (NTRS)
Jasper, T. P.
1973-01-01
Methodology employed in development of a computer program designed to analyze optimal low-thrust trajectories is described, and application of the program to a Solar Electric Propulsion Stage (SEPS) geosynchronous mission is discussed. To avoid the zero inclination and eccentricity singularities which plague many small-force perturbation techniques, a special set of state variables (equinoctial) is used. Adjoint equations are derived for the minimum time problem and are also free from the singularities. Solutions to the state and adjoint equations are obtained by both orbit averaging and precision numerical integration; an evaluation of these approaches is made.
Industrial machinery noise impact modeling, volume 1
NASA Astrophysics Data System (ADS)
Hansen, C. H.; Kugler, B. A.
1981-07-01
The development of a machinery noise computer model which may be used to assess the effect of occupational noise on the health and welfare of industrial workers is discussed. The purpose of the model is to provide EPA with the methodology to evaluate the personnel noise problem, to identify the equipment types responsible for the exposure and to assess the potential benefits of a given noise control action. Due to its flexibility in design and application, the model and supportive computer program can be used by other federal agencies, state governments, labor and industry as an aid in the development of noise abatement programs.
2014-08-06
This final rule will update the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs). These changes will be applicable to IPF discharges occurring during the fiscal year (FY) beginning October 1, 2014 through September 30, 2015. This final rule will also address implementation of ICD-10-CM and ICD-10-PCS codes; finalize a new methodology for updating the cost of living adjustment (COLA), and finalize new quality measures and reporting requirements under the IPF quality reporting program.
Programming Probabilistic Structural Analysis for Parallel Processing Computer
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.
1991-01-01
The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.
Display/control requirements for automated VTOL aircraft
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Kleinman, D. L.; Young, L. R.
1976-01-01
A systematic design methodology for pilot displays in advanced commercial VTOL aircraft was developed and refined. The analyst is provided with a step-by-step procedure for conducting conceptual display/control configurations evaluations for simultaneous monitoring and control pilot tasks. The approach consists of three phases: formulation of information requirements, configuration evaluation, and system selection. Both the monitoring and control performance models are based upon the optimal control model of the human operator. Extensions to the conventional optimal control model required in the display design methodology include explicit optimization of control/monitoring attention; simultaneous monitoring and control performance predictions; and indifference threshold effects. The methodology was applied to NASA's experimental CH-47 helicopter in support of the VALT program. The CH-47 application examined the system performance of six flight conditions. Four candidate configurations are suggested for evaluation in pilot-in-the-loop simulations and eventual flight tests.
[Life style: instrument in health promotion programs].
Jiménez, D
1993-05-01
Non communicable diseases are increasing in third world countries, including Chile. Life style is one of the principal factors influencing this increase. Therefore programs and health strategies to modify the population life styles are needed. The programs developed to change life styles depend on the medical sociocultural scenery and the concept becomes outstanding when disease prevention is replaced by health promotion. The requirements for the application of the concept of life style in health promotion plans and fostering of healthy life styles are: 1) Training in behavioral epidemiology. 2) Election of a biopsychosocial concept of life style. 3) Identify the predominant scenery and target population. 4) Choose the appropriate educational methodologies to change behaviors. 5) Formalize strategies according to the boundaries where the program is applied. 6) Specify the qualifying requisites of the change agents, health promoters and program operators.
Software Requirements Engineering Methodology (Development)
1979-06-01
Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway
NASA/CARES dual-use ceramic technology spinoff applications
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.
1994-01-01
NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.
A Screening Method for Assessing Cumulative Impacts
Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan
2012-01-01
The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening tool to help Cal/EPA programs prioritize their activities and target those communities with the greatest cumulative impacts. PMID:22470315
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
Maru, Shoko; Byrnes, Joshua; Carrington, Melinda J; Stewart, Simon; Scuffham, Paul A
2015-01-01
Substantial variation in economic analyses of cardiovascular disease management programs hinders not only the proper assessment of cost-effectiveness but also the identification of heterogeneity of interest such as patient characteristics. The authors discuss the impact of reporting and methodological variation on the cost-effectiveness of cardiovascular disease management programs by introducing issues that could lead to different policy or clinical decisions, followed by the challenges associated with net intervention effects and generalizability. The authors conclude with practical suggestions to mitigate the identified issues. Improved transparency through standardized reporting practice is the first step to advance beyond one-off experiments (limited applicability outside the study itself). Transparent reporting is a prerequisite for rigorous cost-effectiveness analyses that provide unambiguous implications for practice: what type of program works for whom and how.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-04
... local HUD program staff. Questions on how to conduct FMR surveys or further methodological explanations... version of the RDD survey methodology for smaller, nonmetropolitan PHAs. This methodology is designed to...
Application of Oral Fluid Assays in Support of Mumps, Rubella and Varicella Control Programs.
Maple, Peter A C
2015-12-09
Detection of specific viral antibody or nucleic acid produced by infection or immunization, using oral fluid samples, offers increased potential for wider population uptake compared to blood sampling. This methodology is well established for the control of HIV and measles infections, but can also be applied to the control of other vaccine preventable infections, and this review describes the application of oral fluid assays in support of mumps, rubella and varicella national immunization programs. In England and Wales individuals with suspected mumps or rubella, based on clinical presentation, can have an oral fluid swab sample taken for case confirmation. Universal varicella immunization of children has led to a drastic reduction of chickenpox in those countries where it is used; however, in England and Wales such a policy has not been instigated. Consequently, in England and Wales most children have had chickenpox by age 10 years; however, small, but significant, numbers of adults remain susceptible. Targeted varicella zoster virus (VZV) immunization of susceptible adolescents offers the potential to reduce the pool of susceptible adults and oral fluid determination of VZV immunity in adolescents is a potential means of identifying susceptible individuals in need of VZV vaccination. The main application of oral fluid testing is in those circumstances where blood sampling is deemed not necessary, or is undesirable, and when the documented sensitivity and specificity of the oral fluid assay methodology to be used is considered sufficient for the purpose intended.
Developing CORBA-Based Distributed Scientific Applications from Legacy Fortran Programs
NASA Technical Reports Server (NTRS)
Sang, Janche; Kim, Chan; Lopez, Isaac
2000-01-01
Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.
The economics of project analysis: Optimal investment criteria and methods of study
NASA Technical Reports Server (NTRS)
Scriven, M. C.
1979-01-01
Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
Artificial intelligence and the space station software support environment
NASA Technical Reports Server (NTRS)
Marlowe, Gilbert
1986-01-01
In a software system the size of the Space Station Software Support Environment (SSE), no one software development or implementation methodology is presently powerful enough to provide safe, reliable, maintainable, cost effective real time or near real time software. In an environment that must survive one of the most harsh and long life times, software must be produced that will perform as predicted, from the first time it is executed to the last. Many of the software challenges that will be faced will require strategies borrowed from Artificial Intelligence (AI). AI is the only development area mentioned as an example of a legitimate reason for a waiver from the overall requirement to use the Ada programming language for software development. The limits are defined of the applicability of the Ada language Ada Programming Support Environment (of which the SSE is a special case), and software engineering to AI solutions by describing a scenario that involves many facets of AI methodologies.
Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153
Cost benefit analysis of space communications technology: Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Holland, L. D.; Sassone, P. G.; Gallagher, J. J.; Robinette, S. L.; Vogler, F. H.; Zimmer, R. P.
1976-01-01
The questions of (1) whether or not NASA should support the further development of space communications technology, and, if so, (2) which technology's support should be given the highest priority are addressed. Insofar as the issues deal principally with resource allocation, an economics perspective is adopted. The resultant cost benefit methodology utilizes the net present value concept in three distinct analysis stages to evaluate and rank those technologies which pass a qualification test based upon probable (private sector) market failure. User-preference and technology state-of-the-art surveys were conducted (in 1975) to form a data base for the technology evaluation. The program encompassed near-future technologies in space communications earth stations and satellites, including the noncommunication subsystems of the satellite (station keeping, electrical power system, etc.). Results of the research program include confirmation of the applicability of the methodology as well as a list of space communications technologies ranked according to the estimated net present value of their support (development) by NASA.
NASA Technical Reports Server (NTRS)
Wiegmann, Douglas A.a
2005-01-01
The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.
A status of progress for the Laser Isotope Separation (LIS) process
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1976-01-01
An overview of the Laser Isotope Separation (LIS) methodology is given together with illustrations showing a simplified version of the LIS technique, an example of the two-photon photoionization category, and a diagram depicting how the energy levels of various isotope influence the LIS process. Applications were proposed for the LIS system which, in addition to enriching uranium, could in themselves develop into programs of tremendous scope and breadth. These include the treatment of radioactive wastes from light-water nuclear reactors, enriching the deuterium isotope to make heavy-water, and enriching the light isotopes of such elements as titanium for aerospace weight-reducing programs. Economic comparisons of the LIS methodology with the current method of gaseous diffusion indicate an overwhelming advantage; the laser process promises to be 1000 times more efficient. The technique could also be utilized in chemical reactions with the tuned laser serving as a universal catalyst to determine the speed and direction of a chemical reaction.
NASA Technical Reports Server (NTRS)
Birman, Kenneth; Cooper, Robert; Marzullo, Keith
1990-01-01
The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.
Comparison of Requirements for Composite Structures for Aircraft and Space Applications
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Elliott, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.
2010-01-01
In this paper, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry, and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.
NASA Technical Reports Server (NTRS)
Shoham, Yoav
1994-01-01
The goal of our research is a methodology for creating robust software in distributed and dynamic environments. The approach taken is to endow software objects with explicit information about one another, to have them interact through a commitment mechanism, and to equip them with a speech-acty communication language. System-level applications include software interoperation and compositionality. A government application of specific interest is an infrastructure for coordination among multiple planners. Daily activity applications include personal software assistants, such as programmable email, scheduling, and new group agents. Research topics include definition of mental state of agents, design of agent languages as well as interpreters for those languages, and mechanisms for coordination within agent societies such as artificial social laws and conventions.
NASA Astrophysics Data System (ADS)
Fu, Z. H.; Zhao, H. J.; Wang, H.; Lu, W. T.; Wang, J.; Guo, H. C.
2017-11-01
Economic restructuring, water resources management, population planning and environmental protection are subjects to inner uncertainties of a compound system with objectives which are competitive alternatives. Optimization model and water quality model are usually used to solve problems in a certain aspect. To overcome the uncertainty and coupling in reginal planning management, an interval fuzzy program combined with water quality model for regional planning and management has been developed to obtain the absolutely ;optimal; solution in this study. The model is a hybrid methodology of interval parameter programming (IPP), fuzzy programing (FP), and a general one-dimensional water quality model. The method extends on the traditional interval parameter fuzzy programming method by integrating water quality model into the optimization framework. Meanwhile, as an abstract concept, water resources carrying capacity has been transformed into specific and calculable index. Besides, unlike many of the past studies about water resource management, population as a significant factor has been considered. The results suggested that the methodology was applicable for reflecting the complexities of the regional planning and management systems within the planning period. The government policy makers could establish effective industrial structure, water resources utilization patterns and population planning, and to better understand the tradeoffs among economic, water resources, population and environmental objectives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
OHara J. M.; Higgins, J.; Fleger, S.
The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed to the periodicmore » update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool. To this end, the NRC is updating its guidance to stay current with recent research on human performance, advances in HFE methods and tools, and new technology being employed in plant and control room design. NUREG-0711 is the first document to be addressed. We present the methodology used to update NUREG-0711 and summarize the main changes made. Finally, we discuss the current status of the update program and the future plans.« less
NASA Technical Reports Server (NTRS)
Funk, Christie J.
2013-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees of freedom and allows for the calculation of various airplane responses due to a discrete one-minus-cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and output data so as to provide a more useful and accurate tool for gust load analysis. Revisions are made in the categories of aircraft geometry, computation of aerodynamic forces and moments, and implementation of horizontal tail mode shapes. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs in included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Koorts, Harriet; Gillison, Fiona
2015-11-06
Communities are a pivotal setting in which to promote increases in child and adolescent physical activity behaviours. Interventions implemented in these settings require effective evaluation to facilitate translation of findings to wider settings. The aims of this paper are to i) present findings from a RE-AIM evaluation of a community-based physical activity program, and ii) review the methodological challenges faced when applying RE-AIM in practice. A single mixed-methods case study was conducted based on a concurrent triangulation design. Five sources of data were collected via interviews, questionnaires, archival records, documentation and field notes. Evidence was triangulated within RE-AIM to assess individual and organisational-level program outcomes. Inconsistent availability of data and a lack of robust reporting challenged assessment of all five dimensions. Reach, Implementation and setting-level Adoption were less successful, Effectiveness and Maintenance at an individual and organisational level were moderately successful. Only community-level Adoption was highly successful, reflecting the key program goal to provide community-wide participation in sport and physical activity. This research highlighted important methodological constraints associated with the use of RE-AIM in practice settings. Future evaluators wishing to use RE-AIM may benefit from a mixed-method triangulation approach to offset challenges with data availability and reliability.
A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.
ERIC Educational Resources Information Center
Cantor, Jeffrey A.
1991-01-01
A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...
Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?
Piwek, Lukasz; Ellis, David A.
2016-01-01
Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use and secure becomes freely available. PMID:27602010
MHSS: a material handling system simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomernacki, L.; Hollstien, R.B.
1976-04-07
A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schock, A.; Noravian, H.; Or, C.
1997-12-31
This paper presents the background and introduction to the OSC AMTEC (Alkali Metal Thermal-to-Electrical Conversion) studies, which were conducted for the Department of energy (DOE) and NASA`s jet Propulsion Laboratory (JPL). After describing the basic principle of AMTEC, the paper describes and explains the operation of multi-tube vapor/vapor cells, which have been under development by AMPS (Advance Modular Power Systems, Inc.) for the Air Force Phillips Laboratory (AFPL) and JPL for possible application to the Europa Orbiter, Pluto Express, and other space missions. It then describes a novel OSC-generated methodology for analyzing the performance of such cells. This methodology consistsmore » of an iterative procedure for the coupled solution of the interdependent thermal, electrical, and fluid flow differential and integral equations governing the performance of AMTEC cells and generators, taking proper account of the non-linear axial variations of temperature, pressure, open-circuit voltage, inter-electrode voltages, current density, axial current, sodium mass flow rate, and power density. The paper illustrates that analytical procedure by applying it to OSC`s latest cell design and by presenting detailed analytical results for that design. The OSC-developed analytic methodology constitutes a unique and powerful tool for accurate parametric analyses and design optimizations of the multi-tube AMTEC cells and of radioisotope power systems. This is illustrated in two companion papers in these proceedings. The first of those papers applies the OSC-derived program to determine the effect of various design parameters on the performance of single AMTEC cells with adiabatic side walls, culminating in an OSC-recommended revised cell design. And the second describes a number of OSC-generated AMTEC generator designs consisting of 2 and 3 GPHS heat source modules, 16 multi-tube converter cells, and a hybrid insulation design, and presents the results of applying the above analysis program to determine the applicability of those generators to possible future missions under consideration by NASA.« less
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Integrated research training program of excellence in radiochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapi, Suzanne
2015-09-18
The overall goal of this “Integrated Research Training Program of Excellence in Radiochemistry” is to provide a rich and deep research experience in state-of-the-art radiochemistry and in the fundamentals of radioisotopic labeling and tracer methodology to develop researchers who are capable of meeting the challenges of designing and preparing radiotracers of broad applicability for monitoring and imaging diverse biological systems and environmental processes. This program was based in the Departments of Radiology and Radiation Oncology at Washington University Medical School and the Department of Chemistry at the University of Illinois at Urbana Champaign, and it was initially directed by Professormore » Michael J. Welch as Principal Investigator. After his passing in 2012, the program was led by Professor Suzanne E. Lapi. Programmatic content and participant progress was overseen by an Internal Advisory Committee of senior investigators consisting of the PIs, Professor Mach from the Department of Radiology at Washington University and Professor John A. Katzenellenbogen of the Department of Chemistry at the University of Illinois. A small External Advisory Committee to give overall program guidance was also constituted of experts in radiolabeled compounds and in their applications in environmental and plant science.« less
Search and retrieval of office files using dBASE 3
NASA Technical Reports Server (NTRS)
Breazeale, W. L.; Talley, C. R.
1986-01-01
Described is a method of automating the office files retrieval process using a commercially available software package (dBASE III). The resulting product is a menu-driven computer program which requires no computer skills to operate. One part of the document is written for the potential user who has minimal computer experience and uses sample menu screens to explain the program; while a second part is oriented towards the computer literate individual and includes rather detailed descriptions of the methodology and search routines. Although much of the programming techniques are explained, this document is not intended to be a tutorial on dBASE III. It is hoped that the document will serve as a stimulus for other applications of dBASE III.
DYNA3D: A computer code for crashworthiness engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hallquist, J.O.; Benson, D.J.
1986-09-01
A finite element program with crashworthiness applications has been developed at LLNL. DYNA3D, an explicit, fully vectorized, finite deformation structural dynamics program, has four capabilities that are critical for the efficient and realistic modeling crash phenomena: (1) fully optimized nonlinear solid, shell, and beam elements for representing a structure; (2) a broad range of constitutive models for simulating material behavior; (3) sophisticated contact algorithms for impact interactions; (4) a rigid body capability to represent the bodies away from the impact region at a greatly reduced cost without sacrificing accuracy in the momentum calculations. Basic methodologies of the program are brieflymore » presented along with several crashworthiness calculations. Efficiencies of the Hughes-Liu and Belytschko-Tsay shell formulations are considered.« less
Preparing practicing dentists to engage in practice-based research
DeRouen, Timothy A.; Hujoel, Philippe; Leroux, Brian; Mancl, Lloyd; Sherman, Jeffrey; Hilton, Thomas; Berg, Joel; Ferracane, Jack
2013-01-01
Background The authors describe an educational program designed to prepare practicing dentists to engage in practice-based research in their practices—a trend receiving more emphasis and funding from the National Institute of Dental and Craniofacial Research (NIDCR). Methods The Northwest Practice-based REsearch Collaborative in Evidence-based DENTistry (PRECEDENT), an NIDCR-funded network of which the authors are members, developed a one-day educational program to educate practitioners in principles of good clinical research. The program has four components built around the following questions: “What is the question?”; “What are the options?”; “How do you evaluate the evidence?”; and “How do you conduct a study?” Results The intensive one-day program initially offered in early 2006, which concluded with applications of research principles to research topics of interest to practitioners, was well-received. Despite their admission that the research methodology by itself was not of great interest, the dentists recognized the importance of the background material in equipping them to conduct quality studies in their practices. Conclusions Dentists interested in participating in practice-based research view training in research methodology as helpful to becoming better practitioner-investigators. The PRECEDENT training program seemed to reinforce their interest. Practice Implications As dentistry evolves to become more evidence-based, more and more of the evidence will come from practice-based research. This training program prepares practicing dentists to become engaged in this trend. PMID:18310739
ERIC Educational Resources Information Center
Davis, Jamie D., Ed.; Erickson, Jill Shepard, Ed.; Johnson, Sharon R., Ed.; Marshall, Catherine A., Ed.; Running Wolf, Paulette, Ed.; Santiago, Rolando L., Ed.
This first symposium of the Work Group on American Indian Research and Program Evaluation Methodology (AIRPEM) explored American Indian and Alaska Native cultural considerations in relation to "best practices" in research and program evaluation. These cultural considerations include the importance of tribal consultation on research…
2017-09-01
THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would
NASA Technical Reports Server (NTRS)
McDougal, Kristopher J.
2008-01-01
More and more test programs are requiring high frequency measurements. Marshall Space Flight Center s Cold Flow Test Facility has an interest in acquiring such data. The acquisition of this data requires special hardware and capabilities. This document provides a structured trade study approach for determining which additional capabilities of a VXI-based data acquisition system should be utilized to meet the test facility objectives. The paper is focused on the trade study approach detailing and demonstrating the methodology. A case is presented in which a trade study was initially performed to provide a recommendation for the data system capabilities. Implementation details of the recommended alternative are briefly provided as well as the system s performance during a subsequent test program. The paper then addresses revisiting the trade study with modified alternatives and attributes to address issues that arose during the subsequent test program. Although the model does not identify a single best alternative for all sensitivities, the trade study process does provide a much better understanding. This better understanding makes it possible to confidently recommend Alternative 3 as the preferred alternative.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology
NASA Astrophysics Data System (ADS)
Kirkpatrick, Brad Kenneth
In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weier, Dennis R.; Anderson, Kevin K.; Berman, Herbert S.
2005-03-10
The DST Integrity Plan (RPP-7574, 2003, Double-Shell Tank Integrity Program Plan, Rev. 1A, CH2M HILL Hanford Group, Inc., Richland, Washington.) requires the ultrasonic wall thickness measurement of two vertical scans of the tank primary wall while using a single riser location. The resulting measurements are then used in extreme value methodology to predict the minimum wall thickness expected for the entire tank. The representativeness of using a single riser in this manner to draw conclusions about the entire circumference of a tank has been questioned. The only data available with which to address the representativeness question comes from Tank AY-101more » since only for that tank have multiple risers been used for such inspection. The purpose of this report is to (1) further characterize AY-101 riser differences (relative to prior work); (2) propose a methodology for incorporating a ''riser difference'' uncertainty for subsequent tanks for which only a single riser is used, and (3) specifically apply the methodology to measurements made from a single riser in Tank AN-107.« less
NASA Technical Reports Server (NTRS)
Zyla, L. V.
1979-01-01
The modifications are described as necessary to give the Houston Operations Predictor/Estimator (HOPE) program the capability to solve for or consider vent forces for orbit determination. The model implemented in solving for vent forces is described along with the integrator problems encountered. A summary derivation of the mathematical principles applicable to solve/consider methodology is provided.
Comprehensive Reproductive System Care Program - Clinical Breast Care Project (CRSCP-CBCP)
2006-09-01
policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden...on those results. 5 W81 XWH-05-2-0053 Principal Investigator: Craig D. Shriver, COL MC Summary of the methodology of the project. The five pillars...exploration of warehoused data from an individual patient. An application prototype has been developed to enable users to access clinical or experimental
2010-08-19
highlight the benefits of regenerative braking . Parameters within the drive cycle may include vehicle speed, elevation/grade changes, road surface...assist to downsize the engine due to infinite maximum speed requirements • Drive cycle less suited to regenerative braking improvement compared to...will be cycle dependent. A high speed drive cycle may for example drive a focus on aerodynamic improvements, while high frequency of braking will
Laminated Object Manufacturing-Based Design Ceramic Matrix Composites
2001-04-01
components for DoD applications. Program goals included the development of (1) a new LOM based design methodology for CMC, (2) optimized preceramic polymer ...3.1.1-20 3.1.1-12 Detail of LOM Composites Forming System w/ glass fiber/ polymer laminate................ 3.1.1-21 3.1.1-13...such as polymer matrix composites have faced similar barriers to implementation. These barriers have been overcome through the development of suitable
1989-10-01
Northeast Aritificial Intelligence Consortium (NAIC). i Table of Contents Execu tive Sum m ary...o g~nIl ’vLr COPY o~ T- RADC-TR-89-259, Vol XI (of twelve) N Interim Report SOctober 1989 NORTHEAST ARTIFICIAL INTELLIGENCE CONSORTIUM ANNUAL REPORT...ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION Northeast Artificial (If applicable) Intelligence Consortium (NAIC) . Rome Air Development
A Methodology for Formal Hardware Verification, with Application to Microprocessors.
1993-08-29
concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as
Examination and Implementation of a Proposal for a Ph.D. Program in Administrative Sciences
1992-03-01
Review of two proposals recently approved by the Academic Council (i.e., Computer Science and Mathematics Departments). C. SCOPE OF THE STUDY Since WWII...and through the computer age, the application of administrative science theory and methodologies from the behavioral sciences and quantitative...roles in the U.S. Navy and DoD, providing people who firmly understand the technical and organizational aspects of computer -based systems which support
A revised load estimation procedure for the Susquehanna, Potomac, Patuxent, and Choptank rivers
Yochum, Steven E.
2000-01-01
The U.S. Geological Survey?s Chesapeake Bay River Input Program has updated the nutrient and suspended-sediment load data base for the Susquehanna, Potomac, Patuxent, and Choptank Rivers using a multiple-window, center-estimate regression methodology. The revised method optimizes the seven-parameter regression approach that has been used historically by the program. The revised method estimates load using the fifth or center year of a sliding 9-year window. Each year a new model is run for each site and constituent, the most recent year is added, and the previous 4 years of estimates are updated. The fifth year in the 9-year window is considered the best estimate and is kept in the data base. The last year of estimation shows the most change from the previous year?s estimate and this change approaches a minimum at the fifth year. Differences between loads computed using this revised methodology and the loads populating the historical data base have been noted but the load estimates do not typically change drastically. The data base resulting from the application of this revised methodology is populated by annual and monthly load estimates that are known with greater certainty than in the previous load data base.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
Technology, Data Bases and System Analysis for Space-to-Ground Optical Communications
NASA Technical Reports Server (NTRS)
Lesh, James
1995-01-01
Optical communications is becoming an ever-increasingly important option for designers of space-to- ground communications links, whether it be for government or commercial applications. In this paper the technology being developed by NASA for use in space-to-ground optical communications is presented. Next, a program which is collecting a long term data base of atmospheric visibility statistics for optical propagation through the atmosphere will be described. Finally, a methodology for utilizing the statistics of the atmospheric data base in the analysis of space-to-ground links will be presented. This methodology takes into account the effects of station availability, is useful when comparing optical communications with microwave systems, and provides a rationale establishing the recommended link margin.
NASA Astrophysics Data System (ADS)
McPhee, J.; William, Y. W.
2005-12-01
This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system
McMillen, J Curtis; Lee, Bethany R; Jonson-Reid, Melissa
2008-05-01
This study assessed whether administrative data from the public child welfare system could be used to develop risk-adjusted performance reports for residential mental health programs for adolescents. Regression methods were used with 3,759 residential treatment spells for 2,784 children and youth to determine which outcomes could be adequately risk adjusted for case mix. Expected outcomes were created for each residential program given its case mix; then, expected and achieved outcomes were compared. For most programs, achieved results did not differ significantly from expected results for individual outcomes. Overall, outcomes achieved were not impressive. Only one quarter of spells resulted in a youth being maintained in a single less restrictive setting in the year following discharge. Methodological implications of this study suggest further refinements are needed for child welfare administrative data in order to develop risk-adjusted report cards of program performance.
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-13
...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining...'' or ``off'' total unemployment rate (TUR) indicators to determine when extended benefit (EB) periods...-State Extended Benefits Program--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate...
NASA Technical Reports Server (NTRS)
Ogburn, Marilyn E.; Foster, John V.; Hoffler, Keith D.
2005-01-01
This paper reviews the use of piloted simulation at Langley Research Center as part of the NASA High-Angle-of-Attack Technology Program (HATP), which was created to provide concepts and methods for the design of advanced fighter aircraft. A major research activity within this program is the development of the design processes required to take advantage of the benefits of advanced control concepts for high-angle-of-attack agility. Fundamental methodologies associated with the effective use of piloted simulation for this research are described, particularly those relating to the test techniques, validation of the test results, and design guideline/criteria development.
NASA Astrophysics Data System (ADS)
Fu, Linyun; Ma, Xiaogang; Zheng, Jin; Goldstein, Justin; Duggan, Brian; West, Patrick; Aulenbach, Steve; Tilmes, Curt; Fox, Peter
2014-05-01
This poster will show how we used a case-driven iterative methodology to develop an ontology to represent the content structure and the associated provenance information in a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP). We applied the W3C PROV-O ontology to implement a formal representation of provenance. We argue that the use case-driven, iterative development process and the application of a formal provenance ontology help efficiently incorporate domain knowledge from earth and environmental scientists in a well-structured model interoperable in the context of the Web of Data.
Selected methods for quantification of community exposure to aircraft noise
NASA Technical Reports Server (NTRS)
Edge, P. M., Jr.; Cawthorn, J. M.
1976-01-01
A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.
The Geostationary Operational Environmental Satellite (GOES) Product Generation System
NASA Technical Reports Server (NTRS)
Haines, S. L.; Suggs, R. J.; Jedlovec, G. J.
2004-01-01
The Geostationary Operational Environmental Satellite (GOES) Product Generation System (GPGS) is introduced and described. GPGS is a set of computer programs developed and maintained at the Global Hydrology and Climate Center and is designed to generate meteorological data products using visible and infrared measurements from the GOES-East Imager and Sounder instruments. The products that are produced by GPGS are skin temperature, total precipitable water, cloud top pressure, cloud albedo, surface albedo, and surface insolation. A robust cloud mask is also generated. The retrieval methodology for each product is described to include algorithm descriptions and required inputs and outputs for the programs. Validation is supplied where applicable.
RESTRICTED HIP MOBILITY: CLINICAL SUGGESTIONS FOR SELF‐MOBILIZATION AND MUSCLE RE‐EDUCATION
Matheson, J.W.
2013-01-01
Restricted hip mobility has shown strong correlation with various pathologies of the hip, lumbar spine and lower extremity. Restricted mobility can consequently have deleterious effects not only at the involved joint but throughout the entire kinetic chain. Promising findings are suggesting benefit with skilled joint mobilization intervention for clients with various hip pathologies. Supervised home program intervention, while lacking specifically for the hip joint, are demonstrating promising results in other regions of the body. Application of an accompanying home program for the purpose of complementing skilled, in clinic intervention is advisable for those clients that respond favorably to such methodology. Level of Evidence: 5 PMID:24175151
Restricted hip mobility: clinical suggestions for self-mobilization and muscle re-education.
Reiman, Michael P; Matheson, J W
2013-10-01
Restricted hip mobility has shown strong correlation with various pathologies of the hip, lumbar spine and lower extremity. Restricted mobility can consequently have deleterious effects not only at the involved joint but throughout the entire kinetic chain. Promising findings are suggesting benefit with skilled joint mobilization intervention for clients with various hip pathologies. Supervised home program intervention, while lacking specifically for the hip joint, are demonstrating promising results in other regions of the body. Application of an accompanying home program for the purpose of complementing skilled, in clinic intervention is advisable for those clients that respond favorably to such methodology. 5.
Kang, Stella K; Rawson, James V; Recht, Michael P
2017-12-05
Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Klein, Vladislav
2002-01-01
The program objectives were defined in the original proposal entitled 'Program of Research in Flight Dynamics in the JIAFS at NASA Langley Research Center' which was originated March 20, 1975, and yearly renewals of the research program dated December 1, 1998 to December 31, 2002. The program included three major topics: 1) Improvement of existing methods and development of new methods for flight and wind tunnel data analysis based on system identification methodology; 2) Application of these methods to flight and wind tunnel data obtained from advanced aircraft; 3) Modeling and control of aircraft. The principal investigator of the program was Dr. Vladislav Klein, Professor Emeritus at The George Washington University, DC. Seven Graduate Research Scholar Assistants (GRSA) participated in the program. The results of the research conducted during four years of the total co-operative period were published in 2 NASA Technical Reports, 3 thesis and 3 papers. The list of these publications is included.
NASA Technical Reports Server (NTRS)
1971-01-01
The optimal allocation of resources to the national space program over an extended time period requires the solution of a large combinatorial problem in which the program elements are interdependent. The computer model uses an accelerated search technique to solve this problem. The model contains a large number of options selectable by the user to provide flexible input and a broad range of output for use in sensitivity analyses of all entering elements. Examples of these options are budget smoothing under varied appropriation levels, entry of inflation and discount effects, and probabilistic output which provides quantified degrees of certainty that program costs will remain within planned budget. Criteria and related analytic procedures were established for identifying potential new space program directions. Used in combination with the optimal resource allocation model, new space applications can be analyzed in realistic perspective, including the advantage gain from existing space program plant and on-going programs such as the space transportation system.
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
An update on technical and methodological aspects for cardiac PET applications.
Presotto, Luca; Busnardo, Elena; Gianolli, Luigi; Bettinardi, Valentino
2016-12-01
Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.
2000-02-01
HIDS] Program: Power Drive Train Crack Detection Diagnostics and Prognostics ife Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and...and Prognostics , Life Usage Monitoring , and Damage Tolerance; Techniques, Methodologies, and Experiences Andrew Hess Harrison Chin William Hardman...continuing program and deployed engine monitoring systems in fixed to evaluate helicopter diagnostic, prognostic , and wing aircraft, notably on the A
Comparison of Requirements for Composite Structures for Aircraft and Space Applications
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Elliot, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.
2010-01-01
In this report, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from aircraft and other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.
NASA Technical Reports Server (NTRS)
1976-01-01
The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.
Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A
2004-11-01
The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
NASA Astrophysics Data System (ADS)
Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret
2003-12-01
A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2007-01-01
The Plug-in Image Component Widget (PICWidget) is a software component for building digital imaging applications. The component is part of a methodology described in GIS Methodology for Planning Planetary-Rover Operations (NPO-41812), which appears elsewhere in this issue of NASA Tech Briefs. Planetary rover missions return a large number and wide variety of image data products that vary in complexity in many ways. Supported by a powerful, flexible image-data-processing pipeline, the PICWidget can process and render many types of imagery, including (but not limited to) thumbnail, subframed, downsampled, stereoscopic, and mosaic images; images coregistred with orbital data; and synthetic red/green/blue images. The PICWidget is capable of efficiently rendering images from data representing many more pixels than are available at a computer workstation where the images are to be displayed. The PICWidget is implemented as an Eclipse plug-in using the Standard Widget Toolkit, which provides a straightforward interface for re-use of the PICWidget in any number of application programs built upon the Eclipse application framework. Because the PICWidget is tile-based and performs aggressive tile caching, it has flexibility to perform faster or slower, depending whether more or less memory is available.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert
1985-01-01
The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.
Centralized database for interconnection system design. [for spacecraft
NASA Technical Reports Server (NTRS)
Billitti, Joseph W.
1989-01-01
A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.
Brimblecombe, Julie; Wycherley, Thomas Philip
2017-01-01
Smartphone applications are increasingly being used to support nutrition improvement in community settings. However, there is a scarcity of practical literature to support researchers and practitioners in choosing or developing health applications. This work maps the features, key content, theoretical approaches, and methods of consumer testing of applications intended for nutrition improvement in community settings. A systematic, scoping review methodology was used to map published, peer-reviewed literature reporting on applications with a specific nutrition-improvement focus intended for use in the community setting. After screening, articles were grouped into 4 categories: dietary self-monitoring trials, nutrition improvement trials, application description articles, and qualitative application development studies. For mapping, studies were also grouped into categories based on the target population and aim of the application or program. Of the 4818 titles identified from the database search, 64 articles were included. The broad categories of features found to be included in applications generally corresponded to different behavior change support strategies common to many classic behavioral change models. Key content of applications generally focused on food composition, with tailored feedback most commonly used to deliver educational content. Consumer testing before application deployment was reported in just over half of the studies. Collaboration between practitioners and application developers promotes an appropriate balance of evidence-based content and functionality. This work provides a unique resource for program development teams and practitioners seeking to use an application for nutrition improvement in community settings. PMID:28298274
Tonkin, Emma; Brimblecombe, Julie; Wycherley, Thomas Philip
2017-03-01
Smartphone applications are increasingly being used to support nutrition improvement in community settings. However, there is a scarcity of practical literature to support researchers and practitioners in choosing or developing health applications. This work maps the features, key content, theoretical approaches, and methods of consumer testing of applications intended for nutrition improvement in community settings. A systematic, scoping review methodology was used to map published, peer-reviewed literature reporting on applications with a specific nutrition-improvement focus intended for use in the community setting. After screening, articles were grouped into 4 categories: dietary self-monitoring trials, nutrition improvement trials, application description articles, and qualitative application development studies. For mapping, studies were also grouped into categories based on the target population and aim of the application or program. Of the 4818 titles identified from the database search, 64 articles were included. The broad categories of features found to be included in applications generally corresponded to different behavior change support strategies common to many classic behavioral change models. Key content of applications generally focused on food composition, with tailored feedback most commonly used to deliver educational content. Consumer testing before application deployment was reported in just over half of the studies. Collaboration between practitioners and application developers promotes an appropriate balance of evidence-based content and functionality. This work provides a unique resource for program development teams and practitioners seeking to use an application for nutrition improvement in community settings. © 2017 American Society for Nutrition.
Design and development of an IBM/VM menu system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cazzola, D.J.
1992-10-01
This report describes a full screen menu system developed using IBM's Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.
Mixed methods for telehealth research.
Caffery, Liam J; Martin-Khan, Melinda; Wade, Victoria
2017-10-01
Mixed methods research is important to health services research because the integrated qualitative and quantitative investigation can give a more comprehensive understanding of complex interventions such as telehealth than can a single-method study. Further, mixed methods research is applicable to translational research and program evaluation. Study designs relevant to telehealth research are described and supported by examples. Quality assessment tools, frameworks to assist in the reporting and review of mixed methods research, and related methodologies are also discussed.
2015-03-01
domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model
Development and application of optimum sensitivity analysis of structures
NASA Technical Reports Server (NTRS)
Barthelemy, J. F. M.; Hallauer, W. L., Jr.
1984-01-01
The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Munir, Ahmed R; Prem, Kumar D
2016-03-01
There is a growing awareness among teachers in the complementary and alternative medicine (CAM) disciplines that a formal training in educational methodology can improve their performance as teachers and student evaluators. The Training of Trainers programs conducted by Rajiv Gandhi University of Health Sciences, Karnataka, in the previous years have brought about a transformation among the teachers who attended those programs. Also the teachers were witness to a changing perception among students towards teachers who adapt innovative teaching/assessment strategies. This report illustrates an innovative training activity that was adapted to design a reference model that can be developed as an operational model for large-scale execution. Teachers who are under the affiliated CAM Institutions in Rajiv Gandhi University of Health Sciences, Karnataka, participated in a three-month 'Short Course in Educational Methodology'. This program was delivered on distance learning mode. The course was organised into four modules. Study material was provided for each of the module in the form of a study guide and related reference articles in electronic form. There were three contact programs - Induction and Introduction that also addressed overview of entire course and the subject matter of Module 1, and this was at the beginning of the course, first contact program to address the learner needs of Modules 2 and 3 and second contact program for the contents in Module 4. The participants were engaged during the entire course duration with interactive contact programs, self-study and application of concepts in their teaching/assessment practices, submission of assignments online, and microteaching presentation and peer review. The documentation and raw data generated during the course of training were used to generate an operational model for training of university teachers of health sciences faculty in general and teachers of CAM disciplines in particular. Establishing a model of training for university teachers who are engaged in health sciences education provides a strong platform to realise the roles of teacher, evolve as a conscientious and committed teacher and infuse their learners with passion and commitment to become competent in their professional performance.
Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.
1998-01-01
High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the CARES/Creep program.
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.
1998-01-01
High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and the CARES/Creep program.
Training Programs: A Methodological Note.
ERIC Educational Resources Information Center
Romi, Shlomo; Teichman, Meir
2001-01-01
Discusses the importance of situational factors in training program development, particularly in programs intended for professionals in the fields of mental health and social welfare. Proposes a methodological manner for selecting situations relevant to future tasks that revolve around the role the professional is expected to fulfill. (Author/LRW)
Culturally Sensitive Parent Education: A Critical Review of Quantitative Research.
ERIC Educational Resources Information Center
Gorman, Jean Cheng; Balter, Lawrence
1997-01-01
Critically reviews the quantitative literature on culturally sensitive parent education programs, discussing issues of research methodology and program efficacy in producing change among ethnic minority parents and their children. Culturally sensitive programs for African American and Hispanic families are described in detail. Methodological flaws…
Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.
Heavy hydrocarbon main injector technology program
NASA Technical Reports Server (NTRS)
Arbit, H. A.; Tuegel, L. M.; Dodd, F. E.
1991-01-01
The Heavy Hydrocarbon Main Injector Program was an analytical, design, and test program to demonstrate an injection concept applicable to an Isolated Combustion Compartment of a full-scale, high pressure, LOX/RP-1 engine. Several injector patterns were tested in a 3.4-in. combustor. Based on these results, features of the most promising injector design were incorporated into a 5.7-in. injector which was then hot-fire tested. In turn, a preliminary design of a 5-compartment 2D combustor was based on this pattern. Also the additional subscale injector testing and analysis was performed with an emphasis on improving analytical techniques and acoustic cavity design methodology. Several of the existing 3.5-in. diameter injectors were hot-fire tested with and without acoustic cavities for spontaneous and dynamic stability characteristics.
NASA Technical Reports Server (NTRS)
Tartt, David M.; Hewett, Marle D.; Duke, Eugene L.; Cooper, James A.; Brumbaugh, Randal W.
1989-01-01
The Automated Flight Test Management System (ATMS) is being developed as part of the NASA Aircraft Automation Program. This program focuses on the application of interdisciplinary state-of-the-art technology in artificial intelligence, control theory, and systems methodology to problems of operating and flight testing high-performance aircraft. The development of a Flight Test Engineer's Workstation (FTEWS) is presented, with a detailed description of the system, technical details, and future planned developments. The goal of the FTEWS is to provide flight test engineers and project officers with an automated computer environment for planning, scheduling, and performing flight test programs. The FTEWS system is an outgrowth of the development of ATMS and is an implementation of a component of ATMS on SUN workstations.
An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls
NASA Technical Reports Server (NTRS)
Walker, G. P.; Wagner, E. A.; Bodden, D. S.
1996-01-01
This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.
NASA Handbook for Spacecraft Structural Dynamics Testing
NASA Technical Reports Server (NTRS)
Kern, Dennis L.; Scharton, Terry D.
2005-01-01
Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook are solicited from the spacecraft structural dynamics testing community.
NASA Handbook for Spacecraft Structural Dynamics Testing
NASA Technical Reports Server (NTRS)
Kern, Dennis L.; Scharton, Terry D.
2004-01-01
Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook is solicited from the spacecraft structural dynamics testing community.
Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M
2014-01-01
The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the implementation of a smartphone application in psychiatry. This is one of the few studies that describe how an educational application could be developed using a simple and cost effective methodology and this study has also demonstrated students' perspectives towards Smartphone in psychiatric education. Our methods might apply to future research involving the use of technology in education.
Advantages of virtual reality in the rehabilitation of balance and gait: Systematic review.
Cano Porras, Desiderio; Siemonsma, Petra; Inzelberg, Rivka; Zeilig, Gabriel; Plotnik, Meir
2018-05-29
Virtual reality (VR) has emerged as a therapeutic tool facilitating motor learning for balance and gait rehabilitation. The evidence, however, has not yet resulted in standardized guidelines. The aim of this study was to systematically review the application of VR-based rehabilitation of balance and gait in 6 neurologic cohorts, describing methodologic quality, intervention programs, and reported efficacy. This study follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses. VR-based treatments of Parkinson disease, multiple sclerosis, acute and chronic poststroke, traumatic brain injury, and cerebral palsy were researched in PubMed and Scopus, including earliest available records. Therapeutic validity (CONTENT scale) and risk of bias in randomized controlled trials (RCT) (Cochrane Collaboration tool) and non-RCT (Newcastle-Ottawa scale) were assessed. Ninety-seven articles were included, 68 published in 2013 or later. VR improved balance and gait in all cohorts, especially when combined with conventional rehabilitation. Most studies presented poor methodologic quality, lacked a clear rationale for intervention programs, and did not utilize motor learning principles meticulously. RCTs with more robust methodologic designs were widely recommended. Our results suggest that VR-based rehabilitation is developing rapidly, has the potential to improve balance and gait in neurologic patients, and brings additional benefits when combined with conventional rehabilitation. This systematic review provides detailed information for developing theory-driven protocols that may assist overcoming the observed lack of argued choices for intervention programs and motor learning implementation and serves as a reference for the design and planning of personalized VR-based treatments. PROSPERO CRD42016042051. © 2018 American Academy of Neurology.
[Methodologies for Ascertaining Local Education Needs and for Allocating and Developing Resources.
ERIC Educational Resources Information Center
Bellott, Fred
A survey of 125 school systems in the United States was conducted to investigate methodologies used for developing needs assessment programs at a local level. Schools were asked to reply to a questionnaire which attempted to detail and identify how needs assessment programs are set up, what methodologies are employed, the number of resultant…
ERIC Educational Resources Information Center
Frank, Martina W.; Walker-Moffat, Wendy
This study considered how 25 highly diverse after-school programs with funding of $5.6 million were evaluated during a 10-month period. The paper describes the evaluation methodologies used and determined which methodologies were most effective within a diverse and political context. The Bayview Fund for Youth Development (name assumed for…
Program of policy studies in science and technology
NASA Technical Reports Server (NTRS)
Mayo, L. H.
1973-01-01
The application of an interdisciplinary, problem-oriented capability to the performance of total social impact evaluations is discussed. The consequences of introducing new configurations, technological or otherwise into future social environments are presented. The primary characteristics of the program are summarized: (1) emphasis on interdisciplinary, problem-oriented analysis; (2) development of intra- and inter-institutional arrangements for the purpose of analyzing social problems, evaluating existing programs, and assessing the social impacts of prospective policies, programs, and other public actions; (3) focus on methodological approaches to the projection of alternative future social environments, the identification of the effects of the introduction of new policies, programs, or other actions into the social system, and the evaluation of the social impacts of such effects; (4) availability of analytical resources for advisory and research tasks, and provision for use of program facilities as a neutral forum for the discussion of public issues involving involving the impact of advancing technology on social value-institutional processes.
IMPAC: An Integrated Methodology for Propulsion and Airframe Control
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.
1991-01-01
The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.
SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
2006-01-01
Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-09-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-01-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655
Proceedings of the First NASA Ada Users' Symposium
NASA Technical Reports Server (NTRS)
1988-01-01
Ada has the potential to be a part of the most significant change in software engineering technology within NASA in the last twenty years. Thus, it is particularly important that all NASA centers be aware of Ada experience and plans at other centers. Ada activity across NASA are covered, with presenters representing five of the nine major NASA centers and the Space Station Freedom Program Office. Projects discussed included - Space Station Freedom Program Office: the implications of Ada on training, reuse, management and the software support environment; Johnson Space Center (JSC): early experience with the use of Ada, software engineering and Ada training and the evaluation of Ada compilers; Marshall Space Flight Center (MSFC): university research with Ada and the application of Ada to Space Station Freedom, the Orbital Maneuvering Vehicle, the Aero-Assist Flight Experiment and the Secure Shuttle Data System; Lewis Research Center (LeRC): the evolution of Ada software to support the Space Station Power Management and Distribution System; Jet Propulsion Laboratory (JPL): the creation of a centralized Ada development laboratory and current applications of Ada including the Real-time Weather Processor for the FAA; and Goddard Space Flight Center (GSFC): experiences with Ada in the Flight Dynamics Division and the Extreme Ultraviolet Explorer (EUVE) project and the implications of GSFC experience for Ada use in NASA. Despite the diversity of the presentations, several common themes emerged from the program: Methodology - NASA experience in general indicates that the effective use of Ada requires modern software engineering methodologies; Training - It is the software engineering principles and methods that surround Ada, rather than Ada itself, which requires the major training effort; Reuse - Due to training and transition costs, the use of Ada may initially actually decrease productivity, as was clearly found at GSFC; and real-time work at LeRC, JPL and GSFC shows that it is possible to use Ada for real-time applications.
Software life cycle methodologies and environments
NASA Technical Reports Server (NTRS)
Fridge, Ernest
1991-01-01
Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1987-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1988-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
Fu, Zhenghui; Wang, Han; Lu, Wentao; Guo, Huaicheng; Li, Wei
2017-12-01
Electric power system involves different fields and disciplines which addressed the economic system, energy system, and environment system. Inner uncertainty of this compound system would be an inevitable problem. Therefore, an inexact multistage fuzzy-stochastic programming (IMFSP) was developed for regional electric power system management constrained by environmental quality. A model which concluded interval-parameter programming, multistage stochastic programming, and fuzzy probability distribution was built to reflect the uncertain information and dynamic variation in the case study, and the scenarios under different credibility degrees were considered. For all scenarios under consideration, corrective actions were allowed to be taken dynamically in accordance with the pre-regulated policies and the uncertainties in reality. The results suggest that the methodology is applicable to handle the uncertainty of regional electric power management systems and help the decision makers to establish an effective development plan.
García, Manuel Pérez
2011-01-01
In this article, the author explains how the support of new technologies has helped historians to develop their research over the last few decades. The author, therefore, summarizes the application of both database and genealogical programs for the southern Europe family studies as a methodological tool. First, the author will establish the importance of the creation of databases using the File Maker program, after which they will explain the value of using genealogical programs such as Genopro and Heredis. The main aim of this article is to give detail about the use of these new technologies as applied to a particular study of southern Europe, specifically the Crown of Castile, during the late modern period. The use of these computer programs has helped to develop the field of social sciences and family history, in particular, social history, during the last decade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1989-01-01
Papers on rotorcraft and fatigue methodology are presented, covering topics such as reliability design for rotorcraft, a comparison between theory and fatigue test data on stress concentration factors, the retirement lives of rolling element bearings, hydrogen embrittlement risk analysis for high hardness steel parts, and rotating system load monitoring with minimum fixed system instrumentation. Additional topics include usage data collection to improve structural integrity of operational helicopters, usage monitory of military helicopters, improvements to the fatigue substantiation of the H-60 composite tail rotor blade, helicopter surviellance programs, and potential application of automotive fatigue technology in rotorcraft design. Also, consideration ismore » given to fatigue evaluation of C/MH-53 E main rotor damper threaded joints, SH-2F airframe fatigue test program, a ply termination concept for improving fracture and fatigue strength of composite laminates, the analysis and testing of composite panels subject to muzzle blast effects, the certification plan for an all-composite main rotor flexbeam, and the effects of stacking sequence on the flexural strength of composite beams.« less
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
Impact of uncertainty on modeling and testing
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Brown, Kendall K.
1995-01-01
A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott
1995-01-01
Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology to be quantitatively compared in several categories, and a QFD matrix which allows process/chemical pairs to be rated against one another for importance (using consistent categories). Depending on the need for application, one can choose the part(s) needed or have the methodology completed in its entirety. For example, if a program needs to show the risk of changing a process/chemical one may choose to use part of Matrix A and Matrix C. If a chemical is being used, and the process must be changed; one might use the Process Concerns part of Matrix D for the existing process and all possible replacement processes. If an overall analysis of a program is needed, one may request the QFD to be completed.
Bairi, Partha; Minami, Kosuke; Hill, Jonathan P; Nakanishi, Waka; Shrestha, Lok Kumar; Liu, Chao; Harano, Koji; Nakamura, Eiichi; Ariga, Katsuhiko
2016-09-27
Supramolecular assembly can be used to construct a wide variety of ordered structures by exploiting the cumulative effects of multiple noncovalent interactions. However, the construction of anisotropic nanostructures remains subject to some limitations. Here, we demonstrate the preparation of anisotropic fullerene-based nanostructures by supramolecular differentiation, which is the programmed control of multiple assembly strategies. We have carefully combined interfacial assembly and local phase separation phenomena. Two fullerene derivatives, PhH and C12H, were together formed into self-assembled anisotropic nanostructures by using this approach. This technique is applicable for the construction of anisotropic nanostructures without requiring complex molecular design or complicated methodology.
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2010 CFR
2010-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2012 CFR
2012-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2013 CFR
2013-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2014 CFR
2014-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Design and development of an IBM/VM menu system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cazzola, D.J.
1992-10-01
This report describes a full screen menu system developed using IBM`s Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
Brain Training in Children and Adolescents: Is It Scientifically Valid?
Rossignoli-Palomeque, Teresa; Perez-Hernandez, Elena; González-Marqués, Javier
2018-01-01
Background: Brain training products are becoming increasingly popular for children and adolescents. Despite the marketing aimed at their use in the general population, these products may provide more benefits for specific neurologically impaired populations. A review of Brain Training (BT) products analyzing their efficacy while considering the methodological limitations of supporting research is required for practical applications. Method: searches were made of the PubMed database (until March 2017) for studies including: (1) empirical data on the use of brain training for children or adolescents and any effects on near transfer (NT) and/or far transfer (FT) and/or neuroplasticity, (2) use of brain training for cognitive training purposes, (3) commercially available training applications, (4) computer-based programs for children developed since the 1990s, and (5) relevant printed and peer-reviewed material. Results: Database searches yielded a total of 16,402 references, of which 70 met the inclusion criteria for the review. We classified programs in terms of neuroplasticity, near and far transfer, and long-term effects and their applied methodology. Regarding efficacy, only 10 studies (14.2%) have been found that support neuroplasticity, and the majority of brain training platforms claimed to be based on such concepts without providing any supporting scientific data. Thirty-six studies (51.4%) have shown far transfer (7 of them are non-independent) and only 11 (15.7%) maintained far transfer at follow-up. Considering the methodology, 40 studies (68.2%) were not randomized and controlled; for those randomized, only 9 studies (12.9%) were double-blind, and only 13 studies (18.6%) included active controls in their trials. Conclusion: Overall, few independent studies have found far transfer and long-term effects. The majority of independent results found only near transfer. There is a lack of double-blind randomized trials which include an active control group as well as a passive control to properly control for contaminant variables. Based on our results, Brain Training Programs as commercially available products are not as effective as first expected or as they promise in their advertisements.
Brain Training in Children and Adolescents: Is It Scientifically Valid?
Rossignoli-Palomeque, Teresa; Perez-Hernandez, Elena; González-Marqués, Javier
2018-01-01
Background: Brain training products are becoming increasingly popular for children and adolescents. Despite the marketing aimed at their use in the general population, these products may provide more benefits for specific neurologically impaired populations. A review of Brain Training (BT) products analyzing their efficacy while considering the methodological limitations of supporting research is required for practical applications. Method: searches were made of the PubMed database (until March 2017) for studies including: (1) empirical data on the use of brain training for children or adolescents and any effects on near transfer (NT) and/or far transfer (FT) and/or neuroplasticity, (2) use of brain training for cognitive training purposes, (3) commercially available training applications, (4) computer-based programs for children developed since the 1990s, and (5) relevant printed and peer-reviewed material. Results: Database searches yielded a total of 16,402 references, of which 70 met the inclusion criteria for the review. We classified programs in terms of neuroplasticity, near and far transfer, and long-term effects and their applied methodology. Regarding efficacy, only 10 studies (14.2%) have been found that support neuroplasticity, and the majority of brain training platforms claimed to be based on such concepts without providing any supporting scientific data. Thirty-six studies (51.4%) have shown far transfer (7 of them are non-independent) and only 11 (15.7%) maintained far transfer at follow-up. Considering the methodology, 40 studies (68.2%) were not randomized and controlled; for those randomized, only 9 studies (12.9%) were double-blind, and only 13 studies (18.6%) included active controls in their trials. Conclusion: Overall, few independent studies have found far transfer and long-term effects. The majority of independent results found only near transfer. There is a lack of double-blind randomized trials which include an active control group as well as a passive control to properly control for contaminant variables. Based on our results, Brain Training Programs as commercially available products are not as effective as first expected or as they promise in their advertisements. PMID:29780336
Prada, Sergio I
2017-12-01
The Medicaid Drug Utilization Review (DUR) program is a 2-phase process conducted by Medicaid state agencies. The first phase is a prospective DUR and involves electronically monitoring prescription drug claims to identify prescription-related problems, such as therapeutic duplication, contraindications, incorrect dosage, or duration of treatment. The second phase is a retrospective DUR and involves ongoing and periodic examinations of claims data to identify patterns of fraud, abuse, underutilization, drug-drug interaction, or medically unnecessary care, implementing corrective actions when needed. The Centers for Medicare & Medicaid Services requires each state to measure prescription drug cost-savings generated from its DUR programs on an annual basis, but it provides no guidance or unified methodology for doing so. To describe and synthesize the methodologies used by states to measure cost-savings using their Medicaid retrospective DUR program in federal fiscal years 2014 and 2015. For each state, the cost-savings methodologies included in the Medicaid DUR 2014 and 2015 reports were downloaded from Medicaid's website. The reports were then reviewed and synthesized. Methods described by the states were classified according to research designs often described in evaluation textbooks. In 2014, the most often used prescription drugs cost-savings estimation methodology for the Medicaid retrospective DUR program was a simple pre-post intervention method, without a comparison group (ie, 12 states). In 2015, the most common methodology used was a pre-post intervention method, with a comparison group (ie, 14 states). Comparisons of savings attributed to the program among states are still unreliable, because of a lack of a common methodology available for measuring cost-savings. There is great variation among states in the methods used to measure prescription drug utilization cost-savings. This analysis suggests that there is still room for improvement in terms of methodology transparency, which is important, because lack of transparency hinders states from learning from each other. Ultimately, the federal government needs to evaluate and improve its DUR program.
Cost benefit analysis for smart grid projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; He, Gang; Mauzey, J
The U.S. is unusual in that a definition of the term “smart grid” was written into legislation, appearing in the Energy Independence and Security Act (2007). When the recession called for stimulus spending and the American Recovery and Reinvestment Act (ARRA, 2009) was passed, a framework already existed for identification of smart grid projects. About $4.5B of the U.S. Department of Energy’s (U.S. DOE’s) $37B allocation from ARRA was directed to smart grid projects of two types, investment grants and demonstrations. Matching funds from other sources more than doubled the total value of ARRA-funded smart grid projects. The Smart Gridmore » Investment Grant Program (SGIG) consumed all but $620M of the ARRA funds, which was available for the 32 projects in the Smart Grid Demonstration Program (SGDP, or demonstrations). Given the economic potential of these projects and the substantial investments required, there was keen interest in estimating the benefits of the projects (i.e., quantifying and monetizing the performance of smart grid technologies). Common method development and application, data collection, and analysis to calculate and publicize the benefits were central objectives of the program. For this purpose standard methods and a software tool, the Smart Grid Computational Tool (SGCT), were developed by U.S. DOE and a spreadsheet model was made freely available to grantees and other analysts. The methodology was intended to define smart grid technologies or assets, the mechanisms by which they generate functions, their impacts and, ultimately, their benefits. The SGCT and its application to the Demonstration Projects are described, and actual projects in Southern California and in China are selected to test and illustrate the tool. The usefulness of the methodology and tool for international analyses is then assessed.« less
Clinical practice guidelines in breast cancer
Tyagi, N. Kumar; Dhesy-Thind, S.
2018-01-01
Background A number of clinical practice guidelines (cpgs) concerning breast cancer (bca) screening and management are available. Here, we review the strengths and weaknesses of cpgs from various professional organizations and consensus groups with respect to their methodologic quality, recommendations, and implementability. Methods Guidelines from four groups were reviewed with respect to two clinical scenarios: adjuvant ovarian function suppression (ofs) in premenopausal women with early-stage estrogen receptor–positive bca, and use of sentinel lymph node biopsy (slnb) after neoadjuvant chemotherapy (nac) for locally advanced bca. Guidelines from the American Society of Clinical Oncology (asco); Cancer Care Ontario’s Program in Evidence Based Care (cco’s pebc); the U.S. National Comprehensive Cancer Network (nccn); and the St. Gallen International Breast Cancer Consensus Conference were reviewed by two independent assessors. Guideline methodology and applicability were evaluated using the agree ii tool. Results The quality of the cpgs was greatest for the guidelines developed by asco and cco’s pebc. The nccn and St. Gallen guidelines were found to have lower scores for methodologic rigour. All guidelines scored poorly for applicability. The recommendations for ofs were similar in three guidelines. Recommendations by the various organizations for the use of slnb after nac were contradictory. Conclusions Our review demonstrated that cpgs can be heterogeneous in methodologic quality. Low-quality cpg implementation strategies contribute to low uptake of, and adherence to, bca cpgs. Further research examining the barriers to recommendations—such as intrinsic guideline characteristics and the needs of end users—is required. The use of bca cpgs can improve the knowledge-to-practice gap and patient outcomes.
Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.
Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie
2017-12-01
Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
Methodological, technical, and ethical issues of a computerized data system.
Rice, C A; Godkin, M A; Catlin, R J
1980-06-01
This report examines some methodological, technical, and ethical issues which need to be addressed in designing and implementing a valid and reliable computerized clinical data base. The report focuses on the data collection system used by four residency based family health centers, affiliated with the University of Massachusetts Medical Center. It is suggested that data reliability and validity can be maximized by: (1) standardizing encounter forms at affiliated health centers to eliminate recording biases and ensure data comparability; (2) using forms with a diagnosis checklist to reduce coding errors and increase the number of diagnoses recorded per encounter; (3) developing uniform diagnostic criteria; (4) identifying sources of error, including discrepancies of clinical data as recorded in medical records, encounter forms, and the computer; and (5) improving provider cooperation in recording data by distributing data summaries which reinforce the data's applicability to service provision. Potential applications of the data for research purposes are restricted by personnel and computer costs, confidentiality considerations, programming related issues, and, most importantly, health center priorities, largely focused on patient care, not research.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... wage determinations based on the new prevailing wage methodology set forth in the Wage Rule, as to the... comment, we published a Final Rule on August 1, 2011, which set the new effective date for the Wage Rule... date of the Wage Methodology for the Temporary Non- agricultural Employment H-2B Program Final Rule...
2018-03-01
We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model that...factors. We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model...9 D. BINARY CLASSIFICATION AND FEATURE SELECTION ..........11 III. METHODOLOGY
Assuring data transparency through design methodologies
NASA Technical Reports Server (NTRS)
Williams, Allen
1990-01-01
This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
Compile-time estimation of communication costs in multicomputers
NASA Technical Reports Server (NTRS)
Gupta, Manish; Banerjee, Prithviraj
1991-01-01
An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.
Tool Support for Software Lookup Table Optimization
Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.
2011-01-01
A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less
A systematic collaborative process for assessing launch vehicle propulsion technologies
NASA Astrophysics Data System (ADS)
Odom, Pat R.
1999-01-01
A systematic, collaborative process for prioritizing candidate investments in space transportation systems technologies has been developed for the NASA Space Transportation Programs Office. The purpose of the process is to provide a repeatable and auditable basis for selecting technology investments to enable achievement of NASA's strategic space transportation objectives. The paper describes the current multilevel process and supporting software tool that has been developed. Technologies are prioritized across system applications to produce integrated portfolios for recommended funding. An example application of the process to the assessment of launch vehicle propulsion technologies is described and illustrated. The methodologies discussed in the paper are expected to help NASA and industry ensure maximum returns from technology investments under constrained budgets.
Gulf Coast Clean Energy Application Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillingham, Gavin
The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less
Analytical group decision making in natural resources: Methodology and application
Schmoldt, D.L.; Peterson, D.L.
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.
Research opportunities in human behavior and performance
NASA Technical Reports Server (NTRS)
Christensen, J. M. (Editor); Talbot, J. M. (Editor)
1985-01-01
Extant information on the subject of psychological aspects of manned space flight are reviewed; NASA's psychology research program is examined; significant gaps in knowledge are identified; and suggestions are offered for future research program planning. Issues of human behavior and performance related to the United States space station, to the space shuttle program, and to both near and long term problems of a generic nature in applicable disciplines of psychology are considered. Topics covered include: (1) human performance requirements for a 90 day mission; (2) human perceptual, cognitive, and motor capabilities and limitations in space; (3) crew composition, individual competencies, crew competencies, selection criteria, and special training; (4) environmental factors influencing behavior; (5) psychosocial aspects of multiperson space crews in long term missions; (6) career determinants in NASA; (7) investigational methodology and equipment; and (8) psychological support.
Didactic Aspects of the Academic Discipline "History and Methodology of Mathematics"
ERIC Educational Resources Information Center
Sun, Hai; Varankina, Vera I.; Sadovaya, Victoriya V.
2017-01-01
The purpose of this article is to develop the content and methods, as well as the analysis of the approbation of the program of the academic discipline "History and methodology of mathematics" for graduate students of the Master's program of mathematical program tracks. The leading method in the study of this problem was the method of…
Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.
ERIC Educational Resources Information Center
Bertrand, Jane T.; And Others
1989-01-01
An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
10 CFR 436.12 - Life cycle cost methodology.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 3 2012-01-01 2012-01-01 false Life cycle cost methodology. 436.12 Section 436.12 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.12 Life cycle cost methodology. The life cycle cost methodology...
10 CFR 436.12 - Life cycle cost methodology.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 3 2014-01-01 2014-01-01 false Life cycle cost methodology. 436.12 Section 436.12 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.12 Life cycle cost methodology. The life cycle cost methodology...
10 CFR 436.12 - Life cycle cost methodology.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 3 2013-01-01 2013-01-01 false Life cycle cost methodology. 436.12 Section 436.12 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.12 Life cycle cost methodology. The life cycle cost methodology...
10 CFR 436.12 - Life cycle cost methodology.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Life cycle cost methodology. 436.12 Section 436.12 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.12 Life cycle cost methodology. The life cycle cost methodology...
10 CFR 436.12 - Life cycle cost methodology.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 3 2011-01-01 2011-01-01 false Life cycle cost methodology. 436.12 Section 436.12 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.12 Life cycle cost methodology. The life cycle cost methodology...
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-28
... Wage Rule revised the methodology by which we calculate the prevailing wages to be paid to H-2B workers... methodology by which we calculate the prevailing wages to be paid to H-2B workers and United States (U.S... concerning the calculation of the prevailing wage rate in the H-2B program. CATA v. Solis, Dkt. No. 103-1...
Ball, Sherry L; Stevenson, Lauren D; Ladebue, Amy C; McCreight, Marina S; Lawrence, Emily C; Oestreich, Taryn; Lambert-Kerzner, Anne C
2017-07-01
The Veterans Health Administration (VHA) is adapting to meet the changing needs of our Veterans. VHA leaders are promoting quality improvement strategies including Lean Six Sigma (LSS). This study used LSS tools to evaluate the Veterans Choice Program (VCP), a program that aims to improve access to health care services for eligible Veterans by expanding health care options to non-VHA providers. LSS was utilized to assess the current process and efficiency patterns of the VCP at 3 VHA Medical Centers. LSS techniques were used to assess data obtained through semistructured interviews with Veterans, staff, and providers to describe and evaluate the VCP process by identifying wastes and defects. The LSS methodology facilitated the process of targeting priorities for improvement and constructing suggestions to close identified gaps and inefficiencies. Identified key process wastes included inefficient exchange of clinical information between stakeholders in and outside of the VHA; poor dissemination of VCP programmatic information; shortages of VCP-participating providers; duplication of appointments; declines in care coordination; and lack of program adaptability to local processes. Recommendations for improvement were formulated using LSS. This evaluation illustrates how LSS can be utilized to assess a nationally mandated health care program. By focusing on stakeholder, staff, and Veteran perspectives, process defects in the VCP were identified and improvement recommendations were made. However, the current LSS language used is not intuitive in health care and similar applications of LSS may consider using new language and goals adapted specifically for health care.
NASA Technical Reports Server (NTRS)
Niiler, Pearn P.
2004-01-01
The scientific objective of this research program were to utilize drifter and satellite sea level data for the determination of time mean and time variable surface currents of the global ocean. To accomplish these tasks has required the processing of drifter data to include a wide variety of different configurations of drifters into a uniform format and to process the along track satellite altimeter data for computing the geostrophic current components normal to the track. These tasks were accomplished, which resulted in an increase of drifter data by about 40% and the development of new algorithms for obtaining satellite derived geostrophic velocity data that was consistent with the drifter observations of geostrophic time-variable currents. The methodologies and the research results using these methodologies were reported in the publications listed in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-08-01
The objective of this report is to develop a generalized methodology for examining water distribution systems for adjustable speed drive (ASD) applications and to provide an example (the City of Chicago 68th Street Water Pumping Station) using the methodology. The City of Chicago water system was chosen as the candidate for analysis because it has a large service area distribution network with no storage provisions after the distribution pumps. Many industrial motors operate at only one speed or a few speeds. By speeding up or slowing down, ASDs achieve gentle startups and gradual shutdowns thereby providing plant equipment a longermore » life with fewer breakdowns while minimizing the energy requirements. The test program substantiated that ASDs enhance product quality and increase productivity in many industrial operations, including extended equipment life. 35 figs.« less
Tudor-Locke, C E; Myers, A M
2001-03-01
Researchers and practitioners require guidelines for using electronic pedometers to objectively quantify physical activity (specifically ambulatory activity) for research and surveillance as well as clinical and program applications. Methodological considerations include choice of metric and length of monitoring frame as well as different data recording and collection procedures. A systematic review of 32 empirical studies suggests we can expect 12,000-16,000 steps/day for 8-10-year-old children (lower for girls than boys); 7,000-13,000 steps/day for relatively healthy, younger adults (lower for women than men); 6,000-8,500 steps/day for healthy older adults; and 3,500-5,500 steps/day for individuals living with disabilities and chronic illnesses. These preliminary recommendations should be modified and refined, as evidence and experience using pedometers accumulates.
How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?
NASA Astrophysics Data System (ADS)
Risk, D. A.; O'Connell, L.; Atherton, E.
2017-12-01
In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over developing new super sensors.
Establishing a methodology to evaluate teen driver-training programs.
DOT National Transportation Integrated Search
2013-11-01
The goal of this research project was to develop a methodology to assist the Wisconsin Department of : Transportation (WisDOT) in the evaluation of effectiveness of teen driver education programs over the : short and long terms. The research effort w...
Measuring the complexity of design in real-time imaging software
NASA Astrophysics Data System (ADS)
Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.
2007-02-01
Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.
Imaging screening of catastrophic neurological events using a software tool: preliminary results.
Fernandes, A P; Gomes, A; Veiga, J; Ermida, D; Vardasca, T
2015-05-01
In Portugal, as in most countries, the most frequent organ donors are brain-dead donors. To answer the increasing need for transplants, donation programs have been implemented. The goal is to recognize virtually all the possible and potential brain-dead donors admitted to hospitals. The aim of this work was to describe preliminary results of a software application designed to identify devastating neurological injury victims who may progress to brain death and can be possible organ donors. This was an observational, longitudinal study with retrospective data collection. The software application is an automatic algorithm based on natural language processing for selected keywords/expressions present in the cranio-encephalic computerized tomography (CE CT) scan reports to identify catastrophic neurological situations, with e-mail notification to the Transplant Coordinator (TC). The first 7 months of this application were analyzed and compared with the standard clinical evaluation methodology. The imaging identification tool showed a sensitivity of 77% and a specificity of 66%; predictive positive value (PPV) was 0.8 and predictive negative value (PNV) was 0.7 for the identification of catastrophic neurological events. The methodology proposed in this work seems promising in improving the screening efficiency of critical neurological events. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Noble, Bram F.; Christmas, Lisa M.
2008-01-01
This article presents a methodological framework for strategic environmental assessment (SEA) application. The overall objective is to demonstrate SEA as a systematic and structured policy, plan, and program (PPP) decision support tool. In order to accomplish this objective, a stakeholder-based SEA application to greenhouse gas (GHG) mitigation policy options in Canadian agriculture is presented. Using a mail-out impact assessment exercise, agricultural producers and nonproducers from across the Canadian prairie region were asked to evaluate five competing GHG mitigation options against 13 valued environmental components (VECs). Data were analyzed using multi-criteria and exploratory analytical techniques. The results suggest considerable variation in perceived impacts and GHG mitigation policy preferences, suggesting that a blanket policy approach to GHG mitigation will create gainers and losers based on soil type and associate cropping and on-farm management practices. It is possible to identify a series of regional greenhouse gas mitigation programs that are robust, socially meaningful, and operationally relevant to both agricultural producers and policy decision makers. The assessment demonstrates the ability of SEA to address, in an operational sense, environmental problems that are characterized by conflicting interests and competing objectives and alternatives. A structured and systematic SEA methodology provides the necessary decision support framework for the consideration of impacts, and allows for PPPs to be assessed based on a much broader set of properties, objectives, criteria, and constraints whereas maintaining rigor and accountability in the assessment process.
NASA Technical Reports Server (NTRS)
Powers, L. M.; Jadaan, O. M.; Gyekenyesi, J. P.
1998-01-01
The desirable properties of ceramics at high temperatures have generated interest in their use for structural application such as in advanced turbine engine systems. Design lives for such systems can exceed 10,000 hours. The long life requirement necessitates subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this paper is to present a design methodology for predicting the lifetimes of structural components subjected to creep rupture conditions. This methodology utilizes commercially available finite element packages and takes into account the time-varying creep strain distributions (stress relaxation). The creep life, of a component is discretized into short time steps, during which the stress and strain distributions are assumed constant. The damage is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. Failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity. The corresponding time will be the creep rupture life for that component. Examples are chosen to demonstrate the Ceramics Analysis and Reliability Evaluation of Structures/CREEP (CARES/CREEP) integrated design program, which is written for the ANSYS finite element package. Depending on the component size and loading conditions, it was found that in real structures one of two competing failure modes (creep or slow crack growth) will dominate. Applications to benchmark problems and engine components are included.
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.; Powers, L. M.; Jadaan, O. M.
1998-01-01
The desirable properties of ceramics at high temperatures have generated interest in their use for structural applications such as in advanced turbine systems. Design lives for such systems can exceed 10,000 hours. The long life requirement necessitates subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this paper is to present a design methodology for predicting the lifetimes of structural components subjected to creep rupture conditions. This methodology utilized commercially available finite element packages and takes into account the time-varying creep strain distributions (stress relaxation). The creep life of a component is discretized into short time steps, during which the stress and strain distributions are assumed constant. The damage is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. Failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity. The corresponding time will be the creep rupture life for that component. Examples are chosen to demonstrate the CARES/CREEP (Ceramics Analysis and Reliability Evaluation of Structures/CREEP) integrated design programs, which is written for the ANSYS finite element package. Depending on the component size and loading conditions, it was found that in real structures one of two competing failure modes (creep or slow crack growth) will dominate. Applications to benechmark problems and engine components are included.
Linden, Ariel; Yarnold, Paul R
2016-12-01
Program evaluations often utilize various matching approaches to emulate the randomization process for group assignment in experimental studies. Typically, the matching strategy is implemented, and then covariate balance is assessed before estimating treatment effects. This paper introduces a novel analytic framework utilizing a machine learning algorithm called optimal discriminant analysis (ODA) for assessing covariate balance and estimating treatment effects, once the matching strategy has been implemented. This framework holds several key advantages over the conventional approach: application to any variable metric and number of groups; insensitivity to skewed data or outliers; and use of accuracy measures applicable to all prognostic analyses. Moreover, ODA accepts analytic weights, thereby extending the methodology to any study design where weights are used for covariate adjustment or more precise (differential) outcome measurement. One-to-one matching on the propensity score was used as the matching strategy. Covariate balance was assessed using standardized difference in means (conventional approach) and measures of classification accuracy (ODA). Treatment effects were estimated using ordinary least squares regression and ODA. Using empirical data, ODA produced results highly consistent with those obtained via the conventional methodology for assessing covariate balance and estimating treatment effects. When ODA is combined with matching techniques within a treatment effects framework, the results are consistent with conventional approaches. However, given that it provides additional dimensions and robustness to the analysis versus what can currently be achieved using conventional approaches, ODA offers an appealing alternative. © 2016 John Wiley & Sons, Ltd.
2017-04-30
practices in latent variable theory, it is not surprising that effective measurement programs present methodological typing and considering of experimental ...7 3.3 Methodology ...8 Revised Enterprise Modeling Methodology ................................................................ 128 9 Conclusions
2016-09-15
7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology
ERIC Educational Resources Information Center
Elsherif, Entisar
2017-01-01
This adaptive methodological inquiry explored the affordances and constraints of one TESOL teacher education program in Libya as a conflict zone. Data was collected through seven documents and 33 questionnaires. Questionnaires were gathered from the investigated program's teacher-educators, student-teachers, and graduates, who were in-service…
Toward quantifying the effectiveness of water trading under uncertainty.
Luo, B; Huang, G H; Zou, Y; Yin, Y Y
2007-04-01
This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Standardized emissions inventory methodology for open-pit mining areas.
Huertas, Jose I; Camacho, Dumar A; Huertas, Maria E
2011-08-01
There is still interest in a unified methodology to quantify the mass of particulate material emitted into the atmosphere by activities inherent to open-pit mining. For the case of total suspended particles (TSP), the current practice is to estimate such emissions by developing inventories based on the emission factors recommended by the USEPA for this purpose. However, there are disputes over the specific emission factors that must be used for each activity and the applicability of such factors to cases quite different to the ones under which they were obtained. There is also a need for particulate matter with an aerodynamic diameter less than 10 μm (PM(10)) emission inventories and for metrics to evaluate the emission control programs implemented by open-pit mines. To address these needs, work was carried out to establish a standardized TSP and PM(10) emission inventory methodology for open-pit mining areas. The proposed methodology was applied to seven of the eight mining companies operating in the northern part of Colombia, home to the one of the world's largest open-pit coal mining operations (∼70 Mt/year). The results obtained show that transport on unpaved roads is the mining activity that generates most of the emissions and that the total emissions may be reduced up to 72% by spraying water on the unpaved roads. Performance metrics were defined for the emission control programs implemented by mining companies. It was found that coal open-pit mines are emitting 0.726 and 0.180 kg of TSP and PM(10), respectively, per ton of coal produced. It was also found that these mines are using on average 1.148 m(2) of land per ton of coal produced per year.
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
ERIC Educational Resources Information Center
Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…
NASA Astrophysics Data System (ADS)
Liao, Haitao; Wu, Wenwang; Fang, Daining
2018-07-01
A coupled approach combining the reduced space Sequential Quadratic Programming (SQP) method with the harmonic balance condensation technique for finding the worst resonance response is developed. The nonlinear equality constraints of the optimization problem are imposed on the condensed harmonic balance equations. Making use of the null space decomposition technique, the original optimization formulation in the full space is mathematically simplified, and solved in the reduced space by means of the reduced SQP method. The transformation matrix that maps the full space to the null space of the constrained optimization problem is constructed via the coordinate basis scheme. The removal of the nonlinear equality constraints is accomplished, resulting in a simple optimization problem subject to bound constraints. Moreover, second order correction technique is introduced to overcome Maratos effect. The combination application of the reduced SQP method and condensation technique permits a large reduction of the computational cost. Finally, the effectiveness and applicability of the proposed methodology is demonstrated by two numerical examples.
NASA Technical Reports Server (NTRS)
Birman, Kenneth; Cooper, Robert; Marzullo, Keith
1990-01-01
ISIS and META are two distributed systems projects at Cornell University. The ISIS project, has developed a new methodology, virtual synchrony, for writing robust distributed software. This approach is directly supported by the ISIS Toolkit, a programming system that is distributed to over 300 academic and industrial sites. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project, is about distributed control in a soft real time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are presented. This approach to distributed computing, a philosophy that is believed to significantly distinguish the work from that of others in the field, is explained.
NASA Technical Reports Server (NTRS)
Kvaternik, R. G.
1975-01-01
Two computational procedures for analyzing complex structural systems for their natural modes and frequencies of vibration are presented. Both procedures are based on a substructures methodology and both employ the finite-element stiffness method to model the constituent substructures. The first procedure is a direct method based on solving the eigenvalue problem associated with a finite-element representation of the complete structure. The second procedure is a component-mode synthesis scheme in which the vibration modes of the complete structure are synthesized from modes of substructures into which the structure is divided. The analytical basis of the methods contains a combination of features which enhance the generality of the procedures. The computational procedures exhibit a unique utilitarian character with respect to the versatility, computational convenience, and ease of computer implementation. The computational procedures were implemented in two special-purpose computer programs. The results of the application of these programs to several structural configurations are shown and comparisons are made with experiment.
Assessment, development, and application of combustor aerothermal models
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Mongia, H. C.; Mularz, E. J.
1989-01-01
The gas turbine combustion system design and development effort is an engineering exercise to obtain an acceptable solution to the conflicting design trade-offs between combustion efficiency, gaseous emissions, smoke, ignition, restart, lean blowout, burner exit temperature quality, structural durability, and life cycle cost. For many years, these combustor design trade-offs have been carried out with the help of fundamental reasoning and extensive component and bench testing, backed by empirical and experience correlations. Recent advances in the capability of computational fluid dynamics codes have led to their application to complex 3-D flows such as those in the gas turbine combustor. A number of U.S. Government and industry sponsored programs have made significant contributions to the formulation, development, and verification of an analytical combustor design methodology which will better define the aerothermal loads in a combustor, and be a valuable tool for design of future combustion systems. The contributions made by NASA Hot Section Technology (HOST) sponsored Aerothermal Modeling and supporting programs are described.
Konstantinidis, Evdokimos I; Frantzidis, Christos A; Pappas, Costas; Bamidis, Panagiotis D
2012-07-01
In this paper the feasibility of adopting Graphic Processor Units towards real-time emotion aware computing is investigated for boosting the time consuming computations employed in such applications. The proposed methodology was employed in analysis of encephalographic and electrodermal data gathered when participants passively viewed emotional evocative stimuli. The GPU effectiveness when processing electroencephalographic and electrodermal recordings is demonstrated by comparing the execution time of chaos/complexity analysis through nonlinear dynamics (multi-channel correlation dimension/D2) and signal processing algorithms (computation of skin conductance level/SCL) into various popular programming environments. Apart from the beneficial role of parallel programming, the adoption of special design techniques regarding memory management may further enhance the time minimization which approximates a factor of 30 in comparison with ANSI C language (single-core sequential execution). Therefore, the use of GPU parallel capabilities offers a reliable and robust solution for real-time sensing the user's affective state. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Program and Project Management Framework
NASA Technical Reports Server (NTRS)
Butler, Cassandra D.
2002-01-01
The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.
Aeroelastic analysis for propellers - mathematical formulations and program user's manual
NASA Technical Reports Server (NTRS)
Bielawa, R. L.; Johnson, S. A.; Chi, R. M.; Gangwani, S. T.
1983-01-01
Mathematical development is presented for a specialized propeller dedicated version of the G400 rotor aeroelastic analysis. The G400PROP analysis simulates aeroelastic characteristics particular to propellers such as structural sweep, aerodynamic sweep and high subsonic unsteady airloads (both stalled and unstalled). Formulations are presented for these expanded propeller related methodologies. Results of limited application of the analysis to realistic blade configurations and operating conditions which include stable and unstable stall flutter test conditions are given. Sections included for enhanced program user efficiency and expanded utilization include descriptions of: (1) the structuring of the G400PROP FORTRAN coding; (2) the required input data; and (3) the output results. General information to facilitate operation and improve efficiency is also provided.
NASA Technical Reports Server (NTRS)
Kerr, Andrew W.
1990-01-01
The utilization of advanced simulation technology in the development of the non-real-time MANPRINT design tools in the Army/NASA Aircrew-Aircraft Integration (A3I) program is described. A description is then given of the Crew Station Research and Development Facilities, the primary tool for the application of MANPRINT principles. The purpose of the A3I program is to develop a rational, predictive methodology for helicopter cockpit system design that integrates human factors engineering with other principles at an early stage in the development process, avoiding the high cost of previous system design methods. Enabling technologies such as the MIDAS work station are examined, and the potential of low-cost parallel-processing systems is indicated.
NASA Technical Reports Server (NTRS)
Sreekanta Murthy, T.
1992-01-01
Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.
1997-01-01
The desirable properties of ceramics at high temperatures have generated interest in their use for structural applications such as in advanced turbine systems. Design lives for such systems can exceed 10,000 hours. Such long life requirements necessitate subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this work is to present a design methodology for predicting the lifetimes of structural components subjected to multiaxial creep loading. This methodology utilizes commercially available finite element packages and takes into account the time varying creep stress distributions (stress relaxation). In this methodology, the creep life of a component is divided into short time steps, during which, the stress and strain distributions are assumed constant. The damage, D, is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. For components subjected to predominantly tensile loading, failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
10 CFR 436.14 - Methodological assumptions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 3 2011-01-01 2011-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall...
10 CFR 436.14 - Methodological assumptions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 3 2013-01-01 2013-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall...
10 CFR 436.14 - Methodological assumptions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 3 2012-01-01 2012-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall...
10 CFR 436.14 - Methodological assumptions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall...
10 CFR 436.14 - Methodological assumptions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 3 2014-01-01 2014-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall...
ERIC Educational Resources Information Center
Berney, Tomi D.; Keyes, Jose L.
The Bronx Computer Literacy and Methodologies of Bilingual Education Program for Vietnamese and Cambodian High School Students (Project CLIMB) served 221 students of limited English proficiency (LEP) at Christopher Columbus and Walton High Schools in the Bronx (New York City). The objectives of the program were to develop the students' academic…
Prada, Sergio I.
2017-01-01
Background The Medicaid Drug Utilization Review (DUR) program is a 2-phase process conducted by Medicaid state agencies. The first phase is a prospective DUR and involves electronically monitoring prescription drug claims to identify prescription-related problems, such as therapeutic duplication, contraindications, incorrect dosage, or duration of treatment. The second phase is a retrospective DUR and involves ongoing and periodic examinations of claims data to identify patterns of fraud, abuse, underutilization, drug–drug interaction, or medically unnecessary care, implementing corrective actions when needed. The Centers for Medicare & Medicaid Services requires each state to measure prescription drug cost-savings generated from its DUR programs on an annual basis, but it provides no guidance or unified methodology for doing so. Objectives To describe and synthesize the methodologies used by states to measure cost-savings using their Medicaid retrospective DUR program in federal fiscal years 2014 and 2015. Method For each state, the cost-savings methodologies included in the Medicaid DUR 2014 and 2015 reports were downloaded from Medicaid's website. The reports were then reviewed and synthesized. Methods described by the states were classified according to research designs often described in evaluation textbooks. Discussion In 2014, the most often used prescription drugs cost-savings estimation methodology for the Medicaid retrospective DUR program was a simple pre-post intervention method, without a comparison group (ie, 12 states). In 2015, the most common methodology used was a pre-post intervention method, with a comparison group (ie, 14 states). Comparisons of savings attributed to the program among states are still unreliable, because of a lack of a common methodology available for measuring cost-savings. Conclusion There is great variation among states in the methods used to measure prescription drug utilization cost-savings. This analysis suggests that there is still room for improvement in terms of methodology transparency, which is important, because lack of transparency hinders states from learning from each other. Ultimately, the federal government needs to evaluate and improve its DUR program. PMID:29403573
Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M
1996-01-01
Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153
Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M
1996-01-01
Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research.
Creation of security engineering programs by the Southwest Surety Institute
NASA Astrophysics Data System (ADS)
Romero, Van D.; Rogers, Bradley; Winfree, Tim; Walsh, Dan; Garcia, Mary Lynn
1998-12-01
The Southwest Surety Institute includes Arizona State University (ASU), Louisiana State University (LSU), New Mexico Institute of Mining and Technology (NM Tech), New Mexico State University (NMSU), and Sandia National Laboratories (SNL). The universities currently offer a full spectrum of post-secondary programs in security system design and evaluation, including an undergraduate minor, a graduate program, and continuing education programs. The programs are based on the methodology developed at Sandia National Laboratories over the past 25 years to protect critical nuclear assets. The programs combine basic concepts and principles from business, criminal justice, and technology to create an integrated performance-based approach to security system design and analysis. Existing university capabilities in criminal justice (NMSU), explosives testing and technology (NM Tech and LSU), and engineering technology (ASU) are leveraged to provide unique science-based programs that will emphasize the use of performance measures and computer analysis tools to prove the effectiveness of proposed systems in the design phase. Facility managers may then balance increased protection against the cost of implementation and risk mitigation, thereby enabling effective business decisions. Applications expected to benefit from these programs include corrections, law enforcement, counter-terrorism, critical infrastructure protection, financial and medical care fraud, industrial security, and border security.
Object-oriented programming for the biosciences.
Wiechert, W; Joksch, B; Wittig, R; Hartbrich, A; Höner, T; Möllney, M
1995-10-01
The development of software systems for the biosciences is always closely connected to experimental practice. Programs must be able to handle the inherent complexity and heterogeneous structure of biological systems in combination with the measuring equipment. Moreover, a high degree of flexibility is required to treat rapidly changing experimental conditions. Object-oriented methodology seems to be well suited for this purpose. It enables an evolutionary approach to software development that still maintains a high degree of modularity. This paper presents experience with object-oriented technology gathered during several years of programming in the fields of bioprocess development and metabolic engineering. It concentrates on the aspects of experimental support, data analysis, interaction and visualization. Several examples are presented and discussed in the general context of the experimental cycle of knowledge acquisition, thus pointing out the benefits and problems of object-oriented technology in the specific application field of the biosciences. Finally, some strategies for future development are described.
Durability evaluation of ceramic components using CARES/LIFE
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1994-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens which exhibit SCG when exposed to water.
Durability evaluation of ceramic components using CARES/LIFE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemeth, N.N.; Janosik, L.A.; Gyekenyesi, J.P.
1996-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength andmore » fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens, which exhibit SCG when exposed to water.« less
NASALIFE - Component Fatigue and Creep Life Prediction Program
NASA Technical Reports Server (NTRS)
Gyekenyesi, John Z.; Murthy, Pappu L. N.; Mital, Subodh K.
2014-01-01
NASALIFE is a life prediction program for propulsion system components made of ceramic matrix composites (CMC) under cyclic thermo-mechanical loading and creep rupture conditions. Although the primary focus was for CMC components, the underlying methodologies are equally applicable to other material systems as well. The program references empirical data for low cycle fatigue (LCF), creep rupture, and static material properties as part of the life prediction process. Multiaxial stresses are accommodated by Von Mises based methods and a Walker model is used to address mean stress effects. Varying loads are reduced by the Rainflow counting method or a peak counting type method. Lastly, damage due to cyclic loading and creep is combined with Minor's Rule to determine damage due to cyclic loading, damage due to creep, and the total damage per mission and the number of potential missions the component can provide before failure.
Bonnet, F; Solignac, S; Marty, J
2008-03-01
The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.
2012-09-01
experiments. J. Aerosol Sci., 40, 603- 612. Zheng, M., Cass, G. R., Schauer, J. J., Edgerton, E. S. (2002) Source Apportionment of PM2.5 in the...Energy Heavy Vehicle Research Program. The SERDP project WP1627 team consists of the following members (listed in alphabetical order of the last name...aircraft emissions are dominated by a fleet of high payload aircraft, such as the C-130, B1 B-52, and a variety of heavy -lift turboshaft vehicles
In-Suit Doppler Technology Assessment
NASA Technical Reports Server (NTRS)
Schulze, Arthur E.; Greene, Ernest R.; Nadeau, John J.
1991-01-01
The objective of this program was to perform a technology assessment survey of non-invasive air embolism detection utilizing Doppler ultrasound methodologies. The primary application of this technology will be a continuous monitor for astronauts while performing extravehicular activities (EVA's). The technology assessment was to include: (1) development of a full understanding of all relevant background research; and (2) a survey of the medical ultrasound marketplace for expertise, information, and technical capability relevant to this development. Upon completion of the assessment, LSR was to provide an overview of technological approaches and R&D/manufacturing organizations.
NASA Technical Reports Server (NTRS)
Foster, T. L.; Winans, L., Jr.
1973-01-01
The sampling of soils from the manufacture and assembly areas of the Viking spacecraft is reported and the methodology employed in the analysis of these samples for psychrophilic microorganisms, and temperature studies on these organisms is outlined. Results showing the major types of organisms and the percentage of obligate psychrophiles in each sample are given and discussed. Emphasis in all areas is toward application of these results to the objectives of the planetary quarantine program.
Synthesis: Intertwining product and process
NASA Technical Reports Server (NTRS)
Weiss, David M.
1990-01-01
Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.
NASA Astrophysics Data System (ADS)
Tinoco, R. O.; Goldstein, E. B.; Coco, G.
2016-12-01
We use a machine learning approach to seek accurate, physically sound predictors, to estimate two relevant flow parameters for open-channel vegetated flows: mean velocities and drag coefficients. A genetic programming algorithm is used to find a robust relationship between properties of the vegetation and flow parameters. We use data published from several laboratory experiments covering a broad range of conditions to obtain: a) in the case of mean flow, an equation that matches the accuracy of other predictors from recent literature while showing a less complex structure, and b) for drag coefficients, a predictor that relies on both single element and array parameters. We investigate different criteria for dataset size and data selection to evaluate their impact on the resulting predictor, as well as simple strategies to obtain only dimensionally consistent equations, and avoid the need for dimensional coefficients. The results show that a proper methodology can deliver physically sound models representative of the processes involved, such that genetic programming and machine learning techniques can be used as powerful tools to study complicated phenomena and develop not only purely empirical, but "hybrid" models, coupling results from machine learning methodologies into physics-based models.
NASA Astrophysics Data System (ADS)
Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.
2016-04-01
We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).
Classification methodology for tritiated waste requiring interim storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cana, D.; Dall'ava, D.; Decanis, C.
2015-03-15
Fusion machines like the ITER experimental research facility will use tritium as fuel. Therefore, most of the solid radioactive waste will result not only from activation by 14 MeV neutrons, but also from contamination by tritium. As a consequence, optimizing the treatment process for waste containing tritium (tritiated waste) is a major challenge. This paper summarizes the studies conducted in France within the framework of the French national plan for the management of radioactive materials and waste. The paper recommends a reference program for managing this waste based on its sorting, treatment and packaging by the producer. It also recommendsmore » setting up a 50-year temporary storage facility to allow for tritium decay and designing future disposal facilities using tritiated radwaste characteristics as input data. This paper first describes this waste program and then details an optimized classification methodology which takes into account tritium decay over a 50-year storage period. The paper also describes a specific application for purely tritiated waste and discusses the set-up expected to be implemented for ITER decommissioning waste (current assumption). Comparison between this optimized approach and other viable detritiation techniques will be drawn. (authors)« less
Dorofeev, S B; Babenko, A I
2017-01-01
The article deals with analysis of national and international publications concerning methodological aspects of elaborating systematic approach to healthy life-style of population. This scope of inquiry plays a key role in development of human capital. The costs related to healthy life-style are to be considered as personal investment into future income due to physical incrementation of human capital. The definitions of healthy life-style, its categories and supportive factors are to be considered in the process of development of strategies and programs of healthy lifestyle. The implementation of particular strategies entails application of comprehensive information and educational programs meant for various categories of population. Therefore, different motivation techniques are to be considered for children, adolescents, able-bodied population, the elderly. This approach is to be resulted in establishing particular responsibility for national government, territorial administrations, health care administrations, employers and population itself. The necessity of complex legislative measures is emphasized. The recent social hygienic studies were focused mostly on particular aspects of development of healthy life-style of population. Hence, the demand for long term exploration of development of organizational and functional models implementing medical preventive measures on the basis of comprehensive information analysis using statistical, sociological and professional expertise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler
Battery Life estimation is one of the key inputs required for Hybrid applications for all GM Hybrid/EV/EREV/PHEV programs. For each Hybrid vehicle program, GM has instituted multi-parameter Design of Experiments generating test data at Cell level and also Pack level on a reduced basis. Based on experience, generating test data on a pack level is found to be very expensive, resource intensive and sometimes less reliable. The proposed collaborative project will focus on a methodology to estimate Battery life based on cell degradation data combined with pack thermal modeling. NREL has previously developed cell-level battery aging models and pack-level thermal/electricalmore » network models, though these models are currently not integrated. When coupled together, the models are expected to describe pack-level thermal and aging response of individual cells. GM and NREL will use data collected for GM's Bas+ battery system for evaluation of the proposed methodology and assess to what degree these models can replace pack-level aging experiments in the future.« less
Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente
2012-01-01
Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.
Architectural Methodology Report
NASA Technical Reports Server (NTRS)
Dhas, Chris
2000-01-01
The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
New Briefing Methodology for the Brazilian Study and Monitoring of Space Weather (Embrace) Program.
NASA Astrophysics Data System (ADS)
Dal Lago, A.; Cecatto, J. R.; Costa, J. E. R.; Da Silva, L. A.; Rockenbach, M.; Braga, C. R.; Mendonca, R. R. S.; Mendes, O., Jr.; Koga, D.; Alves, L. R.; Becker-Guedes, F.; Wrasse, C. M.; Takahashi, H.; Resende, L.; Banik de Padua, M.; De Nardin, C. M.
2016-12-01
The Brazilian Study and Monitoring of Space Weather (Embrace) Program is being conducted by the National Institute for Space Research (INPE, Brazil) since 2008. Among several activities of the EMBRACE program, there are weekly briefings, held since 2012, where an evaluation is made of all space weather events occurred in the past week. At the beginning, an intuitive methodology was used, in which scientists were invited to present their reports on their subjects of expertise: solar, interplanetary space, geomagnetism, ionosphere and upper atmosphere. Latter on, an additional subject was introduced, with the inclusion of a separate report on the earth's magnetosphere, with special attention to the dynamics of the earth's radiation belts. Since late 2015, the need for a more efficient methodology was felt by the EMBRACE program, inspired by practices long used in forecasting of metheorological weather and climate. In that sense, an atempt to develop scales of disturbances was made. The aim is to be able to faster represent the level of space weather activity in all reported subjects. A huge effort was put together to produce sound indices, based on statistical significance of occurrence of distinct levels. This methodology is partially under practical evaluation since early 2016. In this work we present a report on the progress of the new methodology for EMBRACE program briefing meetings.
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154
Application of Sequential Quadratic Programming to Minimize Smart Active Flap Rotor Hub Loads
NASA Technical Reports Server (NTRS)
Kottapalli, Sesi; Leyland, Jane
2014-01-01
In an analytical study, SMART active flap rotor hub loads have been minimized using nonlinear programming constrained optimization methodology. The recently developed NLPQLP system (Schittkowski, 2010) that employs Sequential Quadratic Programming (SQP) as its core algorithm was embedded into a driver code (NLP10x10) specifically designed to minimize active flap rotor hub loads (Leyland, 2014). Three types of practical constraints on the flap deflections have been considered. To validate the current application, two other optimization methods have been used: i) the standard, linear unconstrained method, and ii) the nonlinear Generalized Reduced Gradient (GRG) method with constraints. The new software code NLP10x10 has been systematically checked out. It has been verified that NLP10x10 is functioning as desired. The following are briefly covered in this paper: relevant optimization theory; implementation of the capability of minimizing a metric of all, or a subset, of the hub loads as well as the capability of using all, or a subset, of the flap harmonics; and finally, solutions for the SMART rotor. The eventual goal is to implement NLP10x10 in a real-time wind tunnel environment.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2010-01-01
A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.
NASA Technical Reports Server (NTRS)
Russo, Vincent; Johnston, Gary; Campbell, Roy
1988-01-01
The programming of the interrupt handling mechanisms, process switching primitives, scheduling mechanism, and synchronization primitives of an operating system for a multiprocessor require both efficient code in order to support the needs of high- performance or real-time applications and careful organization to facilitate maintenance. Although many advantages have been claimed for object-oriented class hierarchical languages and their corresponding design methodologies, the application of these techniques to the design of the primitives within an operating system has not been widely demonstrated. To investigate the role of class hierarchical design in systems programming, the authors have constructed the Choices multiprocessor operating system architecture the C++ programming language. During the implementation, it was found that many operating system design concerns can be represented advantageously using a class hierarchical approach, including: the separation of mechanism and policy; the organization of an operating system into layers, each of which represents an abstract machine; and the notions of process and exception management. In this paper, we discuss an implementation of the low-level primitives of this system and outline the strategy by which we developed our solution.
Selecting cockpit functions for speech I/O technology
NASA Technical Reports Server (NTRS)
Simpson, C. A.
1985-01-01
A general methodology for the initial selection of functions for speech generation and speech recognition technology is discussed. The SCR (Stimulus/Central-Processing/Response) compatibility model of Wickens et al. (1983) is examined, and its application is demonstrated for a particular cockpit display problem. Some limits of the applicability of that model are illustrated in the context of predicting overall pilot-aircraft system performance. A program of system performance measurement is recommended for the evaluation of candidate systems. It is suggested that no one measure of system performance can necessarily be depended upon to the exclusion of others. Systems response time, system accuracy, and pilot ratings are all important measures. Finally, these measures must be collected in the context of the total flight task environment.
NASA Technical Reports Server (NTRS)
Messinger, Ross
2008-01-01
An assessment was performed to identify the applicability of composite material technologies to major structural elements of the NASA Constellation program. A qualitative technology assessment methodology was developed to document the relative benefit of 24 structural systems with respect to 33 major structural elements of Ares I, Orion, Ares V, and Altair. Technology maturity assessments and development plans were obtained from more than 30 Boeing subject matter experts for more than 100 technologies. These assessment results and technology plans were combined to generate a four-level hierarchy of recommendations. An overarching strategy is suggested, followed by a Constellation-wide development plan, three integrated technology demonstrations, and three focused projects for a task order follow-on.
NASA Technical Reports Server (NTRS)
Mironenko, G.
1972-01-01
Programs for the analyses of the free or forced, undamped vibrations of one or two elastically-coupled lumped parameter teams are presented. Bearing nonlinearities, casing and rotor distributed mass and elasticity, rotor imbalance, forcing functions, gyroscopic moments, rotary inertia, and shear and flexural deformations are all included in the system dynamics analysis. All bearings have nonlinear load displacement characteristics, the solution is achieved by iteration. Rotor imbalances allowed by such considerations as pilot tolerances and runouts as well as bearing clearances (allowing concail or cylindrical whirl) determine the forcing function magnitudes. The computer programs first obtain a solution wherein the bearings are treated as linear springs of given spring rates. Then, based upon the computed bearing reactions, new spring rates are predicted and another solution of the modified system is made. The iteration is continued until the changes to bearing spring rates and bearing reactions become negligibly small.
Interactive multi-mode blade impact analysis
NASA Technical Reports Server (NTRS)
Alexander, A.; Cornell, R. W.
1978-01-01
The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.
Pilotte, Nils; Zaky, Weam I.; Abrams, Brian P.; Chadee, Dave D.; Williams, Steven A.
2016-01-01
Background Given the continued successes of the world’s lymphatic filariasis (LF) elimination programs and the growing successes of many malaria elimination efforts, the necessity of low cost tools and methodologies applicable to long-term disease surveillance is greater than ever before. As many countries reach the end of their LF mass drug administration programs and a growing number of countries realize unprecedented successes in their malaria intervention efforts, the need for practical molecular xenomonitoring (MX), capable of providing surveillance for disease recrudescence in settings of decreased parasite prevalence is increasingly clear. Current protocols, however, require testing of mosquitoes in pools of 25 or fewer, making high-throughput examination a challenge. The new method we present here screens the excreta/feces from hundreds of mosquitoes per pool and provides proof-of-concept for a practical alternative to traditional methodologies resulting in significant cost and labor savings. Methodology/Principal Findings Excreta/feces of laboratory reared Aedes aegypti or Anopheles stephensi mosquitoes provided with a Brugia malayi microfilaria-positive or Plasmodium vivax-positive blood meal respectively were tested for the presence of parasite DNA using real-time PCR. A titration of samples containing various volumes of B. malayi-negative mosquito feces mixed with positive excreta/feces was also tested to determine sensitivity of detection. Real-time PCR amplification of B. malayi and P. vivax DNA from the excreta/feces of infected mosquitoes was demonstrated, and B. malayi DNA in excreta/feces from one to two mf-positive blood meal-receiving mosquitoes was detected when pooled with volumes of feces from as many as 500 uninfected mosquitoes. Conclusions/Significance While the operationalizing of excreta/feces testing may require the development of new strategies for sample collection, the high-throughput nature of this new methodology has the potential to greatly reduce MX costs. This will prove particularly useful in post-transmission-interruption settings, where this inexpensive approach to long-term surveillance will help to stretch the budgets of LF and malaria elimination programs. Furthermore, as this methodology is adaptable to the detection of both single celled (P. vivax) and multicellular eukaryotic pathogens (B. malayi), exploration of its use for the detection of various other mosquito-borne diseases including viruses should be considered. Additionally, integration strategies utilizing excreta/feces testing for the simultaneous surveillance of multiple diseases should be explored. PMID:27096156
The Heritage of Earth Science Applications in Policy, Business, and Management of Natural Resources
NASA Astrophysics Data System (ADS)
Macauley, M.
2012-12-01
From the first hand-held cameras on the Gemini space missions to present day satellite instruments, Earth observations have enhanced the management of natural resources including water, land, and air. Applications include the development of new methodology (for example, developing and testing algorithms or demonstrating how data can be used) and the direct use of data in decisionmaking and policy implementation. Using well-defined bibliographic search indices to systematically survey a broad social science literature, this project enables identification of a host of well-documented, practical and direct applications of Earth science data in resource management. This literature has not previously been well surveyed, aggregated, or analyzed for the heritage of lessons learned in practical application of Earth science data. In the absence of such a survey, the usefulness of Earth science data is underestimated and the factors that make people want to use -- and able to use -- the data are poorly understood. The project extends and updates previous analysis of social science applications of Landsat data to show their contemporary, direct use in new policy, business, and management activities and decisionmaking. The previous surveys (for example, Blumberg and Jacobson 1997; National Research Council 1998) find that the earliest attempts to use data are almost exclusively testing of methodology rather than direct use in resource management. Examples of methodology prototyping include Green et al. (1997) who demonstrate use of remote sensing to detect and monitor changes in land cover and use, Cowen et al. (1995) who demonstrate design and integration of GIS for environmental applications, Hutchinson (1991) who shows uses of data for famine early warning, and Brondizio et al. (1996) who show the link of thematic mapper data with botanical data. Blumberg and Jacobson (in Acevedo et al. 1996) show use of data in a study of urban development in the San Francisco Bay and the Baltimore-Washington metropolitan regions. The earliest direct application of Earth science information to actual decisionmaking began with the use of Landsat data in large-scale government demonstration programs and later, in smaller state and local agency projects. Many of these applications served as experiments to show how to use the data and to test their limitations. These activities served as precursors to more recent applications. Among the newest applications are the use of data to provide essential information to underpin monetary estimates of ecosystem services and the development of "credit" programs for these services. Another example is participatory (citizen science) resource management. This project also identifies the heritage of adoption factors - that is, determinants of the decision to use Earth science data. These factors include previous experience with Earth science data, reliable and transparent validation and verification techniques for new data, the availability and thoroughness of metadata, the ease of access and use of the data products, and technological innovation in computing and software (factors largely outside of the Earth science enterprise but influential in ease of direct use of Earth science data).
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Krosel, S. M.; Bruton, W. M.
1982-01-01
A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology that is pesented makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all the calculations and data manipulations that are needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self-contained engine model to match specified design-point information. Part I contains a general discussion of the methodology, describes a test case, and presents comparisons between hybrid simulation and specified engine performance data. Part II, a companion document, contains documentation, in the form of computer printouts, for the test case.
Advanced Turbine Technology Applications Project (ATTAP)
NASA Technical Reports Server (NTRS)
1992-01-01
This report is the fourth in a series of Annual Technical Summary Reports for the Advanced Turbine Technology Applications Project (ATTAP). This report covers plans and progress on ceramics development for commercial automotive applications over the period 1 Jan. - 31 Dec. 1991. Project effort conducted under this contract is part of the DOE Gas Turbine Highway Vehicle System program. This program is directed to provide the U.S. automotive industry the high-risk, long-range technology necessary to produce gas turbine engines for automobiles with reduced fuel consumption, reduced environmental impact, and a decreased reliance on scarce materials and resources. The program is oriented toward developing the high-risk technology of ceramic structural component design and fabrication, such that industry can carry this technology forward to production in the 1990s. The ATTAP test bed engine, carried over from the previous AGT101 project, is being used for verification testing of the durability of next-generation ceramic components, and their suitability for service at Reference Powertrain Design conditions. This document reports the technical effort conducted by GAPD and the ATTAP subcontractors during the fourth year of the project. Topics covered include ceramic processing definition and refinement, design improvements to the ATTAP test bed engine and test rigs and the methodology development of ceramic impact and fracture mechanisms. Appendices include reports by ATTAP subcontractors in the development of silicon nitride and silicon carbide families of materials and processes.
ERIC Educational Resources Information Center
Greenberg, Julie; Walsh, Kate; McKee, Arthur
2014-01-01
The "NCTQ Teacher Prep Review" evaluates the quality of programs that provide preservice preparation of public school teachers. As part of the "Review," this appendix reports on a pilot study of new standards for assessing the quality of alternative certification programs. Background and methodology for alternative…
Report #13-P-0430, September 24, 2013. The two Region 8 program offices that jointly implement the Lead Renovation, Repair and Painting Program do not have methodology or agreement for sharing SEE funding, which has led to confusion.
MULTI-MEDIA MICROBIOLOGICAL RISK ASSESSMENT METHODOLOGY FOR MUNICIPAL WASTEWATER SLUDGES
In order to reduce the risk of municipal sludge to acceptable levels, the U.S. EPA has undertaken a regulatory program based on risk assessment and risk management. The key to such a program is the development of a methodology which allows the regulatory agency to quantify the re...
42 CFR 441.472 - Budget methodology.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...
Payload/orbiter contamination control requirement study: Spacelab configuration contamination study
NASA Technical Reports Server (NTRS)
Bareiss, L. E.; Hetrick, M. A.; Ress, E. B.; Strange, D. A.
1976-01-01
The assessment of the Spacelab carrier induced contaminant environment was continued, and the ability of Spacelab to meet established contamination control criteria for the space transportation system program was determined. The primary areas considered included: (1) updating, refining, and improving the Spacelab contamination computer model and contamination analysis methodology, (2) establishing the resulting adjusted induced environment predictions for comparison with the applicable criteria, (3) determining the Spacelab design and operational requirements necessary to meet the criteria, (4) conducting mission feasibility analyses of the combined Spacelab/Orbiter contaminant environment for specific proposed mission and payload mixes, and (5) establishing a preliminary Spacelab mission support plan as well as model interface requirements; A summary of those activities conducted to date with respect to the modelling, analysis, and predictions of the induced environment, including any modifications in approach or methodology utilized in the contamination assessment of the Spacelab carrier, was presented.
[Counseling interventions for smoking cessation: systematic review].
Alba, Luz Helena; Murillo, Raúl; Castillo, Juan Sebastián
2013-04-01
A systematic review on efficacy and safety of smoking cessation counseling was developed. The ADAPTE methodology was used with a search of Clinical Practice Guidelines (CPG) in Medline, EMBASE, CINAHL, LILACS, and Cochrane. DELBI was used to select CPG with score over 60 in methodological rigor and applicability to the Colombian health system. Smoking cessation rates at 6 months were assessed according to counseling provider, model, and format. In total 5 CPG out of 925 references were selected comprising 44 systematic reviews and meta-analyses. Physician brief counseling and trained health professionals' intensive counseling (individual, group, proactive telephone) are effective with abstinence rates between 2.1% and 17.4%. Only practical counseling and motivational interview were found effective intensive interventions. The clinical effect of smoking cessation counseling is low and long term cessation rates uncertain. Cost-effectiveness analyses are recommended for the implementation of counseling in public health programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loulou, Richard; Waaub, Jean-Philippe; Zaccour, Georges
2005-07-01
This volume on energy and environmental modeling describes a broad variety of modeling methodologies. It includes chapters covering: The Sustainability of Economic Growth by Cabo, Martin-Herran & Martinez-Garcia; Abatement Scenarios in the Swiss Housing Sector by L. Drouet and others; Support and Planning for Off-Site Emergency Management, by Geldermann and others; Hybrid Energy-Economy Models, by Jaccard; The World-MARKAL Model and Its Application, by Kanudia and others; Methodology for Evaluating a Market of Tradable CO{sub 2}-Permits, by Kunsch and Springael; MERGE - A Model for Global Climate Change, by Manne and Richels; A Linear Programming Model for Capacity Expansion in anmore » Autonomous Power Generation System, by Mavrotas and Diakoulaki; Transport and Climate Policy Modeling in the Transport Sector, by Paltsev and others; Analysis of Ontario Electricity Capacity Requirements and Emissions, by Pineau and Schott; Environmental Damage in Energy/Environmental Policy Evaluation, by Van Regemorter. 71 figs.« less
NASA Astrophysics Data System (ADS)
Abdelaziz, Chebboubi; Grégoire, Kessedjian; Olivier, Serot; Sylvain, Julien-Laferriere; Christophe, Sage; Florence, Martin; Olivier, Méplan; David, Bernard; Olivier, Litaize; Aurélien, Blanc; Herbert, Faust; Paolo, Mutti; Ulli, Köster; Alain, Letourneau; Thomas, Materna; Michal, Rapala
2017-09-01
The study of fission yields has a major impact on the characterization and understanding of the fission process and is mandatory for reactor applications. In the past with the LOHENGRIN spectrometer of the ILL, priority has been given for the studies in the light fission fragment mass range. The LPSC in collaboration with ILL and CEA has developed a measurement program on symmetric and heavy mass fission fragment distributions. The combination of measurements with ionisation chamber and Ge detectors is necessary to describe precisely the heavy fission fragment region in mass and charge. Recently, new measurements of fission yields and kinetic energy distributions are has been made on the 233U(nth,f) reaction. The focus of this work has been on the new optical and statistical methodology and the self-normalization of the data to provide new absolute measurements, independently of any libraries, and the associated experimental covariance matrix.
Programming methodology for a general purpose automation controller
NASA Technical Reports Server (NTRS)
Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.
1987-01-01
The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.
Wyrick, David L; Rulison, Kelly L; Fearnow-Kenney, Melodie; Milroy, Jeffrey J; Collins, Linda M
2014-09-01
Given current pressures to increase the public health contributions of behavioral interventions, intervention scientists may wish to consider moving beyond the classical treatment package approach that focuses primarily on achieving statistical significance. They may wish also to focus on goals directly related to optimizing public health impact. The Multiphase Optimization Strategy (MOST) is an innovative methodological framework that draws on engineering principles to achieve more potent behavioral interventions. MOST is increasingly being adopted by intervention scientists seeking a systematic framework to engineer an optimized intervention. As with any innovation, there are challenges that arise with early adoption. This article describes the solutions to several critical questions that we addressed during the first-ever iterative application of MOST. Specifically, we describe how we have applied MOST to optimize an online program (myPlaybook) for the prevention of substance use among college student-athletes. Our application of MOST can serve as a blueprint for other intervention scientists who wish to design optimized behavioral interventions. We believe using MOST is feasible and has the potential to dramatically improve program effectiveness thereby advancing the public health impact of behavioral interventions.
2017-11-01
The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
G STL: the geostatistical template library in C++
NASA Astrophysics Data System (ADS)
Remy, Nicolas; Shtuka, Arben; Levy, Bruno; Caers, Jef
2002-10-01
The development of geostatistics has been mostly accomplished by application-oriented engineers in the past 20 years. The focus on concrete applications gave birth to many algorithms and computer programs designed to address different issues, such as estimating or simulating a variable while possibly accounting for secondary information such as seismic data, or integrating geological and geometrical data. At the core of any geostatistical data integration methodology is a well-designed algorithm. Yet, despite their obvious differences, all these algorithms share many commonalities on which to build a geostatistics programming library, lest the resulting library is poorly reusable and difficult to expand. Building on this observation, we design a comprehensive, yet flexible and easily reusable library of geostatistics algorithms in C++. The recent advent of the generic programming paradigm allows us elegantly to express the commonalities of the geostatistical algorithms into computer code. Generic programming, also referred to as "programming with concepts", provides a high level of abstraction without loss of efficiency. This last point is a major gain over object-oriented programming which often trades efficiency for abstraction. It is not enough for a numerical library to be reusable, it also has to be fast. Because generic programming is "programming with concepts", the essential step in the library design is the careful identification and thorough definition of these concepts shared by most geostatistical algorithms. Building on these definitions, a generic and expandable code can be developed. To show the advantages of such a generic library, we use G STL to build two sequential simulation programs working on two different types of grids—a surface with faults and an unstructured grid—without requiring any change to the G STL code.
Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story
NASA Technical Reports Server (NTRS)
Ly, Vuong
2017-01-01
The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.
Conducting preference assessments for youth with disorders of consciousness during rehabilitation.
Amari, Adrianna; Suskauer, Stacy J; Paasch, Valerie; Grodin, Lauren K; Slomine, Beth S
2017-08-01
Care and rehabilitation for individuals with disorders of consciousness (DOC) can be challenging; the use of observational data collection, individualized treatment programs, and incorporation of preferred, personally meaningful and salient items may be helpful in addressing such challenges during assessment and intervention. In this article, we extend the predominantly adult literature on use of salient items to promote differential responding by describing our methodology to identify preferred items across sensory domains for application during inpatient rehabilitation with children with DOC. Details on the indirect and direct preference assessment procedures rooted in applied behavior analysis that we have tailored for this population are provided. We describe steps of the procedures, including structured caregiver interview, staff survey, item inclusion, in vivo single-item stimulus preference assessment, and treatment. Clinical case examples further illustrate implementation of our methodology, observed response topographies, individually identified preferred items, and their application for 3 children in a minimally conscious state. In addition, we introduce a new structured caregiver interview, the Preference Assessment for Youth with Disorders of Consciousness (PAYDOC), modeled on the Reinforcer Assessment for Individuals with Severe Disabilities (RAISD; Fisher, Piazza, Bowman, & Amari, 1996) and modified to be appropriate for future use as a clinical tool to enhance assessment of preferences with this pediatric brain injury population. This methodology can be used to identify highly idiosyncratic stimuli that can be incorporated in multiple ways throughout rehabilitation to optimize care for youth with DOC. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.
2012-08-01
The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
Administration, Best Practices, and Evaluation of the National Weather Center REU Program
NASA Astrophysics Data System (ADS)
Zaras, D. S.; Gonzalez-Espada, W.
2005-12-01
The National Weather Center Research Experiences for Undergraduates program in Norman, Oklahoma, is a unique undergraduate career exploration experience, drawing upon the resources available in the National Weather Center's (NWC) state, federal, and university groups. This program takes full advantage of our location by including a wide variety of professionals from throughout the NWC community as mentors and contributors of lectures, workshops, tours, field trips, and job shadow experiences to expose the students to a broad spectrum of research topics and careers in meteorology. Students actively practice good research methodology by being paired with mentors who are productive researchers. The program aims to provide a strong and transformative educational experience that models the life of a scientist. This presentation will include a brief overview of program administration, analysis of applicant characteristics, "best practices" learned since 2001, and new additions to the NWC program funded through a 2-Year Extension for Special Creativity. The presentation will conclude with a brief evaluation of how well the program meets its goals of helping students clarify graduate school and career plans, and build self-efficacy regarding their potential for a career in scientific research.
STS pilot user development program
NASA Technical Reports Server (NTRS)
Mcdowell, J. R.
1977-01-01
Full exploitation of the STS capabilities will be not only dependent on the extensive use of the STS for known space applications and research, but also on new, innovative ideas of use originating with both current and new users. In recognition of this, NASA has been engaged in a User Development Program for the STS. The program began with four small studies. Each study addressed a separate sector of potential new users to identify techniques and methodologies for user development. The collective results established that a user development function was not only feasible, but necessary for NASA to realize the full potential of the STS. This final report begins with a description of the overall pilot program plan, which involved five specific tasks defined in the contract Statement of Work. Each task is then discussed separately; but two subjects, the development of principal investigators and space processing users, are discussed separately for improved continuity of thought. These discussions are followed by a summary of the primary results and conclusions of the Pilot User Development Program. Specific recommendations of the study are given.
Analysis of SBIR phase I and phase II review results at the National Institutes of Health.
Vener, K J; Calkins, B M
1991-09-01
A cohort of phase I and phase II summary statements for the SBIR grant applications was evaluated to determine the strengths and weaknesses in approved and disapproved applications. An analysis of outcome variables (disapproval or unfunded status) was examined with respect to exposure variables (strengths or shortcomings). Logistic regression models were developed for comparisons to measure the predictive value of shortcomings and strengths to the outcomes. Disapproved phase I results were compared with an earlier 1985 study. Although the magnitude of the frequencies of shortcomings was greater in the present study, the relative rankings within shortcoming class were more alike than different. Also, the frequencies of shortcomings were, with one exception, not significantly different in the two studies. Differences in the summary statement review may have accounted for some differences observed between the 1985 data and results of the present study. Comparisons of Approved/Disapproved and Approved-Unfunded/Funded yielded the following observations. For phase I applicants, a lack of a clearly stated, testable hypothesis, a poorly qualified or described investigative team, and inadequate methodological approaches contributed significantly (in that order) to a rating of disapproval. A critical flaw for phase II proposals was failure to accomplish objectives of the phase I study. Methodological issues also dominate the distinctions in both comparison groups. A clear result of the data presented here and that published previously is that SBIR applicants need continuing assistance to improve the chances of their success. These results should serve as a guide to assist NIH staff as they provide information to prospective applicants focusing on key elements of the application. A continuing review of the SBIR program would be helpful to evaluate the quality of the submitted science.
Measurement control workshop instructional materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbs, Philip; Harvel, Charles; Clark, John
2012-09-01
An essential element in an effective nuclear materials control and accountability (MC&A) program is the measurement of the nuclear material as it is received, moved, processed and shipped. Quality measurement systems and methodologies determine the accuracy of the accountability values. Implementation of a measurement control program is essential to ensure that the measurement systems and methodologies perform as expected. A measurement control program also allows for a determination of the level of confidence in the accounting values.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
ERIC Educational Resources Information Center
Drake, Timothy A.
2011-01-01
Previous work has concentrated on the epistemological foundation of comparative and international education (CIE) graduate programs. This study focuses on programmatic size, philosophy, methodology, and pedagogy. It begins by reviewing previous studies. It then provides a theoretical framework and describes the size, relevance, content, and…