Sample records for detailed process evaluation

  1. Evaluation of low-residue soldering for military and commercial applications: A report from the Low-Residue Soldering Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iman, R.L.; Anderson, D.J.; Burress, R.V.

    1995-06-01

    The LRSTF combined the efforts of industry, military, and government to evaluate low-residue soldering processes for military and commercial applications. These processes were selected for evaluation because they provide a means for the military to support the presidential mandate while producing reliable hardware at a lower cost. This report presents the complete details and results of a testing program conducted by the LRSTF to evaluate low-residue soldering for printed wiring assemblies. A previous informal document provided details of the test plan used in this evaluation. Many of the details of that test plan are contained in this report. The testmore » data are too massive to include in this report, however, these data are available on disk as Excel spreadsheets upon request. The main purpose of low-residue soldering is to eliminate waste streams during the manufacturing process.« less

  2. Detailed seismic evaluation of bridges on and over the parkways in Western Kentucky.

    DOT National Transportation Integrated Search

    2008-06-01

    The report outlines a rating system and details an evaluation procedure for the seismic evaluation of highway bridges. These processes are later used to investigate the structural integrity of selected highway bridges on and over the parkways in West...

  3. DEFINITIVE SOX CONTROL PROCESS EVALUATIONS: LIMESTONE, DOUBLE ALKALI, AND CITRATE FGD PROCESSES

    EPA Science Inventory

    The report gives results of a detailed comparative technical and economic evaluation of limestone slurry, generic double alkali, and citrate flue gas desulfurization (FGD) processes, assuming proven technology and using representative power plant, process design, and economic pre...

  4. Human Research Program Unique Processes, Criteria, and Guidelines (UPCG). Revision C, July 28, 2011

    NASA Technical Reports Server (NTRS)

    Chin, Duane

    2011-01-01

    This document defines the processes, criteria, and guidelines exclusive to managing the Human Research Program (HRP). The intent of this document is to provide instruction to the reader in the form of processes, criteria, and guidelines. Of the three instructional categories, processes contain the most detail because of the need for a systematic series of actions directed to some end. In contrast, criteria have lesser detail than processes with the idea of creating a rule or principle structure for evaluating or testing something. Guidelines are a higher level indication of a course of action typically with the least amount of detail. The lack of detail in guidelines allows the reader flexibility when performing an action or actions.

  5. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  6. Conversion of bioprocess ethanol to industrial chemical products - Applications of process models for energy-economic assessments

    NASA Technical Reports Server (NTRS)

    Rohatgi, Naresh K.; Ingham, John D.

    1992-01-01

    An assessment approach for accurate evaluation of bioprocesses for large-scale production of industrial chemicals is presented. Detailed energy-economic assessments of a potential esterification process were performed, where ethanol vapor in the presence of water from a bioreactor is catalytically converted to ethyl acetate. Results show that such processes are likely to become more competitive as the cost of substrates decreases relative to petrolium costs. A commercial ASPEN process simulation provided a reasonably consistent comparison with energy economics calculated using JPL developed software. Detailed evaluations of the sensitivity of production cost to material costs and annual production rates are discussed.

  7. Testing and evaluation of light ablation decontamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demmer, R.L.; Ferguson, R.L.

    1994-10-01

    This report details the testing and evaluation of light ablation decontamination. It details WINCO contracted research and application of light ablation efforts by Ames Laboratory. Tests were conducted with SIMCON (simulated contamination) coupons and REALCON (actual radioactive metal coupons) under controlled conditions to compare cleaning effectiveness, speed and application to plant process type equipment.

  8. The Judicial Process as a Form of Program Evaluation.

    ERIC Educational Resources Information Center

    Ellsberry, James

    1980-01-01

    Maintaining that the judicial process is particularly effective as a form of program evaluation, this article details organizational procedures and lists the following advantages for use of the judicial process: issues are investigated in an open forum, the community can participate, and exciting opportunities for teaching and learning are…

  9. Process Evaluation for a Prison-based Substance Abuse Program.

    ERIC Educational Resources Information Center

    Staton, Michele; Leukefeld, Carl; Logan, T. K.; Purvis, Rick

    2000-01-01

    Presents findings from a process evaluation conducted in a prison-based substance abuse program in Kentucky. Discusses key components in the program, including a detailed program description, modifications in planned treatment strategies, program documentation, and perspectives of staff and clients. Findings suggest that prison-based programs have…

  10. Evaluator's Guide for Word Processing Software.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton.

    This guide provides a detailed evaluation form, together with complete instructions for using it, which is designed to elicit answers to the following questions: (1) What features and abilities does a specific word processing program have? (2) On which computer(s) will the program work? (3) Is additional hardware/software necessary before the…

  11. A New Comparison Between Conventional Indexing (MEDLARS) and Automatic Text Processing (SMART)

    ERIC Educational Resources Information Center

    Salton, G.

    1972-01-01

    A new testing process is described. The design of the test procedure is covered in detail, and the several language processing features incorporated into the SMART system are individually evaluated. (20 references) (Author)

  12. Performance and evaluation of real-time multicomputer control systems

    NASA Technical Reports Server (NTRS)

    Shin, K. G.

    1983-01-01

    New performance measures, detailed examples, modeling of error detection process, performance evaluation of rollback recovery methods, experiments on FTMP, and optimal size of an NMR cluster are discussed.

  13. Evaluation of risk and benefit in thermal effusivity sensor for monitoring lubrication process in pharmaceutical product manufacturing.

    PubMed

    Uchiyama, Jumpei; Kato, Yoshiteru; Uemoto, Yoshifumi

    2014-08-01

    In the process design of tablet manufacturing, understanding and control of the lubrication process is important from various viewpoints. A detailed analysis of thermal effusivity data in the lubrication process was conducted in this study. In addition, we evaluated the risk and benefit in the lubrication process by a detailed investigation. It was found that monitoring of thermal effusivity detected mainly the physical change of bulk density, which was changed by dispersal of the lubricant and the coating powder particle by the lubricant. The monitoring of thermal effusivity was almost the monitoring of bulk density, thermal effusivity could have a high correlation with tablet hardness. Moreover, as thermal effusivity sensor could detect not only the change of the conventional bulk density but also the fractional change of thermal conductivity and thermal capacity, two-phase progress of lubrication process could be revealed. However, each contribution of density, thermal conductivity, or heat capacity to thermal effusivity has the risk of fluctuation by formulation. After carefully considering the change factor with the risk to be changed by formulation, thermal effusivity sensor can be a useful tool for monitoring as process analytical technology, estimating tablet hardness and investigating the detailed mechanism of the lubrication process.

  14. The Use of AMET & Automated Scripts for Model Evaluation

    EPA Science Inventory

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  15. Practical Considerations in Evaluating Patient/Consumer Health Education Programs.

    ERIC Educational Resources Information Center

    Bryant, Nancy H.

    This report contains brief descriptions of seven evaluative efforts and outcomes of health education programs, some considerations of problems encountered in evaluating the programs, and detailed descriptions of two case studies: (1) a process evaluation of preoperative teaching and (2) a retrospective study of visiting nurse association use by…

  16. Integrated payload and mission planning, phase 3. Volume 1: Integrated payload and mission planning process evaluation

    NASA Technical Reports Server (NTRS)

    Sapp, T. P.; Davin, D. E.

    1977-01-01

    The integrated payload and mission planning process for STS payloads was defined, and discrete tasks which evaluate performance and support initial implementation of this process were conducted. The scope of activity was limited to NASA and NASA-related payload missions only. The integrated payload and mission planning process was defined in detail, including all related interfaces and scheduling requirements. Related to the payload mission planning process, a methodology for assessing early Spacelab mission manager assignment schedules was defined.

  17. Post Occupancy Evaluation of Educational Buildings and Equipment.

    ERIC Educational Resources Information Center

    Watson, Chris

    1997-01-01

    Details the post occupancy evaluation (POE) process for public buildings. POEs are used to improve design and optimize educational building and equipment use. The evaluation participants, the method used, the results and recommendations, model schools, and classroom alterations using POE are described. (9 references.) (RE)

  18. Modeling relief.

    PubMed

    Sumner, Walton; Xu, Jin Zhong; Roussel, Guy; Hagen, Michael D

    2007-10-11

    The American Board of Family Medicine deployed virtual patient simulations in 2004 to evaluate Diplomates' diagnostic and management skills. A previously reported dynamic process generates general symptom histories from time series data representing baseline values and reactions to medications. The simulator also must answer queries about details such as palliation and provocation. These responses often describe some recurring pattern, such as, "this medicine relieves my symptoms in a few minutes." The simulator can provide a detail stored as text, or it can evaluate a reference to a second query object. The second query object can generate details using a single Bayesian network to evaluate the effect of each drug in a virtual patient's medication list. A new medication option may not require redesign of the second query object if its implementation is consistent with related drugs. We expect this mechanism to maintain realistic responses to detail questions in complex simulations.

  19. MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES

    EPA Science Inventory

    The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...

  20. Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method.

    PubMed

    Yeh, James S; Van Hoof, Thomas J; Fischer, Michael A

    2016-02-01

    Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process.

  1. Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method

    PubMed Central

    Yeh, James S.; Van Hoof, Thomas J.; Fischer, Michael A.

    2016-01-01

    Background Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. Objective To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. Methods This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. Results The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Conclusion Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process. PMID:27066195

  2. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  3. Qualification of data obtained during a severe accident. Illustrative examples from TMI-2 evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rempe, Joy L.; Knudson, Darrell L.

    2015-02-01

    The accidents at the Three Mile Island Unit 2 (TMI-2) Pressurized Water Reactor (PWR) and the Daiichi Units 1, 2, and 3 Boiling Water Reactors (BWRs) provide unique opportunities to evaluate instrumentation exposed to severe accident conditions. Conditions associated with the release of coolant and the hydrogen burn that occurred during the TMI-2 accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. Post-TMI-2 instrumentation evaluation programs focused on data required by TMI-2 operators to assess the condition of the reactor and containment and the effect of mitigating actions taken bymore » these operators. Prior efforts also focused on sensors providing data required for subsequent forensic evaluations and accident simulations. This paper provides additional details related to the formal process used to develop a qualified TMI-2 data base and presents data qualification details for three parameters: reactor coolant system (RCS) pressure; containment building temperature; and containment pressure. These selected examples illustrate the types of activities completed in the TMI-2 data qualification process and the importance of such a qualification effort. These details are described to facilitate implementation of a similar process using data and examinations at the Daiichi Units 1, 2, and 3 reactors so that BWR-specific benefits can be obtained.« less

  4. Revisiting photon-statistics effects on multiphoton ionization

    NASA Astrophysics Data System (ADS)

    Mouloudakis, G.; Lambropoulos, P.

    2018-05-01

    We present a detailed analysis of the effects of photon statistics on multiphoton ionization. Through a detailed study of the role of intermediate states, we evaluate the conditions under which the premise of nonresonant processes is valid. The limitations of its validity are manifested in the dependence of the process on the stochastic properties of the radiation and found to be quite sensitive to the intensity. The results are quantified through detailed calculations for coherent, chaotic, and squeezed vacuum radiation. Their significance in the context of recent developments in radiation sources such as the short-wavelength free-electron laser and squeezed vacuum radiation is also discussed.

  5. Global Consultation Processes: Lessons Learned from Refugee Teacher Consultation Research in Malaysia

    ERIC Educational Resources Information Center

    O'Neal, Colleen R.; Gosnell, Nicole M.; Ng, Wai Sheng; Clement, Jennifer; Ong, Edward

    2018-01-01

    The process of global consultation has received little attention despite its potential for promoting international mutual understanding with marginalized communities. This article details theory, entry, implementation, and evaluation processes for global consultation research, including lessons learned from our refugee teacher intervention. The…

  6. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, Enhanced Pearson eText with Loose-Leaf Version--Access Card Package. Fifth Edition

    ERIC Educational Resources Information Center

    Creswell, John W.

    2015-01-01

    "Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research" offers a truly balanced, inclusive, and integrated overview of the processes involved in educational research. This text first examines the general steps in the research process and then details the procedures for conducting specific types…

  7. The Role of Empirical Evidence in Modeling Speech Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence

    2015-01-01

    Choosing specific implementational details is one of the most important aspects of creating and evaluating a model. In order to properly model cognitive processes, choices for these details must be made based on empirical research. Unfortunately, modelers are often forced to make decisions in the absence of relevant data. My work investigates the…

  8. Lunar-base construction equipment and methods evaluation

    NASA Technical Reports Server (NTRS)

    Boles, Walter W.; Ashley, David B.; Tucker, Richard L.

    1993-01-01

    A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.

  9. Process economics of renewable biorefineries: butanol and ethanol production in integrated bioprocesses from lignocellulosics and other industrial by-products

    USDA-ARS?s Scientific Manuscript database

    This chapter provides process economic details on production of butanol from lignocellulosic biomass and glycerol in integrated bioreactors where numerous unit operations are combined. In order to compare various processes, economic evaluations were performed using SuperPro Designer Software (versio...

  10. Impacts Assessment of Integrated Dynamic Transit Operations: Evaluation Plan and Addendum

    DOT National Transportation Integrated Search

    2016-04-01

    This document details the process that the Volpe Center intended to follow in evaluating the impacts of the Integrated Dynamic Transit Operations (IDTO) prototype demonstration in Columbus, Ohio and Central Florida. The document also includes the add...

  11. Impact Assessment of Integrated Dynamic Transit Operations Evaluation Plan and Addendum.

    DOT National Transportation Integrated Search

    2016-01-04

    This document details the process that the Volpe Center intended to follow in evaluating the impacts of the Integrated Dynamic Transit Operations (IDTO) prototype demonstration in Columbus, Ohio and Central Florida. The document also includes the add...

  12. "Expectations to Change" ((E2C): A Participatory Method for Facilitating Stakeholder Engagement with Evaluation Findings

    ERIC Educational Resources Information Center

    Adams, Adrienne E.; Nnawulezi, Nkiru A.; Vandenberg, Lela

    2015-01-01

    From a utilization-focused evaluation perspective, the success of an evaluation is rooted in the extent to which the evaluation was used by stakeholders. This paper details the "Expectations to Change" (E2C) process, an interactive, workshop-based method designed to engage primary users with their evaluation findings as a means of…

  13. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    NASA Astrophysics Data System (ADS)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  14. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  15. Revitalizing Adversary Evaluation: Deep Dark Deficits or Muddled Mistaken Musings

    ERIC Educational Resources Information Center

    Thurston, Paul

    1978-01-01

    The adversary evaluation model consists of utilizing the judicial process as a metaphor for educational evaluation. In this article, previous criticism of the model is addressed and its fundamental problems are detailed. It is speculated that the model could be improved by borrowing ideas from other legal forms of inquiry. (Author/GC)

  16. Sustainability Research: Biofuels, Processes and Supply Chains

    EPA Science Inventory

    Presentation will talk about sustainability at the EPA, summarily covering high level efforts and focusing in more detail on research in metrics for liquid biofuels and tools to evaluate sustainable processes. The presentation will also briefly touch on a new area of research, t...

  17. Facilities | Transportation Research | NREL

    Science.gov Websites

    detailed chemical characterization, performance property measurements, and stability research. Photo of Technology Evaluation Center This off-network data center provides secure management, storage, and processing

  18. Designing Flightdeck Procedures

    NASA Technical Reports Server (NTRS)

    Barshi, Immanuel; Mauro, Robert; Degani, Asaf; Loukopoulou, Loukia

    2016-01-01

    The primary goal of this document is to provide guidance on how to design, implement, and evaluate flight deck procedures. It provides a process for developing procedures that meet clear and specific requirements. This document provides a brief overview of: 1) the requirements for procedures, 2) a process for the design of procedures, and 3) a process for the design of checklists. The brief overview is followed by amplified procedures that follow the above steps and provide details for the proper design, implementation and evaluation of good flight deck procedures and checklists.

  19. MDCT evaluation of potential living renal donor, prior to laparoscopic donor nephrectomy: What the transplant surgeon wants to know?

    PubMed Central

    Ghonge, Nitin P; Gadanayak, Satyabrat; Rajakumari, Vijaya

    2014-01-01

    As Laparoscopic Donor Nephrectomy (LDN) offers several advantages for the donor such as lesser post-operative pain, fewer cosmetic concerns and faster recovery time, there is growing global trend towards LDN as compared to open nephrectomy. Comprehensive pre-LDN donor evaluation includes assessment of renal morphology including pelvi-calyceal and vascular system. Apart from donor selection, evaluation of the regional anatomy allows precise surgical planning. Due to limited visualization during laparoscopic renal harvesting, detailed pre-transplant evaluation of regional anatomy, including the renal venous anatomy is of utmost importance. MDCT is the modality of choice for pre-LDN evaluation of potential renal donors. Apart from appropriate scan protocol and post-processing methods, detailed understanding of surgical techniques is essential for the Radiologist for accurate image interpretation during pre-LDN MDCT evaluation of potential renal donors. This review article describes MDCT evaluation of potential living renal donor, prior to LDN with emphasis on scan protocol, post-processing methods and image interpretation. The article laid special emphasis on surgical perspectives of pre-LDN MDCT evaluation and addresses important points which transplant surgeons want to know. PMID:25489130

  20. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study).

    PubMed

    Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle

    2014-09-27

    Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.

  1. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  2. Measuring Outcomes in Children's Services.

    ERIC Educational Resources Information Center

    Christner, Anne Marshall, Ed.

    Outcomes evaluation can provide program managers and clinical directors in child welfare, juvenile justice, child mental health, and child protective services the necessary tools for program quality assurance and accountability. This guide describes the outcomes evaluation process and provides a summary of articles and reports detailing current…

  3. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    DTIC Science & Technology

    2015-12-01

    experience as they evaluated scenarios with varying levels of uncertainty. This research focused on understanding the interaction of experience and...details more often than irrelevant details. Those findings provide insight into the cognitive processes Soldiers with varying levels of experience ...like to thank the Soldiers and leaders who participated in this research , especially for providing their time and for sharing their experiences

  4. A Qualitative Approach to the Evaluation of Expert Systems Shells.

    ERIC Educational Resources Information Center

    Slawson, Dean A.; And Others

    This study explores an approach to the evaluation of expert system shells using case studies. The methodology and some of the results of an evaluation of the prototype development of an expert system using the shell "M1" are detailed, including a description of the participants and the project, the data collection process and materials,…

  5. Identification of Technologies for Provision of Future Aeronautical Communications

    NASA Technical Reports Server (NTRS)

    Gilbert, Tricia; Dyer, Glen; Henriksen, Steve; Berger, Jason; Jin, Jenny; Boci, Tony

    2006-01-01

    This report describes the process, findings, and recommendations of the second of three phases of the Future Communications Study (FCS) technology investigation conducted by NASA Glenn Research Center and ITT Advanced Engineering & Sciences Division for the Federal Aviation Administration (FAA). The FCS is a collaborative research effort between the FAA and Eurocontrol to address frequency congestion and spectrum depletion for safety critical airground communications. The goal of the technology investigation is to identify technologies that can support the longterm aeronautical mobile communication operating concept. A derived set of evaluation criteria traceable to the operating concept document is presented. An adaptation of the analytical hierarchy process is described and recommended for selecting candidates for detailed evaluation. Evaluations of a subset of technologies brought forward from the prescreening process are provided. Five of those are identified as candidates with the highest potential for continental airspace solutions in L-band (P-34, W-CDMA, LDL, B-VHF, and E-TDMA). Additional technologies are identified as best performers in the unique environments of remote/oceanic airspace in the satellite bands (Inmarsat SBB and a custom satellite solution) and the airport flight domain in C-band (802.16e). Details of the evaluation criteria, channel models, and the technology evaluations are provided in appendixes.

  6. Qualification of Daiichi Units 1, 2, and 3 Data for Severe Accident Evaluations - Process and Illustrative Examples from Prior TMI-2 Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rempe, Joy Lynn; Knudson, Darrell Lee

    2014-09-01

    The accidents at the Three Mile Island Unit 2 (TMI-2) Pressurized Water Reactor (PWR) and the Daiichi Units 1, 2, and 3 Boiling Water Reactors (BWRs) provide unique opportunities to evaluate instrumentation exposed to severe accident conditions. Conditions associated with the release of coolant and the hydrogen burn that occurred during the TMI-2 accident exposed instrumentation to harsh conditions, including direct radiation, radioactive contamination, and high humidity with elevated temperatures and pressures. As part of a program initiated in 2012 by the Department of Energy Office of Nuclear Energy (DOE-NE), a review was completed to gain insights from prior TMI-2more » sensor survivability and data qualification efforts. This initial review focused on the set of sensors deemed most important by post-TMI-2 instrumentation evaluation programs. Instrumentation evaluation programs focused on data required by TMI-2 operators to assess the condition of the reactor and containment and the effect of mitigating actions taken by these operators. In addition, prior efforts focused on sensors providing data required for subsequent forensic evaluations and accident simulations. To encourage the potential for similar activities to be completed for qualifying data from Daiichi Units 1, 2, and 3, this report provides additional details related to the formal process used to develop a qualified TMI-2 data base and presents data qualification details for three parameters: primary system pressure; containment building temperature; and containment pressure. As described within this report, sensor evaluations and data qualification required implementation of various processes, including comparisons with data from other sensors, analytical calculations, laboratory testing, and comparisons with sensors subjected to similar conditions in large-scale integral tests and with sensors that were similar in design to instruments easily removed from the TMI-2 plant for evaluations. As documented in this report, results from qualifying data for these parameters led to key insights related to TMI-2 accident progression. Hence, these selected examples illustrate the types of activities completed in the TMI-2 data qualification process and the importance of such a qualification effort. These details are documented in this report to facilitate implementation of similar process using data and examinations at the Daiichi Units 1, 2, and 3 reactors so that BWR-specific benefits can be obtained.« less

  7. Evaluation of Treatment Technologies for Wastewater from Insensitive Munitions Production. Phase 1: Technology Down-Selection

    DTIC Science & Technology

    2013-11-01

    the AOP reactor according to the target process formulation. Gases were vented to a GAC vessel. ERDC/EL TR-13-20 94 10.2.2 Results and Discussion...destructive and filtration methods such as biological treatment (destructive), chemical reduction (destructive), reverse osmosis (RO)/nano- filtration ... filtration ), and advanced oxidation processes (destructive). A comprehensive evaluation of alternatives relies on a detailed list of criteria, allowing for

  8. Evaluation of massless-spring modeling of suspension-line elasticity during the parachute unfurling process

    NASA Technical Reports Server (NTRS)

    Poole, L. R.; Huckins, E. K., III

    1972-01-01

    A general theory on mathematical modeling of elastic parachute suspension lines during the unfurling process was developed. Massless-spring modeling of suspension-line elasticity was evaluated in detail. For this simple model, equations which govern the motion were developed and numerically integrated. The results were compared with flight test data. In most regions, agreement was satisfactory. However, poor agreement was obtained during periods of rapid fluctuations in line tension.

  9. Evaluating Organizational Change at a Multinational Transportation Corporation: Method and Reflections

    ERIC Educational Resources Information Center

    Plakhotnik, Maria S.

    2016-01-01

    The purpose of this perspective on practice is to share my experience conducting an organizational change evaluation using qualitative methodology at a multinational transportation company Global Logistics. I provide a detailed description of the three phase approach to data analysis and my reflections on the process.

  10. CONTROL OF SULFUR EMISSIONS FROM OIL SHALE RETORTING USING SPEND SHALE ABSORPTION

    EPA Science Inventory

    The paper gives results of a detailed engineering evaluation of the potential for using an absorption on spent shale process (ASSP) for controlling sulfur emissions from oil shale plants. The evaluation analyzes the potential effectiveness and cost of absorbing SO2 on combusted s...

  11. Development and Performance Evaluation of Image-Based Robotic Waxing System for Detailing Automobiles

    PubMed Central

    Hsu, Bing-Cheng

    2018-01-01

    Waxing is an important aspect of automobile detailing, aimed at protecting the finish of the car and preventing rust. At present, this delicate work is conducted manually due to the need for iterative adjustments to achieve acceptable quality. This paper presents a robotic waxing system in which surface images are used to evaluate the quality of the finish. An RGB-D camera is used to build a point cloud that details the sheet metal components to enable path planning for a robot manipulator. The robot is equipped with a multi-axis force sensor to measure and control the forces involved in the application and buffing of wax. Images of sheet metal components that were waxed by experienced car detailers were analyzed using image processing algorithms. A Gaussian distribution function and its parameterized values were obtained from the images for use as a performance criterion in evaluating the quality of surfaces prepared by the robotic waxing system. Waxing force and dwell time were optimized using a mathematical model based on the image-based criterion used to measure waxing performance. Experimental results demonstrate the feasibility of the proposed robotic waxing system and image-based performance evaluation scheme. PMID:29757940

  12. Development and Performance Evaluation of Image-Based Robotic Waxing System for Detailing Automobiles.

    PubMed

    Lin, Chi-Ying; Hsu, Bing-Cheng

    2018-05-14

    Waxing is an important aspect of automobile detailing, aimed at protecting the finish of the car and preventing rust. At present, this delicate work is conducted manually due to the need for iterative adjustments to achieve acceptable quality. This paper presents a robotic waxing system in which surface images are used to evaluate the quality of the finish. An RGB-D camera is used to build a point cloud that details the sheet metal components to enable path planning for a robot manipulator. The robot is equipped with a multi-axis force sensor to measure and control the forces involved in the application and buffing of wax. Images of sheet metal components that were waxed by experienced car detailers were analyzed using image processing algorithms. A Gaussian distribution function and its parameterized values were obtained from the images for use as a performance criterion in evaluating the quality of surfaces prepared by the robotic waxing system. Waxing force and dwell time were optimized using a mathematical model based on the image-based criterion used to measure waxing performance. Experimental results demonstrate the feasibility of the proposed robotic waxing system and image-based performance evaluation scheme.

  13. Understanding the Federal Proposal Review Process.

    ERIC Educational Resources Information Center

    Cavin, Janis I.

    Information on the peer review process for the evaluation of federal grant proposals is presented to help college grants administrators and faculty develop good proposals. This guidebook provides an overview of the policies and conventions that govern the review and selection of proposals for funding, and details the review procedures of the…

  14. CRT image recording evaluation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Performance capabilities and limitations of a fiber optic coupled line scan CRT image recording system were investigated. The test program evaluated the following components: (1). P31 phosphor CRT with EMA faceplate; (2). P31 phosphor CRT with clear clad faceplate; (3). Type 7743 semi-gloss dry process positive print paper; (4). Type 777 flat finish dry process positive print paper; (5). Type 7842 dry process positive film; and (6). Type 1971 semi-gloss wet process positive print paper. Detailed test procedures used in each test are provided along with a description of each test, the test data, and an analysis of the results.

  15. Evaluating Web accessibility at different processing phases

    NASA Astrophysics Data System (ADS)

    Fernandes, N.; Lopes, R.; Carriço, L.

    2012-09-01

    Modern Web sites use several techniques (e.g. DOM manipulation) that allow for the injection of new content into their Web pages (e.g. AJAX), as well as manipulation of the HTML DOM tree. This has the consequence that the Web pages that are presented to users (i.e. after browser processing) are different from the original structure and content that is transmitted through HTTP communication (i.e. after browser processing). This poses a series of challenges for Web accessibility evaluation, especially on automated evaluation software. This article details an experimental study designed to understand the differences posed by accessibility evaluation after Web browser processing. We implemented a Javascript-based evaluator, QualWeb, that can perform WCAG 2.0 based accessibility evaluations in the two phases of browser processing. Our study shows that, in fact, there are considerable differences between the HTML DOM trees in both phases, which have the consequence of having distinct evaluation results. We discuss the impact of these results in the light of the potential problems that these differences can pose to designers and developers that use accessibility evaluators that function before browser processing.

  16. Online Deviation Detection for Medical Processes

    PubMed Central

    Christov, Stefan C.; Avrunin, George S.; Clarke, Lori A.

    2014-01-01

    Human errors are a major concern in many medical processes. To help address this problem, we are investigating an approach for automatically detecting when performers of a medical process deviate from the acceptable ways of performing that process as specified by a detailed process model. Such deviations could represent errors and, thus, detecting and reporting deviations as they occur could help catch errors before harm is done. In this paper, we identify important issues related to the feasibility of the proposed approach and empirically evaluate the approach for two medical procedures, chemotherapy and blood transfusion. For the evaluation, we use the process models to generate sample process executions that we then seed with synthetic errors. The process models describe the coordination of activities of different process performers in normal, as well as in exceptional situations. The evaluation results suggest that the proposed approach could be applied in clinical settings to help catch errors before harm is done. PMID:25954343

  17. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  18. Morphology of the external genitalia of the adult male and female mice as an endpoint of sex differentiation

    PubMed Central

    Weiss, Dana A.; Rodriguez, Esequiel; Cunha, Tristan; Menshenina, Julia; Barcellos, Dale; Chan, Lok Yun; Risbridger, Gail; Baskin, Laurence; Cunha, Gerald

    2013-01-01

    Adult external genitalia (ExG) are the endpoints of normal sex differentiation. Detailed morphometric analysis and comparison of adult mouse ExG has revealed 10 homologous features distinguishing the penis and clitoris that define masculine vs. feminine sex differentiation. These features have enabled the construction of a simple metric to evaluate various intersex conditions in mutant or hormonally manipulated mice. This review focuses on the morphology of the adult mouse penis and clitoris through detailed analysis of histologic sections, scanning electron microscopy, and three-dimensional reconstruction. We also present previous results from evaluation of “non-traditional” mammals, such as the spotted hyena and wallaby to demonstrate the complex process of sex differentiation that involves not only androgen-dependent processes, but also estrogen-dependent and hormone-independent mechanisms. PMID:21893161

  19. Computer Courseware Evaluations. A Series of Reports Compiled by the Clearinghouse Computer Technology Project.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton.

    This report reviews Apple computer courseware in business education, library skills, mathematics, science, special education, and word processing based on the curricular requirements of Alberta, Canada. It provides detailed evaluations of 23 authorized titles in business education (2), mathematics (20), and science (1); 3 of the math titles are…

  20. Flight test and evaluation of Omega navigation in a general aviation aircraft. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Howell, J. D.; Hoffman, W. C.; Hwoschinsky, P. V.; Wischmeyer, C. E.

    1975-01-01

    Detailed documentation for each flight of the Omega Flight Evaluation study is presented, including flight test description sheets and actual flight data plots. Computer programs used for data processing and flight planning are explained and the data formats utilized by the Custom Interface Unit are summarized.

  1. The impact of robustness of deformable image registration on contour propagation and dose accumulation for head and neck adaptive radiotherapy.

    PubMed

    Zhang, Lian; Wang, Zhi; Shi, Chengyu; Long, Tengfei; Xu, X George

    2018-05-30

    Deformable image registration (DIR) is the key process for contour propagation and dose accumulation in adaptive radiation therapy (ART). However, currently, ART suffers from a lack of understanding of "robustness" of the process involving the image contour based on DIR and subsequent dose variations caused by algorithm itself and the presetting parameters. The purpose of this research is to evaluate the DIR caused variations for contour propagation and dose accumulation during ART using the RayStation treatment planning system. Ten head and neck cancer patients were selected for retrospective studies. Contours were performed by a single radiation oncologist and new treatment plans were generated on the weekly CT scans for all patients. For each DIR process, four deformation vector fields (DVFs) were generated to propagate contours and accumulate weekly dose by the following algorithms: (a) ANACONDA with simple presetting parameters, (b) ANACONDA with detailed presetting parameters, (c) MORFEUS with simple presetting parameters, and (d) MORFEUS with detailed presetting parameters. The geometric evaluation considered DICE coefficient and Hausdorff distance. The dosimetric evaluation included D 95 , D max , D mean , D min , and Homogeneity Index. For geometric evaluation, the DICE coefficient variations of the GTV were found to be 0.78 ± 0.11, 0.96 ± 0.02, 0.64 ± 0.15, and 0.91 ± 0.03 for simple ANACONDA, detailed ANACONDA, simple MORFEUS, and detailed MORFEUS, respectively. For dosimetric evaluation, the corresponding Homogeneity Index variations were found to be 0.137 ± 0.115, 0.006 ± 0.032, 0.197 ± 0.096, and 0.006 ± 0.033, respectively. The coherent geometric and dosimetric variations also consisted in large organs and small organs. Overall, the results demonstrated that the contour propagation and dose accumulation in clinical ART were influenced by the DIR algorithm, and to a greater extent by the presetting parameters. A quality assurance procedure should be established for the proper use of a commercial DIR for adaptive radiation therapy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  2. Developing a Competency-Based Curriculum for a Dental Hygiene Program.

    ERIC Educational Resources Information Center

    DeWald, Janice P.; McCann, Ann L.

    1999-01-01

    Describes the three-step process used to develop a competency-based curriculum at the Caruth School of Dental Hygiene (Texas A&M University). The process involved development of a competency document (detailing three domains, nine major competencies, and 54 supporting competencies), an evaluation plan, and a curriculum inventory which defined…

  3. Get on Board the Cost Effective Way: A Tech Prep Replication Process.

    ERIC Educational Resources Information Center

    Moore, Wayne A.; Szul, Linda F.; Rivosecchi, Karen

    1997-01-01

    The Northwestern Pennsylvania Tech Prep Consortium model for replicating tech prep programs includes these steps: fact finding, local industry analysis, curriculum development, detailed description, marketing strategies, implementation, and program evaluation. (SK)

  4. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  5. Technical/commercial feasibility study of the production of fuel-grade ethanol from corn: 100-million-gallon-per-year production facility in Myrtle Grove, Louisiana

    NASA Astrophysics Data System (ADS)

    1982-05-01

    The technical and economic feasibility of producing motor fuel alcohol from corn in a 100 million gallon per year plant to be constructed in Myrtle Grove, Louisiana is evaluated. The evaluation includes a detailed process design using proven technology, a capital cost estimate for the plant, a detailed analysis of the annual operating cost, a market study, a socioeconomic, environmental, health and safety analysis, and a complete financial analysis. Several other considerations for production of ethanol were evaluated including: cogeneration and fuel to be used in firing the boilers; single by-products vs. multiple by-products; and use of boiler flue gas for by-product drying.

  6. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  7. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  8. Cerberus to Mind: Media as Sentinel in the Fight against Terrorism

    DTIC Science & Technology

    2006-05-01

    splits. First, unfiltered signals arrive directly at the amygdala . The amygdala , as the evolutionary and memory-induced warehouse of fear, makes a...and detailed evaluation. The cortically-processed sensory inputs then arrive at the amygdala (with a time detail relative to the direct inputs from...the victims themselves.[4] Terror and the Media Democratic nations must try to find ways to starve the terrorist and the hijacker of the oxygen of

  9. Pantex Falling Man - Independent Review Panel Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolini, Louis; Brannon, Nathan; Olson, Jared

    2014-11-01

    Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less

  10. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  11. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  12. Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.

    ERIC Educational Resources Information Center

    Swiger, John; Klaus, Allen

    1996-01-01

    A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)

  13. Lab Procedures. Sludge Treatment and Disposal Course #166. Instructor's Guide [and] Student Workbook.

    ERIC Educational Resources Information Center

    Carnegie, John W.

    Laboratory tests used to determine status and to evaluate and/or maintain process control of the various sludge treatment processes are introduced in this lesson. Neither detailed test procedures nor explanations of how the tests should be applied to every unit are explained; this information is provided in other modules. The instructor's manual…

  14. Hellsgate Big Game Winter Range Wildlife Mitigation Site Specific Management Plan for the Hellsgate Project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, Matthew T.; Judd, Steven L.

    This report contains a detailed site-specific management plan for the Hellsgate Winter Range Wildlife Mitigation Project. The report provides background information about the mitigation process, the review process, mitigation acquisitions, Habitat Evaluation Procedures (HEP) and mitigation crediting, current habitat conditions, desired future habitat conditions, restoration/enhancements efforts and maps.

  15. Evaluating crown fire rate of spread predictions from physics-based models

    Treesearch

    C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont

    2015-01-01

    Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...

  16. Evaluation of feedback interventions for improving the quality assurance of cancer screening in Japan: study design and report of the baseline survey.

    PubMed

    Machii, Ryoko; Saika, Kumiko; Higashi, Takahiro; Aoki, Ayako; Hamashima, Chisato; Saito, Hiroshi

    2012-02-01

    The importance of quality assurance in cancer screening has recently gained increasing attention in Japan. To evaluate and improve quality, checklists and process indicators have been developed. To explore effective methods of enhancing quality in cancer screening, we started a randomized control study of the methods of evaluation and feedback for cancer control from 2009 to 2014. We randomly assigned 1270 municipal governments, equivalent to 71% of all Japanese municipal governments that performed screening programs, into three groups. The high-intensity intervention groups (n = 425) were individually evaluated using both checklist performance and process indicator values, while the low-intensity intervention groups (n= 421) were individually evaluated on the basis of only checklist performance. The control group (n = 424) received only a basic report that included the national average of checklist performance scores. We repeated the survey for each municipality's quality assurance activity performance using checklists and process indicators. In this paper, we report our study design and the result of the baseline survey. The checklist adherence rates were especially low in the checklist elements related to invitation of individuals, detailed monitoring of process indicators such as cancer detection rates according to screening histories and appropriate selection of screening facilities. Screening rate and percentage of examinees who underwent detailed examination tended to be lower for large cities when compared with smaller cities for all cancer sites. The performance of the Japanese cancer screening program in 2009 was identified for the first time.

  17. Three dimensional modeling of cirrus during the 1991 FIRE IFO 2: Detailed process study

    NASA Technical Reports Server (NTRS)

    Jensen, Eric J.; Toon, Owen B.; Westphal, Douglas L.

    1993-01-01

    A three-dimensional model of cirrus cloud formation and evolution, including microphysical, dynamical, and radiative processes, was used to simulate cirrus observed in the FIRE Phase 2 Cirrus field program (13 Nov. - 7 Dec. 1991). Sulfate aerosols, solution drops, ice crystals, and water vapor are all treated as interactive elements in the model. Ice crystal size distributions are fully resolved based on calculations of homogeneous freezing of solution drops, growth by water vapor deposition, evaporation, aggregation, and vertical transport. Visible and infrared radiative fluxes, and radiative heating rates are calculated using the two-stream algorithm described by Toon et al. Wind velocities, diffusion coefficients, and temperatures were taken from the MAPS analyses and the MM4 mesoscale model simulations. Within the model, moisture is transported and converted to liquid or vapor by the microphysical processes. The simulated cloud bulk and microphysical properties are shown in detail for the Nov. 26 and Dec. 5 case studies. Comparisons with lidar, radar, and in situ data are used to determine how well the simulations reproduced the observed cirrus. The roles played by various processes in the model are described in detail. The potential modes of nucleation are evaluated, and the importance of small-scale variations in temperature and humidity are discussed. The importance of competing ice crystal growth mechanisms (water vapor deposition and aggregation) are evaluated based on model simulations. Finally, the importance of ice crystal shape for crystal growth and vertical transport of ice are discussed.

  18. Measurement fidelity of heart rate variability signal processing: The devil is in the details

    PubMed Central

    Jarrin, Denise C.; McGrath, Jennifer J.; Giovanniello, Sabrina; Poirier, Paul; Lambert, Marie

    2017-01-01

    Heart rate variability (HRV) is a particularly valuable quantitative marker of the flexibility and balance of the autonomic nervous system. Significant advances in software programs to automatically derive HRV have led to its extensive use in psychophysiological research. However, there is a lack of systematic comparisons across software programs used to derive HRV indices. Further, researchers report meager details on important signal processing decisions making synthesis across studies challenging. The aim of the present study was to evaluate the measurement fidelity of time- and frequency-domain HRV indices derived from three predominant signal processing software programs commonly used in clinical and research settings. Triplicate ECG recordings were derived from 20 participants using identical data acquisition hardware. Among the time-domain indices, there was strong to excellent correspondence (ICCavg =0.93) for SDNN, SDANN, SDNNi, rMSSD, and pNN50. The frequency-domain indices yielded excellent correspondence (ICCavg =0.91) for LF, HF, and LF/HF ratio, except for VLF which exhibited poor correspondence (ICCavg =0.19). Stringent user-decisions and technical specifications for nuanced HRV processing details are essential to ensure measurement fidelity across signal processing software programs. PMID:22820268

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szoka de Valladares, M.R.; Mack, S.

    The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less

  20. Entrepreneurial Spirit in Strategic Planning.

    ERIC Educational Resources Information Center

    Riggs, Donald E.

    1987-01-01

    Presents a model which merges the concepts of entrepreneurship with those of strategic planning to create a library management system. Each step of the process, including needs assessment and policy formation, strategy choice and implementation, and evaluation, is described in detail. (CLB)

  1. 76 FR 10209 - Corporate Credit Unions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-24

    ... natural person credit unions (NPCUs) may wish to form new corporates. Previous corporate chartering... NCUA's standards for evaluating applications. It also included detailed timelines for processing... natural person representatives of natural person credit unions (NPCUs)--``the subscribers''--may charter a...

  2. Advanced membrane devices. Interim report for October 1996--September 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laciak, D.V.; Langsam, M.; Lewnard, J.J.

    1997-12-31

    Under this Cooperative Agreement, Air Products and Chemicals, Inc. has continued to investigate and develop improved membrane technology for removal of carbon dioxide from natural gas. The task schedule for this reporting period included a detailed assessment of the market opportunity (Chapter 2), continued development and evaluation of membranes and membrane polymers (Chapter 3) and a detailed economic analysis comparing the potential of Air Products membranes to that of established acid gas removal processes (Chapter 4).

  3. Evaluating the use of prior information under different pacing conditions on aircraft inspection performance: The use of virtual reality technology

    NASA Astrophysics Data System (ADS)

    Bowling, Shannon Raye

    The aircraft maintenance industry is a complex system consisting of human and machine components, because of this; much emphasis has been placed on improving aircraft-inspection performance. One proven technique for improving inspection performance is the use of training. There are several strategies that have been implemented for training, one of which is feedforward information. The use of prior information (feedforward) is known to positively affect inspection performance. This information can consist of knowledge about defect characteristics (types, severity/criticality, and location) and the probability of occurrence. Although several studies have been conducted that demonstrate the usefulness of feedforward as a training strategy, there are certain research issues that need to be addressed. This study evaluates the effect of feedforward information in a simulated 3-dimensional environment by the use of virtual reality. A controlled study was conducted to evaluate the effectiveness of feedforward information in a simulated aircraft inspection environment. The study was conducted in two phases. The first phase evaluated the difference between general and detailed inspection at different pacing levels. The second phase evaluated the effect of feedforward information pertaining to severity, probability and location. Analyses of the results showed that subjects performing detailed inspection performed significantly better than while performing general inspection. Pacing also had the effect of reducing performance for both general and detailed inspection. The study also found that as the level of feedforward information increases, performance also increases. In addition to evaluating performance measures, the study also evaluated process and subjective measures. It was found that process measures such as number of fixation points, fixation groups, mean fixation duration, and percent area covered were all affected by the treatment levels. Analyses of the subjective measures also found a correlation between the perceived usefulness of feedforward information and the actual effect on performance. The study also examined the potential of virtual reality as a training tool and analyzed the effect different calculational algorithms have on determining various process measures.

  4. Evaluating Indicators and Life Cycle Inventories for Processes in Early Stages of Technical Readiness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Eric C; Smith, Raymond; Ruiz-Mercado, Gerardo

    This presentation examines different methods for analyzing manufacturing processes in the early stages of technical readiness. Before developers know much detail about their processes, it is valuable to apply various assessments to evaluate their performance. One type of assessment evaluates performance indicators to describe how closely processes approach desirable objectives. Another type of assessment determines the life cycle inventories (LCI) of inputs and outputs for processes, where for a functional unit of product, the user evaluates the resources used and the releases to the environment. These results can be compared to similar processes or combined with the LCI of othermore » processes to examine up-and down-stream chemicals. The inventory also provides a listing of the up-stream chemicals, which permits study of the whole life cycle. Performance indicators are evaluated in this presentation with the U.S. Environmental Protection Agency's GREENSCOPE (Gauging Reaction Effectiveness for ENvironmental Sustainability with a multi-Objective Process Evaluator) methodology, which evaluates processes in four areas: Environment, Energy, Economics, and Efficiency. The method develops relative scores for indicators that allow comparisons across various technologies. In this contribution, two conversion pathways for producing cellulosic ethanol from biomass, via thermochemical and biochemical routes, are studied. The information developed from the indicators and LCI can be used to inform the process design and the potential life cycle effects of up- and down-stream chemicals.« less

  5. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less

  6. Controlled Release of Antibiotics from Biodegradable Microcapsules for Wound Infection Control.

    DTIC Science & Technology

    1982-06-18

    evaporation and phase separation methods were used in formulating the microcapsules .(l1) The microencapsulation process will be described in detail in a...intensity to the antibiotic content. Usi.ng both microencapsulation processes, 14C-labeled ampicillin anhydypte microcapsules were synthesized.(12...excellent technical assistance. .. . . g .SETTERSTROM, TICE, LEWIS, and-MEYERS TABLE 1. IN VIVO AMPICILLIN MICROCAPSULES EVALUATED MICROENCAPSULATION

  7. Assessment of Southern California environment from ERTS-1

    NASA Technical Reports Server (NTRS)

    Bowden, L. W.; Viellenave, J. H.

    1973-01-01

    ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.

  8. Pyrotechnic hazards classification and evaluation program. Phase 2, segment 3: Test plan for determining hazards associated with pyrotechnic manufacturing processes

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A comprehensive test plan for determining the hazards associated with pyrotechnic manufacturing processes is presented. The rationale for each test is based on a systematic analysis of historical accounts of accidents and a detailed study of the characteristics of each manufacturing process. The most hazardous manufacturing operations have been determined to be pressing, mixing, reaming, and filling. The hazard potential of a given situation is evaluated in terms of the probabilities of initiation, communication, and transition to detonation (ICT). The characteristics which affect the ICT probabilities include the ignition mechanisms which are present either in normal or abnormal operation, the condition and properties of the pyrotechnic material, and the configuration of the processing equipment. Analytic expressions are derived which describe the physical conditions of the system, thus permitting a variety of processes to be evaluated in terms of a small number of experiments.

  9. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  10. [The processes of manuscript evaluation and publication in Medicina Clínica. The editorial committee of Medicina Clínica].

    PubMed

    Ribera, Josep M; Cardellach, Francesc; Selva, Albert

    2005-12-01

    The decision-making process includes a series of activities undertaken in biomedical journals from the moment a manuscript is received until it is accepted or rejected. Firstly, the manuscript is evaluated by the members of the Editorial Board, who analyze both its suitability for the journal and its scientific quality. After this initial evaluation, the article is evaluated by peer reviewers, an essential process to guarantee its scientific validity. Both the Editorial Board and the peer reviewers usually use checklists which are of enormous help in this task. Once the biomedical article has been accepted, the publication process is started, which in turn includes a series of steps, beginning with technical and medical review of the article's contents and ending with the article's publication in the journal. The present article provides a detailed description of the main technical and ethical issues involved in the processes of decision-making and publication of biomedical articles.

  11. Evaluation of the flame propagation within an SI engine using flame imaging and LES

    NASA Astrophysics Data System (ADS)

    He, Chao; Kuenne, Guido; Yildar, Esra; van Oijen, Jeroen; di Mare, Francesca; Sadiki, Amsini; Ding, Carl-Philipp; Baum, Elias; Peterson, Brian; Böhm, Benjamin; Janicka, Johannes

    2017-11-01

    This work shows experiments and simulations of the fired operation of a spark ignition engine with port-fuelled injection. The test rig considered is an optically accessible single cylinder engine specifically designed at TU Darmstadt for the detailed investigation of in-cylinder processes and model validation. The engine was operated under lean conditions using iso-octane as a substitute for gasoline. Experiments have been conducted to provide a sound database of the combustion process. A planar flame imaging technique has been applied within the swirl- and tumble-planes to provide statistical information on the combustion process to complement a pressure-based comparison between simulation and experiments. This data is then analysed and used to assess the large eddy simulation performed within this work. For the simulation, the engine code KIVA has been extended by the dynamically thickened flame model combined with chemistry reduction by means of pressure dependent tabulation. Sixty cycles have been simulated to perform a statistical evaluation. Based on a detailed comparison with the experimental data, a systematic study has been conducted to obtain insight into the most crucial modelling uncertainties.

  12. JETC (Japanese Technology Evaluation Center) Panel Report on High Temperature Superconductivity in Japan

    NASA Technical Reports Server (NTRS)

    Shelton, Duane; Gamota, George

    1989-01-01

    The Japanese regard success in R and D in high temperature superconductivity as an important national objective. The results of a detailed evaluation of the current state of Japanese high temperature superconductivity development are provided. The analysis was performed by a panel of technical experts drawn from U.S. industry and academia, and is based on reviews of the relevant literature and visits to Japanese government, academic and industrial laboratories. Detailed appraisals are presented on the following: Basic research; superconducting materials; large scale applications; processing of superconducting materials; superconducting electronics and thin films. In all cases, comparisons are made with the corresponding state-of-the-art in the United States.

  13. Anaerobic digestion of stillage fractions - estimation of the potential for energy recovery in bioethanol plants.

    PubMed

    Drosg, B; Fuchs, W; Meixner, K; Waltenberger, R; Kirchmayr, R; Braun, R; Bochmann, G

    2013-01-01

    Stillage processing can require more than one third of the thermal energy demand of a dry-grind bioethanol production plant. Therefore, for every stillage fraction occurring in stillage processing the potential of energy recovery by anaerobic digestion (AD) was estimated. In the case of whole stillage up to 128% of the thermal energy demand in the process can be provided, so even an energetically self-sufficient bioethanol production process is possible. For wet cake the recovery potential of thermal energy is 57%, for thin stillage 41%, for syrup 40% and for the evaporation condensate 2.5%. Specific issues for establishing AD of stillage fractions are evaluated in detail; these are high nitrogen concentrations, digestate treatment and trace element supply. If animal feed is co-produced at the bioethanol plant and digestate fractions are to be reused as process water, a sufficient quality is necessary. Most interesting stillage fractions as substrates for AD are whole stillage, thin stillage and the evaporation condensate. For these fractions process details are presented.

  14. 3D Printer-Manufacturing of Complex Geometry Elements

    NASA Astrophysics Data System (ADS)

    Ciubară, A.; Burlea, Ș L.; Axinte, M.; Cimpoeșu, R.; Chicet, D. L.; Manole, V.; Burlea, G.; Cimpoeșu, N.

    2018-06-01

    In the last 5-10 years the process of 3D printing has an incredible advanced in all the fields with a tremendous number of applications. Plastic materials exhibit highly beneficial mechanical properties while delivering complex designs impossible to achieve using conventional manufacturing. In this article the printing process (filling degree, time, complications and details finesse) of few plastic elements with complicated geometry and fine details was analyzed and comment. 3D printing offers many of the thermoplastics and industrial materials found in conventional manufacturing. The advantages and disadvantages of 3D printing for plastic parts are discussed. Time of production for an element with complex geometry, from the design to final cut, was evaluated.

  15. Evaluation of patient centered medical home practice transformation initiatives.

    PubMed

    Crabtree, Benjamin F; Chase, Sabrina M; Wise, Christopher G; Schiff, Gordon D; Schmidt, Laura A; Goyzueta, Jeanette R; Malouin, Rebecca A; Payne, Susan M C; Quinn, Michael T; Nutting, Paul A; Miller, William L; Jaén, Carlos Roberto

    2011-01-01

    The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures.

  16. CARETS: A prototype regional environmental information system. Volume 12: User evaluation of experimental land use maps and related products from the central Atlantic test site

    NASA Technical Reports Server (NTRS)

    Alexander, R. H. (Principal Investigator); Mcginty, H. K., III

    1975-01-01

    The author has identified the following significant results. Recommendations resulting from the CARETS evaluation reflect the need to establish a flexible and reliable system for providing more detailed raw and processed land resource information as well as the need to improve the methods of making information available to users.

  17. US-VISIT Identity Matching Algorithm Evaluation Program: ADIS Algorithm Evaluation Project Plan Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, C W; Lenderman, J S; Gansemer, J D

    This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed bymore » Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).« less

  18. Accurate 3d Scanning of Damaged Ancient Greek Inscriptions for Revealing Weathered Letters

    NASA Astrophysics Data System (ADS)

    Papadaki, A. I.; Agrafiotis, P.; Georgopoulos, A.; Prignitz, S.

    2015-02-01

    In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and "hidden" letters.

  19. Modeling Post-Accident Vehicle Egress

    DTIC Science & Technology

    2013-01-01

    interest for military situations may involve rolled-over vehicles for which detailed movement data are not available. In the current design process...test trials. These evaluations are expensive and time-consuming, and are often performed late in the design process when it is too difficult to...alter the design if weaknesses are discovered. Yet, due to the limitations of current software tools, digital human models (DHMs) are not yet widely

  20. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  1. 30 CFR 582.23 - Testing Plan.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OPERATIONS IN THE OUTER... detailed Mining Plan than is obtainable under an approved Delineation Plan, to prepare feasibility studies, to carry out a pilot program to evaluate processing techniques or technology or mining equipment, or...

  2. 30 CFR 582.23 - Testing Plan.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OPERATIONS IN THE OUTER... detailed Mining Plan than is obtainable under an approved Delineation Plan, to prepare feasibility studies, to carry out a pilot program to evaluate processing techniques or technology or mining equipment, or...

  3. 30 CFR 582.23 - Testing Plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OPERATIONS IN THE OUTER... detailed Mining Plan than is obtainable under an approved Delineation Plan, to prepare feasibility studies, to carry out a pilot program to evaluate processing techniques or technology or mining equipment, or...

  4. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation gives an overview of a debugger for computational grid applications. Details are given on NAS parallel tools groups (including parallelization support tools, evaluation of various parallelization strategies, and distributed and aggregated computing), debugger dependencies, scalability, initial implementation, the process grid, and information on Globus.

  5. 75 FR 61144 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... design includes both client-level outcomes and process evaluation components. The purpose of the outcome... assessment). Child Data Collection Tool (all children; descriptive biopsychosocial measure). Children's...--Detailed Annual Burden for All Interviews & Surveys Number of Interviews and surveys Respondent respondents...

  6. Effects of Hygrothermal Cycling on the Chemical, Thermal, and Mechanical Properties of 862/W Epoxy Resin

    NASA Technical Reports Server (NTRS)

    Miller, Sandi G.; Roberts, Gary D.; Copa, Christine C.; Bail, Justin L.; Kohlman, Lee W.; Binienda, Wieslaw K.

    2011-01-01

    The hygrothermal aging characteristics of an epoxy resin were characterized over 1 year, which included 908 temperature and humidity cycles. The epoxy resin quickly showed evidence of aging through color change and increased brittleness. The influence of aging on the material s glass transition temperature (Tg) was evaluated by Differential Scanning Calorimetry (DSC) and Dynamic Mechanical Analysis (DMA). The Tg remained relatively constant throughout the year long cyclic aging profile. The chemical composition was monitored by Fourier Transform Infrared Spectroscopy (FTIR) where evidence of chemical aging and advancement of cure was noted. The tensile strength of the resin was tested as it aged. This property was severely affected by the aging process in the form of reduced ductility and embrittlement. Detailed chemical evaluation suggests many aging mechanisms are taking place during exposure to hygrothermal conditions. This paper details the influence of processes such as: advancement of cure, chemical degradation, and physical aging on the chemical and physical properties of the epoxy resin.

  7. Update on the Department of Energy's 1994 plutonium vulnerability assessment for the plutonium finishing plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HERZOG, K.R.

    1999-09-01

    A review of the environmental, safety, and health vulnerabilities associated with the continued storage of PFP's inventory of plutonium bearing materials and other SNM. This report re-evaluates the five vulnerabilities identified in 1994 at the PFP that are associated with SNM storage. This new evaluation took a more detailed look and applied a risk ranking process to help focus remediation efforts.

  8. Evaluation of the environmental impact of Brownfield remediation options: comparison of two life cycle assessment-based evaluation tools.

    PubMed

    Cappuyns, Valérie; Kessen, Bram

    2012-01-01

    The choice between different options for the remediation of a contaminated site traditionally relies on economical, technical and regulatory criteria without consideration of the environmental impact of the soil remediation process itself. In the present study, the environmental impact assessment of two potential soil remediation techniques (excavation and off-site cleaning and in situ steam extraction) was performed using two life cycle assessment (LCA)-based evaluation tools, namely the REC (risk reduction, environmental merit and cost) method and the ReCiPe method. The comparison and evaluation of the different tools used to estimate the environmental impact of Brownfield remediation was based on a case study which consisted of the remediation of a former oil and fat processing plant. For the environmental impact assessment, both the REC and ReCiPe methods result in a single score for the environmental impact of the soil remediation process and allow the same conclusion to be drawn: excavation and off-site cleaning has a more pronounced environmental impact than in situ soil remediation by means of steam extraction. The ReCiPe method takes into account more impact categories, but is also more complex to work with and needs more input data. Within the routine evaluation of soil remediation alternatives, a detailed LCA evaluation will often be too time consuming and costly and the estimation of the environmental impact with the REC method will in most cases be sufficient. The case study worked out in this paper wants to provide a basis for a more sounded selection of soil remediation technologies based on a more detailed assessment of the secondary impact of soil remediation.

  9. NBS: Nondestructive evaluation of nonuniformities in 2219 aluminum alloy plate: Relationship to processing

    NASA Technical Reports Server (NTRS)

    Swartzendruber, L.; Boettinger, W.; Ives, L.; Coriell, S.; Ballard, D.; Laughlin, D.; Clough, R.; Biancanieilo, F.; Blau, P.; Cahn, J.

    1980-01-01

    The compositional homogeneity, microstructure, hardness, electrical conductivity and mechanical properties of 2219 aluminum alloy plates are influenced by the process variables during casting, rolling and thermomechanical treatment. The details of these relationships wre investigated for correctly processed 2219 plate as well as for deviations caused by improper quenching after solution heat treatment. Primary emphasis was been placed on the reliability of eddy current electrical conductivity and hardness as NDE tools to detect variations in mechanical properties.

  10. Effect of Processing and Subsequent Storage on Nutrition

    NASA Technical Reports Server (NTRS)

    Perchonok, Michele; Lai, Oiki Sylvia

    2008-01-01

    The objective of this research is to determine the effects of thermal processing, freeze drying, irradiation, and storage time on the nutritional content of food, to evaluate the nutritional content of the food items currently used on the International Space Station and Shuttle, and to establish the need to institute countermeasures. (This study does not seek to address the effect of processing on nutrients in detail, but rather aims to place in context the overall nutritional status at the time of consumption).

  11. Vocabulary Development and Maintenance--Descriptors. ERIC Processing Manual, Section VIII (Part 1).

    ERIC Educational Resources Information Center

    Houston, Jim, Ed.

    Comprehensive rules, guidelines, and examples are provided for use by ERIC indexers and lexicographers in developing and maintaining the "Thesaurus of ERIC Descriptors." Evaluation and decision criteria, research procedures, and inputting details for adding new Descriptors are documented. Instructions for modifying existing Thesaurus…

  12. Domestic Aluminum Resources: Dilemmas of Development. Volume II. Appendixes II-VII. Detailed Agency Comments and GAO Response.

    DTIC Science & Technology

    1980-07-17

    31 Clay/hydrochloric acid, gas - induced crystallization 32 Clay/nitric acid evaporative crystallization 32 Clay/hydrochloric acid, evapora- tive...ALUMINA AND ALUMINUM TECHNOLOGIES 53 Evaluation of nonbauxitic alumina production processes 54 Clay/carbo-chlorination 54 Clay/hydrochloric acid, gas ...reports that the miniplant program is centered on a single process-- clay/hydrochloric acid- gas precipitation. The Bureau of Mines has not retreated

  13. Addressing the hidden dimension in nursing education: promoting cultural competence.

    PubMed

    Carter, Kimberly F; Xu, Yu

    2007-01-01

    The authors describe a cultural competence quality enhancement process to address the retention challenge of students who speak English as second language and international students as part of a school of nursing's continuous program quality improvement to achieve excellence. The process, strategies, outcomes, and evaluation of the training program are detailed within the given geographical, institutional, and curriculum context. Lessons and continuing challenges are also specified.

  14. Extended performance electric propulsion power processor design study. Volume 2: Technical summary

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Inouye, L. Y.; Schoenfeld, A. D.

    1977-01-01

    Electric propulsion power processor technology has processed during the past decade to the point that it is considered ready for application. Several power processor design concepts were evaluated and compared. Emphasis was placed on a 30 cm ion thruster power processor with a beam power rating supply of 2.2KW to 10KW for the main propulsion power stage. Extension in power processor performance were defined and were designed in sufficient detail to determine efficiency, component weight, part count, reliability and thermal control. A detail design was performed on a microprocessor as the thyristor power processor controller. A reliability analysis was performed to evaluate the effect of the control electronics redesign. Preliminary electrical design, mechanical design and thermal analysis were performed on a 6KW power transformer for the beam supply. Bi-Mod mechanical, structural and thermal control configurations were evaluated for the power processor and preliminary estimates of mechanical weight were determined.

  15. Comparative study of resist stabilization techniques for metal etch processing

    NASA Astrophysics Data System (ADS)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  16. Beauty and sublime. Comment on "Move me, astonish me…" delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates; by Matthew Pelowski et al.

    NASA Astrophysics Data System (ADS)

    Perlovsky, Leonid

    2017-07-01

    The VIMAP model presented in this review [1] is an interesting and detailed model of neural mechanisms of aesthetic perception. In this Comment I address one deficiency of this model: it does not address in details the fundamental notions of the VIMAP, beauty and sublime. In this regard VIMAP is similar to other publications on aesthetics.

  17. Evaluation of Patient Centered Medical Home Practice Transformation Initiatives

    PubMed Central

    Crabtree, Benjamin F.; Chase, Sabrina M.; Wise, Christopher G.; Schiff, Gordon D.; Schmidt, Laura A.; Goyzueta, Jeanette R.; Malouin, Rebecca A.; Payne, Susan M. C.; Quinn, Michael T.; Nutting, Paul A.; Miller, William L.; Jaén, Carlos Roberto

    2011-01-01

    Background The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Methods Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. Results A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Conclusions Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures. PMID:21079525

  18. Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates

    NASA Astrophysics Data System (ADS)

    Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki

    2018-04-01

    We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.

  19. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  20. Structure-property characterization of rheocast and VADER processed IN-100 superalloy. Ph.D. Thesis. Final Report

    NASA Technical Reports Server (NTRS)

    Cheng, J. J. A.; Apelian, D.

    1985-01-01

    Two recent solidification processes have been applied in the production of IN-100 nickel-base superalloy: rheocasting and vacuum arc double electrode remelting (VADER). A detailed microstructural examination has been made of the products of these two processes; associated tensile strength and fatigue crack propagation (FCP) rate at an elevated temperature were evaluated. In rheocasting, processing variables that have been evaluated include stirring speed, isothermal stirring time and volume fraction solid during isothermal stirring. VADER processed IN-100 was purchased from Special Metals Corp., New Hartford, NY. As-cast ingots were subjected to hot isostatic pressing (HIP) and heat treatment. Both rheocasting and VADER processed materials yield fine and equiaxed spherical structures, with reduced macrosegregation in comparison to ingot materials. The rheocast structures are discussed on the basis of the Vogel-Doherty-Cantor model of dendrite arm fragmentation. The rheocast ingots evaluated were superior in yield strength to both VADER and commercially cast IN-100 alloy. Rheocast and VADER ingots may have higher crack propagation resistance than P/M processed material.

  1. Basic as well as detailed neurosonograms can be performed by offline analysis of three-dimensional fetal brain volumes.

    PubMed

    Bornstein, E; Monteagudo, A; Santos, R; Strock, I; Tsymbal, T; Lenchner, E; Timor-Tritsch, I E

    2010-07-01

    To evaluate the feasibility and the processing time of offline analysis of three-dimensional (3D) brain volumes to perform a basic, as well as a detailed, targeted, fetal neurosonogram. 3D fetal brain volumes were obtained in 103 consecutive healthy fetuses that underwent routine anatomical survey at 20-23 postmenstrual weeks. Transabdominal gray-scale and power Doppler volumes of the fetal brain were acquired by one of three experienced sonographers (an average of seven volumes per fetus). Acquisition was first attempted in the sagittal and coronal planes. When the fetal position did not enable easy and rapid access to these planes, axial acquisition at the level of the biparietal diameter was performed. Offline analysis of each volume was performed by two of the authors in a blinded manner. A systematic technique of 'volume manipulation' was used to identify a list of 25 brain dimensions/structures comprising a complete basic evaluation, intracranial biometry and a detailed targeted fetal neurosonogram. The feasibility and reproducibility of obtaining diagnostic-quality images of the different structures was evaluated, and processing times were recorded, by the two examiners. Diagnostic-quality visualization was feasible in all of the 25 structures, with an excellent visualization rate (85-100%) reported in 18 structures, a good visualization rate (69-97%) reported in five structures and a low visualization rate (38-54%) reported in two structures, by the two examiners. An average of 4.3 and 5.4 volumes were used to complete the examination by the two examiners, with a mean processing time of 7.2 and 8.8 minutes, respectively. The overall agreement rate for diagnostic visualization of the different brain structures between the two examiners was 89.9%, with a kappa coefficient of 0.5 (P < 0.001). In experienced hands, offline analysis of 3D brain volumes is a reproducible modality that can identify all structures necessary to complete both a basic and a detailed second-trimester fetal neurosonogram. Copyright 2010 ISUOG. Published by John Wiley & Sons, Ltd.

  2. Stabilizing Waste Materials for Landfills

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1977

    1977-01-01

    The test procedures used to evaluate the suitability of landfilled materials of varying stability and to determine the leachate from such materials are reviewed. A process for stabilizing a mixture of sulfur dioxide sludge, fly ash, and bottom ash with lime and other additives for deposition in landfills is detailed. (BT)

  3. Scintigraphic evaluation in musculoskeletal sepsis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, K.D.; Fitzgerald, R.H. Jr.; Brown, M.L.

    In this article, the mechanism of technetium, gallium, and indium-labeled white blood cell localization in septic processes is detailed, and the method of interpretation of these three isotopes with relationship to musculoskeletal infection is outlined. Specific clinical application of technetium, gallium, and indium-labeled white blood cell imaging for musculoskeletal sepsis is reviewed.

  4. Collaborative Textbook Selection: A Case Study Leading to Practical and Theoretical Considerations

    ERIC Educational Resources Information Center

    Czerwionka, Lori; Gorokhovsky, Bridget

    2015-01-01

    This case study developed a collaborative approach to the selection of a Spanish language textbook. The collaborative process consisted of six steps, detailed in this article: team building, generating evaluation criteria, formulating a meaningful rubric, selecting prospective textbooks, calculating rubric results, and reflectively reviewing…

  5. Integrating Sustainability Programs into the Facilities Capital Planning Process

    ERIC Educational Resources Information Center

    Buchanan, Susan

    2011-01-01

    With detailed information about the costs and benefits of potential green investments, educational facilities can effectively evaluate which initiatives will ultimately provide the greatest results over the short and long term. Based on its overall goals, every school, college, or university will have different values and therefore different…

  6. Visual Grading and Quality of 1-0 Northern Red Oak Seedlings

    Treesearch

    S.L. Clark; S.E. Scblarbaum; Paul P. Kormanik

    2000-01-01

    Past research has used detailed measurements of various growth characteristics to determine seedling grades and quality of northern red oak nursery stock This study evaluates the effectiveness ofa visual grading process. similar to thosefound in commercial nursery operations, to distinguish high quality seedlings. Northern red oak (Quercus rubra...

  7. National Geodetic Satellite Program, Part 1

    NASA Technical Reports Server (NTRS)

    Henriksen, S. W. (Editor)

    1977-01-01

    The work performed by individual contributors to the National Geodetic Satellite Program is presented. The purpose of the organization, the instruments used in obtaining the data, a description of the data itself, the theory used in processing the data, and evaluation of the results are detailed for the participating organizations.

  8. Bilingual Program Application for Continuation Proposal: Compton Unified School District.

    ERIC Educational Resources Information Center

    Compton City Schools, CA.

    This document contains the continuation proposal for the fourth grade Compton bilingual education program. A review of the third year is included with details on process evaluation, project personnel and duties, new vocabulary developed by the project for lexical references, and inservice training of teachers. Information concerning the proposed…

  9. Metal spar/superhybrid shell composite fan blades. [for application to turbofan engins

    NASA Technical Reports Server (NTRS)

    Salemme, C. T.; Murphy, G. C.

    1979-01-01

    The use of superhybrid materials in the manufacture and testing of large fan blades is analyzed. The FOD resistance of large metal spar/superhybrid fan blades is investigated. The technical effort reported was comprised of: (1) preliminary blade design; (2) detailed analysis of two selected superhybrid blade designs; (3) manufacture of two process evaluation blades and destructive evaluation; and (4) manufacture and whirligig testing of six prototype superhybrid blades.

  10. Energy Transfer Processes in (Lu,Gd)AlO3:Ce

    DTIC Science & Technology

    2001-01-01

    studies on energy transfer processes in Ce-activated Lu, Y and Gd aluminum perovskite crystals that contribute to production of scintillation light in...LuAIO3, GdA10 3, cerium, scintillators, VUV spectroscopy, luminescence, time profiles, energy transfer 1. INTRODUCTION The yttrium aluminum perovskite...The Czochralski-grown monocrystals of LuAP:Ce were first evaluated in a garnet -free perovskite phase by Lempicki et al. in 1994 .4 More detailed

  11. Clinical Decision Support Alert Appropriateness: A Review and Proposal for Improvement

    PubMed Central

    McCoy, Allison B.; Thomas, Eric J.; Krousel-Wood, Marie; Sittig, Dean F.

    2014-01-01

    Background Many healthcare providers are adopting clinical decision support (CDS) systems to improve patient safety and meet meaningful use requirements. Computerized alerts that prompt clinicians about drug-allergy, drug-drug, and drug-disease warnings or provide dosing guidance are most commonly implemented. Alert overrides, which occur when clinicians do not follow the guidance presented by the alert, can hinder improved patient outcomes. Methods We present a review of CDS alerts and describe a proposal to develop novel methods for evaluating and improving CDS alerts that builds upon traditional informatics approaches. Our proposal incorporates previously described models for predicting alert overrides that utilize retrospective chart review to determine which alerts are clinically relevant and which overrides are justifiable. Results Despite increasing implementations of CDS alerts, detailed evaluations rarely occur because of the extensive labor involved in manual chart reviews to determine alert and response appropriateness. Further, most studies have solely evaluated alert overrides that are appropriate or justifiable. Our proposal expands the use of web-based monitoring tools with an interactive dashboard for evaluating CDS alert and response appropriateness that incorporates the predictive models. The dashboard provides 2 views, an alert detail view and a patient detail view, to provide a full history of alerts and help put the patient's events in context. Conclusion The proposed research introduces several innovations to address the challenges and gaps in alert evaluations. This research can transform alert evaluation processes across healthcare settings, leading to improved CDS, reduced alert fatigue, and increased patient safety. PMID:24940129

  12. Reducing Physical Risk Factors in Construction Work Through a Participatory Intervention: Protocol for a Mixed-Methods Process Evaluation.

    PubMed

    Ajslev, Jeppe; Brandt, Mikkel; Møller, Jeppe Lykke; Skals, Sebastian; Vinstrup, Jonas; Jakobsen, Markus Due; Sundstrup, Emil; Madeleine, Pascal; Andersen, Lars Louis

    2016-05-26

    Previous research has shown that reducing physical workload among workers in the construction industry is complicated. In order to address this issue, we developed a process evaluation in a formative mixed-methods design, drawing on existing knowledge of the potential barriers for implementation. We present the design of a mixed-methods process evaluation of the organizational, social, and subjective practices that play roles in the intervention study, integrating technical measurements to detect excessive physical exertion measured with electromyography and accelerometers, video documentation of working tasks, and a 3-phased workshop program. The evaluation is designed in an adapted process evaluation framework, addressing recruitment, reach, fidelity, satisfaction, intervention delivery, intervention received, and context of the intervention companies. Observational studies, interviews, and questionnaires among 80 construction workers organized in 20 work gangs, as well as health and safety staff, contribute to the creation of knowledge about these phenomena. At the time of publication, the process of participant recruitment is underway. Intervention studies are challenging to conduct and evaluate in the construction industry, often because of narrow time frames and ever-changing contexts. The mixed-methods design presents opportunities for obtaining detailed knowledge of the practices intra-acting with the intervention, while offering the opportunity to customize parts of the intervention.

  13. Biomedically relevant chemical and physical properties of coal combustion products.

    PubMed Central

    Fisher, G L

    1983-01-01

    The evaluation of the potential public and occupational health hazards of developing and existing combustion processes requires a detailed understanding of the physical and chemical properties of effluents available for human and environmental exposures. These processes produce complex mixtures of gases and aerosols which may interact synergistically or antagonistically with biological systems. Because of the physicochemical complexity of the effluents, the biomedically relevant properties of these materials must be carefully assessed. Subsequent to release from combustion sources, environmental interactions further complicate assessment of the toxicity of combustion products. This report provides an overview of the biomedically relevant physical and chemical properties of coal fly ash. Coal fly ash is presented as a model complex mixture for health and safety evaluation of combustion processes. PMID:6337824

  14. Efficiency and Accuracy in Thermal Simulation of Powder Bed Fusion of Bulk Metallic Glass

    NASA Astrophysics Data System (ADS)

    Lindwall, J.; Malmelöv, A.; Lundbäck, A.; Lindgren, L.-E.

    2018-05-01

    Additive manufacturing by powder bed fusion processes can be utilized to create bulk metallic glass as the process yields considerably high cooling rates. However, there is a risk that reheated material set in layers may become devitrified, i.e., crystallize. Therefore, it is advantageous to simulate the process to fully comprehend it and design it to avoid the aforementioned risk. However, a detailed simulation is computationally demanding. It is necessary to increase the computational speed while maintaining accuracy of the computed temperature field in critical regions. The current study evaluates a few approaches based on temporal reduction to achieve this. It is found that the evaluated approaches save a lot of time and accurately predict the temperature history.

  15. Signal design study for shuttle/TDRSS Ku-band uplink

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The adequacy of the signal design approach chosen for the TDRSS/orbiter uplink was evaluated. Critical functions and/or components associated with the baseline design were identified, and design alternatives were developed for those areas considered high risk. A detailed set of RF and signal processing performance specifications for the orbiter hardware associated with the TDRSS/orbiter Ku band uplink was analyzed. Performances of a detailed design of the PN despreader, the PSK carrier synchronization loop, and the symbol synchronizer are identified. The performance of the downlink signal by means of computer simulation to obtain a realistic determination of bit error rate degradations was studied. The three channel PM downlink signal was detailed by means of analysis and computer simulation.

  16. A Framework for Usability Evaluation in EHR Procurement.

    PubMed

    Tyllinen, Mari; Kaipio, Johanna; Lääveri, Tinja

    2018-01-01

    Usability should be considered already by the procuring organizations when selecting future systems. In this paper, we present a framework for usability evaluation during electronic health record (EHR) system procurement. We describe the objectives of the evaluation, the procedure, selected usability attributes and the evaluation methods to measure them. We also present the emphasis usability had in the selection process. We do not elaborate on the details of the results, the application of methods or gathering of data. Instead we focus on the components of the framework to inform and give an example to other similar procurement projects.

  17. Failure mode and effects analysis: A community practice perspective.

    PubMed

    Schuller, Bradley W; Burns, Angi; Ceilley, Elizabeth A; King, Alan; LeTourneau, Joan; Markovic, Alexander; Sterkel, Lynda; Taplin, Brigid; Wanner, Jennifer; Albert, Jeffrey M

    2017-11-01

    To report our early experiences with failure mode and effects analysis (FMEA) in a community practice setting. The FMEA facilitator received extensive training at the AAPM Summer School. Early efforts focused on department education and emphasized the need for process evaluation in the context of high profile radiation therapy accidents. A multidisciplinary team was assembled with representation from each of the major department disciplines. Stereotactic radiosurgery (SRS) was identified as the most appropriate treatment technique for the first FMEA evaluation, as it is largely self-contained and has the potential to produce high impact failure modes. Process mapping was completed using breakout sessions, and then compiled into a simple electronic format. Weekly sessions were used to complete the FMEA evaluation. Risk priority number (RPN) values > 100 or severity scores of 9 or 10 were considered high risk. The overall time commitment was also tracked. The final SRS process map contained 15 major process steps and 183 subprocess steps. Splitting the process map into individual assignments was a successful strategy for our group. The process map was designed to contain enough detail such that another radiation oncology team would be able to perform our procedures. Continuous facilitator involvement helped maintain consistent scoring during FMEA. Practice changes were made responding to the highest RPN scores, and new resulting RPN scores were below our high-risk threshold. The estimated person-hour equivalent for project completion was 258 hr. This report provides important details on the initial steps we took to complete our first FMEA, providing guidance for community practices seeking to incorporate this process into their quality assurance (QA) program. Determining the feasibility of implementing complex QA processes into different practice settings will take on increasing significance as the field of radiation oncology transitions into the new TG-100 QA paradigm. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastasia M. Gribik; Ronald E. Mizia; Harry Gatley

    This project addresses both the technical and economic feasibility of replacing industrial gas in lime kilns with synthesis gas from the gasification of hog fuel. The technical assessment includes a materials evaluation, processing equipment needs, and suitability of the heat content of the synthesis gas as a replacement for industrial gas. The economic assessment includes estimations for capital, construction, operating, maintenance, and management costs for the reference plant. To perform these assessments, detailed models of the gasification and lime kiln processes were developed using Aspen Plus. The material and energy balance outputs from the Aspen Plus model were used asmore » inputs to both the material and economic evaluations.« less

  19. Planning riparian restoration in the context of tamarix control in Western North America

    USGS Publications Warehouse

    Shafroth, P.B.; Beauchamp, Vanessa B.; Briggs, M.K.; Lair, K.; Scott, M.L.; Sher, A.A.

    2008-01-01

    Throughout the world, the condition of many riparian ecosystems has declined due to numerous factors, including encroachment of non-native species. In the western United States, millions of dollars are spent annually to control invasions of Tamarix spp., introduced small trees or shrubs from Eurasia that have colonized bottomland ecosystems along many rivers. Resource managers seek to control Tamarix in attempts to meet various objectives, such as increasing water yield and improving wildlife habitat. Often, riparian restoration is an implicit goal, but there has been little emphasis on a process or principles to effectively plan restoration activities, and many Tamarix removal projects are unsuccessful at restoring native vegetation. We propose and summarize the key steps in a planning process aimed at developing effective restoration projects in Tamarix-dominated areas. We discuss in greater detail the biotic and abiotic factors central to the evaluation of potential restoration sites and summarize information about plant communities likely to replace Tamarix under various conditions. Although many projects begin with implementation, which includes the actual removal of Tamarix, we stress the importance of pre-project planning that includes: (1) clearly identifying project goals; (2) developing realistic project objectives based on a detailed evaluation of site conditions; (3) prioritizing and selecting Tamarix control sites with the best chance of ecological recovery; and (4) developing a detailed tactical plan before Tamarix is removed. After removal, monitoring and maintenance as part of an adaptive management approach are crucial for evaluating project success and determining the most effective methods for restoring these challenging sites. ?? 2008 Society for Ecological Restoration International.

  20. A Numerical and Experimental Study of Damage Growth in a Composite Laminate

    NASA Technical Reports Server (NTRS)

    McElroy, Mark; Ratcliffe, James; Czabaj, Michael; Wang, John; Yuan, Fuh-Gwo

    2014-01-01

    The present study has three goals: (1) perform an experiment where a simple laminate damage process can be characterized in high detail; (2) evaluate the performance of existing commercially available laminate damage simulation tools by modeling the experiment; (3) observe and understand the underlying physics of damage in a composite honeycomb sandwich structure subjected to low-velocity impact. A quasi-static indentation experiment has been devised to provide detailed information about a simple mixed-mode damage growth process. The test specimens consist of an aluminum honeycomb core with a cross-ply laminate facesheet supported on a stiff uniform surface. When the sample is subjected to an indentation load, the honeycomb core provides support to the facesheet resulting in a gradual and stable damage growth process in the skin. This enables real time observation as a matrix crack forms, propagates through a ply, and then causes a delamination. Finite element analyses were conducted in ABAQUS/Explicit(TradeMark) 6.13 that used continuum and cohesive modeling techniques to simulate facesheet damage and a geometric and material nonlinear model to simulate core crushing. The high fidelity of the experimental data allows a detailed investigation and discussion of the accuracy of each numerical modeling approach.

  1. Evaluation of Research Ethics Committees: Criteria for the Ethical Quality of the Review Process.

    PubMed

    Scherzinger, Gregor; Bobbert, Monika

    2017-01-01

    Repeatedly, adequacy, performance and quality of Ethics Committees that oversee medical research trials are being discussed. Although they play a crucial role in reviewing medical research and protecting human subjects, it is far from clear to what degree they fulfill the task they have been assigned to. This eventuates in the call for an evaluation of their activity and, in some places, led to the establishment of accreditation schemes. At the same time, IRBs have become subject of detailed legislation in the process of the ongoing global juridification of medical research. Unsurprisingly, there is a tendency to understand the evaluation of RECs as a question of controlling their legal compliance. This paper discusses the need for a quality evaluation of IRBs from an ethical point of view and, by systematically reviewing the major ethical guidelines for IRBs, proposes a system of criteria that should orientate any evaluation of IRBs.

  2. Engaged for Change: A Community-Engaged Process for Developing Interventions to Reduce Health Disparities.

    PubMed

    Rhodes, Scott D; Mann-Jackson, Lilli; Alonzo, Jorge; Simán, Florence M; Vissman, Aaron T; Nall, Jennifer; Abraham, Claire; Aronson, Robert E; Tanner, Amanda E

    2017-12-01

    The science underlying the development of individual, community, system, and policy interventions designed to reduce health disparities has lagged behind other innovations. Few models, theoretical frameworks, or processes exist to guide intervention development. Our community-engaged research partnership has been developing, implementing, and evaluating efficacious interventions to reduce HIV disparities for over 15 years. Based on our intervention research experiences, we propose a novel 13-step process designed to demystify and guide intervention development. Our intervention development process includes steps such as establishing an intervention team to manage the details of intervention development; assessing community needs, priorities, and assets; generating intervention priorities; evaluating and incorporating theory; developing a conceptual or logic model; crafting activities; honing materials; administering a pilot, noting its process, and gathering feedback from all those involved; and editing the intervention based on what was learned. Here, we outline and describe each of these 13 steps.

  3. Effect of Processing and Subsequent Storage on Nutrition

    NASA Technical Reports Server (NTRS)

    Perchonok, Michele H.

    2009-01-01

    This viewgraph presentation includes the following objectives: 1) To determine the effects of thermal processing, freeze drying, irradiation, and storage time on the nutritional content of food; 2) To evaluate the nutritional content of the food items currently used on the International Space Station and Shuttle; and 3) To determine if there is a need to institute countermeasures. (This study does not seek to address the effect of processing on nutrients in detail, but rather aims to place in context the overall nutritional status at the time of consumption).

  4. Auditing radiation sterilization facilities

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey A.

    The diversity of radiation sterilization systems available today places renewed emphasis on the need for thorough Quality Assurance audits of these facilities. Evaluating compliance with Good Manufacturing Practices is an obvious requirement, but an effective audit must also evaluate installation and performance qualification programs (validation_, and process control and monitoring procedures in detail. The present paper describes general standards that radiation sterilization operations should meet in each of these key areas, and provides basic guidance for conducting QA audits of these facilities.

  5. The difference between energy consumption and energy cost: Modelling energy tariff structures for water resource recovery facilities.

    PubMed

    Aymerich, I; Rieger, L; Sobhani, R; Rosso, D; Corominas, Ll

    2015-09-15

    The objective of this paper is to demonstrate the importance of incorporating more realistic energy cost models (based on current energy tariff structures) into existing water resource recovery facilities (WRRFs) process models when evaluating technologies and cost-saving control strategies. In this paper, we first introduce a systematic framework to model energy usage at WRRFs and a generalized structure to describe energy tariffs including the most common billing terms. Secondly, this paper introduces a detailed energy cost model based on a Spanish energy tariff structure coupled with a WRRF process model to evaluate several control strategies and provide insights into the selection of the contracted power structure. The results for a 1-year evaluation on a 115,000 population-equivalent WRRF showed monthly cost differences ranging from 7 to 30% when comparing the detailed energy cost model to an average energy price. The evaluation of different aeration control strategies also showed that using average energy prices and neglecting energy tariff structures may lead to biased conclusions when selecting operating strategies or comparing technologies or equipment. The proposed framework demonstrated that for cost minimization, control strategies should be paired with a specific optimal contracted power. Hence, the design of operational and control strategies must take into account the local energy tariff. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Improving NASA's technology transfer process through increased screening and evaluation in the information dissemination program

    NASA Technical Reports Server (NTRS)

    Laepple, H.

    1979-01-01

    The current status of NASA's technology transfer system can be improved if the technology transfer process is better understood. This understanding will only be gained if a detailed knowledge about factors generally influencing technology transfer is developed, and particularly those factors affecting technology transfer from government R and D agencies to industry. Secondary utilization of aerospace technology is made more difficult because it depends on a transfer process which crosses established organizational lines of authority and which is outside well understood patterns of technical applications. In the absence of a sound theory about technology transfer and because of the limited capability of government agencies to explore industry's needs, a team approach to screening and evaluation of NASA generated technologies is proposed which calls for NASA, and other organizations of the private and public sectors which influence the transfer of NASA generated technology, to participate in a screening and evaluation process to determine the commercial feasibility of a wide range of technical applications.

  7. The pyroelectric behavior of lead free ferroelectric ceramics in thermally stimulated depolarization current measurements

    NASA Astrophysics Data System (ADS)

    González-Abreu, Y.; Peláiz-Barranco, A.; Garcia-Wong, A. C.; Guerra, J. D. S.

    2012-06-01

    The present paper shows a detailed analysis on the thermally stimulated processes in barium modified SrBi2Nb2O9 ferroelectric bi-layered perovskite, which is one of the most promising candidates for non-volatile random access memory applications because of its excellent fatigue-resistant properties. A numerical method is used to separate the real pyroelectric current from the other thermally stimulated processes. A discharge due to the space-charge injected during the poling process, the pyroelectric response, and a conductive process are discussed in a wide temperature range from ferroelectric to paraelectric phase. The pyroelectric response is separated from the other components to evaluate the polarization behavior and some pyroelectric parameters. The remanent polarization, the pyroelectric coefficient, and the merit figure are evaluated, which show good results.

  8. SeaSat-A Satellite Scatterometer (SASS) Validation and Experiment Plan

    NASA Technical Reports Server (NTRS)

    Schroeder, L. C. (Editor)

    1978-01-01

    This plan was generated by the SeaSat-A satellite scatterometer experiment team to define the pre-and post-launch activities necessary to conduct sensor validation and geophysical evaluation. Details included are an instrument and experiment description/performance requirements, success criteria, constraints, mission requirements, data processing requirement and data analysis responsibilities.

  9. A Four-Stage Model for Planning Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Morrison, Gary R.; Ross, Steven M.

    1988-01-01

    Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…

  10. Development and evaluation of the bacterial fate and transport module for the agricultural policy/environmental extender (APEX) model

    USDA-ARS?s Scientific Manuscript database

    The Agricultural Policy/Environmental eXtender (APEX) is a watershed-scale water quality model that includes detailed representation of agricultural management but currently does not have microbial fate and transport simulation capabilities. The objective of this work was to develop a process-based ...

  11. Improving the Quality of Services in Residential Treatment Facilities: A Strength-Based Consultative Review Process

    ERIC Educational Resources Information Center

    Pavkov, Thomas W.; Lourie, Ira S.; Hug, Richard W.; Negash, Sesen

    2010-01-01

    This descriptive case study reports on the positive impact of a consultative review methodology used to conduct quality assurance reviews as part of the Residential Treatment Center Evaluation Project. The study details improvement in the quality of services provided to youth in unmonitored residential treatment facilities. Improvements were…

  12. Personnel Training--Secondary Vocational Agriculture Teacher Education.

    ERIC Educational Resources Information Center

    Brown, Herman D.; And Others

    This document consists of three parts. The first part is a report on a project conducted to develop computer software needed by vocational agriculture teachers in Texas. The report details the process used to assess and develop software, and provides guidelines that can be used by others in evaluating computer software for vocational agriculture…

  13. Design and Implementation of a Learning Analytics Toolkit for Teachers

    ERIC Educational Resources Information Center

    Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik

    2012-01-01

    Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…

  14. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  15. The discrete Fourier transform algorithm for determining decay constants—Implementation using a field programmable gate array

    NASA Astrophysics Data System (ADS)

    Bostrom, G.; Atkinson, D.; Rice, A.

    2015-04-01

    Cavity ringdown spectroscopy (CRDS) uses the exponential decay constant of light exiting a high-finesse resonance cavity to determine analyte concentration, typically via absorption. We present a high-throughput data acquisition system that determines the decay constant in near real time using the discrete Fourier transform algorithm on a field programmable gate array (FPGA). A commercially available, high-speed, high-resolution, analog-to-digital converter evaluation board system is used as the platform for the system, after minor hardware and software modifications. The system outputs decay constants at maximum rate of 4.4 kHz using an 8192-point fast Fourier transform by processing the intensity decay signal between ringdown events. We present the details of the system, including the modifications required to adapt the evaluation board to accurately process the exponential waveform. We also demonstrate the performance of the system, both stand-alone and incorporated into our existing CRDS system. Details of FPGA, microcontroller, and circuitry modifications are provided in the Appendix and computer code is available upon request from the authors.

  16. Modeling greenhouse gas emissions from dairy farms.

    PubMed

    Rotz, C Alan

    2017-11-15

    Dairy farms have been identified as an important source of greenhouse gas emissions. Within the farm, important emissions include enteric CH 4 from the animals, CH 4 and N 2 O from manure in housing facilities during long-term storage and during field application, and N 2 O from nitrification and denitrification processes in the soil used to produce feed crops and pasture. Models using a wide range in level of detail have been developed to represent or predict these emissions. They include constant emission factors, variable process-related emission factors, empirical or statistical models, mechanistic process simulations, and life cycle assessment. To fully represent farm emissions, models representing the various emission sources must be integrated to capture the combined effects and interactions of all important components. Farm models have been developed using relationships across the full scale of detail, from constant emission factors to detailed mechanistic simulations. Simpler models, based upon emission factors and empirical relationships, tend to provide better tools for decision support, whereas more complex farm simulations provide better tools for research and education. To look beyond the farm boundaries, life cycle assessment provides an environmental accounting tool for quantifying and evaluating emissions over the full cycle, from producing the resources used on the farm through processing, distribution, consumption, and waste handling of the milk and dairy products produced. Models are useful for improving our understanding of farm processes and their interacting effects on greenhouse gas emissions. Through better understanding, they assist in the development and evaluation of mitigation strategies for reducing emissions and improving overall sustainability of dairy farms. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).

  17. Developing visualisation software for rehabilitation: investigating the requirements of patients, therapists and the rehabilitation process

    PubMed Central

    Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne

    2012-01-01

    This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812

  18. Aircraft Maneuvers for the Evaluation of Flying Qualities and Agility. Volume 1. Maneuver Development Process and Initial Maneuver Set

    DTIC Science & Technology

    1993-08-01

    subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used

  19. Configuration evaluation and criteria plan. Volume 2: Evaluation critera plan (preliminary). Space Transportation Main Engine (STME) configuration study

    NASA Technical Reports Server (NTRS)

    Bair, E. K.

    1986-01-01

    The unbiased selection of the Space Transportation Main Engine (STME) configuration requires that the candidate engines be evaluated against a predetermined set of criteria which must be properly weighted to emphasize critical requirements defined prior to the actual evaluation. The evaluation and selection process involves the following functions: (1) determining if a configuration can satisfy basic STME requirements (yes/no); (2) defining the evaluation criteria; (3) selecting the criteria relative importance or weighting; (4) determining the weighting sensitivities; and (5) establishing a baseline for engine evaluation. The criteria weighting and sensitivities are cost related and are based on mission models and vehicle requirements. The evaluation process is used as a coarse screen to determine the candidate engines for the parametric studies and as a fine screen to determine concept(s) for conceptual design. The criteria used for the coarse and fine screen evaluation process is shown. The coarse screen process involves verifying that the candidate engines can meet the yes/no screening requirements and a semi-subjective quantitative evaluation. The fine screen engines have to meet all of the yes/no screening gates and are then subjected to a detailed evaluation or assessment using the quantitative cost evaluation processes. The option exists for re-cycling a concept through the quantitative portion of the screening and allows for some degree of optimization. The basic vehicle is a two stage LOX/HC, LOX/LH2 parallel burn vehicle capable of placing 150,000 lbs in low Earth orbit (LEO).

  20. [Comparative evaluation of six different body regions of the dog using analog and digital radiography].

    PubMed

    Meyer-Lindenberg, Andrea; Ebermaier, Christine; Wolvekamp, Pim; Tellhelm, Bernd; Meutstege, Freek J; Lang, Johann; Hartung, Klaus; Fehr, Michael; Nolte, Ingo

    2008-01-01

    In this study the quality of digital and analog radiography in dogs was compared. For this purpose, three conventional radiographs (varying in exposure) and three digital radiographs (varying in MUSI-contrast [MUSI = MUlti Scale Image Contrast], the main post-processing parameter) of six different body regions of the dog were evaluated (thorax, abdomen, skull, femur, hip joints, elbow). The quality of the radiographs was evaluated by eight veterinary specialists familiar with radiographic images using a questionnaire based on details of each body region significant in obtaining a radiographic diagnosis. In the first part of the study the overall quality of the radiographs was evaluated. Within one region, 89.5% (43/48) chose a digital radiograph as the best image. Divided into analog and digital groups, the digital image with the highest MUSI-contrast was most often considered the best, while the analog image considered the best varied between the one with the medium and the one with the longest exposure time. In the second part of the study, each image was rated for the visibility of specific, diagnostically important details. After summarisation of the scores for each criterion, divided into analog and digital imaging, the digital images were rated considerably superior to conventional images. The results of image comparison revealed that digital radiographs showed better image detail than radiographs taken with the analog technique in all six areas of the body.

  1. Concepts and embodiment design of a reentry recumbent seating system for the NASA Space Shuttle

    NASA Technical Reports Server (NTRS)

    Mcmillan, Scott; Looby, Brent; Devany, Chris; Chudej, Chris; Brooks, Barry

    1993-01-01

    This report deals with the generation of a recumbent seating system which will be used by NASA to shuttle astronauts from the Russian space station Mir. We begin by examining the necessity for designing a special couch for the returning astronauts. Next, we discuss the operating conditions and constraints of the recumbent seating system and provide a detailed function structure. After working through the conceptual design process, we came up with ten alternative designs which are presented in the appendices. These designs were evaluated and weighted to systematically determine the best choice for embodiment design. A detailed discussion of all components of the selected system follows with design calculations for the seat presented in the appendices. The report concludes with an evaluation of the resulting design and recommendations for further development.

  2. The characterization and evaluation of accidental explosions

    NASA Technical Reports Server (NTRS)

    Strehlow, R. A.; Baker, W. E.

    1975-01-01

    Accidental explosions are discussed from a number of viewpoints. First, all accidental explosions, intentional explosions and natural explosions are characterized by type. Second, the nature of the blast wave produced by an ideal (point source or HE) explosion is discussed to form a basis for describing how other explosion processes yield deviations from ideal blast wave behavior. The current status blast damage mechanism evaluation is also discussed. Third, the current status of our understanding of each different category of accidental explosions is discussed in some detail.

  3. Computer vision system: a tool for evaluating the quality of wheat in a grain tank

    NASA Astrophysics Data System (ADS)

    Minkin, Uryi Igorevish; Panchenko, Aleksei Vladimirovich; Shkanaev, Aleksandr Yurievich; Konovalenko, Ivan Andreevich; Putintsev, Dmitry Nikolaevich; Sadekov, Rinat Nailevish

    2018-04-01

    The paper describes a technology that allows for automatizing the process of evaluating the grain quality in a grain tank of a combine harvester. Special recognition algorithm analyzes photographic images taken by the camera, and that provides automatic estimates of the total mass fraction of broken grains and the presence of non-grains. The paper also presents the operating details of the tank prototype as well as it defines the accuracy of the algorithms designed.

  4. Evaluation in a competency-based educational system.

    PubMed

    May, B J

    1977-01-01

    Competency-based curricula have been implemented at the Medical College of Georgia to prepare students to meet entry level competencies as physical therapists and physical therapist assistants. Criteria-referenced evaluations are used to determine if students have achieved the desired competencies. Performance is measured against the criteria and not other students. The process of developing the system currently in use at the Medical College of Georgia is delineated in some detail and some of the advantages of the system for physical therapy education are discussed.

  5. What happened after the evaluation?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, C L

    1999-03-12

    An ergonomics program including a ergonomic computer workstation evaluations at a research and development facility was assessed three years after formal implementation. As part of the assessment, 53 employees who had been subjects of computer workstation evaluations were interviewed. The documented reports (ergonomic evaluation forms) of the ergonomic evaluations were used in the process of selecting the interview subjects. The evaluation forms also provided information about the aspects of the computer workstation that were discussed and recommended as part of the evaluation, although the amount of detail and completeness of the forms varied. Although the results were mixed and reflectivemore » of the multivariate psychosocial factors affecting employees working in a large organization, the findings led to recommendations for improvements of the program.« less

  6. Multifaceted academic detailing program to increase pharmacotherapy for alcohol use disorder: interrupted time series evaluation of effectiveness.

    PubMed

    Harris, Alex H S; Bowe, Thomas; Hagedorn, Hildi; Nevedal, Andrea; Finlay, Andrea K; Gidwani, Risha; Rosen, Craig; Kay, Chad; Christopher, Melissa

    2016-09-15

    Active consideration of effective medications to treat alcohol use disorder (AUD) is a consensus standard of care, yet knowledge and use of these medications are very low across diverse settings. This study evaluated the overall effectiveness a multifaceted academic detailing program to address this persistent quality problem in the US Veterans Health Administration (VHA), as well as the context and process factors that explained variation in effectiveness across sites. An interrupted time series design, analyzed with mixed-effects segmented logistic regression, was used to evaluate changes in level and rate of change in the monthly percent of patients with a clinically documented AUD who received naltrexone, acamprosate, disulfiram, or topiramate. Using data from a 20 month post-implementation period, intervention sites (n = 37) were compared to their own 16 month pre-implementation performance and separately to the rest of VHA. From immediately pre-intervention to the end of the observation period, the percent of patients in the intervention sites with AUD who received medication increased over 3.4 % in absolute terms and 68 % in relative terms (i.e., 4.9-8.3 %). This change was significant compared to the pre-implementation period in the intervention sites and secular trends in control sites. Sites with lower pre-implementation adoption, more person hours of detailing, but fewer people detailed, had larger immediate increases in medication receipt after implementation. The average number of detailing encounters per person was associated with steeper increases in slope over time. This study found empirical support for a multifaceted quality improvement strategy aimed at increasing access to and utilization of pharmacotherapy for AUD. Future studies should focus on determining how to enhance the programs effects, especially in non-responsive locations.

  7. The Use of Modeling Approach for Teaching Exponential Functions

    NASA Astrophysics Data System (ADS)

    Nunes, L. F.; Prates, D. B.; da Silva, J. M.

    2017-12-01

    This work presents a discussion related to the teaching and learning of mathematical contents related to the study of exponential functions in a freshman students group enrolled in the first semester of the Science and Technology Bachelor’s (STB of the Federal University of Jequitinhonha and Mucuri Valleys (UFVJM). As a contextualization tool strongly mentioned in the literature, the modelling approach was used as an educational teaching tool to produce contextualization in the teaching-learning process of exponential functions to these students. In this sense, were used some simple models elaborated with the GeoGebra software and, to have a qualitative evaluation of the investigation and the results, was used Didactic Engineering as a methodology research. As a consequence of this detailed research, some interesting details about the teaching and learning process were observed, discussed and described.

  8. Rotorblades for large wind turbines

    NASA Astrophysics Data System (ADS)

    Wackerle, P. M.; Hahn, M.

    1981-09-01

    Details of the design work and manufacturing process for a running prototype production of 25 m long composite rotor blades for wind energy generators are presented. The blades are of the 'integrated spar design' type and consist of a glass fiber skin and a PVC core. A computer program (and its action tree) is used for the analysis of the multi-connected hybrid cross-section, in order to achieve optimal design specifications. Four tools are needed for the production of two blade types, including two molds, and milling, cutting and drilling jigs. The manufacturing processes for the molds, jigs and blades are discussed in detail. The final acceptance of the blade is based on a static test where the flexibility of the blade is checked by magnitude of load and deflection, and a dynamic test evaluating the natural frequencies in bending and torsion.

  9. New Developments in the Technology Readiness Assessment Process in US DOE-EM - 13247

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krahn, Steven; Sutter, Herbert; Johnson, Hoyt

    2013-07-01

    A Technology Readiness Assessment (TRA) is a systematic, metric-based process and accompanying report that evaluates the maturity of the technologies used in systems; it is designed to measure technology maturity using the Technology Readiness Level (TRL) scale pioneered by the National Aeronautics and Space Administration (NASA) in the 1980's. More recently, DoD has adopted and provided systematic guidance for performing TRAs and determining TRLs. In 2007 the GAO recommended that the DOE adopt the NASA/DoD methodology for evaluating technology maturity. Earlier, in 2006-2007, DOE-EM had conducted pilot TRAs on a number of projects at Hanford and Savannah River. In Marchmore » 2008, DOE-EM issued a process guide, which established TRAs as an integral part of DOE-EM's Project Management Critical Decision Process. Since the development of its detailed TRA guidance in 2008, DOE-EM has continued to accumulate experience in the conduct of TRAs and the process for evaluating technology maturity. DOE has developed guidance on TRAs applicable department-wide. DOE-EM's experience with the TRA process, the evaluations that led to recently developed proposed revisions to the DOE-EM TRA/TMP Guide; the content of the proposed changes that incorporate the above lessons learned and insights are described. (authors)« less

  10. Techno-economic assessment of hybrid extraction and distillation processes for furfural production from lignocellulosic biomass.

    PubMed

    Nhien, Le Cao; Long, Nguyen Van Duc; Kim, Sangyong; Lee, Moonyong

    2017-01-01

    Lignocellulosic biomass is one of the most promising alternatives for replacing mineral resources to overcome global warming, which has become the most important environmental issue in recent years. Furfural was listed by the National Renewable Energy Laboratory as one of the top 30 potential chemicals arising from biomass. However, the current production of furfural is energy intensive and uses inefficient technology. Thus, a hybrid purification process that combines extraction and distillation to produce furfural from lignocellulosic biomass was considered and investigated in detail to improve the process efficiency. This effective hybrid process depends on the extracting solvent, which was selected based on a comprehensive procedure that ranged from solvent screening to complete process design. Various solvents were first evaluated in terms of their extraction ability. Then, the most promising solvents were selected to study the separation feasibility. Eventually, processes that used the three best solvents (toluene, benzene, and butyl chloride) were designed and optimized in detail using Aspen Plus. Sustainability analysis was performed to evaluate these processes in terms of their energy requirements, total annual costs (TAC), and carbon dioxide (CO 2 ) emissions. The results showed that butyl chloride was the most suitable solvent for the hybrid furfural process because it could save 44.7% of the TAC while reducing the CO 2 emissions by 45.5% compared to the toluene process. In comparison with the traditional purification process using distillation, this suggested hybrid extraction/distillation process can save up to 19.2% of the TAC and reduce 58.3% total annual CO 2 emissions. Furthermore, a sensitivity analysis of the feed composition and its effect on the performance of the proposed hybrid system was conducted. Butyl chloride was found to be the most suitable solvent for the hybrid extraction/distillation process of furfural production. The proposed hybrid sequence was more favorable than the traditional distillation process when the methanol fraction of the feed stream was <3% and more benefit could be obtained when that fraction decreased.

  11. Graphite polystyryl pyridine (PSP) structural composites

    NASA Technical Reports Server (NTRS)

    Malassine, B.

    1981-01-01

    PSP6022 M resin, PSP 6024 M resin and W 133 Thormel T 300 graphite fabric reinforced panels were fabricated and provided to NASA Ames Research Center. PSP6022 and PSP6024 characteristics, process specifications for the fabriation of prepregs and of laminates are detailed. Mechanical properties, thermomechanical properties and moisture resistance were evaluated. PSP6022 and PSP6024 appear as high performance thermostable systems, very easy to process, being soluble in MEK for prepregging and being cured at no more than 250C, and even 200C.

  12. A Process Evaluation of Project Developmental Continuity. Interim Report VI: Executive Summary. Recommendations for Continuing the Impact Study.

    ERIC Educational Resources Information Center

    Granville, Arthur C.; Love, John M.

    This brief report summarizes the analysis and conclusions presented in detail in Interim Report VI regarding the feasibility of conducting a longitudinal study of Project Developmental Continuity (PDC). This project is a Head Start demonstration program aimed at providing educational and developmental continuity between children's Head Start and…

  13. Biohorizons: An eConference to Assess Human Biology in Large, First-Year Classes

    ERIC Educational Resources Information Center

    Moni, Roger W.; Moni, Karen B.; Poronnik, Philip; Lluka, Lesley J.

    2007-01-01

    The authors detail the design, implementation and evaluation of an eConference entitled "Biohorizons," using a presage-process-product model to describe the development of an eLearning community. Biohorizons was a summative learning and assessment task aiming to introduce large classes of first-year Human Biology students to the practices of…

  14. Comparing fire spread algorithms using equivalence testing and neutral landscape models

    Treesearch

    Brian R. Miranda; Brian R. Sturtevant; Jian Yang; Eric J. Gustafson

    2009-01-01

    We demonstrate a method to evaluate the degree to which a meta-model approximates spatial disturbance processes represented by a more detailed model across a range of landscape conditions, using neutral landscapes and equivalence testing. We illustrate this approach by comparing burn patterns produced by a relatively simple fire spread algorithm with those generated by...

  15. Double-Higgs boson production in the high-energy limit: planar master integrals

    NASA Astrophysics Data System (ADS)

    Davies, Joshua; Mishima, Go; Steinhauser, Matthias; Wellmann, David

    2018-03-01

    We consider the virtual corrections to the process gg → HH at NLO in the high energy limit and compute the corresponding planar master integrals in an expansion for small top quark mass. We provide details on the evaluation of the boundary conditions and present analytic results expressed in terms of harmonic polylogarithms.

  16. An Evaluation of the Right to Read Inexpensive Book Distribution Program. Final Report.

    ERIC Educational Resources Information Center

    General Research Corp., McLean, VA.

    This report provides details of a study of the Inexpensive Book Distribution Program (IBDP), a federally funded and sponsored program operated by Reading is Fundamental (RIF). The specific objectives of the described study were to determine the effectiveness of the IBDP in generating student reading motivation, and to describe the process by which…

  17. Educating Information Systems Students on Business Process Management (BPM) through Digital Gaming Metaphors of Virtual Reality

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2010-01-01

    Digital gaming continues to be an approach for enhancing methods of pedagogy. The study evaluates the effectiveness of a gaming product of a leading technology firm in engaging graduate students in an information systems course at a major northeast institution. Findings from a detailed perception survey of the students indicate favorable…

  18. Silicon Schottky photovoltaic diodes for solar energy conversion

    NASA Technical Reports Server (NTRS)

    Anderson, W. A.

    1975-01-01

    Various factors in Schottky barrier solar cell fabrication are evaluated in order to improve understanding of the current flow mechanism and to isolate processing variables that improve efficiency. Results of finger design, substrate resistivity, surface finishing and activation energy studies are detailed. An increased fill factor was obtained by baking of the vacuum system to remove moisture.

  19. On-Line Support and Portfolio Assessment for NETS-T Standards In Pre-Service Programs at a Large Southeastern University.

    ERIC Educational Resources Information Center

    Shoffner, Mary B.; Dias, Laurie B.

    This paper details the theoretical underpinnings of one university's approach to technology integration in its pre-service teacher preparation programs, and the results of a continuous, feedback-driven project to evaluate for technology integration through a student portfolio development process. Portfolios are assessed for multiple education and…

  20. Similarities and Contrasts in Quality of Child and Sibling Relationships with Elderly.

    ERIC Educational Resources Information Center

    Cicirelli, Victor G.

    Relationships with family members have been shown to be important in old age, both with adult children and with elderly siblings. Through shared memories such relationships may help with the life review (a process in which there is a detailed reconsideration and evaluation of personal experiences throughout life). To compare the differences in…

  1. 77 FR 19227 - Gulf of Mexico Fishery Management Council; Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... meeting will be held April 16-19, 2012. ADDRESSES: The meeting will be held at the Omni Bayfront Hotel... responsibilities of the SSC and receive a presentation on Detailed ``Option 3''. 2 p.m.-3 p.m.--Budget/Personnel Committee will review the Executive Director's Evaluation Process, the 2012 Proposed Budget and an Overview...

  2. The Responsive Environmental Assessment for Classroom Teaching (REACT): The Dimensionality of Student Perceptions of the Instructional Environment

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Demers, Joseph A.; Christ, Theodore J.

    2014-01-01

    This study details the initial development of the Responsive Environmental Assessment for Classroom Teachers (REACT). REACT was developed as a questionnaire to evaluate student perceptions of the classroom teaching environment. Researchers engaged in an iterative process to develop, field test, and analyze student responses on 100 rating-scale…

  3. A holistic model for evaluating the impact of individual technology-enhanced learning resources.

    PubMed

    Pickering, James D; Joynes, Viktoria C T

    2016-12-01

    The use of technology within education has now crossed the Rubicon; student expectations, the increasing availability of both hardware and software and the push to fully blended learning environments mean that educational institutions cannot afford to turn their backs on technology-enhanced learning (TEL). The ability to meaningfully evaluate the impact of TEL resources nevertheless remains problematic. This paper aims to establish a robust means of evaluating individual resources and meaningfully measure their impact upon learning within the context of the program in which they are used. Based upon the experience of developing and evaluating a range of mobile and desktop based TEL resources, this paper outlines a new four-stage evaluation process, taking into account learner satisfaction, learner gain, and the impact of a resource on both the individual and the institution in which it has been adapted. A new multi-level model of TEL resource evaluation is proposed, which includes a preliminary evaluation of need, learner satisfaction and gain, learner impact and institutional impact. Each of these levels are discussed in detail, and in relation to existing TEL evaluation frameworks. This paper details a holistic, meaningful evaluation model for individual TEL resources within the specific context in which they are used. It is proposed that this model is adopted to ensure that TEL resources are evaluated in a more meaningful and robust manner than is currently undertaken.

  4. Superconductivity devices: Commercial use of space

    NASA Technical Reports Server (NTRS)

    Haertling, Gene; Furman, Eugene; Hsi, Chi-Shiung; Li, Guang

    1993-01-01

    The processing and screen printing of the superconducting BSCCO and 123 YBCO materials on substrates is described. The resulting superconducting properties and the use of these materials as possible electrode materials for ferroelectrics at 77 K are evaluated. Also, work performed in the development of solid-state electromechanical actuators is reported. Specific details include the fabrication and processing of high strain PBZT and PLZT electrostrictive materials, the development of PSZT and PMN-based ceramics, and the testing and evaluation of these electrostrictive materials. Finally, the results of studies on a new processing technology for preparing piezoelectric and electrostrictive ceramic materials are summarized. The process involves a high temperature chemical reduction which leads to an internal pre-stressing of the oxide wafer. These reduced and internally biased oxide wafers (RAINBOW) can produce bending-mode actuator devices which possess a factor of ten more displacement and load bearing capacity than present-day benders.

  5. Alternative sites for LNG facilities in the Cook Inlet/Kenia Peninsula, Alaska area. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-10-02

    The purpose of this study was to analyze alternate LNG sites in the Cook Inlet area, Alaska, with primary emphasis on sites not identified by the El Paso-Alaska LNG Company in Docket No. CP-75-96. The evaluation included a systematic gross elimination process of eleven major subregions of Cook Inlet to eight subregions based upon considerations of land use and status, proximity of volcanos and other detrimental geological features, unsafe approaches for maneuvering and docking transport vessels, and adverse meteorological and marine conditions. This initial elimination process was followed by a more detailed iterative process of location and evaluation of 26more » specific sites in terms of local adverse impacts to biotic communities, human populations, and present land use practices. The analysis and elimination process resulted in the eventual selection and ranking of three sites: (1) Nikiski; (2) Cape Starichkof; (3) Resurrection Bay East. (GRA)« less

  6. System integration of wind and solar power in integrated assessment models: A cross-model evaluation of new approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel

    Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less

  7. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  8. How to become a better clinical teacher: a collaborative peer observation process.

    PubMed

    Finn, Kathleen; Chiappa, Victor; Puig, Alberto; Hunt, Daniel P

    2011-01-01

    Peer observation of teaching (PoT) is most commonly done as a way of evaluating educators in lecture or small group teaching. Teaching in the clinical environment is a complex and hectic endeavor that requires nimble and innovative teaching on a daily basis. Most junior faculty start their careers with little formal training in education and with limited opportunity to be observed or to observe more experienced faculty. Formal PoT would potentially ameliorate these challenges. This article describes a collaborative peer observation process that a group of 11 clinician educators is using as a longitudinal faculty development program. The process described in this article provides detailed and specific teaching feedback for the observed teaching attending while prompting the observing faculty to reflect on their own teaching style and to borrow effective teaching techniques from the observation. This article provides detailed examples from written feedback obtained during collaborative peer observation to emphasize the richness of this combined experience.

  9. The HEAO experience - design through operations

    NASA Technical Reports Server (NTRS)

    Hoffman, D. P.

    1983-01-01

    The design process and performance of the NASA High Energy Astronomy Observatories (HEAO-1, 2, and 3) are surveyed from the initiation of the program in 1968 through the end of HEAO-3 operation in May, 1981, with a focus on the attitude control and determination subsystem (ACDS). The science objectives, original and revised overall design concepts, final design for each spacecraft, and details of the ACDS designs are discussed, and the stages of the ACDS design process, including redefinition to achieve 50 percent cost reduction, detailed design of common and mission-unique hardware and software, unit qualification, subsystem integration, and observatory-level testing, are described. Overall and ACDS performance is evaluated for each mission and found to meet or exceed design requirements despite some difficulties arising from errors in startracker-ACDS-interface coordination and from gyroscope failures. These difficulties were resolved by using the flexibility of the software design. The implicationns of the HEAO experience for the design process of future spacecraft are suggested.

  10. Pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform

    NASA Astrophysics Data System (ADS)

    Shi, Cheng; Liu, Fang; Li, Ling-Ling; Hao, Hong-Xia

    2014-01-01

    The goal of pan-sharpening is to get an image with higher spatial resolution and better spectral information. However, the resolution of the pan-sharpened image is seriously affected by the thin clouds. For a single image, filtering algorithms are widely used to remove clouds. These kinds of methods can remove clouds effectively, but the detail lost in the cloud removal image is also serious. To solve this problem, a pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform (NSST) is proposed. For the low-resolution multispectral (LR MS) and high-resolution panchromatic images with thin clouds, a mask dodging method is used to remove clouds. For the cloud removal LR MS image, an adaptive principal component analysis transform is proposed to balance the spectral information and spatial resolution in the pan-sharpened image. Since the clouds removal process causes the detail loss problem, a weight matrix is designed to enhance the details of the cloud regions in the pan-sharpening process, but noncloud regions remain unchanged. And the details of the image are obtained by NSST. Experimental results over visible and evaluation metrics demonstrate that the proposed method can keep better spectral information and spatial resolution, especially for the images with thin clouds.

  11. Biosurfactant production by Aureobasidium pullulans in stirred tank bioreactor: New approach to understand the influence of important variables in the process.

    PubMed

    Brumano, Larissa Pereira; Antunes, Felipe Antonio Fernandes; Souto, Sara Galeno; Dos Santos, Júlio Cesar; Venus, Joachim; Schneider, Roland; da Silva, Silvio Silvério

    2017-11-01

    Surfactants are amphiphilic molecules with large industrial applications produced currently by chemical routes mainly derived from oil industry. However, biotechnological process, aimed to develop new sustainable process configurations by using favorable microorganisms, already requires investigations in more details. Thus, we present a novel approach for biosurfactant production using the promising yeast Aureobasidium pullulans LB 83, in stirred tank reactor. A central composite face-centered design was carried out to evaluate the effect of the aeration rate (0.1-1.1min -1 ) and sucrose concentration (20-80g.L -1 ) in the biosurfactant maximum tensoactivity and productivity. Statistical analysis showed that the use of variables at high levels enhanced tensoactivity, showing 8.05cm in the oil spread test and productivity of 0.0838cm.h -1 . Also, unprecedented investigation of aeration rate and sucrose concentration relevance in biosurfactant production by A. pullulans in stirred tank reactor was detailed, demonstrating the importance to establish adequate conditions in bioreactors, aimed to scale-up process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Walk-through survey report: Control technology for integrated circuit fabrication, Xerox Corporation, El Segundo, California

    NASA Astrophysics Data System (ADS)

    Mihlan, G. J.; Ungers, L. J.; Smith, R. K.; Mitchell, R. I.; Jones, J. H.

    1983-05-01

    A preliminary control technology assessment survey was conducted at the facility which manufactures N-channel metal oxide semiconductor (NMOS) integrated circuits. The facility has industrial hygiene review procedures for evaluating all new and existing process equipment. Employees are trained in safety, use of personal protective equipment, and emergency response. Workers potentially exposed to arsenic are monitored for urinary arsenic levels. The facility should be considered a candidate for detailed study based on the diversity of process operations encountered and the use of state-of-the-art technology and process equipment.

  13. GoActive: a protocol for the mixed methods process evaluation of a school-based physical activity promotion programme for 13-14year old adolescents.

    PubMed

    Jong, Stephanie T; Brown, Helen Elizabeth; Croxson, Caroline H D; Wilkinson, Paul; Corder, Kirsten L; van Sluijs, Esther M F

    2018-05-21

    Process evaluations are critical for interpreting and understanding outcome trial results. By understanding how interventions function across different settings, process evaluations have the capacity to inform future dissemination of interventions. The complexity of Get others Active (GoActive), a 12-week, school-based physical activity intervention implemented in eight schools, highlights the need to investigate how implementation is achieved across a variety of school settings. This paper describes the mixed methods GoActive process evaluation protocol that is embedded within the outcome evaluation. In this detailed process evaluation protocol, we describe the flexible and pragmatic methods that will be used for capturing the process evaluation data. A mixed methods design will be used for the process evaluation, including quantitative data collected in both the control and intervention arms of the GoActive trial, and qualitative data collected in the intervention arm. Data collection methods will include purposively sampled, semi-structured interviews and focus group interviews, direct observation, and participant questionnaires (completed by students, teachers, older adolescent mentors, and local authority-funded facilitators). Data will be analysed thematically within and across datasets. Overall synthesis of findings will address the process of GoActive implementation, and through which this process affects outcomes, with careful attention to the context of the school environment. This process evaluation will explore the experience of participating in GoActive from the perspectives of key groups, providing a greater understanding of the acceptability and process of implementation of the intervention across the eight intervention schools. This will allow for appraisal of the intervention's conceptual base, inform potential dissemination, and help optimise post-trial sustainability. The process evaluation will also assist in contextualising the trial effectiveness results with respect to how the intervention may or may not have worked and, if it was found to be effective, what might be required for it to be sustained in the 'real world'. Furthermore, it will offer suggestions for the development and implementation of future initiatives to promote physical activity within schools. ISRCTN, ISRCTN31583496 . Registered on 18 February 2014.

  14. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  15. Techniques to evaluate the importance of common cause degradation on reliability and safety of nuclear weapons.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, John L.

    2011-05-01

    As the nuclear weapon stockpile ages, there is increased concern about common degradation ultimately leading to common cause failure of multiple weapons that could significantly impact reliability or safety. Current acceptable limits for the reliability and safety of a weapon are based on upper limits on the probability of failure of an individual item, assuming that failures among items are independent. We expanded the current acceptable limits to apply to situations with common cause failure. Then, we developed a simple screening process to quickly assess the importance of observed common degradation for both reliability and safety to determine if furthermore » action is necessary. The screening process conservatively assumes that common degradation is common cause failure. For a population with between 100 and 5000 items we applied the screening process and conclude the following. In general, for a reliability requirement specified in the Military Characteristics (MCs) for a specific weapon system, common degradation is of concern if more than 100(1-x)% of the weapons are susceptible to common degradation, where x is the required reliability expressed as a fraction. Common degradation is of concern for the safety of a weapon subsystem if more than 0.1% of the population is susceptible to common degradation. Common degradation is of concern for the safety of a weapon component or overall weapon system if two or more components/weapons in the population are susceptible to degradation. Finally, we developed a technique for detailed evaluation of common degradation leading to common cause failure for situations that are determined to be of concern using the screening process. The detailed evaluation requires that best estimates of common cause and independent failure probabilities be produced. Using these techniques, observed common degradation can be evaluated for effects on reliability and safety.« less

  16. DDR process and materials for novel tone reverse technique

    NASA Astrophysics Data System (ADS)

    Shigaki, Shuhei; Shibayama, Wataru; Takeda, Satoshi; Tamura, Mamoru; Nakajima, Makoto; Sakamoto, Rikimaru

    2018-03-01

    We developed the novel process and material which can be created reverse-tone pattern without any collapse. The process was Dry Development Rinse (DDR) process, and the material used in this process was DDR material. DDR material was containing siloxane polymer which could be replaced the space area of the photo resist pattern. And finally, the reverse-tone pattern could be obtained by dry etching process without any pattern collapse issue. DDR process could be achieved fine line and space patterning below hp14nm without any pattern collapse by combination of PTD or NTD photo resist. DDR materials were demonstrated with latest coater track at imec. DDR process was fully automated and good CD uniformity was achieved after dry development. Detailed evaluation could be achieved with whole wafer such a study of CD uniformity (CDU). CDU of DDR pattern was compared to pre-pattern's CDU. Lower CDU was achieved and CDU healing was observed with special DDR material. By further evaluation, special DDR material showed relatively small E-slope compared to another DDR material. This small E-slope caused CDU improvement.

  17. Pediatric Intubation by Paramedics in a Large Emergency Medical Services System: Process, Challenges, and Outcomes.

    PubMed

    Prekker, Matthew E; Delgado, Fernanda; Shin, Jenny; Kwok, Heemun; Johnson, Nicholas J; Carlbom, David; Grabinsky, Andreas; Brogan, Thomas V; King, Mary A; Rea, Thomas D

    2016-01-01

    Pediatric intubation is a core paramedic skill in some emergency medical services (EMS) systems. The literature lacks a detailed examination of the challenges and subsequent adjustments made by paramedics when intubating children in the out-of-hospital setting. We undertake a descriptive evaluation of the process of out-of-hospital pediatric intubation, focusing on challenges, adjustments, and outcomes. We performed a retrospective analysis of EMS responses between 2006 and 2012 that involved attempted intubation of children younger than 13 years by paramedics in a large, metropolitan EMS system. We calculated the incidence rate of attempted pediatric intubation with EMS and county census data. To summarize the intubation process, we linked a detailed out-of-hospital airway registry with clinical records from EMS, hospital, or autopsy encounters for each child. The main outcome measures were procedural challenges, procedural success, complications, and patient disposition. Paramedics attempted intubation in 299 cases during 6.3 years, with an incidence of 1 pediatric intubation per 2,198 EMS responses. Less than half of intubations (44%) were for patients in cardiac arrest. Two thirds of patients were intubated on the first attempt (66%), and overall success was 97%. The most prevalent challenge was body fluids obscuring the laryngeal view (33%). After a failed first intubation attempt, corrective actions taken by paramedics included changing equipment (33%), suctioning (32%), and repositioning the patient (27%). Six patients (2%) experienced peri-intubation cardiac arrest and 1 patient had an iatrogenic tracheal injury. No esophageal intubations were observed. Of patients transported to the hospital, 86% were admitted to intensive care and hospital mortality was 27%. Pediatric intubation by paramedics was performed infrequently in this EMS system. Although overall intubation success was high, a detailed evaluation of the process of intubation revealed specific challenges and adjustments that can be anticipated by paramedics to improve first-pass success, potentially reduce complications, and ultimately improve clinical outcomes. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  18. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  19. A Serious Games Platform for Cognitive Rehabilitation with Preliminary Evaluation.

    PubMed

    Rego, Paula Alexandra; Rocha, Rui; Faria, Brígida Mónica; Reis, Luís Paulo; Moreira, Pedro Miguel

    2017-01-01

    In recent years Serious Games have evolved substantially, solving problems in diverse areas. In particular, in Cognitive Rehabilitation, Serious Games assume a relevant role. Traditional cognitive therapies are often considered repetitive and discouraging for patients and Serious Games can be used to create more dynamic rehabilitation processes, holding patients' attention throughout the process and motivating them during their road to recovery. This paper reviews Serious Games and user interfaces in rehabilitation area and details a Serious Games platform for Cognitive Rehabilitation that includes a set of features such as: natural and multimodal user interfaces and social features (competition, collaboration, and handicapping) which can contribute to augment the motivation of patients during the rehabilitation process. The web platform was tested with healthy subjects. Results of this preliminary evaluation show the motivation and the interest of the participants by playing the games.

  20. Viewing the functional consequences of traumatic brain injury by using brain SPECT.

    PubMed

    Pavel, D; Jobe, T; Devore-Best, S; Davis, G; Epstein, P; Sinha, S; Kohn, R; Craita, I; Liu, P; Chang, Y

    2006-03-01

    High-resolution brain SPECT is increasingly benefiting from improved image processing software and multiple complementary display capabilities. This enables detailed functional mapping of the disturbances in relative perfusion occurring after TBI. The patient population consisted of 26 cases (ages 8-61 years)between 3 months and 6 years after traumatic brain injury.A very strong case can be made for the routine use of Brain SPECT in TBI. Indeed it can provide a detailed evaluation of multiple functional consequences after TBI and is thus capable of supplementing the clinical evaluation and tailoring the therapeutic strategies needed. In so doing it also provides significant additional information beyond that available from MRI/CT. The critical factor for Brain SPECT's clinical relevance is a carefully designed technical protocol, including displays which should enable a comprehensive description of the patterns found, in a user friendly mode.

  1. Program for establishing long-time flight service performance of composite materials in the center wing structure of C-130 aircraft. Phase 4: Ground/flight acceptance tests

    NASA Technical Reports Server (NTRS)

    Harvill, W. E.; Kizer, J. A.

    1976-01-01

    The advantageous structural uses of advanced filamentary composites are demonstrated by design, fabrication, and test of three boron-epoxy reinforced C-130 center wing boxes. The advanced development work necessary to support detailed design of a composite reinforced C-130 center wing box was conducted. Activities included the development of a basis for structural design, selection and verification of materials and processes, manufacturing and tooling development, and fabrication and test of full-scale portions of the center wing box. Detailed design drawings, and necessary analytical structural substantiation including static strength, fatigue endurance, flutter, and weight analyses are considered. Some additional component testing was conducted to verify the design for panel buckling, and to evaluate specific local design areas. Development of the cool tool restraint concept was completed, and bonding capabilities were evaluated using full-length skin panel and stringer specimens.

  2. Advanced composite elevator for Boeing 727 aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.

    1980-01-01

    Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.

  3. Process-based modelling of the methane balance in periglacial landscapes (JSBACH-methane)

    NASA Astrophysics Data System (ADS)

    Kaiser, Sonja; Göckede, Mathias; Castro-Morales, Karel; Knoblauch, Christian; Ekici, Altug; Kleinen, Thomas; Zubrzycki, Sebastian; Sachs, Torsten; Wille, Christian; Beer, Christian

    2017-01-01

    A detailed process-based methane module for a global land surface scheme has been developed which is general enough to be applied in permafrost regions as well as wetlands outside permafrost areas. Methane production, oxidation and transport by ebullition, diffusion and plants are represented. In this model, oxygen has been explicitly incorporated into diffusion, transport by plants and two oxidation processes, of which one uses soil oxygen, while the other uses oxygen that is available via roots. Permafrost and wetland soils show special behaviour, such as variable soil pore space due to freezing and thawing or water table depths due to changing soil water content. This has been integrated directly into the methane-related processes. A detailed application at the Samoylov polygonal tundra site, Lena River Delta, Russia, is used for evaluation purposes. The application at Samoylov also shows differences in the importance of the several transport processes and in the methane dynamics under varying soil moisture, ice and temperature conditions during different seasons and on different microsites. These microsites are the elevated moist polygonal rim and the depressed wet polygonal centre. The evaluation shows sufficiently good agreement with field observations despite the fact that the module has not been specifically calibrated to these data. This methane module is designed such that the advanced land surface scheme is able to model recent and future methane fluxes from periglacial landscapes across scales. In addition, the methane contribution to carbon cycle-climate feedback mechanisms can be quantified when running coupled to an atmospheric model.

  4. Performance Evaluation of Target Detection with a Near-Space Vehicle-Borne Radar in Blackout Condition.

    PubMed

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Deng, Bin; Qin, Yuliang

    2016-01-06

    Radar is a very important sensor in surveillance applications. Near-space vehicle-borne radar (NSVBR) is a novel installation of a radar system, which offers many benefits, like being highly suited to the remote sensing of extremely large areas, having a rapidly deployable capability and having low vulnerability to electronic countermeasures. Unfortunately, a target detection challenge arises because of complicated scenarios, such as nuclear blackout, rain attenuation, etc. In these cases, extra care is needed to evaluate the detection performance in blackout situations, since this a classical problem along with the application of an NSVBR. However, the existing evaluation measures are the probability of detection and the receiver operating curve (ROC), which cannot offer detailed information in such a complicated application. This work focuses on such requirements. We first investigate the effect of blackout on an electromagnetic wave. Performance evaluation indexes are then built: three evaluation indexes on the detection capability and two evaluation indexes on the robustness of the detection process. Simulation results show that the proposed measure will offer information on the detailed performance of detection. These measures are therefore very useful in detecting the target of interest in a remote sensing system and are helpful for both the NSVBR designers and users.

  5. Performance Evaluation of Target Detection with a Near-Space Vehicle-Borne Radar in Blackout Condition

    PubMed Central

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Deng, Bin; Qin, Yuliang

    2016-01-01

    Radar is a very important sensor in surveillance applications. Near-space vehicle-borne radar (NSVBR) is a novel installation of a radar system, which offers many benefits, like being highly suited to the remote sensing of extremely large areas, having a rapidly deployable capability and having low vulnerability to electronic countermeasures. Unfortunately, a target detection challenge arises because of complicated scenarios, such as nuclear blackout, rain attenuation, etc. In these cases, extra care is needed to evaluate the detection performance in blackout situations, since this a classical problem along with the application of an NSVBR. However, the existing evaluation measures are the probability of detection and the receiver operating curve (ROC), which cannot offer detailed information in such a complicated application. This work focuses on such requirements. We first investigate the effect of blackout on an electromagnetic wave. Performance evaluation indexes are then built: three evaluation indexes on the detection capability and two evaluation indexes on the robustness of the detection process. Simulation results show that the proposed measure will offer information on the detailed performance of detection. These measures are therefore very useful in detecting the target of interest in a remote sensing system and are helpful for both the NSVBR designers and users. PMID:26751445

  6. City and County Solar PV Training Program, Module 3: Detailed Site Evaluation, Project Validation, and Permitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Day, Megan H; Lisell, Lars J

    This is the third of five training modules recorded for the City and County Solar PV Training Program. The program is focused on training local government staff in the PV procurement process. This module focuses on siting and permitting for both rooftop and larger, ground-mounted systems rand includes a link to a video.

  7. Between Traditions: Stephen Ball and the Critical Sociology of Education

    ERIC Educational Resources Information Center

    Apple, Michael W.

    2013-01-01

    Stephen Ball's work has deservedly received a good deal of attention. In this article, I detail a number of tasks in which the critical sociologist of education--as a "public intellectual"--should engage. I then place Ball's work within these tasks and evaluate his contributions to them. In the process, I show that one of the…

  8. Evaluating plant biodiversity measurements and exotic species detection in National Resources Inventory Sampling protocols using examples from the Northern Great Plains of the USA

    USDA-ARS?s Scientific Manuscript database

    Native plant biodiversity loss and exotic species invasions are threatening the ability of many ecosystems to maintain key functions and processes. We currently lack detailed plant biodiversity data at a national scale with which to make management decisions and recommendations based on current cons...

  9. Holistic Metrics for Assessment of the Greenness of Chemical Reactions in the Context of Chemical Education

    ERIC Educational Resources Information Center

    Ribeiro, M. Gabriela T. C.; Machado, Adelio A. S. C.

    2013-01-01

    Two new semiquantitative green chemistry metrics, the green circle and the green matrix, have been developed for quick assessment of the greenness of a chemical reaction or process, even without performing the experiment from a protocol if enough detail is provided in it. The evaluation is based on the 12 principles of green chemistry. The…

  10. Modeling linear and cyclic PKS intermediates through atom replacement.

    PubMed

    Shakya, Gaurav; Rivera, Heriberto; Lee, D John; Jaremko, Matt J; La Clair, James J; Fox, Daniel T; Haushalter, Robert W; Schaub, Andrew J; Bruegger, Joel; Barajas, Jesus F; White, Alexander R; Kaur, Parminder; Gwozdziowski, Emily R; Wong, Fiona; Tsai, Shiou-Chuan; Burkart, Michael D

    2014-12-03

    The mechanistic details of many polyketide synthases (PKSs) remain elusive due to the instability of transient intermediates that are not accessible via conventional methods. Here we report an atom replacement strategy that enables the rapid preparation of polyketone surrogates by selective atom replacement, thereby providing key substrate mimetics for detailed mechanistic evaluations. Polyketone mimetics are positioned on the actinorhodin acyl carrier protein (actACP) to probe the underpinnings of substrate association upon nascent chain elongation and processivity. Protein NMR is used to visualize substrate interaction with the actACP, where a tetraketide substrate is shown not to bind within the protein, while heptaketide and octaketide substrates show strong association between helix II and IV. To examine the later cyclization stages, we extended this strategy to prepare stabilized cyclic intermediates and evaluate their binding by the actACP. Elongated monocyclic mimics show much longer residence time within actACP than shortened analogs. Taken together, these observations suggest ACP-substrate association occurs both before and after ketoreductase action upon the fully elongated polyketone, indicating a key role played by the ACP within PKS timing and processivity. These atom replacement mimetics offer new tools to study protein and substrate interactions and are applicable to a wide variety of PKSs.

  11. Similarity measure and domain adaptation in multiple mixture model clustering: An application to image processing.

    PubMed

    Leong, Siow Hoo; Ong, Seng Huat

    2017-01-01

    This paper considers three crucial issues in processing scaled down image, the representation of partial image, similarity measure and domain adaptation. Two Gaussian mixture model based algorithms are proposed to effectively preserve image details and avoids image degradation. Multiple partial images are clustered separately through Gaussian mixture model clustering with a scan and select procedure to enhance the inclusion of small image details. The local image features, represented by maximum likelihood estimates of the mixture components, are classified by using the modified Bayes factor (MBF) as a similarity measure. The detection of novel local features from MBF will suggest domain adaptation, which is changing the number of components of the Gaussian mixture model. The performance of the proposed algorithms are evaluated with simulated data and real images and it is shown to perform much better than existing Gaussian mixture model based algorithms in reproducing images with higher structural similarity index.

  12. Similarity measure and domain adaptation in multiple mixture model clustering: An application to image processing

    PubMed Central

    Leong, Siow Hoo

    2017-01-01

    This paper considers three crucial issues in processing scaled down image, the representation of partial image, similarity measure and domain adaptation. Two Gaussian mixture model based algorithms are proposed to effectively preserve image details and avoids image degradation. Multiple partial images are clustered separately through Gaussian mixture model clustering with a scan and select procedure to enhance the inclusion of small image details. The local image features, represented by maximum likelihood estimates of the mixture components, are classified by using the modified Bayes factor (MBF) as a similarity measure. The detection of novel local features from MBF will suggest domain adaptation, which is changing the number of components of the Gaussian mixture model. The performance of the proposed algorithms are evaluated with simulated data and real images and it is shown to perform much better than existing Gaussian mixture model based algorithms in reproducing images with higher structural similarity index. PMID:28686634

  13. High Energy Vibration for Gas Piping

    NASA Astrophysics Data System (ADS)

    Lee, Gary Y. H.; Chan, K. B.; Lee, Aylwin Y. S.; Jia, ShengXiang

    2017-07-01

    In September 2016, a gas compressor in offshore Sarawak has its rotor changed out. Prior to this change-out, pipe vibration study was carried-out by the project team to evaluate any potential high energy pipe vibration problems at the compressor’s existing relief valve downstream pipes due to process condition changes after rotor change out. This paper covers high frequency acoustic excitation (HFAE) vibration also known as acoustic induced vibration (AIV) study and discusses detailed methodologies as a companion to the Energy Institute Guidelines for the avoidance of vibration induced fatigue failure, which is a common industry practice to assess and mitigate for AIV induced fatigue failure. Such detailed theoretical studies can help to minimize or totally avoid physical pipe modification, leading to reduce offshore plant shutdown days to plant shutdowns only being required to accommodate gas compressor upgrades, reducing cost without compromising process safety.

  14. Ares I-X Flight Data Evaluation: Executive Overview

    NASA Technical Reports Server (NTRS)

    Huebner, Lawrence D.; Waits, David A.; Lewis, Donny L.; Richards, James S.; Coates, R. H., Jr.; Cruit, Wendy D.; Bolte, Elizabeth J.; Bangham, Michal E.; Askins, Bruce R.; Trausch, Ann N.

    2011-01-01

    NASA's Constellation Program (CxP) successfully launched the Ares I-X flight test vehicle on October 28, 2009. The Ares I-X flight was a developmental flight test to demonstrate that this very large, long, and slender vehicle could be controlled successfully. The flight offered a unique opportunity for early engineering data to influence the design and development of the Ares I crew launch vehicle. As the primary customer for flight data from the Ares I-X mission, the Ares Projects Office (APO) established a set of 33 flight evaluation tasks to correlate flight results with prospective design assumptions and models. The flight evaluation tasks used Ares I-X data to partially validate tools and methodologies in technical disciplines that will ultimately influence the design and development of Ares I and future launch vehicles. Included within these tasks were direct comparisons of flight data with preflight predictions and post-flight assessments utilizing models and processes being applied to design and develop Ares I. The benefits of early development flight testing were made evident by results from these flight evaluation tasks. This overview provides summary information from assessment of the Ares I-X flight test data and represents a small subset of the detailed technical results. The Ares Projects Office published a 1,600-plus-page detailed technical report that documents the full set of results. This detailed report is subject to the International Traffic in Arms Regulations (ITAR) and is available in the Ares Projects Office archives files.

  15. High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery

    NASA Astrophysics Data System (ADS)

    Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.

    2015-10-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment, particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the catchment area as model input. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increases as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data are often unavailable. Modern unmanned aerial vehicles (UAVs) allow one to acquire high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility of deriving high-resolution imperviousness maps for urban areas from UAV imagery and of using this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is proposed and evaluated in a state-of-the-art urban drainage modelling exercise. In a real-life case study (Lucerne, Switzerland), we compare imperviousness maps generated using a fixed-wing consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their overall accuracy, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyse the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and runoff volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated from UAV images processed with modern classification methods achieve an accuracy comparable to standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on predicted surface runoff and pipe flows, when traditional workflows are used. We expect that they will have a substantial influence when more detailed modelling approaches are employed to characterize land use and to predict surface runoff. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility of flexibly acquiring up-to-date aerial images at a quality compared with off-the-shelf image products and a competitive price at the same time. We believe that in the future, urban drainage models representing a higher degree of spatial detail will fully benefit from the strengths of UAV imagery.

  16. Better movers and thinkers (BMT): A quasi-experimental study into the impact of physical education on children's cognition-A study protocol.

    PubMed

    Dalziell, Andrew; Boyle, James; Mutrie, Nanette

    2015-01-01

    This study will extend on a pilot study and will evaluate the impact of a novel approach to PE, Better Movers and Thinkers (BMT), on students' cognition, physical activity habits, and gross motor coordination (GMC). The study will involve six mainstream state schools with students aged 9-11 years. Three schools will be allocated as the intervention condition and three as the control condition. The design of the study is a 16-week intervention with pre-, post- and 6 month follow-up measurements taken using the 'Cognitive Assessment System (CAS)' GMC tests, and the 'Physical Activity Habits Questionnaire for Children (PAQ-C).' Qualitative data will be gathered using student focus groups and class teacher interviews in each of the six schools. ANCOVA will be used to evaluate any effect of intervention comparing pre-test scores with post-test scores and then pre-test scores with 6 month follow-up scores. Qualitative data will be analysed through an iterative process using grounded theory. This protocol provides the details of the rationale and design of the study and details of the intervention, outcome measures, and the recruitment process. The study will address gaps within current research by evaluating if a change of approach in the delivery of PE within schools has an effect on children's cognition, PA habits, and GMC within a Scottish setting.

  17. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool.

    PubMed

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M; Nuckley, David J; Keefe, Daniel F

    2012-10-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.

  18. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool

    PubMed Central

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M.; Nuckley, David J.

    2017-01-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations. PMID:28944349

  19. Cognitive processing specificity of anxious apprehension: impact on distress and performance during speech exposure.

    PubMed

    Philippot, Pierre; Vrielynck, Nathalie; Muller, Valérie

    2010-12-01

    The present study examined the impact of different modes of processing anxious apprehension on subsequent anxiety and performance in a stressful speech task. Participants were informed that they would have to give a speech on a difficult topic while being videotaped and evaluated on their performance. They were then randomly assigned to one of three conditions. In a specific processing condition, they were encouraged to explore in detail all the specific aspects (thoughts, emotions, sensations) they experienced while anticipating giving the speech; in a general processing condition, they had to focus on the generic aspects that they would typically experience during anxious anticipation; and in a control, no-processing condition, participants were distracted. Results revealed that at the end of the speech, participants in the specific processing condition reported less anxiety than those in the two other conditions. They were also evaluated by judges to have performed better than those in the control condition, who in turn did better than those in the general processing condition. Copyright © 2010. Published by Elsevier Ltd.

  20. Processing, characterization, and in vitro/in vivo evaluations of powder metallurgy processed Ti-13Nb-13Zr alloys.

    PubMed

    Bottino, Marco C; Coelho, Paulo G; Henriques, Vinicius A R; Higa, Olga Z; Bressiani, Ana H A; Bressiani, José C

    2009-03-01

    This article presents details of processing, characterization and in vitro as well as in vivo evaluations of powder metallurgy processed Ti-13Nb-13Zr samples with different levels of porosity. Sintered samples were characterized for density, crystalline phases (XRD), and microstructure (SEM and EDX). Samples sintered at 1000 degrees C showed the highest porosity level ( approximately 30%), featuring open and interconnected pores ranging from 50 to 100 mum in diameter but incomplete densification. In contrast, samples sintered at 1300 and 1500 degrees C demonstrated high densification with 10% porosity level distributed in a homogeneous microstructure. The different sintering conditions used in this study demonstrated a coherent trend that is increase in temperature lead to higher sample densification, even though densification represents a drawback for bone ingrowth. Cytotoxicity tests did not reveal any toxic effects of the starting and processed materials on surviving cell percentage. After an 8-week healing period in rabbit tibias, the implants were retrieved, processed for nondecalcified histological evaluation, and then assessed by backscattered electron images (BSEI-SEM) and EDX. Bone growth into the microstructure was observed only in samples sintered at 1000 degrees C. Overall, a close relation between newly formed bone and all processed samples was observed. (c) 2008 Wiley Periodicals, Inc.

  1. Findings from a participatory evaluation of a smart home application for older adults.

    PubMed

    Demiris, George; Oliver, Debra Parker; Dickey, Geraldine; Skubic, Marjorie; Rantz, Marilyn

    2008-01-01

    The aim of this paper is to present a participatory evaluation of an actual "smart home" project implemented in an independent retirement facility. Using the participatory evaluation process, residents guided the research team through development and implementation of the initial phase of a smart home project designed to assist residents to remain functionally independent and age in place. We recruited nine residents who provided permission to install the technology in their apartments. We conducted a total of 75 interviews and three observational sessions. Residents expressed overall positive perceptions of the sensor technologies and did not feel that these interfered with their daily activities. The process of adoption and acceptance of the sensors included three phases, familiarization, adjustment and curiosity, and full integration. Residents did not express privacy concerns. They provided detailed feedback and suggestions that were integrated into the redesign of the system. They also reported a sense of control resulting from their active involvement in the evaluation process. Observational sessions confirmed that the sensors were not noticeable and residents did not change their routines. The participatory evaluation approach not only empowers end-users but it also allows for the implementation of smart home systems that address residents' needs.

  2. Designing cost effective water demand management programs in Australia.

    PubMed

    White, S B; Fane, S A

    2002-01-01

    This paper describes recent experience with integrated resource planning (IRP) and the application of least cost planning (LCP) for the evaluation of demand management strategies in urban water. Two Australian case studies, Sydney and Northern New South Wales (NSW) are used in illustration. LCP can determine the most cost effective means of providing water services or alternatively the cheapest forms of water conservation. LCP contrasts to a traditional approach of evaluation which looks only at means of increasing supply. Detailed investigation of water usage, known as end-use analysis, is required for LCP. End-use analysis allows both rigorous demand forecasting, and the development and evaluation of conservation strategies. Strategies include education campaigns, increasing water use efficiency and promoting wastewater reuse or rainwater tanks. The optimal mix of conservation strategies and conventional capacity expansion is identified based on levelised unit cost. IRP uses LCP in the iterative process, evaluating and assessing options, investing in selected options, measuring the results, and then re-evaluating options. Key to this process is the design of cost effective demand management programs. IRP however includes a range of parameters beyond least economic cost in the planning process and program designs, including uncertainty, benefit partitioning and implementation considerations.

  3. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  4. Technoeconomic study on steam explosion application in biomass processing.

    PubMed

    Zimbardi, Francesco; Ricci, Esmeralda; Braccio, Giacobbe

    2002-01-01

    This work is based on the data collected during trials of a continuous steam explosion (SE) plant, with a treatment capacity of about 350 kg/h, including the biomass fractionation section. The energy and water consumption, equipment cost, and manpower needed to run this plant have been used as the base case for a techno-economic evaluation of productive plants. Three processing plant configurations have been considered: (I) SE pretreatment only; (II) SE followed by the hemicellulose extraction; (III) SE followed by the sequential hemicellulose and lignin extractions. The biomass treatment cost has been evaluated as a function of the plant scale. For each configuration, variable and fixed cost breakdown has been detailed in the case of a 50,000 t/y plant.

  5. Yield Improvement and Energy Savings Uing Phosphonates as Additives in Kraft pulping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulrike W. Tschirner; Timothy Smith

    2007-03-31

    Project Objective: Develop a commercially viable modification to the Kraft process resulting in energy savings, increased yield and improved bleachability. Evaluate the feasibility of this technology across a spectrum of wood species used in North America. Develop detailed fundamental understanding of the mechanism by which phosphonates improve KAPPA number and yield. Evaluate the North American market potential for the use of phosphonates in the Kraft pulping process. Examine determinants of customer perceived value and explore organizational and operational factors influencing attitudes and behaviors. Provide an economic feasibility assessment for the supply chain, both suppliers (chemical supply companies) and buyers (Kraftmore » mills). Provide background to most effectively transfer this new technology to commercial mills.« less

  6. Data assimilation and model evaluation experiment datasets

    NASA Technical Reports Server (NTRS)

    Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.

    1994-01-01

    The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.

  7. Completion of a Hospital-Wide Comprehensive Image Management and Communication System

    NASA Astrophysics Data System (ADS)

    Mun, Seong K.; Benson, Harold R.; Horii, Steven C.; Elliott, Larry P.; Lo, Shih-Chung B.; Levine, Betty A.; Braudes, Robert E.; Plumlee, Gabriel S.; Garra, Brian S.; Schellinger, Dieter; Majors, Bruce; Goeringer, Fred; Kerlin, Barbara D.; Cerva, John R.; Ingeholm, Mary-Lou; Gore, Tim

    1989-05-01

    A comprehensive image management and communication (IMAC) network has been installed at Georgetown University Hospital for an extensive clinical evaluation. The network is based on the AT&T CommView system and it includes interfaces to 12 imaging devices, 15 workstations (inside and outside of the radiology department), a teleradiology link to an imaging center, an optical jukebox and a number of advanced image display and processing systems such as Sun workstations, PIXAR, and PIXEL. Details of network configuration and its role in the evaluation project are discussed.

  8. Evaluation of streams in selected communities for the application of limited-detail study methods for flood-insurance studies

    USGS Publications Warehouse

    Cobb, Ernest D.

    1986-01-01

    The U.S. Geological Survey evaluated 2,349 communities in 1984 for the application of limited-detail flood-insurance study methods, that is, methods with a reduced effort and cost compared to the detailed studies. Limited-detail study methods were found to be appropriate for 1,705 communities, while detailed studies were appropriate for 62 communities and no studies were appropriate for 582 communities. The total length of streams for which limited-detail studies are recommended is 9 ,327 miles with a corresponding cost of $23,007,000. This results in average estimated costs for conducting limited-detail studies of $2,500 per mile of studied stream length. The purpose of the report is to document the limited-detail study methods and the results of the evaluation. (USGS)

  9. Application of remote sensing technology to land evaluation, planning utilization of land resources, and assessment of westland habitat in eastern South Dakota, parts 1 and 2

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator); Cox, T. L.; Best, R. G.

    1976-01-01

    The author has identified the following significant results. LANDSAT fulfilled the requirements for general soils and land use information. RB-57 imagery was required to provide the information and detail needed for mapping soils for land evaluation. Soils maps for land evaluation were provided on clear mylar at the scale of the county highway map to aid users in locating mapping units. Resulting mapped data were computer processed to provided a series of interpretive maps (land value, limitations to development, etc.) and area summaries for the users.

  10. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  11. Research on the use of data fusion technology to evaluate the state of electromechanical equipment

    NASA Astrophysics Data System (ADS)

    Lin, Lin

    2018-04-01

    Aiming at the problems of different testing information modes and the coexistence of quantitative and qualitative information in the state evaluation of electromechanical equipment, the paper proposes the use of data fusion technology to evaluate the state of electromechanical equipment. This paper introduces the state evaluation process of mechanical and electrical equipment in detail, uses the D-S evidence theory to fuse the decision-making layers of mechanical and electrical equipment state evaluation and carries out simulation tests. The simulation results show that it is feasible and effective to apply the data fusion technology to the state evaluation of the mechatronic equipment. After the multiple decision-making information provided by different evaluation methods are fused repeatedly and the useful information is extracted repeatedly, the fuzziness of judgment can be reduced and the state evaluation Credibility.

  12. Design of the storage location based on the ABC analyses

    NASA Astrophysics Data System (ADS)

    Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel

    2016-06-01

    The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.

  13. Study of seed reprocessing systems for open cycle coal fired MHD power plants. Task 1: Selection of processes for more detailed study

    NASA Astrophysics Data System (ADS)

    1980-07-01

    In most of the processes, a portion of the potassium seed material is converted to a compound not containing sulfur. The potassium in this form can, when injected upstream of the MHD channel, capture the sulfur released during the combustion of coal and eliminate the need for flue gas desulfurization equipment. Criteria considered in the evaluation included cost, state of development, seed loss, power requirements, availability, durability, key component risk, environmental impact, safety, controllability, and impurities buildup.

  14. Information Presentation in Decision and Risk Analysis: Answered, Partly Answered, and Unanswered Questions.

    PubMed

    Keller, L Robin; Wang, Yitong

    2017-06-01

    For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.

  15. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  16. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement

    PubMed Central

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-01-01

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893

  17. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement.

    PubMed

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-02-07

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.

  18. Pahoa geothermal industrial park. Engineering and economic analysis for direct applications of geothermal energy in an industrial park at Pahoa, Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreau, J.W.

    1980-12-01

    This engineering and economic study evaluated the potential for developing a geothermal industrial park in the Puna District near Pahoa on the Island of Hawaii. Direct heat industrial applications were analyzed from a marketing, engineering, economic, environmental, and sociological standpoint to determine the most viable industries for the park. An extensive literature search produced 31 existing processes currently using geothermal heat. An additional list was compiled indicating industrial processes that require heat that could be provided by geothermal energy. From this information, 17 possible processes were selected for consideration. Careful scrutiny and analysis of these 17 processes revealed three thatmore » justified detailed economic workups. The three processes chosen for detailed analysis were: an ethanol plant using bagasse and wood as feedstock; a cattle feed mill using sugar cane leaf trash as feedstock; and a papaya processing facility providing both fresh and processed fruit. In addition, a research facility to assess and develop other processes was treated as a concept. Consideration was given to the impediments to development, the engineering process requirements and the governmental support for each process. The study describes the geothermal well site chosen, the pipeline to transmit the hydrothermal fluid, and the infrastructure required for the industrial park. A conceptual development plan for the ethanol plant, the feedmill and the papaya processing facility was prepared. The study concluded that a direct heat industrial park in Pahoa, Hawaii, involves considerable risks.« less

  19. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  20. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  1. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  2. Intervention mapping: a process for developing theory- and evidence-based health education programs.

    PubMed

    Bartholomew, L K; Parcel, G S; Kok, G

    1998-10-01

    The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.

  3. Innovative fabrication processing of advanced composite materials concepts for primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Kassapoglou, Christos; Dinicola, Al J.; Chou, Jack C.

    1992-01-01

    The autoclave based THERM-X(sub R) process was evaluated by cocuring complex curved panels with frames and stiffeners. The process was shown to result in composite parts of high quality with good compaction at sharp radius regions and corners of intersecting parts. The structural properties of the postbuckled panels fabricated were found to be equivalent to those of conventionally tooled hand laid-up parts. Significant savings in bagging time over conventional tooling were documented. Structural details such as cocured shear ties and embedded stiffener flanges in the skin were found to suppress failure modes such as failure at corners of intersecting members and skin stiffeners separation.

  4. Seeking consent for research with indigenous communities: a systematic review.

    PubMed

    Fitzpatrick, Emily F M; Martiniuk, Alexandra L C; D'Antoine, Heather; Oscar, June; Carter, Maureen; Elliott, Elizabeth J

    2016-10-22

    When conducting research with Indigenous populations consent should be sought from both individual participants and the local community. We aimed to search and summarise the literature about methods for seeking consent for research with Indigenous populations. A systematic literature search was conducted for articles that describe or evaluate the process of seeking informed consent for research with Indigenous participants. Guidelines for ethical research and for seeking consent with Indigenous people are also included in our review. Of 1447 articles found 1391 were excluded (duplicates, irrelevant, not in English); 56 were relevant and included. Articles were categorised into original research that evaluated the consent process (n = 5) or publications detailing the process of seeking consent (n = 13) and guidelines for ethical research (n = 38). Guidelines were categorised into international (n = 8); national (n = 20) and state/regional/local guidelines (n = 10). In five studies based in Australia, Canada and The United States of America the consent process with Indigenous people was objectively evaluated. In 13 other studies interpreters, voice recording, videos, pictures, flipcharts and "plain language" forms were used to assist in seeking consent but these processes were not evaluated. Some Indigenous organisations provide examples of community-designed resources for seeking consent and describe methods of community engagement, but none are evaluated. International, national and local ethical guidelines stress the importance of upholding Indigenous values but fail to specify methods for engaging communities or obtaining individual consent. In the 'Grey literature' concerns about the consent process are identified but no solutions are offered. Consultation with Indigenous communities is needed to determine how consent should be sought from the community and the individual, and how to evaluate this process.

  5. Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt vs Solution Prepreg

    NASA Technical Reports Server (NTRS)

    Shin, E. Eugene; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.

    2002-01-01

    Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property-in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.

  6. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  7. Wind turbine siting: A summary of the state of the art

    NASA Technical Reports Server (NTRS)

    Hiester, T. R.

    1982-01-01

    The process of siting large wind turbines may be divided into two broad steps: site selection, and site evaluation. Site selection is the process of locating windy sites where wind energy development shows promise of economic viability. Site evaluation is the process of determining in detail for a given site the economic potential of the site. The state of the art in the first aspect of siting, site selection is emphasized. Several techniques for assessing the wind resource were explored or developed in the Federal Wind Energy Program. Local topography and meteorology will determine which of the techniques should be used in locating potential sites. None of the techniques can do the job alone, none are foolproof, and all require considerable knowledge and experience to apply correctly. Therefore, efficient siting requires a strategy which is founded on broad based application of several techniques without relying solely on one narrow field of expertise.

  8. Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt versus Solution Prepreg

    NASA Technical Reports Server (NTRS)

    Shin, Eugene E.; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.; Gray, Hugh R. (Technical Monitor)

    2002-01-01

    Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.

  9. Behavior of the gypsy moth life system model and development of synoptic model formulations

    Treesearch

    J. J. Colbert; Xu Rumei

    1991-01-01

    Aims of the research: The gypsy moth life system model (GMLSM) is a complex model which incorporates numerous components (both biotic and abiotic) and ecological processes. It is a detailed simulation model which has much biological reality. However, it has not yet been tested with life system data. For such complex models, evaluation and testing cannot be adequately...

  10. The Devil Is in the Detail Regarding the Efficacy of Reading Recovery: A Rejoinder to Schwartz, Hobsbaum, Briggs, and Scull

    ERIC Educational Resources Information Center

    Reynolds, Meree; Wheldall, Kevin; Madelaine, Alison

    2009-01-01

    This rejoinder provides comment on issues raised by Schwartz, Hobsbaum, Briggs and Scull (2009) in their article about evidence-based practice and Reading Recovery (RR), written in response to Reynolds and Wheldall (2007). Particular attention is paid to the processes and findings of the What Works Clearinghouse evaluation of RR. The suggestion…

  11. CO 2 laser cutting of MDF . 2. Estimation of power distribution

    NASA Astrophysics Data System (ADS)

    Ng, S. L.; Lum, K. C. P.; Black, I.

    2000-02-01

    Part 2 of this paper details an experimentally-based method to evaluate the power distribution for both CW and PM cutting. Variations in power distribution with different cutting speeds, material thickness and pulse ratios are presented. The paper also provides information on both the cutting efficiency and absorptivity index for MDF, and comments on the beam dispersion characteristics after the cutting process.

  12. Revisions in Natural Gas Monthly Consumption and Price Data, 2004 - 2007

    EIA Publications

    2009-01-01

    This report summarizes the method in which natural gas consumption data are collected and processed for publication and details the most notable revisions in natural gas consumption data for the period 2004 to 2007. It is intended to assist data users in evaluating the quality of the monthly consumption and price data for residential, commercial, and industrial consumers of natural gas.

  13. Fuel quality-processing study. Volume 1: Overview and results

    NASA Technical Reports Server (NTRS)

    Jones, G. E., Jr.

    1982-01-01

    The methods whereby the intermediate results were obtained are outlined, and the evaluation of the feasible paths from liquid fossil fuel sources to generated electricity is presented. The segments from which these paths were built are the results from the fuel upgrading schemes, on-site treatments, and exhaust gas treatments detailed in the subsequent volumes. The salient cost and quality parameters are included.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, S. P.N.; Peterson, G. R.

    Coal beneficiation is a generic term used for processes that prepare run-of-mine coal for specific end uses. It is also referred to as coal preparation or coal cleaning and is a means of reducing the sulfur and the ash contents of coal. Information is presented regarding current and potential coal beneficiation processes. Several of the processes reviewed, though not yet commercial, are at various stages of experimental development. Process descriptions are provided for these processes commensurate with the extent of information and time available to perform the evaluation of these processes. Conceptual process designs, preliminary cost estimates, and economic evaluationsmore » are provided for the more advanced (from a process development hierarchy viewpoint) processes based on production levels of 1500 and 15,000 tons/day (maf) of cleaned product coal. Economic evaluations of the coal preparation plants are conducted for several project financing schemes and at 12 and 15% annual after-tax rates of return on equity capital. A 9% annual interest rate is used on the debt fraction of the plant capital. Cleaned product coal prices are determined using the discounted cash flow procedure. The study is intended to provide information on publicly known coal beneficiation processes and to indicate the relative costs of various coal beneficiation processes. Because of severe timeconstraints, several potential coal beneficiation processes are not evaluated in great detail. It is recommended that an additional study be conducted to complement this study and to more fully appreciate the potentially significant role of coal beneficiation in the clean burning of coal.« less

  15. Combining Mechanistic Approaches for Studying Eco-Hydro-Geomorphic Coupling

    NASA Astrophysics Data System (ADS)

    Francipane, A.; Ivanov, V.; Akutina, Y.; Noto, V.; Istanbullouglu, E.

    2008-12-01

    Vegetation interacts with hydrology and geomorphic form and processes of a river basin in profound ways. Despite recent advances in hydrological modeling, the dynamic coupling between these processes is yet to be adequately captured at the basin scale to elucidate key features of process interaction and their role in the organization of vegetation and landscape morphology. In this study, we present a blueprint for integrating a geomorphic component into the physically-based, spatially distributed ecohydrological model, tRIBS- VEGGIE, which reproduces essential water and energy processes over the complex topography of a river basin and links them to the basic plant life regulatory processes. We present a preliminary design of the integrated modeling framework in which hillslope and channel erosion processes at the catchment scale, will be coupled with vegetation-hydrology dynamics. We evaluate the developed framework by applying the integrated model to Lucky Hills basin, a sub-catchment of the Walnut Gulch Experimental Watershed (Arizona). The evaluation is carried out by comparing sediment yields at the basin outlet, that follows a detailed verification of simulated land-surface energy partition, biomass dynamics, and soil moisture states.

  16. Advanced Spacesuit Portable Life Support System Packaging Concept Mock-Up Design & Development

    NASA Technical Reports Server (NTRS)

    O''Connell, Mary K.; Slade, Howard G.; Stinson, Richard G.

    1998-01-01

    A concentrated development effort was begun at NASA Johnson Space Center to create an advanced Portable Life Support System (PLSS) packaging concept. Ease of maintenance, technological flexibility, low weight, and minimal volume are targeted in the design of future micro-gravity and planetary PLSS configurations. Three main design concepts emerged from conceptual design techniques and were carried forth into detailed design, then full scale mock-up creation. "Foam", "Motherboard", and "LEGOtm" packaging design concepts are described in detail. Results of the evaluation process targeted maintenance, robustness, mass properties, and flexibility as key aspects to a new PLSS packaging configuration. The various design tools used to evolve concepts into high fidelity mock ups revealed that no single tool was all encompassing, several combinations were complimentary, the devil is in the details, and, despite efforts, many lessons were learned only after working with hardware.

  17. High-throughput sequencing: a failure mode analysis.

    PubMed

    Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A

    2005-01-04

    Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nexant, Inc., San Francisco, California

    The first section (Task 1) of this report by Nexant includes a survey and screening of various acid gas removal processes in order to evaluate their capability to meet the specific design requirements for thermochemical ethanol synthesis in NREL's thermochemical ethanol design report (Phillips et al. 2007, NREL/TP-510-41168). MDEA and selexol were short-listed as the most promising acid-gas removal agents based on work described in Task 1. The second report section (Task 2) describes a detailed design of an MDEA (methyl diethanol amine) based acid gas removal system for removing CO2 and H2S from biomass-derived syngas. Only MDEA was chosenmore » for detailed study because of the available resources.« less

  19. Using evaluation theory in priority setting and resource allocation.

    PubMed

    Smith, Neale; Mitton, Craig; Cornelissen, Evelyn; Gibson, Jennifer; Peacock, Stuart

    2012-01-01

    Public sector interest in methods for priority setting and program or policy evaluation has grown considerably over the last several decades, given increased expectations for accountable and efficient use of resources and emphasis on evidence-based decision making as a component of good management practice. While there has been some occasional effort to conduct evaluation of priority setting projects, the literatures around priority setting and evaluation have largely evolved separately. In this paper, the aim is to bring them together. The contention is that evaluation theory is a means by which evaluators reflect upon what it is they are doing when they do evaluation work. Theories help to organize thinking, sort out relevant from irrelevant information, provide transparent grounds for particular implementation choices, and can help resolve problematic issues which may arise in the conduct of an evaluation project. A detailed review of three major branches of evaluation theory--methods, utilization, and valuing--identifies how such theories can guide the development of efforts to evaluate priority setting and resource allocation initiatives. Evaluation theories differ in terms of their guiding question, anticipated setting or context, evaluation foci, perspective from which benefits are calculated, and typical methods endorsed. Choosing a particular theoretical approach will structure the way in which any priority setting process is evaluated. The paper suggests that explicitly considering evaluation theory makes key aspects of the evaluation process more visible to all stakeholders, and can assist in the design of effective evaluation of priority setting processes; this should iteratively serve to improve the understanding of priority setting practices themselves.

  20. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  1. Silicon solar cell process development, fabrication and analysis

    NASA Technical Reports Server (NTRS)

    Yoo, H. I.; Iles, P. A.; Leung, D. C.

    1981-01-01

    Solar cells were fabricated from EFG ribbons dendritic webs, cast ingots by heat exchanger method, and cast ingots by ubiquitous crystallization process. Baseline and other process variations were applied to fabricate solar cells. EFG ribbons grown in a carbon-containing gas atmosphere showed significant improvement in silicon quality. Baseline solar cells from dendritic webs of various runs indicated that the quality of the webs under investigation was not as good as the conventional CZ silicon, showing an average minority carrier diffusion length of about 60 um versus 120 um of CZ wafers. Detail evaluation of large cast ingots by HEM showed ingot reproducibility problems from run to run and uniformity problems of sheet quality within an ingot. Initial evaluation of the wafers prepared from the cast polycrystalline ingots by UCP suggested that the quality of the wafers from this process is considerably lower than the conventional CZ wafers. Overall performance was relatively uniform, except for a few cells which showed shunting problems caused by inclusions.

  2. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  3. Partnerships for Policy Development: A Case Study From Uganda’s Costed Implementation Plan for Family Planning

    PubMed Central

    Lipsky, Alyson B; Gribble, James N; Cahaelen, Linda; Sharma, Suneeta

    2016-01-01

    ABSTRACT In global health, partnerships between practitioners and policy makers facilitate stakeholders in jointly addressing those issues that require multiple perspectives for developing, implementing, and evaluating plans, strategies, and programs. For family planning, costed implementation plans (CIPs) are developed through a strategic government-led consultative process that results in a detailed plan for program activities and an estimate of the funding required to achieve an established set of goals. Since 2009, many countries have developed CIPs. Conventionally, the CIP approach has not been defined with partnerships as a focal point; nevertheless, cooperation between key stakeholders is vital to CIP development and execution. Uganda launched a CIP in November 2014, thus providing an opportunity to examine the process through a partnership lens. This article describes Uganda’s CIP development process in detail, grounded in a framework for assessing partnerships, and provides the findings from 22 key informant interviews. Findings reveal strengths in Uganda’s CIP development process, such as willingness to adapt and strong senior management support. However, the evaluation also highlighted challenges, including district health officers (DHOs), who are a key group of implementers, feeling excluded from the development process. There was also a lack of planning around long-term partnership practices that could help address anticipated execution challenges. The authors recommend that future CIP development efforts use a long-term partnership strategy that fosters accountability by encompassing both the short-term goal of developing the CIP and the longer-term goal of achieving the CIP objectives. Although this study focused on Uganda’s CIP for family planning, its lessons have implications for any policy or strategy development efforts that require multiple stakeholders to ensure successful execution. PMID:27353621

  4. The use of concept mapping in measurement development and evaluation: Application and future directions.

    PubMed

    Rosas, Scott R; Ridings, John W

    2017-02-01

    The past decade has seen an increase of measurement development research in social and health sciences that featured the use of concept mapping as a core technique. The purpose, application, and utility of concept mapping have varied across this emerging literature. Despite the variety of uses and range of outputs, little has been done to critically review how researchers have approached the application of concept mapping in the measurement development and evaluation process. This article focuses on a review of the current state of practice regarding the use of concept mapping as methodological tool in this process. We systematically reviewed 23 scale or measure development and evaluation studies, and detail the application of concept mapping in the context of traditional measurement development and psychometric testing processes. Although several limitations surfaced, we found several strengths in the contemporary application of the method. We determined concept mapping provides (a) a solid method for establishing content validity, (b) facilitates researcher decision-making, (c) insight into target population perspectives that are integrated a priori, and (d) a foundation for analytical and interpretative choices. Based on these results, we outline how concept mapping can be situated in the measurement development and evaluation processes for new instrumentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Using the Donabedian framework to examine the quality and safety of nursing service innovation.

    PubMed

    Gardner, Glenn; Gardner, Anne; O'Connell, Jane

    2014-01-01

    To evaluate the safety and quality of nurse practitioner service using the audit framework of Structure, Process and Outcome. Health service and workforce reform are on the agenda of governments and other service providers seeking to contain healthcare costs whilst providing safe and effective health care to communities. The nurse practitioner service is one health workforce innovation that has been adopted globally to improve timely access to clinical care, but there is scant literature reporting evaluation of the quality of this service innovation. A mixed-methods design within the Donabedian evaluation framework was used. The Donabedian framework was used to evaluate the Structure, Process and Outcome of nurse practitioner service. A range of data collection approaches was used, including stakeholder survey (n = 36), in-depth interviews (11 patients and 13 nurse practitioners) and health records data on service processes. The study identified that adequate and detailed preparation of Structure and Process is essential for the successful implementation of a service innovation. The multidisciplinary team was accepting of the addition of nurse practitioner service, and nurse practitioner clinical care was shown to be effective, satisfactory and safe from the perspective of the clinician stakeholders and patients. This study demonstrated that the Donabedian framework of Structure, Process and Outcome evaluation is a valuable and validated approach to examine the safety and quality of a service innovation. Furthermore, in this study, specific Structure elements were shown to influence the quality of service processes further validating the framework and the interdependence of the Structure, Process and Outcome components. Understanding the Structure and Process requirements for establishing nursing service innovation lays the foundation for safe, effective and patient-centred clinical care. © 2013 John Wiley & Sons Ltd.

  6. Evaluating a voice recognition system: finding the right product for your department.

    PubMed

    Freeh, M; Dewey, M; Brigham, L

    2001-06-01

    The Department of Radiology at the University of Utah Health Sciences Center has been in the process of transitioning from the traditional film-based department to a digital imaging department for the past 2 years. The department is now transitioning from the traditional method of dictating reports (dictation by radiologist to transcription to review and signing by radiologist) to a voice recognition system. The transition to digital operations will not be complete until we have the ability to directly interface the dictation process with the image review process. Voice recognition technology has advanced to the level where it can and should be an integral part of the new way of working in radiology and is an integral part of an efficient digital imaging department. The transition to voice recognition requires the task of identifying the product and the company that will best meet a department's needs. This report introduces the methods we used to evaluate the vendors and the products available as we made our purchasing decision. We discuss our evaluation method and provide a checklist that can be used by other departments to assist with their evaluation process. The criteria used in the evaluation process fall into the following major categories: user operations, technical infrastructure, medical dictionary, system interfaces, service support, cost, and company strength. Conclusions drawn from our evaluation process will be detailed, with the intention being to shorten the process for others as they embark on a similar venture. As more and more organizations investigate the many products and services that are now being offered to enhance the operations of a radiology department, it becomes increasingly important that solid methods are used to most effectively evaluate the new products. This report should help others complete the task of evaluating a voice recognition system and may be adaptable to other products as well.

  7. Learning Evaluation: blending quality improvement and implementation research methods to study healthcare innovations.

    PubMed

    Balasubramanian, Bijal A; Cohen, Deborah J; Davis, Melinda M; Gunn, Rose; Dickinson, L Miriam; Miller, William L; Crabtree, Benjamin F; Stange, Kurt C

    2015-03-10

    In healthcare change interventions, on-the-ground learning about the implementation process is often lost because of a primary focus on outcome improvements. This paper describes the Learning Evaluation, a methodological approach that blends quality improvement and implementation research methods to study healthcare innovations. Learning Evaluation is an approach to multi-organization assessment. Qualitative and quantitative data are collected to conduct real-time assessment of implementation processes while also assessing changes in context, facilitating quality improvement using run charts and audit and feedback, and generating transportable lessons. Five principles are the foundation of this approach: (1) gather data to describe changes made by healthcare organizations and how changes are implemented; (2) collect process and outcome data relevant to healthcare organizations and to the research team; (3) assess multi-level contextual factors that affect implementation, process, outcome, and transportability; (4) assist healthcare organizations in using data for continuous quality improvement; and (5) operationalize common measurement strategies to generate transportable results. Learning Evaluation principles are applied across organizations by the following: (1) establishing a detailed understanding of the baseline implementation plan; (2) identifying target populations and tracking relevant process measures; (3) collecting and analyzing real-time quantitative and qualitative data on important contextual factors; (4) synthesizing data and emerging findings and sharing with stakeholders on an ongoing basis; and (5) harmonizing and fostering learning from process and outcome data. Application to a multi-site program focused on primary care and behavioral health integration shows the feasibility and utility of Learning Evaluation for generating real-time insights into evolving implementation processes. Learning Evaluation generates systematic and rigorous cross-organizational findings about implementing healthcare innovations while also enhancing organizational capacity and accelerating translation of findings by facilitating continuous learning within individual sites. Researchers evaluating change initiatives and healthcare organizations implementing improvement initiatives may benefit from a Learning Evaluation approach.

  8. Study of water recovery and solid waste processing for aerospace and domestic applications. Volume 2: Final report

    NASA Technical Reports Server (NTRS)

    Guarneri, C. A.; Reed, A.; Renman, R. E.

    1972-01-01

    The manner in which current and advanced technology can be applied to develop practical solutions to existing and emerging water supply and waste disposal problems is evaluated. An overview of water resource factors as they affect new community planning, and requirements imposed on residential waste treatment systems are presented. The results of equipment surveys contain information describing: commercially available devices and appliances designed to conserve water; devices and techniques for monitoring water quality and controlling back contamination; and advanced water and waste processing equipment. System concepts are developed and compared on the basis of current and projected costs. Economic evaluations are based on community populations of from 2,000 to 250,000. The most promising system concept is defined in sufficient depth to initiate detailed design.

  9. A study of TRIGLYCINE SULFATE (TGS) crystals from the International Microgravity Laboratory Mission (IML-1)

    NASA Technical Reports Server (NTRS)

    Lal, R. B.

    1992-01-01

    Preliminary evaluation of the data was made during the hologram processing procedure. A few representative holograms were selected and reconstructed in the HGS; photographs of sample particle images were made to illustrate the resolution of all three particle sizes. Based on these evaluations slight modifications were requested in the hologram processing procedure to optimize the hologram exposure in the vicinity of the crystal. Preliminary looks at the data showed that we are able to see and track all three sizes of particles throughout the chamber. Because of the vast amount of data available in the holograms, it was recommended that we produce a detailed data reduction plan with prioritization on the different types of data which can be extracted from the holograms.

  10. Assessing the representativeness of wind data for wind turbine site evaluation

    NASA Technical Reports Server (NTRS)

    Renne, D. S.; Corotis, R. B.

    1982-01-01

    Once potential wind turbine sites (either for single installations or clusters) are identified through siting procedures, actual evaluation of the sites must commence. This evaluation is needed to obtain estimates of wind turbine performance and to identify hazards to the machine from the turbulence component of the atmosphere. These estimates allow for more detailed project planning and for preliminary financing arrangements to be secured. The site evaluation process can occur in two stages: (1) utilizing existing nearby data, and (2) establishing and monitoring an onsite measurement program. Since step (2) requires a period of at least 1 yr or more from the time a potential site has been identified, step (1) is often an essential stage in the preliminary evaluation process. Both the methods that have been developed and the unknowns that still exist in assessing the representativeness of available data to a nearby wind turbine site are discussed. How the assessment of the representativeness of available data can be used to develop a more effective onsite meteorological measurement program is also discussed.

  11. Promising Fuel Cycle Options for R&D – Results, Insights, and Future Directions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wigeland, Roald Arnold

    2015-05-01

    The Fuel Cycle Options (FCO) campaign in the U.S. DOE Fuel Cycle Research & Development Program conducted a detailed evaluation and screening of nuclear fuel cycles. The process for this study was described at the 2014 ICAPP meeting. This paper reports on detailed insights and questions from the results of the study. The comprehensive study identified continuous recycle in fast reactors as the most promising option, using either U/Pu or U/TRU recycle, and potentially in combination with thermal reactors, as reported at the ICAPP 2014 meeting. This paper describes the examination of the results in detail that indicated that theremore » was essentially no difference in benefit between U/Pu and U/TRU recycle, prompting questions about the desirability of pursuing the more complex U/TRU approach given that the estimated greater challenges for development and deployment. The results will be reported from the current effort that further explores what, if any, benefits of TRU recycle (minor actinides in addition to plutonium recycle) may be in order to inform decisions on future R&D directions. The study also identified continuous recycle using thorium-based fuel cycles as potentially promising, in either fast or thermal systems, but with lesser benefit. Detailed examination of these results indicated that the lesser benefit was confined to only a few of the evaluation metrics, identifying the conditions under which thorium-based fuel cycles would be promising to pursue. For the most promising fuel cycles, the FCO is also conducting analyses on the potential transition to such fuel cycles to identify the issues, challenges, and the timing for critical decisions that would need to be made to avoid unnecessary delay in deployment, including investigation of issues such as the effects of a temporary lack of plutonium fuel resources or supporting infrastructure. These studies are placed in the context of an overall analysis approach designed to provide comprehensive information to the decision-making process.« less

  12. Modelling biological Cr(VI) reduction in aquifer microcosm column systems.

    PubMed

    Molokwane, Pulane E; Chirwa, Evans M N

    2013-01-01

    Several chrome processing facilities in South Africa release hexavalent chromium (Cr(VI)) into groundwater resources. Pump-and-treat remediation processes have been implemented at some of the sites but have not been successful in reducing contamination levels. The current study is aimed at developing an environmentally friendly, cost-effective and self-sustained biological method to curb the spread of chromium at the contaminated sites. An indigenous Cr(VI)-reducing mixed culture of bacteria was demonstrated to reduce high levels of Cr(VI) in laboratory samples. The effect of Cr(VI) on the removal rate was evaluated at concentrations up to 400 mg/L. Following the detailed evaluation of fundamental processes for biological Cr(VI) reduction, a predictive model for Cr(VI) breakthrough through aquifer microcosm reactors was developed. The reaction rate in batch followed non-competitive rate kinetics with a Cr(VI) inhibition threshold concentration of approximately 99 mg/L. This study evaluates the application of the kinetic parameters determined in the batch reactors to the continuous flow process. The model developed from advection-reaction rate kinetics in a porous media fitted best the effluent Cr(VI) concentration. The model was also used to elucidate the logistic nature of biomass growth in the reactor systems.

  13. Better movers and thinkers (BMT): A quasi-experimental study into the impact of physical education on children's cognition—A study protocol

    PubMed Central

    Dalziell, Andrew; Boyle, James; Mutrie, Nanette

    2015-01-01

    This study will extend on a pilot study and will evaluate the impact of a novel approach to PE, Better Movers and Thinkers (BMT), on students' cognition, physical activity habits, and gross motor coordination (GMC). The study will involve six mainstream state schools with students aged 9–11 years. Three schools will be allocated as the intervention condition and three as the control condition. The design of the study is a 16-week intervention with pre-, post- and 6 month follow-up measurements taken using the ‘Cognitive Assessment System (CAS)’ GMC tests, and the ‘Physical Activity Habits Questionnaire for Children (PAQ-C).’ Qualitative data will be gathered using student focus groups and class teacher interviews in each of the six schools. ANCOVA will be used to evaluate any effect of intervention comparing pre-test scores with post-test scores and then pre-test scores with 6 month follow-up scores. Qualitative data will be analysed through an iterative process using grounded theory. This protocol provides the details of the rationale and design of the study and details of the intervention, outcome measures, and the recruitment process. The study will address gaps within current research by evaluating if a change of approach in the delivery of PE within schools has an effect on children's cognition, PA habits, and GMC within a Scottish setting. PMID:26844172

  14. Is the work flow model a suitable candidate for an observatory supervisory control infrastructure?

    NASA Astrophysics Data System (ADS)

    Daly, Philip N.; Schumacher, Germán.

    2016-08-01

    This paper reports on the early investigation of using the work flow model for observatory infrastructure software. We researched several work ow engines and identified 3 for further detailed, study: Bonita BPM, Activiti and Taverna. We discuss the business process model and how it relates to observatory operations and identify a path finder exercise to further evaluate the applicability of these paradigms.

  15. Nighttime images fusion based on Laplacian pyramid

    NASA Astrophysics Data System (ADS)

    Wu, Cong; Zhan, Jinhao; Jin, Jicheng

    2018-02-01

    This paper expounds method of the average weighted fusion, image pyramid fusion, the wavelet transform and apply these methods on the fusion of multiple exposures nighttime images. Through calculating information entropy and cross entropy of fusion images, we can evaluate the effect of different fusion. Experiments showed that Laplacian pyramid image fusion algorithm is suitable for processing nighttime images fusion, it can reduce the halo while preserving image details.

  16. Review of digital holography reconstruction methods

    NASA Astrophysics Data System (ADS)

    Dovhaliuk, Rostyslav Yu.

    2018-01-01

    Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.

  17. Modeling Linear and Cyclic PKS Intermediates through Atom Replacement

    PubMed Central

    2015-01-01

    The mechanistic details of many polyketide synthases (PKSs) remain elusive due to the instability of transient intermediates that are not accessible via conventional methods. Here we report an atom replacement strategy that enables the rapid preparation of polyketone surrogates by selective atom replacement, thereby providing key substrate mimetics for detailed mechanistic evaluations. Polyketone mimetics are positioned on the actinorhodin acyl carrier protein (actACP) to probe the underpinnings of substrate association upon nascent chain elongation and processivity. Protein NMR is used to visualize substrate interaction with the actACP, where a tetraketide substrate is shown not to bind within the protein, while heptaketide and octaketide substrates show strong association between helix II and IV. To examine the later cyclization stages, we extended this strategy to prepare stabilized cyclic intermediates and evaluate their binding by the actACP. Elongated monocyclic mimics show much longer residence time within actACP than shortened analogs. Taken together, these observations suggest ACP-substrate association occurs both before and after ketoreductase action upon the fully elongated polyketone, indicating a key role played by the ACP within PKS timing and processivity. These atom replacement mimetics offer new tools to study protein and substrate interactions and are applicable to a wide variety of PKSs. PMID:25406716

  18. The role of change data in a land use and land cover map updating program

    USGS Publications Warehouse

    Milazzo, Valerie A.

    1981-01-01

    An assessment of current land use and a process for identifying and measuring change are needed to evaluate trends and problems associated with the use of our Nation's land resources. The U. S. Geological Survey is designing a program to maintain the currency of its land use and land cover maps and digital data base and to provide data on changes in our Nation's land use and land cover. Ways to produce and use change data in a map updating program are being evaluated. A dual role for change data is suggested. For users whose applications require specific polygon data on land use change, showing the locations of all individual category changes and detailed statistical data on these changes can be provided as byproducts of the map-revision process. Such products can be produced quickly and inexpensively either by conventional mapmaking methods or as specialized output from a computerized geographic information system. Secondly, spatial data on land use change are used directly for updating existing maps and statistical data. By incorporating only selected change data, maps and digital data can be updated in an efficient and timely manner without the need for complete and costly detailed remapping and redigitization of polygon data.

  19. Strengthening stakeholder-engaged research and research on stakeholder engagement.

    PubMed

    Ray, Kristin N; Miller, Elizabeth

    2017-06-01

    Stakeholder engagement is an emerging field with little evidence to inform best practices. Guidelines are needed to improve the quality of research on stakeholder engagement through more intentional planning, evaluation and reporting. We developed a preliminary framework for planning, evaluating and reporting stakeholder engagement, informed by published conceptual models and recommendations and then refined through our own stakeholder engagement experience. Our proposed exploratory framework highlights contexts and processes to be addressed in planning stakeholder engagement, and potential immediate, intermediate and long-term outcomes that warrant evaluation. We use this framework to illustrate both the minimum information needed for reporting stakeholder-engaged research and the comprehensive detail needed for reporting research on stakeholder engagement.

  20. Strengthening stakeholder-engaged research and research on stakeholder engagement

    PubMed Central

    Ray, Kristin N; Miller, Elizabeth

    2017-01-01

    Stakeholder engagement is an emerging field with little evidence to inform best practices. Guidelines are needed to improve the quality of research on stakeholder engagement through more intentional planning, evaluation and reporting. We developed a preliminary framework for planning, evaluating and reporting stakeholder engagement, informed by published conceptual models and recommendations and then refined through our own stakeholder engagement experience. Our proposed exploratory framework highlights contexts and processes to be addressed in planning stakeholder engagement, and potential immediate, intermediate and long-term outcomes that warrant evaluation. We use this framework to illustrate both the minimum information needed for reporting stakeholder-engaged research and the comprehensive detail needed for reporting research on stakeholder engagement. PMID:28621551

  1. Application of ERTS-1 imagery to land use, forest density and soil investigations in Greece

    NASA Technical Reports Server (NTRS)

    Yassoglou, N. J.; Skordalakis, E.; Koutalos, A.

    1974-01-01

    Photographic and digital imagery received from ERTS-1 was analyzed and evaluated as to its usefulness for the assessment of agricultural and forest land resources. Black and white, and color composite imagery provided spectral and spatial data, which, when matched with temporal land information, provided the basis for a semidetailed land use and forest site evaluation cartography. Color composite photographs have provided some information on the status of irrigation of agricultural lands. Computer processed digital imagery was successfully used for detailed crop classification and semidetailed soil evaluation. The results and techniques of this investigation are applicable to ecological and geological conditions similar to those prevailing in the Eastern Mediterranean.

  2. Application of IPAD to missile design

    NASA Technical Reports Server (NTRS)

    Santa, J. E.; Whiting, T. R.

    1974-01-01

    The application of an integrated program for aerospace-vehicle design (IPAD) to the design of a tactical missile is examined. The feasibility of modifying a proposed IPAD system for aircraft design work for use in missile design is evaluated. The tasks, cost, and schedule for the modification are presented. The basic engineering design process is described, explaining how missile design is achieved through iteration of six logical problem solving functions throughout the system studies, preliminary design, and detailed design phases of a new product. Existing computer codes used in various engineering disciplines are evaluated for their applicability to IPAD in missile design.

  3. Implementation of an Outer Can Welding System for Savannah River Site FB-Line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howard, S.R.

    2003-03-27

    This paper details three phases of testing to confirm use of a Gas Tungsten Arc (GTA) system for closure welding the 3013 outer container used for stabilization/storage of plutonium metals and oxides. The outer container/lid closure joint was originally designed for laser welding, but for this application, the gas tungsten arc (GTA) welding process has been adapted. The testing progressed in three phases: (1) system checkout to evaluate system components for operational readiness, (2) troubleshooting to evaluate high weld failure rates and develop corrective techniques, and (3) pre-installation acceptance testing.

  4. Task 4 supporting technology. Part 2: Detailed test plan for thermal seals. Thermal seals evaluation, improvement and test. CAN8-1, Reusable Launch Vehicle (RLV), advanced technology demonstrator: X-33. Leading edge and seals thermal protection system technology demonstration

    NASA Technical Reports Server (NTRS)

    Hogenson, P. A.; Lu, Tina

    1995-01-01

    The objective is to develop the advanced thermal seals to a technology readiness level (TRL) of 6 to support the rapid turnaround time and low maintenance requirements of the X-33 and the future reusable launch vehicle (RLV). This program is divided into three subtasks: (1) orbiter thermal seals operation history review; (2) material, process, and design improvement; and (3) fabrication and evaluation of the advanced thermal seals.

  5. Comparative Evaluation of Financing Programs: Insights From California’s Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deason, Jeff

    Berkeley Lab examines criteria for a comparative assessment of multiple financing programs for energy efficiency, developed through a statewide public process in California. The state legislature directed the California Alternative Energy and Advanced Transportation Financing Authority (CAEATFA) to develop these criteria. CAEATFA's report to the legislature, an invaluable reference for other jurisdictions considering these topics, discusses the proposed criteria and the rationales behind them in detail. Berkeley Lab's brief focuses on several salient issues that emerged during the criteria development and discussion process. Many of these issues are likely to arise in other states that plan to evaluate the impactsmore » of energy efficiency financing programs, whether for a single program or multiple programs. Issues discussed in the brief include: -The stakeholder process to develop the proposed assessment criteria -Attribution of outcomes - such as energy savings - to financing programs vs. other drivers -Choosing the outcome metric of primary interest: program take-up levels vs. savings -The use of net benefits vs. benefit-cost ratios for cost-effectiveness evaluation -Non-energy factors -Consumer protection factors -Market transformation impacts -Accommodating varying program goals in a multi-program evaluation -Accounting for costs and risks borne by various parties, including taxpayers and utility customers, in cost-effectiveness analysis -How to account for potential synergies among programs in a multi-program evaluation« less

  6. Enabling high-quality observations of surface imperviousness for water runoff modelling from unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Tokarczyk, Piotr; Leitao, Joao Paulo; Rieckermann, Jörg; Schindler, Konrad; Blumensaat, Frank

    2015-04-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual sub-catchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.

  7. High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery

    NASA Astrophysics Data System (ADS)

    Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.

    2015-01-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.

  8. Global cost and weight evaluation of fuselage keel design concepts

    NASA Technical Reports Server (NTRS)

    Flynn, B. W.; Morris, M. R.; Metschan, S. L.; Swanson, G. D.; Smith, P. J.; Griess, K. H.; Schramm, M. R.; Humphrey, R. J.

    1993-01-01

    The Boeing program entitled Advanced Technology Composite Aircraft Structure (ATCAS) is focused on the application of affordable composite technology to pressurized fuselage structure of future aircraft. As part of this effort, a design study was conducted on the keel section of the aft fuselage. A design build team (DBT) approach was used to identify and evaluate several design concepts which incorporated different material systems, fabrication processes, structural configurations, and subassembly details. The design concepts were developed in sufficient detail to accurately assess their potential for cost and weight savings as compared with a metal baseline representing current wide body technology. The cost and weight results, along with an appraisal of performance and producibility risks, are used to identify a globally optimized keel design; one which offers the most promising cost and weight advantages over metal construction. Lastly, an assessment is given of the potential for further cost and weight reductions of the selected keel design during local optimization.

  9. GIS-assisted spatial analysis for urban regulatory detailed planning: designer's dimension in the Chinese code system

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Zeng, Zheng

    2009-10-01

    By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.

  10. Wash water recovery system

    NASA Technical Reports Server (NTRS)

    Deckman, G.; Rousseau, J. (Editor)

    1973-01-01

    The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.

  11. CNES reliability approach for the qualification of MEMS for space

    NASA Astrophysics Data System (ADS)

    Pressecq, Francis; Lafontan, Xavier; Perez, Guy; Fortea, Jean-Pierre

    2001-10-01

    This paper describes the reliability approach performs at CNES to evaluate MEMS for space application. After an introduction and a detailed state of the art on the space requirements and on the use of MEMS for space, different approaches for taking into account MEMS in the qualification phases are presented. CNES proposes improvement to theses approaches in term of failure mechanisms identification. Our approach is based on a design and test phase deeply linked with a technology study. This workflow is illustrated with an example: the case of a variable capacitance processed with MUMPS process is presented.

  12. Hybrid receiver study

    NASA Technical Reports Server (NTRS)

    Stone, M. S.; Mcadam, P. L.; Saunders, O. W.

    1977-01-01

    The results are presented of a 4 month study to design a hybrid analog/digital receiver for outer planet mission probe communication links. The scope of this study includes functional design of the receiver; comparisons between analog and digital processing; hardware tradeoffs for key components including frequency generators, A/D converters, and digital processors; development and simulation of the processing algorithms for acquisition, tracking, and demodulation; and detailed design of the receiver in order to determine its size, weight, power, reliability, and radiation hardness. In addition, an evaluation was made of the receiver's capabilities to perform accurate measurement of signal strength and frequency for radio science missions.

  13. Performance and techno-economic assessment of several solid-liquid separation technologies for processing dilute-acid pretreated corn stover.

    PubMed

    Sievers, David A; Tao, Ling; Schell, Daniel J

    2014-09-01

    Solid-liquid separation of pretreated lignocellulosic biomass slurries is a critical unit operation employed in several different processes for production of fuels and chemicals. An effective separation process achieves good recovery of solute (sugars) and efficient dewatering of the biomass slurry. Dilute acid pretreated corn stover slurries were subjected to pressure and vacuum filtration and basket centrifugation to evaluate the technical and economic merits of these technologies. Experimental performance results were used to perform detailed process simulations and economic analysis using a 2000 tonne/day biorefinery model to determine differences between the various filtration methods and their process settings. The filtration processes were able to successfully separate pretreated slurries into liquor and solid fractions with estimated sugar recoveries of at least 95% using a cake washing process. A continuous vacuum belt filter produced the most favorable process economics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Determinants of business sustainability: an ergonomics perspective.

    PubMed

    Genaidy, Ash M; Sequeira, Reynold; Rinder, Magda M; A-Rehim, Amal D

    2009-03-01

    There is a need to integrate both macro- and micro-ergonomic approaches for the effective implementation of interventions designed to improve the root causes of problems such as work safety, quality and productivity in the enterprise system. The objective of this study was to explore from an ergonomics perspective the concept of business sustainability through optimising the worker-work environment interface. The specific aims were: (a) to assess the working conditions of a production department work process with the goal to jointly optimise work safety, quality and quantity; (b) to evaluate the enterprise-wide work process at the system level as a social entity in an attempt to trace the root causes of ergonomic issues impacting employees throughout the work process. The Work Compatibility Model was deployed to examine the experiences of workers (that is, effort, perceived risk/benefit, performance and satisfaction/dissatisfaction or psychological impact) and their associations with the complex domains of the work environment (task content, physical and non-physical work environment and conditions for learning/growth/development). This was followed by assessment of the enterprise system through detailed interviews with department managers and lead workers. A system diagnostic instrument was also constructed from information derived from the published literature to evaluate the enterprise system performance. The investigation of the production department indicated that the stress and musculoskeletal pain experienced by workers (particularly on the day shift) were derived from sources elsewhere in the work process. The enterprise system evaluation and detailed interviews allowed the research team to chart the feed-forward and feedback stress propagation loops in the work system. System improvement strategies were extracted on the basis of tacit/explicit knowledge obtained from department managers and lead workers. In certain situations concerning workplace human performance issues, a combined macro-micro ergonomic methodology is essential to solve the productivity, quality and safety issues impacting employees along the trajectory or path of the enterprise-wide work process. In this study, the symptoms associated with human performance issues in one production department work process had root causes originating in the customer service department work process. In fact, the issues found in the customer service department caused performance problems elsewhere in the enterprise-wide work process such as the traffic department. Sustainable enterprise solutions for workplace human performance require the integration of macro- and micro-ergonomic approaches.

  15. Low void content autoclave molded titanium alloy and polyimide graphite composite structures.

    NASA Technical Reports Server (NTRS)

    Vaughan, R. W.; Jones, R. J.; Creedon, J. F.

    1972-01-01

    This paper discusses a resin developed for use in autoclave molding of polyimide graphite composite stiffened, titanium alloy structures. Both primary and secondary bonded structures were evaluated that were produced by autoclave processing. Details of composite processing, adhesive formulary, and bonding processes are provided in this paper, together with mechanical property data for structures. These data include -65 F, room temperature, and 600 F shear strengths; strength retention after aging; and stress rupture properties at 600 F under various stress levels for up to 1000 hours duration. Typically, shear strengths in excess of 16 ksi at room temperature with over 60% strength retention at 600 F were obtained with titanium alloy substrates.

  16. Prediction and Estimation of Scaffold Strength with different pore size

    NASA Astrophysics Data System (ADS)

    Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.

    2018-04-01

    This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.

  17. Green piezoelectric for autonomous smart textile

    NASA Astrophysics Data System (ADS)

    Lemaire, E.; Borsa, C. J.; Briand, D.

    2015-12-01

    In this work, the fabrication of Rochelle salt based piezoelectric textiles are shown. Structures composed of fibers and Rochelle salt are easily produced using green processes. Both manufacturing and the material itself are really efficient in terms of environmental impact, considering the fabrication processes and the material resources involved. Additionally Rochelle salt is biocompatible. In this green paradigm, active sensing or actuating textiles are developed. Thus processing method and piezoelectric properties have been studied: (1) pure crystals are used as acoustic actuator, (2) fabrication of the textile-based composite is detailed, (3) converse effective d33 is evaluated and compared to lead zirconate titanate ceramic. The utility of textile-based piezoelectric merits its use in a wide array of applications.

  18. Operationalizing strategic marketing.

    PubMed

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  19. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  20. Review of the Global Models Used Within Phase 1 of the Chemistry-Climate Model Initiative (CCMI)

    NASA Technical Reports Server (NTRS)

    Morgenstern, Olaf; Hegglin, Michaela I.; Rozanov, Eugene; O’Connor, Fiona M.; Abraham, N. Luke; Akiyoshi, Hideharu; Archibald, Alexander T.; Bekki, Slimane; Butchart, Neal; Chipperfield, Martyn P.; hide

    2017-01-01

    We present an overview of state-of-the-art chemistry-climate and chemistry transport models that are used within phase 1 of the Chemistry-Climate Model Initiative (CCMI-1). The CCMI aims to conduct a detailed evaluation of participating models using process-oriented diagnostics derived from observations in order to gain confidence in the models' projections of the stratospheric ozone layer, tropospheric composition, air quality, where applicable global climate change, and the interactions between them. Interpretation of these diagnostics requires detailed knowledge of the radiative, chemical, dynamical, and physical processes incorporated in the models. Also an understanding of the degree to which CCMI-1 recommendations for simulations have been followed is necessary to understand model responses to anthropogenic and natural forcing and also to explain inter-model differences. This becomes even more important given the ongoing development and the ever-growing complexity of these models. This paper also provides an overview of the available CCMI-1 simulations with the aim of informing CCMI data users.

  1. Experimental study of PAM-4, CAP-16, and DMT for 100 Gb/s short reach optical transmission systems.

    PubMed

    Zhong, Kangping; Zhou, Xian; Gui, Tao; Tao, Li; Gao, Yuliang; Chen, Wei; Man, Jiangwei; Zeng, Li; Lau, Alan Pak Tao; Lu, Chao

    2015-01-26

    Advanced modulation formats combined with digital signal processing and direct detection is a promising way to realize high capacity, low cost and power efficient short reach optical transmission system. In this paper, we present a detailed investigation on the performance of three advanced modulation formats for 100 Gb/s short reach transmission system. They are PAM-4, CAP-16 and DMT. The detailed digital signal processing required for each modulation format is presented. Comprehensive simulations are carried out to evaluate the performance of each modulation format in terms of received optical power, transmitter bandwidth, relative intensity noise and thermal noise. The performance of each modulation format is also experimentally studied. To the best of our knowledge, we report the first demonstration of a 112 Gb/s transmission over 10km of SSMF employing single band CAP-16 with EML. Finally, a comparison of computational complexity of DSP for the three formats is presented.

  2. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  3. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  4. Promoting a smokers' quitline in Ontario, Canada: an evaluation of an academic detailing approach.

    PubMed

    Kirst, Maritt; Schwartz, Robert

    2015-06-01

    This study assesses the impact of an academic detailing quitline promotional outreach program on integration of patient referrals to the quitline by fax in healthcare settings and quitline utilization in Ontario, Canada. The study employed a mixed methods approach for evaluation, with trend analysis of quitline administrative data from the year before program inception (2005) to 2011 and qualitative interviews with quitline stakeholders. Participants in the qualitative interviews included academic detailing program staff, regional tobacco control stakeholders and quitline promotion experts. Quantitative outcomes included the number of fax referral partners and fax referrals received, and quitline reach. Trends in proximal and distal outreach program outcomes were assessed. The qualitative data were analysed through a process of data coding involving the constant comparative technique derived from grounded theory methods. The study identified that the outreach program has had some success in integrating the fax referral program in healthcare settings through evidence of increased fax referrals since program inception. However, organizational barriers to program partner engagement have been encountered. While referral from health professionals through the fax referral programs has increased since the inception of the outreach program, the overall reach of the quitline has not increased. The study findings highlight that an academic detailing approach to quitline promotion can have some success in achieving increased fax referral program integration in healthcare settings. However, findings suggest that investment in a comprehensive promotional strategy, incorporating academic detailing, media and the provision of free cessation medications may be a more effective approach to quitline promotion. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Clostridium botulinum and the clinical laboratorian: a detailed review of botulism, including biological warfare ramifications of botulinum toxin.

    PubMed

    Caya, James G; Agni, Rashmi; Miller, Joan E

    2004-06-01

    This review article is designed to thoroughly familiarize all health care professionals with the history, classification, epidemiology, clinical characteristics, differential diagnosis, diagnostic evaluation (including laboratory-based testing), treatment, and prognosis of botulism. It is especially targeted toward clinical laboratorians and includes a detailed enumeration of the important clinical laboratory contributions to the diagnosis, treatment, and monitoring of patients with botulism. Finally, the bioterrorism potential for botulism is discussed, with an emphasis on the clinical laboratory ramifications of this possibility. Included medical periodicals and textbooks accessioned from computerized and manual medical literature searches. More than 1000 medical works published from the 1800s through 2003 were retrieved and reviewed in this process. Pertinent data are presented in textual and tabular formats, the latter including 6 tables presenting detailed information regarding the clinical parameters, differential diagnosis, diagnostic studies, laboratory testing, and therapeutic approaches to botulism. Because botulism is such a rare disease, a keen awareness of its manifestations and prompt diagnosis are absolutely crucial for its successful treatment. The bioterrorism potential of botulism adds further urgency to the need for all health care professionals to be familiar with this disease, its proper evaluation, and timely treatment; the need for such urgency clearly includes the clinical laboratory.

  6. How is genetic testing evaluated? A systematic review of the literature.

    PubMed

    Pitini, Erica; De Vito, Corrado; Marzuillo, Carolina; D'Andrea, Elvira; Rosso, Annalisa; Federici, Antonio; Di Maria, Emilio; Villari, Paolo

    2018-05-01

    Given the rapid development of genetic tests, an assessment of their benefits, risks, and limitations is crucial for public health practice. We performed a systematic review aimed at identifying and comparing the existing evaluation frameworks for genetic tests. We searched PUBMED, SCOPUS, ISI Web of Knowledge, Google Scholar, Google, and gray literature sources for any documents describing such frameworks. We identified 29 evaluation frameworks published between 2000 and 2017, mostly based on the ACCE Framework (n = 13 models), or on the HTA process (n = 6), or both (n = 2). Others refer to the Wilson and Jungner screening criteria (n = 3) or to a mixture of different criteria (n = 5). Due to the widespread use of the ACCE Framework, the most frequently used evaluation criteria are analytic and clinical validity, clinical utility and ethical, legal and social implications. Less attention is given to the context of implementation. An economic dimension is always considered, but not in great detail. Consideration of delivery models, organizational aspects, and consumer viewpoint is often lacking. A deeper analysis of such context-related evaluation dimensions may strengthen a comprehensive evaluation of genetic tests and support the decision-making process.

  7. Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro

    2016-06-01

    A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.

  8. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    DOE PAGES

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  9. Program Evaluation: Two Management-Oriented Samples

    ERIC Educational Resources Information Center

    Alford, Kenneth Ray

    2010-01-01

    Two Management-Oriented Samples details two examples of the management-oriented approach to program evaluation. Kenneth Alford, a doctorate candidate at the University of the Cumberlands, details two separate program evaluations conducted in his school district and seeks to compare and contrast the two evaluations based upon the characteristics of…

  10. Effects of agri-environmental schemes on farmland birds: do food availability measurements improve patterns obtained from simple habitat models?

    PubMed Central

    Ponce, Carlos; Bravo, Carolina; Alonso, Juan Carlos

    2014-01-01

    Studies evaluating agri-environmental schemes (AES) usually focus on responses of single species or functional groups. Analyses are generally based on simple habitat measurements but ignore food availability and other important factors. This can limit our understanding of the ultimate causes determining the reactions of birds to AES. We investigated these issues in detail and throughout the main seasons of a bird's annual cycle (mating, postfledging and wintering) in a dry cereal farmland in a Special Protection Area for farmland birds in central Spain. First, we modeled four bird response parameters (abundance, species richness, diversity and “Species of European Conservation Concern” [SPEC]-score), using detailed food availability and vegetation structure measurements (food models). Second, we fitted new models, built using only substrate composition variables (habitat models). Whereas habitat models revealed that both, fields included and not included in the AES benefited birds, food models went a step further and included seed and arthropod biomass as important predictors, respectively, in winter and during the postfledging season. The validation process showed that food models were on average 13% better (up to 20% in some variables) in predicting bird responses. However, the cost of obtaining data for food models was five times higher than for habitat models. This novel approach highlighted the importance of food availability-related causal processes involved in bird responses to AES, which remained undetected when using conventional substrate composition assessment models. Despite their higher costs, measurements of food availability add important details to interpret the reactions of the bird community to AES interventions and thus facilitate evaluating the real efficiency of AES programs. PMID:25165523

  11. Computer generated maps from digital satellite data - A case study in Florida

    NASA Technical Reports Server (NTRS)

    Arvanitis, L. G.; Reich, R. M.; Newburne, R.

    1981-01-01

    Ground cover maps are important tools to a wide array of users. Over the past three decades, much progress has been made in supplementing planimetric and topographic maps with ground cover details obtained from aerial photographs. The present investigation evaluates the feasibility of using computer maps of ground cover from satellite input tapes. Attention is given to the selection of test sites, a satellite data processing system, a multispectral image analyzer, general purpose computer-generated maps, the preliminary evaluation of computer maps, a test for areal correspondence, the preparation of overlays and acreage estimation of land cover types on the Landsat computer maps. There is every indication to suggest that digital multispectral image processing systems based on Landsat input data will play an increasingly important role in pattern recognition and mapping land cover in the years to come.

  12. Modification and fixed-point analysis of a Kalman filter for orientation estimation based on 9D inertial measurement unit data.

    PubMed

    Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger

    2013-01-01

    A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.

  13. GaAs digital dynamic IC's for applications up to 10 GHz

    NASA Astrophysics Data System (ADS)

    Rocchi, M.; Gabillard, B.

    1983-06-01

    To evaluate the potentiality of GaAs MESFET's as transmitting gates, dynamic TT-bar flip-flops have been fabricated using a self-aligned planar process. The maximum operating frequency is 10.2 GHz, which is the best speed performance ever reported for a digital circuit. The performance of the transmitting gates within the circuits are discussed in detail. Speed improvement and topological simplification of fully static LSI subsystems are investigated.

  14. Conceptual design of single turbofan engine powered light aircraft

    NASA Technical Reports Server (NTRS)

    Snyder, F. S.; Voorhees, C. G.; Heinrich, A. M.; Baisden, D. N.

    1977-01-01

    The conceptual design of a four place single turbofan engine powered light aircraft was accomplished utilizing contemporary light aircraft conventional design techniques as a means of evaluating the NASA-Ames General Aviation Synthesis Program (GASP) as a preliminary design tool. In certain areas, disagreement or exclusion were found to exist between the results of the conventional design and GASP processes. Detail discussion of these points along with the associated contemporary design methodology are presented.

  15. Managing to Payroll: An Evaluation of Local Activity Data Management

    DTIC Science & Technology

    1989-06-01

    of the long, complex formulation process from line manager input to receipt of payroll authority - serves only as a starting...information from T/ A and labor cards may be input into a locally managed data base before these cards are returned to the FIPC at the end of a pay period...support future labor mix and utilization decisions. Data from the detailed reports is manually transferred to the fourth PC. Another operator using

  16. Advanced evaporator technology progress report FY 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberlain, D.; Hutter, J.C.; Leonard, R.A.

    1995-01-01

    This report summarizes the work that was completed in FY 1992 on the program {open_quotes}Technology Development for Concentrating Process Streams.{close_quotes} The purpose of this program is to evaluate and develop evaporator technology for concentrating radioactive waste and product streams such as those generated by the TRUEX process. Concentrating these streams and minimizing the volume of waste generated can significantly reduce disposal costs; however, equipment to concentrate the streams and recycle the decontaminated condensates must be installed. LICON, Inc., is developing an evaporator that shows a great deal of potential for this application. In this report, concepts that need to bemore » incorporated into the design of an evaporator operated in a radioactive environment are discussed. These concepts include criticality safety, remote operation and maintenance, and materials of construction. Both solubility and vapor-liquid equilibrium data are needed to design an effective process for concentrating process streams. Therefore, literature surveys were completed and are summarized in this report. A model that is being developed to predict vapor phase compositions is described. A laboratory-scale evaporator was purchased and installed to study the evaporation process and to collect additional data. This unit is described in detail. Two new LICON evaporators are being designed for installation at Argonne-East in FY 1993 to process low-level radioactive waste generated throughout the laboratory. They will also provide operating data from a full-sized evaporator processing radioactive solutions. Details on these evaporators are included in this report.« less

  17. Performance evaluation of a digital mammography unit using a contrast-detail phantom

    NASA Astrophysics Data System (ADS)

    Elizalde-Cabrera, J.; Brandan, M.-E.

    2015-01-01

    The relation between image quality and mean glandular dose (MGD) has been studied for a Senographe 2000D mammographic unit used for research in our laboratory. The magnitudes were evaluated for a clinically relevant range of acrylic thicknesses and radiological techniques. The CDMAM phantom was used to determine the contrast-detail curve. Also, an alternative method based on the analysis of signal-to-noise (SNR) and contrast-to-noise (CNR) ratios from the CDMAM image was proposed and applied. A simple numerical model was utilized to successfully interpret the results. Optimum radiological techniques were determined using the figures-of-merit FOMSNR=SNR2/MGD and FOMCNR=CNR2/MGD. Main results were: the evaluation of the detector response flattening process (it reduces by about one half the spatial non-homogeneities due to the X- ray field), MGD measurements (the values comply with standards), and verification of the automatic exposure control performance (it is sensitive to fluence attenuation, not to contrast). For 4-5 cm phantom thicknesses, the optimum radiological techniques were Rh/Rh 34 kV to optimize SNR, and Rh/Rh 28 kV to optimize CNR.

  18. Developing criteria to establish Trusted Digital Repositories

    USGS Publications Warehouse

    Faundeen, John L.

    2017-01-01

    This paper details the drivers, methods, and outcomes of the U.S. Geological Survey’s quest to establish criteria by which to judge its own digital preservation resources as Trusted Digital Repositories. Drivers included recent U.S. legislation focused on data and asset management conducted by federal agencies spending $100M USD or more annually on research activities. The methods entailed seeking existing evaluation criteria from national and international organizations such as International Standards Organization (ISO), U.S. Library of Congress, and Data Seal of Approval upon which to model USGS repository evaluations. Certification, complexity, cost, and usability of existing evaluation models were key considerations. The selected evaluation method was derived to allow the repository evaluation process to be transparent, understandable, and defensible; factors that are critical for judging competing, internal units. Implementing the chosen evaluation criteria involved establishing a cross-agency, multi-disciplinary team that interfaced across the organization. 

  19. Transdisciplinary Research and Evaluation for Community Health Initiatives

    PubMed Central

    Harper, Gary W.; Neubauer, Leah C.; Bangi, Audrey K.; Francisco, Vincent T.

    2010-01-01

    Transdisciplinary research and evaluation projects provide valuable opportunities to collaborate on interventions to improve the health and well-being of individuals and communities. Given team members’ diverse backgrounds and roles or responsibilities in such projects, members’ perspectives are significant in strengthening a project’s infrastructure and improving its organizational functioning. This article presents an evaluation mechanism that allows team members to express the successes and challenges incurred throughout their involvement in a multisite transdisciplinary research project. Furthermore, their feedback is used to promote future sustainability and growth. Guided by a framework known as organizational development, the evaluative process was conducted by a neutral entity, the Quality Assurance Team. A mixed-methods approach was utilized to garner feedback and clarify how the research project goals could be achieved more effectively and efficiently. The multiple benefits gained by those involved in this evaluation and implications for utilizing transdisciplinary research and evaluation teams for health initiatives are detailed. PMID:18936267

  20. Production of Hydrogen by Superadiabatic Decomposition of Hydrogen Sulfide - Final Technical Report for the Period June 1, 1999 - September 30, 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rachid B. Slimane; Francis S. Lau; Javad Abbasian

    2000-10-01

    The objective of this program is to develop an economical process for hydrogen production, with no additional carbon dioxide emission, through the thermal decomposition of hydrogen sulfide (H{sub 2}S) in H{sub 2}S-rich waste streams to high-purity hydrogen and elemental sulfur. The novel feature of the process being developed is the superadiabatic combustion (SAC) of part of the H{sub 2}S in the waste stream to provide the thermal energy required for the decomposition reaction such that no additional energy is required. The program is divided into two phases. In Phase 1, detailed thermochemical and kinetic modeling of the SAC reactor withmore » H{sub 2}S-rich fuel gas and air/enriched air feeds is undertaken to evaluate the effects of operating conditions on exit gas products and conversion efficiency, and to identify key process parameters. Preliminary modeling results are used as a basis to conduct a thorough evaluation of SAC process design options, including reactor configuration, operating conditions, and productivity-product separation schemes, with respect to potential product yields, thermal efficiency, capital and operating costs, and reliability, ultimately leading to the preparation of a design package and cost estimate for a bench-scale reactor testing system to be assembled and tested in Phase 2 of the program. A detailed parametric testing plan was also developed for process design optimization and model verification in Phase 2. During Phase 2 of this program, IGT, UIC, and industry advisors UOP and BP Amoco will validate the SAC concept through construction of the bench-scale unit and parametric testing. The computer model developed in Phase 1 will be updated with the experimental data and used in future scale-up efforts. The process design will be refined and the cost estimate updated. Market survey and assessment will continue so that a commercial demonstration project can be identified.« less

  1. FUSE: a profit maximization approach for functional summarization of biological networks.

    PubMed

    Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes; Yu, Hanry

    2012-03-21

    The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein interaction network (PPI) using graph theoretic analysis. Despite the recent progress, systems level analysis of PPIS remains a daunting task as it is challenging to make sense out of the deluge of high-dimensional interaction data. Specifically, techniques that automatically abstract and summarize PPIS at multiple resolutions to provide high level views of its functional landscape are still lacking. We present a novel data-driven and generic algorithm called FUSE (Functional Summary Generator) that generates functional maps of a PPI at different levels of organization, from broad process-process level interactions to in-depth complex-complex level interactions, through a pro t maximization approach that exploits Minimum Description Length (MDL) principle to maximize information gain of the summary graph while satisfying the level of detail constraint. We evaluate the performance of FUSE on several real-world PPIS. We also compare FUSE to state-of-the-art graph clustering methods with GO term enrichment by constructing the biological process landscape of the PPIS. Using AD network as our case study, we further demonstrate the ability of FUSE to quickly summarize the network and identify many different processes and complexes that regulate it. Finally, we study the higher-order connectivity of the human PPI. By simultaneously evaluating interaction and annotation data, FUSE abstracts higher-order interaction maps by reducing the details of the underlying PPI to form a functional summary graph of interconnected functional clusters. Our results demonstrate its effectiveness and superiority over state-of-the-art graph clustering methods with GO term enrichment.

  2. Computational Biochemistry-Enzyme Mechanisms Explored.

    PubMed

    Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias

    2017-01-01

    Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.

  3. Residual Ductility and Microstructural Evolution in Continuous-Bending-under-Tension of AA-6022-T4

    PubMed Central

    Zecevic, Milovan; Roemer, Timothy J.; Knezevic, Marko; Korkolis, Yannis P.; Kinsey, Brad L.

    2016-01-01

    A ubiquitous experiment to characterize the formability of sheet metal is the simple tension test. Past research has shown that if the material is repeatedly bent and unbent during this test (i.e., Continuous-Bending-under-Tension, CBT), the percent elongation at failure can significantly increase. In this paper, this phenomenon is evaluated in detail for AA-6022-T4 sheets using a custom-built CBT device. In particular, the residual ductility of specimens that are subjected to CBT processing is investigated. This is achieved by subjecting a specimen to CBT processing and then creating subsize tensile test and microstructural samples from the specimens after varying numbers of CBT cycles. Interestingly, the engineering stress initially increases after CBT processing to a certain number of cycles, but then decreases with less elongation achieved for increasing numbers of CBT cycles. Additionally, a detailed microstructure and texture characterization are performed using standard scanning electron microscopy and electron backscattered diffraction imaging. The results show that the material under CBT preserves high integrity to large plastic strains due to a uniform distribution of damage formation and evolution in the material. The ability to delay ductile fracture during the CBT process to large plastic strains, results in formation of a strong <111> fiber texture throughout the material. PMID:28773257

  4. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  5. Structural evaluation of curved stiffened composite panels fabricated using a THERM-Xsm process

    NASA Technical Reports Server (NTRS)

    Kassapoglou, Christos; Dinicola, Albert J.; Chou, Jack C.; Deaton, Jerry W.

    1991-01-01

    The use of composites in aircraft structures is often limited by material and manufacturing costs which, for some designs and applications, are prohibitively high. To increase the frequency of application of composites in primary airframe components alternative manufacturing processes are sought that reduce cost and/or enhance structural efficiency. One alternative process involves the use of THERM-Xsm as the pressure transfer medium during autoclave curing. THERM-Xsm, a silicon-based flow able polymer which behaves like a liquid under autoclave presssure, transmits quasi-hydrostatic pressure to all contacting surfaces of the part to be cured. Once the autoclave pressure is relieved, THERM-Xsm reverts back to the powdery solid state and can be reused many times. The THERM-Xsm process to be evaluated is depicted and consists of (1) enclosing the tool and part to be cured by a set of frames that create a box, (2) pouring THERM-Xsm powder onto the part and filling the box, and (3) placing a vacuum bag over the box assembly. In this program, a separating non-porous film (Teflon) was placed between the part to be cured and THERM-Xsm powder to avoid any contamination. The use of THERM-Xsm has two significant advantages over conventional manufacturing procedures. First, it eliminates complicated hard tooling since it guarantees uniform pressure transfer and thus, good compaction at complex structural details (such as frame-stiffener intersections and corners). Second, it greatly simplifies vacuum bagging, since once the part to be cured is covered by THERM-Xsm powder, the vacuum bag need only conform to a relatively flat shape reducing significantly the number of pleats required. A program is on-going at Sikorsky Aircraft to evaluate the structural performance of complex composite fuselage structures made with this THERM-Xsm process and to quantify the impact of THERM-Xsm on manufacturing labor hours and cost. The program involves fuselage panel optimization analysis, a building block test program where structural details representative of the full-scale article are analyzed and tested, and static and fatigue test/analysis of the full-scale test articles. The main results of this program are reported.

  6. Canada's Deep Geological Repository for Used Nuclear Fuel - Geo-scientific Site Evaluation Process - 13117

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blyth, Alec; Ben Belfadhel, Mahrez; Hirschorn, Sarah

    2013-07-01

    The Nuclear Waste Management Organization (NWMO) is responsible for implementing Adaptive Phased Management (APM), the approach selected by the Government of Canada for long-term management of used nuclear fuel generated by Canadian nuclear reactors. The ultimate objective of APM is the centralized containment and isolation of Canada's used nuclear fuel in a Deep Geological Repository in a suitable rock formation at a depth of approximately 500 meters (m) (1,640 feet [ft]). In May 2010, the NWMO published a nine-step site selection process that serves as the road map to decision-making on the location for the deep geological repository. The safetymore » and appropriateness of any potential site will be assessed against a number of factors, both technical and social in nature. The selected site will be one that can be demonstrated to be able to safely contain and isolate used nuclear fuel, protecting humans and the environment over the very long term. The geo-scientific suitability of potential candidate sites will be assessed in a stepwise manner following a progressive and thorough site evaluation process that addresses a series of geo-scientific factors revolving around five safety functions. The geo-scientific site evaluation process includes: Initial Screenings; Preliminary Assessments; and Detailed Site Evaluations. As of November 2012, 22 communities have entered the site selection process (three in northern Saskatchewan and 18 in northwestern and southwestern Ontario). (authors)« less

  7. The Future of the Space Age or how to Evaluate Innovative Ideas

    NASA Astrophysics Data System (ADS)

    Vollerthun, A.; Fricke, E.

    2002-05-01

    Based on an initiative of the German Aerospace Industry Association to foster a more transparent and structured funding of German commercial-oriented space projects a three-phased approach is suggested in this paper, to stepwise improve and evaluate proposed concepts for space-related innovations. The objective of this concept was to develop a transparent, structured, and reproducible process to select the right innovative project in terms of political, economical, and technical objectives for funding by e.g. a governmental agency. A stepwise process and related methods, that cover technical as well as economical aspects (and related sensitivities) are proposed. Based on the special needs and requirements of space industry the proposals are compared to a set of predefined top level objectives/requirements. Using an initial trades analysis with the criteria company, technology, product, and market, an initial business case is analyzed. The alternative innovative concepts are in the third process step subject to a very detailed analysis. The full economical and technical scale of the projects is evaluated and metrics for e.g. the 'Return on Investment' or 'Break Even Point' are determined, to compare the various innovations. Risks related to time, cost, and quality are considered, when performing sensitivity analysis by varying the most important factors of the project. Before discussing critical aspects of the proposed process, space-related examples will be presented to show how the process could be applied, and how different concepts should be evaluated.

  8. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters." Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Subglacial sedimentary basin characterization of Wilkes Land, East Antarctica via applied aerogeophysical inverse methods

    NASA Astrophysics Data System (ADS)

    Frederick, B. C.; Gooch, B. T.; Richter, T.; Young, D. A.; Blankenship, D. D.; Aitken, A.; Siegert, M. J.

    2013-12-01

    Topography, sediment distribution and heat flux are all key boundary conditions governing the stability of the East Antarctic ice sheet (EAIS). Recent scientific scrutiny has been focused on several large, deep, interior EAIS basins including the submarine basal topography characterizing the Aurora Subglacial Basin (ASB). Numerical ice sheet models require accurate deformable sediment distribution and lithologic character constraints to estimate overall flow velocities and potential instability. To date, such estimates across the ASB have been derived from low-resolution satellite data or historic aerogeophysical surveys conducted prior to the advent of GPS. These rough basal condition estimates have led to poorly-constrained ice sheet stability models for this remote 200,000 sq km expanse of the ASB. Here we present a significantly improved quantitative model characterizing the subglacial lithology and sediment in the ASB region. The product of comprehensive ICECAP (2008-2013) aerogeophysical data processing, this sedimentary basin model details the expanse and thickness of probable Wilkes Land subglacial sedimentary deposits and density contrast boundaries indicative of distinct subglacial lithologic units. As part of the process, BEDMAP2 subglacial topographic results were improved through the additional incorporation of ice-penetrating radar data collected during ICECAP field seasons 2010-2013. Detailed potential field data pre-processing was completed as well as a comprehensive evaluation of crustal density contrasts based on the gravity power spectrum, a subsequent high pass data filter was also applied to remove longer crustal wavelengths from the gravity dataset prior to inversion. Gridded BEDMAP2+ ice and bed radar surfaces were then utilized to establish bounding density models for the 3D gravity inversion process to yield probable sedimentary basin anomalies. Gravity inversion results were iteratively evaluated against radar along-track RMS deviation and gravity and magnetic depth to basement results. This geophysical data processing methodology provides a substantial improvement over prior Wilkes Land sedimentary basin estimates yielding a higher resolution model based upon iteration of several aerogeophysical datasets concurrently. This more detailed subglacial sedimentary basin model for Wilkes Land, East Antarctica will not only contribute to vast improvements on EAIS ice sheet model constraints, but will also provide significant quantifiable controls for subglacial hydrologic and geothermal flux estimates that are also sizable contributors to the cold-based, deep interior basal ice dynamics dominant in the Wilkes Land region.

  10. Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home

    PubMed Central

    Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.

    2010-01-01

    PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398

  11. An Example-Based Super-Resolution Algorithm for Selfie Images

    PubMed Central

    William, Jino Hans; Venkateswaran, N.; Narayanan, Srinath; Ramachandran, Sandeep

    2016-01-01

    A selfie is typically a self-portrait captured using the front camera of a smartphone. Most state-of-the-art smartphones are equipped with a high-resolution (HR) rear camera and a low-resolution (LR) front camera. As selfies are captured by front camera with limited pixel resolution, the fine details in it are explicitly missed. This paper aims to improve the resolution of selfies by exploiting the fine details in HR images captured by rear camera using an example-based super-resolution (SR) algorithm. HR images captured by rear camera carry significant fine details and are used as an exemplar to train an optimal matrix-value regression (MVR) operator. The MVR operator serves as an image-pair priori which learns the correspondence between the LR-HR patch-pairs and is effectively used to super-resolve LR selfie images. The proposed MVR algorithm avoids vectorization of image patch-pairs and preserves image-level information during both learning and recovering process. The proposed algorithm is evaluated for its efficiency and effectiveness both qualitatively and quantitatively with other state-of-the-art SR algorithms. The results validate that the proposed algorithm is efficient as it requires less than 3 seconds to super-resolve LR selfie and is effective as it preserves sharp details without introducing any counterfeit fine details. PMID:27064500

  12. Interpreted Cooper-Harper for broader use

    NASA Technical Reports Server (NTRS)

    Green, David L.; Andrews, Hal; Gallagher, Donald W.

    1993-01-01

    The current aircraft assessment process typically makes extensive use of operational personnel during simulations and operational evaluations, with increased emphasis on evaluating the many pilot and/or operator/aircraft control loops. The need for a crew assessment in this broader arena has produced a variety of rating scales. The Cooper-Harper Rating Scale is frequently misused and routinely overlooked in the process, for these applications often extend the scale's use beyond its originally intended application. This paper agrees with the broader application of the Cooper-Harper Rating Scale and presents a concept for the development of a 'use unique' Interpreted Cooper-Harper Scale to help achieve this objective. This interpreted scale concept was conceived during efforts to support an FAA evaluation of a night vision enhancement system. It includes descriptive extensions, which are faithful to the intent of the current Cooper-Harper Scale and should provide the kind of detail that has historically been provided by trained test pilots in their explanatory comments.

  13. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  14. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  15. Comprehensive system models: Strategies for evaluation

    NASA Technical Reports Server (NTRS)

    Field, Christopher; Kutzbach, John E.; Ramanathan, V.; Maccracken, Michael C.

    1992-01-01

    The task of evaluating comprehensive earth system models is vast involving validations of every model component at every scale of organization, as well as tests of all the individual linkages. Even the most detailed evaluation of each of the component processes and the individual links among them should not, however, engender confidence in the performance of the whole. The integrated earth system is so rich with complex feedback loops, often involving components of the atmosphere, oceans, biosphere, and cryosphere, that it is certain to exhibit emergent properties very difficult to predict from the perspective of a narrow focus on any individual component of the system. Therefore, a substantial share of the task of evaluating comprehensive earth system models must reside at the level of whole system evaluations. Since complete, integrated atmosphere/ ocean/ biosphere/ hydrology models are not yet operational, questions of evaluation must be addressed at the level of the kinds of earth system processes that the models should be competent to simulate, rather than at the level of specific performance criteria. Here, we have tried to identify examples of earth system processes that are difficult to simulate with existing models and that involve a rich enough suite of feedbacks that they are unlikely to be satisfactorily described by highly simplified or toy models. Our purpose is not to specify a checklist of evaluation criteria but to introduce characteristics of the earth system that may present useful opportunities for model testing and, of course, improvement.

  16. Fabrication of High Temperature Cermet Materials for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Hickman, Robert; Panda, Binayak; Shah, Sandeep

    2005-01-01

    Processing techniques are being developed to fabricate refractory metal and ceramic cermet materials for Nuclear Thermal Propulsion (NTP). Significant advances have been made in the area of high-temperature cermet fuel processing since RoverNERVA. Cermet materials offer several advantages such as retention of fission products and fuels, thermal shock resistance, hydrogen compatibility, high conductivity, and high strength. Recent NASA h d e d research has demonstrated the net shape fabrication of W-Re-HfC and other refractory metal and ceramic components that are similar to UN/W-Re cermet fuels. This effort is focused on basic research and characterization to identify the most promising compositions and processing techniques. A particular emphasis is being placed on low cost processes to fabricate near net shape parts of practical size. Several processing methods including Vacuum Plasma Spray (VPS) and conventional PM processes are being evaluated to fabricate material property samples and components. Surrogate W-Re/ZrN cermet fuel materials are being used to develop processing techniques for both coated and uncoated ceramic particles. After process optimization, depleted uranium-based cermets will be fabricated and tested to evaluate mechanical, thermal, and hot H2 erosion properties. This paper provides details on the current results of the project.

  17. Global Environmental Data for Mapping Infectious Disease Distribution

    PubMed Central

    Hay, S.I.; Tatem, A.J.; Graham, A.J.; Goetz, S.J.; Rogers, D.J.

    2011-01-01

    This contribution documents the satellite data archives, data processing methods and temporal Fourier analysis (TFA) techniques used to create the remotely sensed datasets on the DVD distributed with this volume. The aim is to provide a detailed reference guide to the genesis of the data, rather than a standard review. These remotely sensed data cover the entire globe at either 1 × 1 or 8 × 8 km spatial resolution. We briefly evaluate the relationships between the 1 × 1 and 8 × 8 km global TFA products to explore their inter-compatibility. The 8 × 8 km TFA surfaces are used in the mapping procedures detailed in the subsequent disease mapping reviews, since the 1 × 1 km products have been validated less widely. Details are also provided on additional, current and planned sensors that should be able to provide continuity with these environmental variable surfaces, as well as other sources of global data that may be used for mapping infectious disease. PMID:16647967

  18. Identifying climate drivers of infectious disease dynamics: recent advances and challenges ahead

    PubMed Central

    Walter, Katharine S.; Wesolowski, Amy; Buckee, Caroline O.; Shevliakova, Elena; Tatem, Andrew J.; Boos, William R.; Weinberger, Daniel M.; Pitzer, Virginia E.

    2017-01-01

    Climate change is likely to profoundly modulate the burden of infectious diseases. However, attributing health impacts to a changing climate requires being able to associate changes in infectious disease incidence with the potentially complex influences of climate. This aim is further complicated by nonlinear feedbacks inherent in the dynamics of many infections, driven by the processes of immunity and transmission. Here, we detail the mechanisms by which climate drivers can shape infectious disease incidence, from direct effects on vector life history to indirect effects on human susceptibility, and detail the scope of variation available with which to probe these mechanisms. We review approaches used to evaluate and quantify associations between climate and infectious disease incidence, discuss the array of data available to tackle this question, and detail remaining challenges in understanding the implications of climate change for infectious disease incidence. We point to areas where synthesis between approaches used in climate science and infectious disease biology provide potential for progress. PMID:28814655

  19. Identifying climate drivers of infectious disease dynamics: recent advances and challenges ahead.

    PubMed

    Metcalf, C Jessica E; Walter, Katharine S; Wesolowski, Amy; Buckee, Caroline O; Shevliakova, Elena; Tatem, Andrew J; Boos, William R; Weinberger, Daniel M; Pitzer, Virginia E

    2017-08-16

    Climate change is likely to profoundly modulate the burden of infectious diseases. However, attributing health impacts to a changing climate requires being able to associate changes in infectious disease incidence with the potentially complex influences of climate. This aim is further complicated by nonlinear feedbacks inherent in the dynamics of many infections, driven by the processes of immunity and transmission. Here, we detail the mechanisms by which climate drivers can shape infectious disease incidence, from direct effects on vector life history to indirect effects on human susceptibility, and detail the scope of variation available with which to probe these mechanisms. We review approaches used to evaluate and quantify associations between climate and infectious disease incidence, discuss the array of data available to tackle this question, and detail remaining challenges in understanding the implications of climate change for infectious disease incidence. We point to areas where synthesis between approaches used in climate science and infectious disease biology provide potential for progress. © 2017 The Authors.

  20. Using Twitter for Demographic and Social Science Research: Tools for Data Collection and Processing

    PubMed Central

    McCormick, Tyler H.; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S.

    2015-01-01

    Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users—a key component of much social science research—remains a challenge. This article develops an accurate and reliable data processing approach for social science researchers interested in using Twitter data to examine behaviors and attitudes, as well as the demographic characteristics of the populations expressing or engaging in them. Using information gathered from Twitter users who state an intention to not vote in the 2012 presidential election, we describe and evaluate a method for processing data to retrieve demographic information reported by users that is not encoded as text (e.g., details of images) and evaluate the reliability of these techniques. We end by assessing the challenges of this data collection strategy and discussing how large-scale social media data may benefit demographic researchers. PMID:29033471

  1. Using Twitter for Demographic and Social Science Research: Tools for Data Collection and Processing.

    PubMed

    McCormick, Tyler H; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S

    2017-08-01

    Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users-a key component of much social science research-remains a challenge. This article develops an accurate and reliable data processing approach for social science researchers interested in using Twitter data to examine behaviors and attitudes, as well as the demographic characteristics of the populations expressing or engaging in them. Using information gathered from Twitter users who state an intention to not vote in the 2012 presidential election, we describe and evaluate a method for processing data to retrieve demographic information reported by users that is not encoded as text (e.g., details of images) and evaluate the reliability of these techniques. We end by assessing the challenges of this data collection strategy and discussing how large-scale social media data may benefit demographic researchers.

  2. TRMM Common Microphysics Products: A Tool for Evaluating Spaceborne Precipitation Retrieval Algorithms

    NASA Technical Reports Server (NTRS)

    Kingsmill, David E.; Yuter, Sandra E.; Hobbs, Peter V.; Rangno, Arthur L.; Heymsfield, Andrew J.; Stith, Jeffrey L.; Bansemer, Aaron; Haggerty, Julie A.; Korolev, Alexei V.

    2004-01-01

    A customized product for analysis of microphysics data collected from aircraft during field campaigns in support of the TRMM program is described. These Common Microphysics Products (CMP's) are designed to aid in evaluation of TRMM spaceborne precipitation retrieval algorithms. Information needed for this purpose (e.g., particle size spectra and habit, liquid and ice water content) was derived using a common processing strategy on the wide variety of microphysical instruments and raw native data formats employed in the field campaigns. The CMP's are organized into an ASCII structure to allow easy access to the data for those less familiar with and without the tools to accomplish microphysical data processing. Detailed examples of the CMP show its potential and some of its limitations. This approach may be a first step toward developing a generalized microphysics format and an associated community-oriented, non-proprietary software package for microphysics data processing, initiatives that would likely broaden community access to and use of microphysics datasets.

  3. MeetingVis: Visual Narratives to Assist in Recalling Meeting Context and Content.

    PubMed

    Shi, Yang; Bryan, Chris; Bhamidipati, Sridatt; Zhao, Ying; Zhang, Yaoxue; Ma, Kwan-Liu

    2018-06-01

    In team-based workplaces, reviewing and reflecting on the content from a previously held meeting can lead to better planning and preparation. However, ineffective meeting summaries can impair this process, especially when participants have difficulty remembering what was said and what its context was. To assist with this process, we introduce MeetingVis, a visual narrative-based approach to meeting summarization. MeetingVis is composed of two primary components: (1) a data pipeline that processes the spoken audio from a group discussion, and (2) a visual-based interface that efficiently displays the summarized content. To design MeetingVis, we create a taxonomy of relevant meeting data points, identifying salient elements to promote recall and reflection. These are mapped to an augmented storyline visualization, which combines the display of participant activities, topic evolutions, and task assignments. For evaluation, we conduct a qualitative user study with five groups. Feedback from the study indicates that MeetingVis effectively triggers the recall of subtle details from prior meetings: all study participants were able to remember new details, points, and tasks compared to an unaided, memory-only baseline. This visual-based approaches can also potentially enhance the productivity of both individuals and the whole team.

  4. The Role of the Lateral Intraparietal Area in (the Study of) Decision Making.

    PubMed

    Huk, Alexander C; Katz, Leor N; Yates, Jacob L

    2017-07-25

    Over the past two decades, neurophysiological responses in the lateral intraparietal area (LIP) have received extensive study for insight into decision making. In a parallel manner, inferred cognitive processes have enriched interpretations of LIP activity. Because of this bidirectional interplay between physiology and cognition, LIP has served as fertile ground for developing quantitative models that link neural activity with decision making. These models stand as some of the most important frameworks for linking brain and mind, and they are now mature enough to be evaluated in finer detail and integrated with other lines of investigation of LIP function. Here, we focus on the relationship between LIP responses and known sensory and motor events in perceptual decision-making tasks, as assessed by correlative and causal methods. The resulting sensorimotor-focused approach offers an account of LIP activity as a multiplexed amalgam of sensory, cognitive, and motor-related activity, with a complex and often indirect relationship to decision processes. Our data-driven focus on multiplexing (and de-multiplexing) of various response components can complement decision-focused models and provides more detailed insight into how neural signals might relate to cognitive processes such as decision making.

  5. A surety engineering framework to reduce cognitive systems risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudell, Thomas P.; Peercy, David Eugene; Caldera, Eva O.

    Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reducemore » such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.« less

  6. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  7. Entrepreneurship education: A strength-based approach to substance use and suicide prevention for American Indian adolescents.

    PubMed

    Tingey, Lauren; Larzelere-Hinton, Francene; Goklish, Novalene; Ingalls, Allison; Craft, Todd; Sprengeler, Feather; McGuire, Courtney; Barlow, Allison

    2016-01-01

    American Indian (AI) adolescents suffer the largest disparities in substance use and suicide. Predominating prevention models focus primarily on risk and utilize deficit-based approaches. The fields of substance use and suicide prevention research urge for positive youth development frameworks that are strength based and target change at individual and community levels. Entrepreneurship education is an innovative approach that reflects the gap in available programs. This paper describes the development and evaluation of a youth entrepreneurship education program in partnership with one AI community. We detail the curriculum, process evaluation results, and the randomized controlled trial evaluating its efficacy for increasing protective factors. Lessons learned may be applicable to other AI communities.

  8. Estimating costs in the economic evaluation of medical technologies.

    PubMed

    Luce, B R; Elixhauser, A

    1990-01-01

    The complexities and nuances of evaluating the costs associated with providing medical technologies are often underestimated by analysts engaged in economic evaluations. This article describes the theoretical underpinnings of cost estimation, emphasizing the importance of accounting for opportunity costs and marginal costs. The various types of costs that should be considered in an analysis are described; a listing of specific cost elements may provide a helpful guide to analysis. The process of identifying and estimating costs is detailed, and practical recommendations for handling the challenges of cost estimation are provided. The roles of sensitivity analysis and discounting are characterized, as are determinants of the types of costs to include in an analysis. Finally, common problems facing the analyst are enumerated with suggestions for managing these problems.

  9. Signal processing in urodynamics: towards high definition urethral pressure profilometry.

    PubMed

    Klünder, Mario; Sawodny, Oliver; Amend, Bastian; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Stenzl, Arnulf; Feuer, Ronny

    2016-03-22

    Urethral pressure profilometry (UPP) is used in the diagnosis of stress urinary incontinence (SUI) which is a significant medical, social, and economic problem. Low spatial pressure resolution, common occurrence of artifacts, and uncertainties in data location limit the diagnostic value of UPP. To overcome these limitations, high definition urethral pressure profilometry (HD-UPP) combining enhanced UPP hardware and signal processing algorithms has been developed. In this work, we present the different signal processing steps in HD-UPP and show experimental results from female minipigs. We use a special microtip catheter with high angular pressure resolution and an integrated inclination sensor. Signals from the catheter are filtered and time-correlated artifacts removed. A signal reconstruction algorithm processes pressure data into a detailed pressure image on the urethra's inside. Finally, the pressure distribution on the urethra's outside is calculated through deconvolution. A mathematical model of the urethra is contained in a point-spread-function (PSF) which is identified depending on geometric and material properties of the urethra. We additionally investigate the PSF's frequency response to determine the relevant frequency band for pressure information on the urinary sphincter. Experimental pressure data are spatially located and processed into high resolution pressure images. Artifacts are successfully removed from data without blurring other details. The pressure distribution on the urethra's outside is reconstructed and compared to the one on the inside. Finally, the pressure images are mapped onto the urethral geometry calculated from inclination and position data to provide an integrated image of pressure distribution, anatomical shape, and location. With its advanced sensing capabilities, the novel microtip catheter collects an unprecedented amount of urethral pressure data. Through sequential signal processing steps, physicians are provided with detailed information on the pressure distribution in and around the urethra. Therefore, HD-UPP overcomes many current limitations of conventional UPP and offers the opportunity to evaluate urethral structures, especially the sphincter, in context of the correct anatomical location. This could enable the development of focal therapy approaches in the treatment of SUI.

  10. Processing of Digital Plates1.2m of Baldone Observatory Schmidt Telescope

    NASA Astrophysics Data System (ADS)

    Eglitis, Ilgmars; Andruk, Vitaly

    2017-04-01

    The aim of this research is to evaluate accuracy of Plate Processing Method and to perform a detailed study of the Epson Expression 10000XL scanner, which was used to digitize plates from the database collection of the 1.2 m Schmidt Telescope installed in the Baldone Observatory. Special software developed in LINUX/MIDAS/ROMAFOT environment was used for processing the scans. Results of the digitized files with grey gradations of 8- and 16-bits were compared; an estimation of the accuracy of the developed method for rectangular coordinates determination and photometry was made. Errors in the instrumental system are ±0.026 pixels and ±0.024m for coordinates and stellar magnitudes respectively. To evaluate the repeatability of the scanner's astrometric and photometric errors, six consecutive scans of one plate were processed with a spatial separation of 1200 dpi. The following error estimations are obtained for stars brighter than U< 13.5m: σxy = ±0.021 to 0.027 pixels and σm = ±0.014m to 0.016m for rectangular coordinates and instrumental stellar magnitudes respectively.

  11. Evaluation of the physical process controlling beach changes adjacent to nearshore dredge pits

    USGS Publications Warehouse

    Benedet, L.; List, J.H.

    2008-01-01

    Numerical modeling of a beach nourishment project is conducted to enable a detailed evaluation of the processes associated with the effects of nearshore dredge pits on nourishment evolution and formation of erosion hot spots. A process-based numerical model, Delft3D, is used for this purpose. The analysis is based on the modification of existing bathymetry to simulate "what if" scenarios with/without the bathymetric features of interest. Borrow pits dredged about 30??years ago to provide sand for the nourishment project have a significant influence on project performance and formation of erosional hot spots. It was found that the main processes controlling beach response to these offshore bathymetric features were feedbacks between wave forces (roller force or alongshore component of the radiation stress), pressure gradients due to differentials in wave set-up/set-down and bed shear stress. Modeling results also indicated that backfilling of selected borrow sites showed a net positive effect within the beach fill limits and caused a reduction in the magnitude of hot spot erosion. ?? 2008 Elsevier B.V. All rights reserved.

  12. Research and development of low cost processes for integrated solar arrays. Final report, April 15, 1974--January 14, 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, C.D.; Kulkarni, S.; Louis, E.

    1976-05-01

    Results of a program to study process routes leading to a low cost large area integrated silicon solar array manufacture for terrestrial applications are reported. Potential processes for the production of solar-grade silicon are evaluated from thermodynamic, economic, and technical feasibility points of view. Upgrading of the present arc-furnace process is found most favorable. Experimental studies of the Si/SiF/sub 4/ transport and purification process show considerable impurity removal and reasonable transport rates. Silicon deformation experiments indicate production of silicon sheet by rolling at 1350/sup 0/C is feasible. Significant recrystallization by strain-anneal technique has been observed. Experimental recrystallization studies using anmore » electron beam line source are discussed. A maximum recrystallization velocity of approximately 9 m/hr is calculated for silicon sheet. A comparative process rating technique based on detailed cost analysis is presented.« less

  13. Investigation about the Chrome Steel Wire Arc Spray Process and the Resulting Coating Properties

    NASA Astrophysics Data System (ADS)

    Wilden, J.; Bergmann, J. P.; Jahn, S.; Knapp, S.; van Rodijnen, F.; Fischer, G.

    2007-12-01

    Nowadays, wire-arc spraying of chromium steel has gained an important market share for corrosion and wear protection applications. However, detailed studies are the basis for further process optimization. In order to optimize the process parameters and to evaluate the effects of the spray parameters DoE-based experiments had been carried out with high-speed camera shoots. In this article, the effects of spray current, voltage, and atomizing gas pressure on the particle jet properties, mean particle velocity and mean particle temperature and plume width on X46Cr13 wire are presented using an online process monitoring device. Moreover, the properties of the coatings concerning the morphology, composition and phase formation were subject of the investigations using SEM, EDX, and XRD-analysis. These deep investigations allow a defined verification of the influence of process parameters on spray plume and coating properties and are the basis for further process optimization.

  14. Community reporting of ambient air polychlorinated biphenyl concentrations near a Superfund site.

    PubMed

    Tomsho, Kathryn S; Basra, Komal; Rubin, Staci M; Miller, Claire B; Juang, Richard; Broude, Sylvia; Martinez, Andres; Hornbuckle, Keri C; Heiger-Bernays, Wendy; Scammell, Madeleine K

    2017-10-27

    In this manuscript, we describe the process of establishing partnerships for community-based environmental exposure research, the tools and methods implemented for data report-back to community members, and the results of evaluations of these efforts. Data discovery and report-back materials developed by Statistics for Action (SFA) were employed as the framework to communicate the environmental data to community members and workshops. These data communication and research translation efforts are described in detail and evaluated for effectiveness based on feedback provided from community members who attended the workshops. Overall, the methods were mostly effective for the intended data communication.

  15. Engineering visualization utilizing advanced animation

    NASA Technical Reports Server (NTRS)

    Sabionski, Gunter R.; Robinson, Thomas L., Jr.

    1989-01-01

    Engineering visualization is the use of computer graphics to depict engineering analysis and simulation in visual form from project planning through documentation. Graphics displays let engineers see data represented dynamically which permits the quick evaluation of results. The current state of graphics hardware and software generally allows the creation of two types of 3D graphics. The use of animated video as an engineering visualization tool is presented. The engineering, animation, and videography aspects of animated video production are each discussed. Specific issues include the integration of staffing expertise, hardware, software, and the various production processes. A detailed explanation of the animation process reveals the capabilities of this unique engineering visualization method. Automation of animation and video production processes are covered and future directions are proposed.

  16. Process and assembly plans for low cost commercial fuselage structure

    NASA Technical Reports Server (NTRS)

    Willden, Kurtis; Metschan, Stephen; Starkey, Val

    1991-01-01

    Cost and weight reduction for a composite structure is a result of selecting design concepts that can be built using efficient low cost manufacturing and assembly processes. Since design and manufacturing are inherently cost dependent, concurrent engineering in the form of a Design-Build Team (DBT) is essential for low cost designs. Detailed cost analysis from DBT designs and hardware verification must be performed to identify the cost drivers and relationships between design and manufacturing processes. Results from the global evaluation are used to quantitatively rank design, identify cost centers for higher ranking design concepts, define and prioritize a list of technical/economic issues and barriers, and identify parameters that control concept response. These results are then used for final design optimization.

  17. Design and cost analysis of rapid aquifer restoration systems using flow simulation and quadratic programming.

    USGS Publications Warehouse

    Lefkoff, L.J.; Gorelick, S.M.

    1986-01-01

    Detailed two-dimensional flow simulation of a complex ground-water system is combined with quadratic and linear programming to evaluate design alternatives for rapid aquifer restoration. Results show how treatment and pumping costs depend dynamically on the type of treatment process, and capacity of pumping and injection wells, and the number of wells. The design for an inexpensive treatment process minimizes pumping costs, while an expensive process results in the minimization of treatment costs. Substantial reductions in pumping costs occur with increases in injection capacity or in the number of wells. Treatment costs are reduced by expansions in pumping capacity or injecion capacity. The analysis identifies maximum pumping and injection capacities.-from Authors

  18. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    PubMed

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Parametric Evaluation of Interstellar Exploration Mission Concepts

    NASA Technical Reports Server (NTRS)

    Adams, Robert B.

    2017-01-01

    One persistent difficulty in evaluating the myriad advanced propulsion concepts proposed over the last 60 years is a true apples to apples comparison of the expected gain in performance. This analysis is complicated by numerous factors including, multiple missions of interest to the advanced propulsion community, the lack of a credible closed form solution to 'medium thrust' trajectories, and lack of detailed design data for most proposed concepts that lend credibility to engine performance estimates. This paper describes a process on how to make fair comparisons of different propulsion concepts for multiple missions over a wide range of performance values. The figure below illustrates this process. This paper describes in detail the process and outlines the status so far in compiling the required data. Parametric data for several missions are calculated and plotted against specific power-specific impulse scatter plots of expected propulsion system performance. The overlay between required performance as defined by the trajectory parametrics and expected performance as defined in the literature for major categories of propulsion systems clearly defines which propulsion systems are the most apt for a given mission. The application of the Buckingham Pi theorem to general parameters for interstellar exploration ( mission time, mass, specific impulse, specific power, distance, propulsion source energy/mass, etc.) yields a number of dimensionless variables. The relationships of these variables can then be explored before application to a particular mission. Like in the fields of fluid mechanics and heat transfer, the use of the Buckingham Pi theorem results in new variables to make apples to apples comparisons.

  20. Differential flank growth

    NASA Astrophysics Data System (ADS)

    Zieschang, H. E.; Sievers, A.

    1994-08-01

    With the mathematical basis for the precise analysis of developmental processes in plants, the patterns of growth in phototropic and gravitropic responses have become better understood. A detailed temporal and spatial quantification of a growth process is an important tool for evaluating hypotheses about the underlying physiological mechanisms. Studies of growth rates and curvature show that the original Cholodny-Went hypothesis cannot explain the complex growth patterns during tropic responses of shoots and roots. In addition, regulating factors other than the lateral redistribution of hormones must be taken into account. Electrophysiological studies on roots led to a modification of the Cholodny-Went hypothesis in that redistributions of bioelectrical activities are observed.

  1. Survey of hydrogen production and utilization methods. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Gregory, D. P.; Pangborn, J. B.; Gillis, J. C.

    1975-01-01

    The use of hydrogen as a synthetic fuel is considered. Processes for the production of hydrogen are described along with the present and future industrial uses of hydrogen as a fuel and as a chemical feedstock. Novel and unconventional hydrogen-production techniques are evaluated, with emphasis placed on thermochemical and electrolytic processes. Potential uses for hydrogen as a fuel in industrial and residential applications are identified and reviewed in the context of anticipated U.S. energy supplies and demands. A detailed plan for the period from 1975 to 1980 prepared for research on and development of hydrogen as an energy carrier is included.

  2. Accurate and efficient spin integration for particle accelerators

    DOE PAGES

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; ...

    2015-02-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations.We evaluate their performance and accuracy in quantitative detail for individual elements as well as formore » the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.« less

  3. Space station systems technology study (add-on task). Volume 2: Trade study and technology selection

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The current Space Station Systems Technology Study add on task was an outgrowth of the Advanced Platform Systems Technology Study (APSTS) that was completed in April 1983 and the subsequent Space Station System Technology Study completed in April 1984. The first APSTS proceeded from the identification of 106 technology topics to the selection of five for detailed trade studies. During the advanced platform study, the technical issues and options were evaluated through detailed trade processes, individual consideration was given to costs and benefits for the technologies identified for advancement, and advancement plans were developed. An approach similar to that was used in the subsequent study, with emphasis on system definition in four specific technology areas to facilitate a more in depth analysis of technology issues.

  4. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  5. Effort and Accuracy in Choice.

    DTIC Science & Technology

    1984-01-01

    AUTOMIS . CONTPRACT OR GANT MUNGICAfO) Eric J. Johnson John W. Payne . NOOO-14-80-C-a 114 UPCOVRIMIS ORGANIZATION NDE1 AND AGOREM If- =AtI9ET PR 1T...concerning human rationality in the absence of a detailed analysis of the sensitivity of the criterion and the cost involved in evaluating the alternatives (p...can be thought of as being part of long-term memory. Arguments for the value of production systems as a representation of human cognitive processes

  6. The Main Portal of the Cathedral of Monreale: First Geometric Analysis and Interpretive Assessment of Architectural Features

    NASA Astrophysics Data System (ADS)

    Lo Brutto, M.; Dardanelli, G.; Ebolese, D.; Milazzo, G.; Pipitone, C.; Sciortino, R.

    2017-05-01

    Nowadays, 3D documentation of architectural assets is becoming a demanding task for the valorisation of Cultural Heritage especially after a restoration project. The 3D documentation can be used for detailed analysis of specific elements, for monitoring the state of conservation and for valorisation actions. The paper describes the results of the 3D close-range photogrammetry survey of the main portal of the Cathedral of Monreale (Palermo, Italy). The Cathedral is one the most important monumental complexes in Sicily that, for its high historical and artistic importance has been inscribed on UNESCO's World Heritage List since 2015. The main portal of the Cathedral has been recently restored. The restoration work has given the opportunity to evidence small details of the sculptural decorations and to carry out new interpretative analysis of the bas-reliefs. The main purpose of the work is to obtain a detailed 3D model and a high-resolution ortophoto of the entire portal and of some architectural details. The study was used to evaluate the most appropriate technical solutions for the 3D survey and to define the most suitable parameters for image acquisition and data processing.

  7. Single photon laser altimeter simulator and statistical signal processing

    NASA Astrophysics Data System (ADS)

    Vacek, Michael; Prochazka, Ivan

    2013-05-01

    Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.

  8. Design, evaluation, and fabrication of low-cost composite blades for intermediate-size wind turbines

    NASA Technical Reports Server (NTRS)

    Weingart, O.

    1981-01-01

    Low cost approaches for production of 60 ft long glass fiber/resin composite rotor blades for the MOD-OA wind turbine were identified and evaluated. The most cost-effective configuration was selected for detailed design. Subelement and subscale specimens were fabricated for testing to confirm physical and mechanical properties of the composite blade materials, to develop and evaluate blade fabrication techniques and processes, and to confirm the structural adequacy of the root end joint. Full-scale blade tooling was constructed and a partial blade for tool and process tryout was built. Then two full scale blades were fabricated and delivered to NASA-LeRC for installation on a MOD-OA wind turbine at Clayton, New Mexico for operational testing. Each blade was 60 ft. long with 4.5 ft. chord at root end and 2575 lbs weight including metal hub adapter. The selected blade configuration was a three cell design constructed using a resin impregnated glass fiber tape winding process that allows rapid wrapping of primarily axially oriented fibers onto a tapered mandrel, with tapered wall thickness. The ring winder/transverse filament tape process combination was used for the first time on this program to produce entire rotor blade structures. This approach permitted the complete blade to be wound on stationary mandrels, an improvement which alleviated some of the tooling and process problems encountered on previous composite blade programs.

  9. Numerical analysis of tailored sheets to improve the quality of components made by SPIF

    NASA Astrophysics Data System (ADS)

    Gagliardi, Francesco; Ambrogio, Giuseppina; Cozza, Anna; Pulice, Diego; Filice, Luigino

    2018-05-01

    In this paper, the authors pointed out a study on the profitable combination of forming techniques. More in detail, the attention has been put on the combination of the single point incremental forming (SPIF) and, generally, speaking, of an additional process that can lead to a material thickening on the initial blank considering the local thinning which the sheets undergo at. Focalizing the attention of the research on the excessive thinning of parts made by SPIF, a hybrid approach can be thought as a viable solution to reduce the not homogeneous thickness distribution of the sheet. In fact, the basic idea is to work on a blank previously modified by a deformation step performed, for instance, by forming, additive or subtractive processes. To evaluate the effectiveness of this hybrid solution, a FE numerical model has been defined to analyze the thickness variation on tailored sheets incrementally formed optimizing the material distribution according to the shape to be manufactured. Simulations based on the explicit formulation have been set up for the model implementation. The mechanical properties of the sheet material have been taken in literature and a frustum of cone as benchmark profile has been considered for the performed analysis. The outcomes of numerical model have been evaluated in terms of both maximum thinning and final thickness distribution. The feasibility of the proposed approach will be deeply detailed in the paper.

  10. Drug delivery system innovation and Health Technology Assessment: Upgrading from Clinical to Technological Assessment.

    PubMed

    Panzitta, Michele; Bruno, Giorgio; Giovagnoli, Stefano; Mendicino, Francesca R; Ricci, Maurizio

    2015-11-30

    Health Technology Assessment (HTA) is a multidisciplinary health political instrument that evaluates the consequences, mainly clinical and economical, of a health care technology; the HTA aim is to produce and spread information on scientific and technological innovation for health political decision making process. Drug delivery systems (DDS), such as nanocarriers, are technologically complex but they have pivotal relevance in therapeutic innovation. The HTA process, as commonly applied to conventional drug evaluation, should upgrade to a full pharmaceutical assessment, considering the DDS complexity. This is useful to study more in depth the clinical outcome and to broaden its critical assessment toward pharmaceutical issues affecting the patient and not measured by the current clinical evidence approach. We draw out the expertise necessary to perform the pharmaceutical assessment and we propose a format to evaluate the DDS technological topics such as formulation and mechanism of action, physicochemical characteristics, manufacturing process. We integrated the above-mentioned three points in the Evidence Based Medicine approach, which is data source for any HTA process. In this regard, the introduction of a Pharmaceutics Expert figure in the HTA could be fundamental to grant a more detailed evaluation of medicine product characteristics and performances and to help optimizing DDS features to overcome R&D drawbacks. Some aspects of product development, such as manufacturing processes, should be part of the HTA as innovative manufacturing processes allow new products to reach more effectively patient bedside. HTA so upgraded may encourage resource allocating payers to invest in innovative technologies and providers to focus on innovative material properties and manufacturing processes, thus contributing to bring more medicines in therapy in a sustainable manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    PubMed

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly because a large part of the total uncertainty is dependent on depth-damage curves. Improving the estimation of these curves may provide better results in term of uncertainty reduction than the adoption of detailed hydraulic models.

  12. NASA Processes and Requirements for Conducting Human-in-the-Loop Closed Chamber Tests

    NASA Technical Reports Server (NTRS)

    Barta, Daniel J.; Montz, Michael E.

    2004-01-01

    NASA has specific processes and requirements that must be followed for tests involving human subjects to be conducted in a safe and effective manner. There are five distinct phases of test operations. Phase one, the test request phase, consists of those activities related to initiating, processing, reviewing, and evaluating the test request. Phase two, the test preparation phase consists of those activities related to planning, coordinating, documenting, and building up the test. Phase three, the test readiness phase consists of those activities related to verifying and reviewing the planned test operations. Phase four, the test activity phase, consists of all pretest operations, functional checkouts, emergency drills, and test operations. Phase five, the post test activity phase, consists of those activities performed once the test is completed, including briefings, documentation of anomalies, data reduction and archiving, and reporting. Project management processes must be followed for facility modifications and major test buildup, which include six phases: initiation and assessment, requirements evaluation, preliminary design, detailed design, use readiness review (URR) and acceptance. Compliance with requirements for safety and quality assurance are documented throughout the test buildup and test operation processes. Tests involving human subjects must be reviewed by the applicable Institutional Review Board (IRB).

  13. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Hun C.; Fang, Ho T.

    1987-01-01

    The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).

  14. Introduction of an all-electronic administrative process for a major international pediatric surgical meeting.

    PubMed

    Applebaum, Harry; Boles, Kay; Atkinson, James B

    2003-12-01

    The administrative process for annual meetings is time consuming and increasingly costly when accomplished by traditional postal, fax, and telephone methods. The Pacific Association of Pediatric Surgeons introduced an all-electronic communication format for its 2002 annual meeting. Attendee acceptance and administrative and financial impact were evaluated. Interested physicians were directed to a Website containing detailed information and electronic forms. E-mail was used for the abstract selection and manuscript submission processes. Attendees were surveyed to evaluate the new format. Administrative costs for the new format were compared with estimated costs for a comparable traditionally managed meeting. Attendance was similar to that at previous US meetings. Eighty-two percent of respondents approved of the all-electronic format, although 48% believed a choice should remain. None suggested a complete return to the traditional format. Abstract and manuscript processing time was reduced substantially as were administrative costs (79.43 dollars savings per physician registrant). Adoption of an all-electronic annual meeting administrative process was associated with substantial cost reduction, increased efficiency, and excellent attendee satisfaction. This technology can help avoid increased registration fees while easing the burden on physician volunteers.

  15. Criteria-based evaluation of group 3 level memory telefacsimile equipment for interlibrary loan.

    PubMed Central

    Bennett, V M; Wood, M S; Malcom, D L

    1990-01-01

    The Interlibrary Loan, Document Delivery, and Union List Task Force of the Health Sciences Libraries Consortium (HSLC)--with nineteen libraries located in Philadelphia, Pittsburgh, and Hershey, Pennsylvania, and Delaware--accepted the charge of evaluating and recommending for purchase telefacsimile hardware to further interlibrary loan among HSLC members. To allow a thorough and scientific evaluation of group 3 level telefacsimile equipment, the task force identified ninety-six hardware features, which were grouped into nine broad criteria. These features formed the basis of a weighted analysis that identified three final candidates, with one model recommended to the HSLC board. This article details each of the criteria and discusses features in terms of library applications. The evaluation grid developed in the weighted analysis process should aid librarians charged with the selection of level 3 telefacsimile equipment. PMID:2328361

  16. Benchmark matrix and guide: Part III.

    PubMed

    1992-01-01

    The final article in the "Benchmark Matrix and Guide" series developed by Headquarters Air Force Logistics Command completes the discussion of the last three categories that are essential ingredients of a successful total quality management (TQM) program. Detailed behavioral objectives are listed in the areas of recognition, process improvement, and customer focus. These vertical categories are meant to be applied to the levels of the matrix that define the progressive stages of the TQM: business as usual, initiation, implementation, expansion, and integration. By charting the horizontal progress level and the vertical TQM category, the quality management professional can evaluate the current state of TQM in any given organization. As each category is completed, new goals can be defined in order to advance to a higher level. The benchmarking process is integral to quality improvement efforts because it focuses on the highest possible standards to evaluate quality programs.

  17. Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering

    NASA Astrophysics Data System (ADS)

    Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.

    2015-01-01

    Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.

  18. Extended performance solar electric propulsion thrust system study. Volume 4: Thruster technology evaluation

    NASA Technical Reports Server (NTRS)

    Poeschel, R. L.; Hawthorne, E. I.; Weisman, Y. C.; Frisman, M.; Benson, G. C.; Mcgrath, R. J.; Martinelli, R. M.; Linsenbardt, T. L.; Beattie, J. R.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. Emphasis was placed on relatively high power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentrator solar array concept and is designed to interface with the Space Shuttle.

  19. Feature and contrast enhancement of mammographic image based on multiscale analysis and morphology.

    PubMed

    Wu, Shibin; Yu, Shaode; Yang, Yuhan; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII).

  20. Feature and Contrast Enhancement of Mammographic Image Based on Multiscale Analysis and Morphology

    PubMed Central

    Wu, Shibin; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII). PMID:24416072

  1. Evaluation of the applicability of the dual‐domain mass transfer model in porous media containing connected high‐conductivity channels

    USGS Publications Warehouse

    Liu, Gaisheng; Zheng, Chunmiao; Gorelick, Steven M.

    2007-01-01

    This paper evaluates the dual‐domain mass transfer (DDMT) model to represent transport processes when small‐scale high‐conductivity (K) preferential flow paths (PFPs) are present in a homogenous porous media matrix. The effects of PFPs upon solute transport were examined through detailed numerical experiments involving different realizations of PFP networks, PFP/matrix conductivity contrasts varying from 10:1 to 200:1, different magnitudes of effective conductivities, and a range of molecular diffusion coefficients. Results suggest that the DDMT model can reproduce both the near‐source peak and the downstream low‐concentration spreading observed in the embedded dendritic network when there are large conductivity contrasts between high‐K PFPs and the low‐K matrix. The accuracy of the DDMT model is also affected by the geometry of PFP networks and by the relative significance of the diffusion process in the network‐matrix system.

  2. Processing and Recall of Seductive Details in Scientific Text

    ERIC Educational Resources Information Center

    Lehman, Stephen; Schraw, Gregory; McCrudden, Matthew T.; Hartley, Kendall

    2007-01-01

    This study examined how seductive details affect on-line processing of a technical, scientific text. In Experiment 1, each sentence from the experimental text was rated for interest and importance. Participants rated seductive details as being more interesting but less important than main ideas. In Experiment 2, we examined the effect of seductive…

  3. Ethnographic process evaluation in primary care: explaining the complexity of implementation.

    PubMed

    Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine

    2014-12-05

    The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.

  4. Planetary Defense: Options for Deflection of Near Earth Objects

    NASA Technical Reports Server (NTRS)

    Adams, R. B.; Statham, G.; Hopkins, R.; Chapman, J.; White, S.; Bonometti, J.; Alexander, R.; Fincher, S.; Polsgrove, T.; Kalkstein, M.

    2003-01-01

    Several recent near-miss encounters with asteroids and comets have focused attention on the threat of a catastrophic impact with the Earth. This document reviews the historical impact record and current understanding of the number and location of Near Earth Objects (NEO's) to address their impact probability. Various ongoing projects intended to survey and catalog the NEO population are also reviewed. Details are then given of an MSFC-led study, intended to develop and assess various candidate systems for protection of the Earth against NEOs. An existing program, used to model the NE0 threat, was extensively modified and is presented here. Details of various analytical tools, developed to evaluate the performance of proposed technologies for protection against the NEO threat, are also presented. Trajectory tools, developed to model the outbound path a vehicle would take to intercept or rendezvous with a target asteroid or comet, are described. Also, details are given of a tool that was created to model both the un-deflected inbound path of an NE0 as well as the modified, post-deflection, path. The number of possible options available for protection against the NE0 threat was too numerous for them to all be addressed within the study; instead, a representative selection were modeled and evaluated. The major output from this work was a novel process by which the relative effectiveness of different threat mitigation concepts can be evaluated during future, more detailed, studies. In addition, several new or modified mathematical models were developed to analyze various proposed protection systems. A summary of the major lessons learned during this study is presented, as are recommendations for future work. It is hoped that this study will serve to raise the level attention about this very real threat and also demonstrate that successful defense is both possible and practicable, provided appropriate steps are taken.

  5. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Although the friction stir scribemore » process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  6. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE PAGES

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah; ...

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of impartially joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Finally, although the frictionmore » stir scribe process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  7. Scale-up of mild gasification to be a process development unit mildgas 24 ton/day PDU design report. Final report, November 1991--July 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    From November 1991 to April 1996, Kerr McGee Coal Corporation (K-M Coal) led a project to develop the Institute of Gas Technology (IGT) Mild Gasification (MILDGAS) process for near-term commercialization. The specific objectives of the program were to: design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for further design scale-up; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. The project team for the PDU development program consisted of: K-M Coal,more » IGT, Bechtel Corporation, Southern Illinois University at Carbondale (SIUC), General Motors (GM), Pellet Technology Corporation (PTC), LTV Steel, Armco Steel, Reilly Industries, and Auto Research.« less

  8. Evaluation of microbial diversity in the pilot-scale beer brewing process by culture-dependent and culture-independent method.

    PubMed

    Takahashi, M; Kita, Y; Kusaka, K; Mizuno, A; Goto-Yamamoto, N

    2015-02-01

    In the brewing industry, microbial management is very important for stabilizing the quality of the product. We investigated the detailed microbial community of beer during fermentation and maturation, to manage beer microbiology in more detail. We brewed a beer (all-malt) and two beerlike beverages (half- and low-malt) in pilot-scale fermentation and investigated the microbial community of them using a next-generation sequencer (454 GS FLX titanium), quantitative PCR, flow cytometry and a culture-dependent method. From 28 to 88 genera of bacteria and from 9 to 38 genera of eukaryotic micro-organisms were detected in each sample. Almost all micro-organisms died out during the boiling process. However, bacteria belonging to the genera Acidovorax, Bacillus, Brevundimonas, Caulobacter, Chryseobacterium, Methylobacterium, Paenibacillus, Polaromonas, Pseudomonas, Ralstonia, Sphingomonas, Stenotrophomonas, Tepidimonas and Tissierella were detected at the early and middle stage of fermentation, even though their cell densities were low (below approx. 10(3) cells ml(-1) ) and they were not almost detected at the end of fermentation. We revealed that the microbial community of beer during fermentation and maturation is very diverse and several bacteria possibly survive during fermentation. In this study, we revealed the detailed microbial communities of beer using next-generation sequencing. Some of the micro-organisms detected in this study were found in beer brewing process for the first time. Additionally, the possibility of growth of several bacteria at the early and middle stage of fermentation was suggested. © 2014 The Society for Applied Microbiology.

  9. Absorbable energy monitoring scheme: new design protocol to test vehicle structural crashworthiness.

    PubMed

    Ofochebe, Sunday M; Enibe, Samuel O; Ozoegwu, Chigbogu G

    2016-05-01

    In vehicle crashworthiness design optimization detailed system evaluation capable of producing reliable results are basically achieved through high-order numerical computational (HNC) models such as the dynamic finite element model, mesh-free model etc. However the application of these models especially during optimization studies is basically challenged by their inherent high demand on computational resources, conditional stability of the solution process, and lack of knowledge of viable parameter range for detailed optimization studies. The absorbable energy monitoring scheme (AEMS) presented in this paper suggests a new design protocol that attempts to overcome such problems in evaluation of vehicle structure for crashworthiness. The implementation of the AEMS involves studying crash performance of vehicle components at various absorbable energy ratios based on a 2DOF lumped-mass-spring (LMS) vehicle impact model. This allows for prompt prediction of useful parameter values in a given design problem. The application of the classical one-dimensional LMS model in vehicle crash analysis is further improved in the present work by developing a critical load matching criterion which allows for quantitative interpretation of the results of the abstract model in a typical vehicle crash design. The adequacy of the proposed AEMS for preliminary vehicle crashworthiness design is demonstrated in this paper, however its extension to full-scale design-optimization problem involving full vehicle model that shows greater structural detail requires more theoretical development.

  10. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  11. SOUTH ELEVATION AND DETAILS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH ELEVATION AND DETAILS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103082. ALTERNATE ID NUMBER 542-12-B-76. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  12. BUILDING DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    BUILDING DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103080. ALTERNATE ID NUMBER 542-11-B-74. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  13. STRUCTURAL DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    STRUCTURAL DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103079. ALTERNATE ID NUMBER 542-11-B-73. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  14. The Devil is in the Concepts: Lessons Learned from World War II Planning Staffs for Transitioning from Conceptual to Detailed Planning

    DTIC Science & Technology

    2017-05-25

    the planning process. Current US Army doctrine links conceptual planning to the Army Design Methodology and detailed planning to the Military...Decision Making Process. By associating conceptual and detailed planning with doctrinal methodologies , it is easy to regard the transition as a set period...plans into detailed directives resulting in changes to the operational environment. 15. SUBJECT TERMS Design; Army Design Methodology ; Conceptual

  15. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  16. Optical coherence tomography angiography monitors human cutaneous wound healing over time.

    PubMed

    Deegan, Anthony J; Wang, Wendy; Men, Shaojie; Li, Yuandong; Song, Shaozhen; Xu, Jingjiang; Wang, Ruikang K

    2018-03-01

    In vivo imaging of the complex cascade of events known to be pivotal elements in the healing of cutaneous wounds is a difficult but essential task. Current techniques are highly invasive, or lack the level of vascular and structural detail required for accurate evaluation, monitoring and treatment. We aimed to use an advanced optical coherence tomography (OCT)-based angiography (OCTA) technique for the non-invasive, high resolution imaging of cutaneous wound healing. We used a clinical prototype OCTA to image, identify and track key vascular and structural adaptations known to occur throughout the healing process. Specific vascular parameters, such as diameter and density, were measured to aid our interpretations under a spatiotemporal framework. We identified multiple distinct, yet overlapping stages, hemostasis, inflammation, proliferation, and remodeling, and demonstrated the detailed vascularization and anatomical attributes underlying the multifactorial processes of dermatologic wound healing. OCTA provides an opportunity to both qualitatively and quantitatively assess the vascular response to acute cutaneous damage and in the future, may help to ascertain wound severity and possible healing outcomes; thus, enabling more effective treatment options.

  17. Optical coherence tomography angiography monitors human cutaneous wound healing over time

    PubMed Central

    Deegan, Anthony J.; Wang, Wendy; Men, Shaojie; Li, Yuandong; Song, Shaozhen; Xu, Jingjiang

    2018-01-01

    Background In vivo imaging of the complex cascade of events known to be pivotal elements in the healing of cutaneous wounds is a difficult but essential task. Current techniques are highly invasive, or lack the level of vascular and structural detail required for accurate evaluation, monitoring and treatment. We aimed to use an advanced optical coherence tomography (OCT)-based angiography (OCTA) technique for the non-invasive, high resolution imaging of cutaneous wound healing. Methods We used a clinical prototype OCTA to image, identify and track key vascular and structural adaptations known to occur throughout the healing process. Specific vascular parameters, such as diameter and density, were measured to aid our interpretations under a spatiotemporal framework. Results We identified multiple distinct, yet overlapping stages, hemostasis, inflammation, proliferation, and remodeling, and demonstrated the detailed vascularization and anatomical attributes underlying the multifactorial processes of dermatologic wound healing. Conclusions OCTA provides an opportunity to both qualitatively and quantitatively assess the vascular response to acute cutaneous damage and in the future, may help to ascertain wound severity and possible healing outcomes; thus, enabling more effective treatment options. PMID:29675355

  18. Radio-guided sentinel lymph node identification by lymphoscintigraphy fused with an anatomical vector profile: clinical applications.

    PubMed

    Niccoli Asabella, A; Antonica, F; Renna, M A; Rubini, D; Notaristefano, A; Nicoletti, A; Rubini, G

    2013-12-01

    To develop a method to fuse lymphoscintigraphic images with an adaptable anatomical vector profile and to evaluate its role in the clinical practice. We used Adobe Illustrator CS6 to create different vector profiles, we fused those profiles, using Adobe Photoshop CS6, with lymphoscintigraphic images of the patient. We processed 197 lymphoscintigraphies performed in patients with cutaneous melanomas, breast cancer or delayed lymph drainage. Our models can be adapted to every patient attitude or position and contain different levels of anatomical details ranging from external body profiles to the internal anatomical structures like bones, muscles, vessels, and lymph nodes. If needed, more new anatomical details can be added and embedded in the profile without redrawing them, saving a lot of time. Details can also be easily hidden, allowing the physician to view only relevant information and structures. Fusion times are about 85 s. The diagnostic confidence of the observers increased significantly. The validation process showed a slight shift (mean 4.9 mm). We have created a new, practical, inexpensive digital technique based on commercial software for fusing lymphoscintigraphic images with built-in anatomical reference profiles. It is easily reproducible and does not alter the original scintigraphic image. Our method allows a more meaningful interpretation of lymphoscintigraphies, an easier recognition of the anatomical site and better lymph node dissection planning.

  19. Evaluation of biofouling in stainless microfluidic channels for implantable multilayered dialysis device

    NASA Astrophysics Data System (ADS)

    Ota, Takashi; To, Naoya; Kanno, Yoshihiko; Miki, Norihisa

    2017-06-01

    An implantable artificial kidney can markedly improve the quality of life of renal disease patients. Our group has developed an implantable multilayered dialysis system consisting of microfluidic channels and dialysis membranes. Long-term evaluation is necessary for implant devices where biofouling is a critical factor, culminating in the deterioration of dialysis performance. Our previous work revealed that surface conditions, which depend on the manufacturing process, determine the amount of biofouling, and that electrolytic etching is the most suitable technique for forming a channel wall free of biofouling. In this study, we investigated the electrolytic etching conditions in detail. We conducted in vitro experiments for 7 d and evaluated the adhesion of biomaterials by scanning electron microscopy. The experiments revealed that a surface mirror-finished by electrolytic etching effectively prevents biofouling.

  20. Capacity of clinical pathways--a strategic multi-level evaluation tool.

    PubMed

    Cardoen, Brecht; Demeulemeester, Erik

    2008-12-01

    In this paper we strategically evaluate the efficiency of clinical pathways and their complex interdependencies with respect to joint resource usage and patient throughput. We propose a discrete-event simulation approach that allows for the simultaneous evaluation of multiple clinical pathways and the inherent uncertainty (resource, duration and arrival) that accompanies medical processes. Both the consultation suite and the surgery suite may be modeled and examined in detail by means of sensitivity or scenario analyses. Since each medical facility can somehow be represented as a combination of clinical pathways, i.e. they are conceptually similar, the simulation model is generic in nature. Next to the formulation of the model, we illustrate its applicability by means of a case study that was conducted in a Belgian hospital.

  1. Exposure pathway evaluations for sites that processed asbestos-contaminated vermiculite.

    PubMed

    Anderson, Barbara A; Dearwent, Steve M; Durant, James T; Dyken, Jill J; Freed, Jennifer A; Moore, Susan McAfee; Wheeler, John S

    2005-01-01

    The Agency for Toxic Substances and Disease Registry (ATSDR) is currently evaluating the potential public health impacts associated with the processing of asbestos-contaminated vermiculite at various facilities around the country. Vermiculite ore contaminated with significant levels of asbestos was mined and milled in Libby, Montana, from the early 1920s until 1990. The majority of the Libby ore was then shipped to processing facilities for exfoliation. ATSDR initiated the National Asbestos Exposure Review (NAER) to identify and evaluate exposure pathways associated with these processing facilities. This manuscript details ATSDR's phased approach in addressing exposure potential around these sites. As this is an ongoing project, only the results from a selected set of completed site analyses are presented. Historical occupational exposures are the most significant exposure pathway for the site evaluations completed to date. Former workers also probably brought asbestos fibers home on their clothing, shoes, and hair, and their household contacts may have been exposed. Currently, most site-related worker and community exposure pathways have been eliminated. One community exposure pathway of indeterminate significance is the current exposure of individuals through direct contact with waste rock brought home for personal use as fill material, driveway surfacing, or soil amendment. Trace levels of asbestos are present in soil at many of the sites and buried waste rock has been discovered at a few sites; therefore, future worker and community exposure associated with disturbing on-site soil during construction or redevelopment at these sites is also a potential exposure pathway.

  2. Heuristic Evaluation of Ehealth Interventions: Establishing Standards That Relate to the Therapeutic Process Perspective

    PubMed Central

    Muench, Fred

    2016-01-01

    In recent years, the number of available eHealth interventions aimed at treating behavioral and mental health challenges has been growing. From the perspective of health care providers, there is a need for eHealth interventions to be evaluated prior to clinical trials and for the limited resources allocated to empirical research to be invested in the most promising products. Following a literature review, a gap was found in the availability of eHealth interventions evaluation principles related to the patient experience of the therapeutic process. This paper introduces principles and concepts for the evaluation of eHealth interventions developed as a first step in a process to outline general evaluation guidelines that relate to the clinical context from health care providers’ perspective. Our approach was to conduct a review of literature that relates to the examination of eHealth interventions. We identified the literature that was most relevant to our study and used it to define guidelines that relate to the clinical context. We then compiled a list of heuristics we found to be useful for the evaluation of eHealth intervention products’ suitability for empirical examination. Four heuristics were identified with respect to the therapeutic process: (1) the product’s ease of use (ie, usability), (2) the eHealth intervention’s compatibility with the clinical setting, (3) the presence of tools that make it easier for the user to engage in therapeutic activities, and (4) the provision of a feasible therapeutic pathway to growth. We then used this set of heuristics to conduct a detailed examination of MyFitnessPal. This line of work could help to set the bar higher for product developers and to inform health care providers about preferred eHealth intervention designs. PMID:26764209

  3. Heuristic Evaluation of Ehealth Interventions: Establishing Standards That Relate to the Therapeutic Process Perspective.

    PubMed

    Baumel, Amit; Muench, Fred

    2016-01-13

    In recent years, the number of available eHealth interventions aimed at treating behavioral and mental health challenges has been growing. From the perspective of health care providers, there is a need for eHealth interventions to be evaluated prior to clinical trials and for the limited resources allocated to empirical research to be invested in the most promising products. Following a literature review, a gap was found in the availability of eHealth interventions evaluation principles related to the patient experience of the therapeutic process. This paper introduces principles and concepts for the evaluation of eHealth interventions developed as a first step in a process to outline general evaluation guidelines that relate to the clinical context from health care providers' perspective. Our approach was to conduct a review of literature that relates to the examination of eHealth interventions. We identified the literature that was most relevant to our study and used it to define guidelines that relate to the clinical context. We then compiled a list of heuristics we found to be useful for the evaluation of eHealth intervention products' suitability for empirical examination. Four heuristics were identified with respect to the therapeutic process: (1) the product's ease of use (ie, usability), (2) the eHealth intervention's compatibility with the clinical setting, (3) the presence of tools that make it easier for the user to engage in therapeutic activities, and (4) the provision of a feasible therapeutic pathway to growth. We then used this set of heuristics to conduct a detailed examination of MyFitnessPal. This line of work could help to set the bar higher for product developers and to inform health care providers about preferred eHealth intervention designs.

  4. Analysis and modeling of leakage current sensor under pulsating direct current

    NASA Astrophysics Data System (ADS)

    Li, Kui; Dai, Yihua; Wang, Yao; Niu, Feng; Chen, Zhao; Huang, Shaopo

    2017-05-01

    In this paper, the transformation characteristics of current sensor under pulsating DC leakage current is investigated. The mathematical model of current sensor is proposed to accurately describe the secondary side current and excitation current. The transformation process of current sensor is illustrated in details and the transformation error is analyzed from multi aspects. A simulation model is built and a sensor prototype is designed to conduct comparative evaluation, and both simulation and experimental results are presented to verify the correctness of theoretical analysis.

  5. Safety assessment, detection and traceability, and societal aspects of genetically modified foods. European Network on Safety Assessment of Genetically Modified Food Crops (ENTRANSFOOD). Concluding remarks.

    PubMed

    Kuiper, H A; König, A; Kleter, G A; Hammes, W P; Knudsen, I

    2004-07-01

    The most important results from the EU-sponsored ENTRANSFOOD Thematic Network project are reviewed, including the design of a detailed step-wise procedure for the risk assessment of foods derived from genetically modified crops based on the latest scientific developments, evaluation of topical risk assessment issues, and the formulation of proposals for improved risk management and public involvement in the risk analysis process. Copyright 2004 Elsevier Ltd.

  6. A Proposal for an Australian Hydrodynamics Laboratory.

    DTIC Science & Technology

    1981-05-01

    SPAIN Cau~al De CDRV C 0.9x0.9 4.7 11.0 1.6 225 yes Experiencias ijidrodinainica Madrid SWEDEN Swedish State C,R,V C 0.5x0.5 2.2 11 2 atm 53 yes...results of research on topics related to ship design. Currently, detailed design informa- tion on ship hydrodynamics available to Naval Architects in... relating to the introduction, continuous up- dating, and improvement of instrumentation and data acquisition and processing systems. 2. The evaluation and

  7. Status and Trend of Automotive Power Packaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Zhenxian

    2012-01-01

    Comprehensive requirements in aspects of cost, reliability, efficiency, form factor, weight, and volume for power electronics modules in modern electric drive vehicles have driven the development of automotive power packaging technology intensively. Innovation in materials, interconnections, and processing techniques is leading to enormous improvements in power modules. In this paper, the technical development of and trends in power module packaging are evaluated by examining technical details with examples of industrial products. The issues and development directions for future automotive power module packaging are also discussed.

  8. Fabrication of fuel pin assemblies, phase 3

    NASA Technical Reports Server (NTRS)

    Keeton, A. R.; Stemann, L. G.

    1972-01-01

    Five full size and eight reduced length fuel pins were fabricated for irradiation testing to evaluate design concepts for a fast spectrum lithium cooled compact space power reactor. These assemblies consisted of uranium mononitride fuel pellets encased in a T-111 (Ta-8W-2Hf) clad with a tungsten barrier separating fuel and clad. Fabrication procedures were fully qualified by process development and assembly qualification tests. Detailed specifications and procedures were written for the fabrication and assembly of prototype fuel pins.

  9. Yaw rate control of an air bearing vehicle

    NASA Technical Reports Server (NTRS)

    Walcott, Bruce L.

    1989-01-01

    The results of a 6 week project which focused on the problem of controlling the yaw (rotational) rate the air bearing vehicle used on NASA's flat floor facility are summarized. Contained within is a listing of the equipment available for task completion and an evaluation of the suitability of this equipment. The identification (modeling) process of the air bearing vehicle is detailed as well as the subsequent closed-loop control strategy. The effectiveness of the solution is discussed and further recommendations are included.

  10. Three-dimensional laser window formation

    NASA Technical Reports Server (NTRS)

    Verhoff, Vincent G.

    1992-01-01

    The NASA Lewis Research Center has developed and implemented a unique process for forming flawless three-dimensional laser windows. These windows represent a major part of specialized, nonintrusive laser data acquisition systems used in a variety of compressor and turbine research test facilities. This report discusses in detail the aspects of three-dimensional laser window formation. It focuses on the unique methodology and the peculiarities associated with the formation of these windows. Included in this discussion are the design criteria, bonding mediums, and evaluation testing for three-dimensional laser windows.

  11. A computational reinvestigation of the formation of N-alkylpyrroles via intermolecular redox amination.

    PubMed

    Xue, Xiaosong; Yu, Ao; Cai, Yu; Cheng, Jin-Pei

    2011-11-18

    A detailed mechanism of N-alkylpyrrole formation from 3-pyrroline and 2-phenylpropanal in the presence of a Brønsted acid catalyst was investigated in depth using the MP2 and DFT theories. The two mechanisms proposed earlier in recent literatures for this internal redox process were evaluated and were found not to account perfectly for the transition state and the energetic barrier of its formation. Based on the present calculations, a new mechanism was put forth.

  12. RCRA, superfund and EPCRA hotline training module. Introduction to: Hazardous waste identification (40 cfr part 261) updated July 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-07-01

    The module introduces a specific hazardous waste identification process, which involves asking and analyzing a series of questions about any waste being evaluated. It analyzes in detail the Resource Conservation and Recovery Act (RCRA) definition of `hazardous waste.` It explains concepts that are essential to identifying a RCRA hazardous waste: hazardous waste listing, hazardous waste characteristics, the `mixture` and `derived-from` rules, the `contained-in` policy, and the hazardous waste identification rules (HWIR).

  13. Applications of advanced transport aircraft in developing countries

    NASA Technical Reports Server (NTRS)

    Gobetz, F. W.; Assarabowski, R. J.; Leshane, A. A.

    1978-01-01

    Four representative market scenarios were studied to evaluate the relative performance of air-and surface-based transportation systems in meeting the needs of two developing contries, Brazil and Indonesia, which were selected for detailed case studies. The market scenarios were: remote mining, low-density transport, tropical forestry, and large cargo aircraft serving processing centers in resource-rich, remote areas. The long-term potential of various aircraft types, together with fleet requirements and necessary technology advances, is determined for each application.

  14. [Current topics on cancer biology and research strategies for anti-cancer traditional Chinese medicine].

    PubMed

    Chen, Xiu-ping; Tang, Zheng-hai; Shi, Zhe; Lu, Jin-jian; Su, Huan-xing; Chen, Xin; Wang, Yi-tao

    2015-09-01

    Cancer, an abnormal cell proliferation resulted from multi-factors,has the highest morbidity and mortality among all the serious diseases. Considerable progress has been made in cancer biology in recent years. Tumor immunology, cancer stem cells (CSCs), autophagy, and epithelial-mesenchymal transition (EMT) have become hot topics of interests in this area. Detailed dissection of these biological processes will provide novel directions, targets, and strategies for the pharmacological evaluation, mechanism elucidation, and new drug development of traditional Chinese medicine.

  15. PLAN SECTIONS AND DETAILS OF CELL HATCHES MAIN PROCESSING BUILDING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PLAN SECTIONS AND DETAILS OF CELL HATCHES MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103256. ALTERNATE ID NUMBER 542-11-F-302. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  16. Appropriate prescribing in nursing homes demonstration project (APDP) study protocol: pragmatic, cluster-randomized trial and mixed methods process evaluation of an Ontario policy-maker initiative to improve appropriate prescribing of antipsychotics.

    PubMed

    Desveaux, Laura; Gomes, Tara; Tadrous, Mina; Jeffs, Lianne; Taljaard, Monica; Rogers, Jess; Bell, Chaim M; Ivers, Noah M

    2016-03-29

    Antipsychotic medications are routinely prescribed in nursing homes to address the behavioral and psychological symptoms of dementia. Unfortunately, inappropriate prescribing of antipsychotic medications is common and associated with increased morbidity, adverse drug events, and hospitalizations. Multifaceted interventions can achieve a 12-20 % reduction in antipsychotic prescribing levels in nursing homes. Effective interventions have featured educational outreach and ongoing performance feedback. This pragmatic, cluster-randomized control trial and embedded process evaluation seeks to determine the effect of adding academic detailing to audit and feedback on prescribing of antipsychotic medications in nursing homes, compared with audit and feedback alone. Nursing homes within pre-determined regions of Ontario, Canada, are eligible if they express an interest in the intervention. The academic detailing intervention will be delivered by registered health professionals following an intensive training program including relevant clinical issues and techniques to support health professional behavior change. Physicians in both groups will have the opportunity to access confidential reports summarizing their prescribing patterns for antipsychotics in comparison to the local and provincial average. Participating homes will be allocated to one of the two arms of the study (active/full intervention versus standard audit and feedback) in two waves, with a 2:1 allocation ratio. Homes will be randomized after stratifying for geography, baseline antipsychotic prescription rates, and size, to ensure a balance of characteristics. The primary outcome is antipsychotic dispensing in nursing homes, measured 6 months after allocation; secondary outcomes include clinical outcomes and healthcare utilization. Policy-makers and the public have taken note that antipsychotics are used in nursing homes in Ontario far more than other jurisdictions. Academic detailing can be an effective technique to address challenges in appropriate prescribing in nursing homes, but effect sizes vary widely. This opportunistic, policy-driven evaluation, embedded within a government-initiated demonstration project, was designed to ensure policy-makers receive the best evidence possible regarding whether and how to scale up the intervention. ClinicalTrials.gov NLM Identifier: NCT02604056 .

  17. Evaluating the effectiveness of a practical inquiry-based learning bioinformatics module on undergraduate student engagement and applied skills.

    PubMed

    Brown, James A L

    2016-05-06

    A pedagogic intervention, in the form of an inquiry-based peer-assisted learning project (as a practical student-led bioinformatics module), was assessed for its ability to increase students' engagement, practical bioinformatic skills and process-specific knowledge. Elements assessed were process-specific knowledge following module completion, qualitative student-based module evaluation and the novelty, scientific validity and quality of written student reports. Bioinformatics is often the starting point for laboratory-based research projects, therefore high importance was placed on allowing students to individually develop and apply processes and methods of scientific research. Students led a bioinformatic inquiry-based project (within a framework of inquiry), discovering, justifying and exploring individually discovered research targets. Detailed assessable reports were produced, displaying data generated and the resources used. Mimicking research settings, undergraduates were divided into small collaborative groups, with distinctive central themes. The module was evaluated by assessing the quality and originality of the students' targets through reports, reflecting students' use and understanding of concepts and tools required to generate their data. Furthermore, evaluation of the bioinformatic module was assessed semi-quantitatively using pre- and post-module quizzes (a non-assessable activity, not contributing to their grade), which incorporated process- and content-specific questions (indicative of their use of the online tools). Qualitative assessment of the teaching intervention was performed using post-module surveys, exploring student satisfaction and other module specific elements. Overall, a positive experience was found, as was a post module increase in correct process-specific answers. In conclusion, an inquiry-based peer-assisted learning module increased students' engagement, practical bioinformatic skills and process-specific knowledge. © 2016 by The International Union of Biochemistry and Molecular Biology, 44:304-313 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  18. Through the eye of the needle: a review of isotope approaches to quantify microbial processes mediating soil carbon balance.

    PubMed

    Paterson, Eric; Midwood, Andrew J; Millard, Peter

    2009-01-01

    For soils in carbon balance, losses of soil carbon from biological activity are balanced by organic inputs from vegetation. Perturbations, such as climate or land use change, have the potential to disrupt this balance and alter soil-atmosphere carbon exchanges. As the quantification of soil organic matter stocks is an insensitive means of detecting changes, certainly over short timescales, there is a need to apply methods that facilitate a quantitative understanding of the biological processes underlying soil carbon balance. We outline the processes by which plant carbon enters the soil and critically evaluate isotopic methods to quantify them. Then, we consider the balancing CO(2) flux from soil and detail the importance of partitioning the sources of this flux into those from recent plant assimilate and those from native soil organic matter. Finally, we consider the interactions between the inputs of carbon to soil and the losses from soil mediated by biological activity. We emphasize the key functional role of the microbiota in the concurrent processing of carbon from recent plant inputs and native soil organic matter. We conclude that quantitative isotope labelling and partitioning methods, coupled to those for the quantification of microbial community substrate use, offer the potential to resolve the functioning of the microbial control point of soil carbon balance in unprecedented detail.

  19. Robust perception algorithms for road and track autonomous following

    NASA Astrophysics Data System (ADS)

    Marion, Vincent; Lecointe, Olivier; Lewandowski, Cecile; Morillon, Joel G.; Aufrere, Romuald; Marcotegui, Beatrix; Chapuis, Roland; Beucher, Serge

    2004-09-01

    The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales Airborne Systems as the prime contractor, focuses on about 15 robotic themes, which can provide an immediate "operational add-on value." The paper details the "road and track following" theme (named AUT2), which main purpose was to develop a vision based sub-system to automatically detect roadsides of an extended range of roads and tracks suitable to military missions. To achieve the goal, efforts focused on three main areas: (1) Improvement of images quality at algorithms inputs, thanks to the selection of adapted video cameras, and the development of a THALES patented algorithm: it removes in real time most of the disturbing shadows in images taken in natural environments, enhances contrast and lowers reflection effect due to films of water. (2) Selection and improvement of two complementary algorithms (one is segment oriented, the other region based) (3) Development of a fusion process between both algorithms, which feeds in real time a road model with the best available data. Each previous step has been developed so that the global perception process is reliable and safe: as an example, the process continuously evaluates itself and outputs confidence criteria qualifying roadside detection. The paper presents the processes in details, and the results got from passed military acceptance tests, which trigger the next step: autonomous track following (named AUT3).

  20. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  1. Academic Detailing Has a Positive Effect on Prescribing and Decreasing Prescription Drug Costs: A Health Plan's Perspective

    PubMed Central

    Ndefo, Uche Anadu; Norman, Rolicia; Henry, Andrea

    2017-01-01

    Background When initiated by a health plan, academic detailing can be used to change prescribing practices, which can lead to increased safety and savings. Objective To evaluate the impact of academic detailing on prescribing and prescription drug costs of cefixime to a health plan. Methods A prospective intervention study was carried out that evaluated the prescribing practices and prescription drug costs of cefixime. A total of 11 prescribers were detailed by 1 pharmacist between August 2014 and March 2015. Two of the 11 prescribers did not respond to the academic detailing and were not followed up. The physicians' prescribing habits and prescription costs were compared before and after detailing to evaluate the effectiveness of the intervention. Data were collected for approximately 5 months before and after the intervention. Each prescriber served as his or her own control. Results Overall, an approximate 36% reduction in the number of cefixime prescriptions written and an approximate 20% decrease in prescription costs was seen with academic detailing compared with the year before the intervention. In 9 of 11 (82%) prescribers, intervention with academic detailing was successful and resulted in fewer prescriptions for cefixime during the study period. Conclusion Academic detailing had a positive impact on prescribing, by decreasing the number of cefixime prescriptions and lowering the drug costs to the health plan. PMID:28626509

  2. Control of Technology Transfer at JPL

    NASA Technical Reports Server (NTRS)

    Oliver, Ronald

    2006-01-01

    Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software

  3. 41 CFR 102-37.230 - What must a letter of intent for obtaining surplus aircraft or vessels include?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cannibalization, the donee must provide details of the cannibalization process (time to complete the cannibalization process, how recovered parts are to be used, method of accounting for usable parts, disposition of... land, the donee must provide details of the process, including the time to complete the process; and (d...

  4. Enabling Earth Science Through Cloud Computing

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  5. Biogasification of Walt Disney World biomass waste blend. Annual report Jan-Dec 82

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biljetina, R.; Chynoweth, D.P.; Janulis, J.

    1983-05-01

    The objective of this research is to develop efficient processes for conversion of biomass-waste blends to methane and other resources. To evaluate the technical and economic feasibility, an experimental test facility (ETU) is being designed and installed at the Reedy Creek Wastewater Treatment Plant at Walt Disney World, Orlando, Florida. The facility will integrate a biomethanogenic conversion process with a waste-water treatment process employing water hyacinth ponds for secondary and tertiary treatment of sewage produced at Walt Disney World. The ETU will be capable of feeding 1-wet ton per day of water hyacinth-sludge blends to the digestion system for productionmore » of methane and other byproducts. The detailed design of the facility has been completed and procurement of equipment is in progress.« less

  6. Description and Evaluation of the Research Ethics Review Process in Japan: Proposed Measures for Improvement.

    PubMed

    Suzuki, Mika; Sato, Keiko

    2016-07-01

    Research Ethics Committees (RECs) are designed to protect human subjects in research. It is essential to recognize whether the RECs are achieving this goal. Several studies have reported on RECs; however, detailed data regarding the quality of research protocols and the review process of RECs have not been reported in Japan. We examine research protocols reviewed by RECs and the review processes at three institutions using a novel checklist we developed. The data show that approximately half of all examined protocols lacked a clearly written "Background" section that defines the study rationale and design. These results reiterate suggestions made in previous research regarding educational programs and support departments that could enhance responsible conduct in clinical research to protect human subjects in Japan. © The Author(s) 2016.

  7. Improving TOGAF ADM 9.1 Migration Planning Phase by ITIL V3 Service Transition

    NASA Astrophysics Data System (ADS)

    Hanum Harani, Nisa; Akhmad Arman, Arry; Maulana Awangga, Rolly

    2018-04-01

    Modification planning of business transformation involving technological utilization required a system of transition and migration planning process. Planning of system migration activity is the most important. The migration process is including complex elements such as business re-engineering, transition scheme mapping, data transformation, application development, individual involvement by computer and trial interaction. TOGAF ADM is the framework and method of enterprise architecture implementation. TOGAF ADM provides a manual refer to the architecture and migration planning. The planning includes an implementation solution, in this case, IT solution, but when the solution becomes an IT operational planning, TOGAF could not handle it. This paper presents a new model framework detail transitions process of integration between TOGAF and ITIL. We evaluated our models in field study inside a private university.

  8. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  9. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE PAGES

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo; ...

    2018-05-17

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  10. Advanced composites structural concepts and materials technologies for primary aircraft structures: Design/manufacturing concept assessment

    NASA Technical Reports Server (NTRS)

    Chu, Robert L.; Bayha, Tom D.; Davis, HU; Ingram, J. ED; Shukla, Jay G.

    1992-01-01

    Composite Wing and Fuselage Structural Design/Manufacturing Concepts have been developed and evaluated. Trade studies were performed to determine how well the concepts satisfy the program goals of 25 percent cost savings, 40 percent weight savings with aircraft resizing, and 50 percent part count reduction as compared to the aluminum Lockheed L-1011 baseline. The concepts developed using emerging technologies such as large scale resin transfer molding (RTM), automatic tow placed (ATP), braiding, out-of-autoclave and automated manufacturing processes for both thermoset and thermoplastic materials were evaluated for possible application in the design concepts. Trade studies were used to determine which concepts carry into the detailed design development subtask.

  11. Motivational Interviewing at the Intersections of Depression and Intimate Partner Violence among African American Women

    PubMed Central

    Wahab, Stéphanie; Trimble, Jammie; Mejia, Angie; Mitchell, S. Renee; Thomas, Mary Jo; Timmons, Vanessa; Waters, A. Star; Raymaker, Dora; Nicolaidis, Christina

    2014-01-01

    This article focuses on design, training, and delivery of a culturally-tailored, multi-faceted intervention which used motivational interviewing (MI) and case management to reduce depression severity among African American survivors of intimate partner violence (IPV). We present the details of the intervention and discuss its implementation as a means of creating and providing culturally appropriate depression and violence services to African American women. We used a CBPR approach to develop and evaluate the multi-faceted intervention. As part of the evaluation, we collected process measures about the use of MI, assessed MI fidelity, and interviewed participants about their experiences with the program. PMID:24857557

  12. Digital autopilots: Design considerations and simulator evaluations

    NASA Technical Reports Server (NTRS)

    Osder, S.; Neuman, F.; Foster, J.

    1971-01-01

    The development of a digital autopilot program for a transport aircraft and the evaluation of that system's performance on a transport aircraft simulator is discussed. The digital autopilot includes three axis attitude stabilization, automatic throttle control and flight path guidance functions with emphasis on the mode progression from descent into the terminal area through automatic landing. The study effort involved a sequence of tasks starting with the definition of detailed system block diagrams of control laws followed by a flow charting and programming phase and concluding with performance verification using the transport aircraft simulation. The autopilot control laws were programmed in FORTRAN 4 in order to isolate the design process from requirements peculiar to an individual computer.

  13. FPGA Based Reconfigurable ATM Switch Test Bed

    NASA Technical Reports Server (NTRS)

    Chu, Pong P.; Jones, Robert E.

    1998-01-01

    Various issues associated with "FPGA Based Reconfigurable ATM Switch Test Bed" are presented in viewgraph form. Specific topics include: 1) Network performance evaluation; 2) traditional approaches; 3) software simulation; 4) hardware emulation; 5) test bed highlights; 6) design environment; 7) test bed architecture; 8) abstract sheared-memory switch; 9) detailed switch diagram; 10) traffic generator; 11) data collection circuit and user interface; 12) initial results; and 13) the following conclusions: Advances in FPGA make hardware emulation feasible for performance evaluation, hardware emulation can provide several orders of magnitude speed-up over software simulation; due to the complexity of hardware synthesis process, development in emulation is much more difficult than simulation and requires knowledge in both networks and digital design.

  14. Application of a design-build-team approach to low cost and weight composite fuselage structure

    NASA Technical Reports Server (NTRS)

    Ilcewicz, L. B.; Walker, T. H.; Willden, K. S.; Swanson, G. D.; Truslove, G.; Metschan, S. L.; Pfahl, C. L.

    1991-01-01

    Relationships between manufacturing costs and design details must be understood to promote the application of advanced composite technologies to transport fuselage structures. A team approach, integrating the disciplines responsible for aircraft structural design and manufacturing, was developed to perform cost and weight trade studies for a twenty-foot diameter aft fuselage section. Baseline composite design and manufacturing concepts were selected for large quadrant panels in crown, side, and keel areas of the fuselage section. The associated technical issues were also identified. Detailed evaluation of crown panels indicated the potential for large weight savings and costs competitive with aluminum technology in the 1995 timeframe. Different processes and material forms were selected for the various elements that comprise the fuselage structure. Additional cost and weight savings potential was estimated for future advancements.

  15. A rehabilitation intervention to promote physical recovery following intensive care: a detailed description of construct development, rationale and content together with proposed taxonomy to capture processes in a randomised controlled trial.

    PubMed

    Ramsay, Pam; Salisbury, Lisa G; Merriweather, Judith L; Huby, Guro; Rattray, Janice E; Hull, Alastair M; Brett, Stephen J; Mackenzie, Simon J; Murray, Gordon D; Forbes, John F; Walsh, Timothy Simon

    2014-01-29

    Increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause ongoing disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). The intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. The final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. The MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective.

  16. A rehabilitation intervention to promote physical recovery following intensive care: a detailed description of construct development, rationale and content together with proposed taxonomy to capture processes in a randomised controlled trial

    PubMed Central

    2014-01-01

    Background Increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause ongoing disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). Methods The intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. Results The final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. Conclusions The MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective. PMID:24476530

  17. Preliminary results of real-time in-vitro electronic speckle pattern interferometry (ESPI) measurements in otolaryngology

    NASA Astrophysics Data System (ADS)

    Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.

    1995-05-01

    Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.

  18. @neurIST complex information processing toolchain for the integrated management of cerebral aneurysms

    PubMed Central

    Villa-Uriol, M. C.; Berti, G.; Hose, D. R.; Marzo, A.; Chiarini, A.; Penrose, J.; Pozo, J.; Schmidt, J. G.; Singh, P.; Lycett, R.; Larrabide, I.; Frangi, A. F.

    2011-01-01

    Cerebral aneurysms are a multi-factorial disease with severe consequences. A core part of the European project @neurIST was the physical characterization of aneurysms to find candidate risk factors associated with aneurysm rupture. The project investigated measures based on morphological, haemodynamic and aneurysm wall structure analyses for more than 300 cases of ruptured and unruptured aneurysms, extracting descriptors suitable for statistical studies. This paper deals with the unique challenges associated with this task, and the implemented solutions. The consistency of results required by the subsequent statistical analyses, given the heterogeneous image data sources and multiple human operators, was met by a highly automated toolchain combined with training. A testimonial of the successful automation is the positive evaluation of the toolchain by over 260 clinicians during various hands-on workshops. The specification of the analyses required thorough investigations of modelling and processing choices, discussed in a detailed analysis protocol. Finally, an abstract data model governing the management of the simulation-related data provides a framework for data provenance and supports future use of data and toolchain. This is achieved by enabling the easy modification of the modelling approaches and solution details through abstract problem descriptions, removing the need of repetition of manual processing work. PMID:22670202

  19. Pore scale study of multiphase multicomponent reactive transport during CO 2 dissolution trapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Li; Wang, Mengyi; Kang, Qinjun

    Solubility trapping is crucial for permanent CO 2 sequestration in deep saline aquifers. For the first time, a pore-scale numerical method is developed to investigate coupled scCO 2-water two-phase flow, multicomponent (CO 2(aq), H +, HCO 3 –, CO 3 2 – and OH –) mass transport, heterogeneous interfacial dissolution reaction, and homogeneous dissociation reactions. Pore-scale details of evolutions of multiphase distributions and concentration fields are presented and discussed. Time evolutions of several variables including averaged CO 2(aq) concentration, scCO 2 saturation, and pH value are analyzed. Specific interfacial length, an important variable which cannot be determined but is requiredmore » by continuum models, is investigated in detail. Mass transport coefficient or efficient dissolution rate is also evaluated. The pore-scale results show strong non-equilibrium characteristics during solubility trapping due to non-uniform distributions of multiphase as well as slow mass transport process. Complicated coupling mechanisms between multiphase flow, mass transport and chemical reactions are also revealed. Lastly, effects of wettability are also studied. The pore-scale studies provide deep understanding of non-linear non-equilibrium multiple physicochemical processes during CO 2 solubility trapping processes, and also allow to quantitatively predict some important empirical relationships, such as saturation-interfacial surface area, for continuum models.« less

  20. Pore scale study of multiphase multicomponent reactive transport during CO 2 dissolution trapping

    DOE PAGES

    Chen, Li; Wang, Mengyi; Kang, Qinjun; ...

    2018-04-26

    Solubility trapping is crucial for permanent CO 2 sequestration in deep saline aquifers. For the first time, a pore-scale numerical method is developed to investigate coupled scCO 2-water two-phase flow, multicomponent (CO 2(aq), H +, HCO 3 –, CO 3 2 – and OH –) mass transport, heterogeneous interfacial dissolution reaction, and homogeneous dissociation reactions. Pore-scale details of evolutions of multiphase distributions and concentration fields are presented and discussed. Time evolutions of several variables including averaged CO 2(aq) concentration, scCO 2 saturation, and pH value are analyzed. Specific interfacial length, an important variable which cannot be determined but is requiredmore » by continuum models, is investigated in detail. Mass transport coefficient or efficient dissolution rate is also evaluated. The pore-scale results show strong non-equilibrium characteristics during solubility trapping due to non-uniform distributions of multiphase as well as slow mass transport process. Complicated coupling mechanisms between multiphase flow, mass transport and chemical reactions are also revealed. Lastly, effects of wettability are also studied. The pore-scale studies provide deep understanding of non-linear non-equilibrium multiple physicochemical processes during CO 2 solubility trapping processes, and also allow to quantitatively predict some important empirical relationships, such as saturation-interfacial surface area, for continuum models.« less

  1. Pore scale study of multiphase multicomponent reactive transport during CO2 dissolution trapping

    NASA Astrophysics Data System (ADS)

    Chen, Li; Wang, Mengyi; Kang, Qinjun; Tao, Wenquan

    2018-06-01

    Solubility trapping is crucial for permanent CO2 sequestration in deep saline aquifers. For the first time, a pore-scale numerical method is developed to investigate coupled scCO2-water two-phase flow, multicomponent (CO2(aq), H+, HCO3-, CO32- and OH-) mass transport, heterogeneous interfacial dissolution reaction, and homogeneous dissociation reactions. Pore-scale details of evolutions of multiphase distributions and concentration fields are presented and discussed. Time evolutions of several variables including averaged CO2(aq) concentration, scCO2 saturation, and pH value are analyzed. Specific interfacial length, an important variable which cannot be determined but is required by continuum models, is investigated in detail. Mass transport coefficient or efficient dissolution rate is also evaluated. The pore-scale results show strong non-equilibrium characteristics during solubility trapping due to non-uniform distributions of multiphase as well as slow mass transport process. Complicated coupling mechanisms between multiphase flow, mass transport and chemical reactions are also revealed. Finally, effects of wettability are also studied. The pore-scale studies provide deep understanding of non-linear non-equilibrium multiple physicochemical processes during CO2 solubility trapping processes, and also allow to quantitatively predict some important empirical relationships, such as saturation-interfacial surface area, for continuum models.

  2. A novel process of viral vector barcoding and library preparation enables high-diversity library generation and recombination-free paired-end sequencing

    PubMed Central

    Davidsson, Marcus; Diaz-Fernandez, Paula; Schwich, Oliver D.; Torroba, Marcos; Wang, Gang; Björklund, Tomas

    2016-01-01

    Detailed characterization and mapping of oligonucleotide function in vivo is generally a very time consuming effort that only allows for hypothesis driven subsampling of the full sequence to be analysed. Recent advances in deep sequencing together with highly efficient parallel oligonucleotide synthesis and cloning techniques have, however, opened up for entirely new ways to map genetic function in vivo. Here we present a novel, optimized protocol for the generation of universally applicable, barcode labelled, plasmid libraries. The libraries are designed to enable the production of viral vector preparations assessing coding or non-coding RNA function in vivo. When generating high diversity libraries, it is a challenge to achieve efficient cloning, unambiguous barcoding and detailed characterization using low-cost sequencing technologies. With the presented protocol, diversity of above 3 million uniquely barcoded adeno-associated viral (AAV) plasmids can be achieved in a single reaction through a process achievable in any molecular biology laboratory. This approach opens up for a multitude of in vivo assessments from the evaluation of enhancer and promoter regions to the optimization of genome editing. The generated plasmid libraries are also useful for validation of sequencing clustering algorithms and we here validate the newly presented message passing clustering process named Starcode. PMID:27874090

  3. A study on the real-time reliability of on-board equipment of train control system

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  4. Opening the black box of ethics policy work: evaluating a covert practice.

    PubMed

    Frolic, Andrea; Drolet, Katherine; Bryanton, Kim; Caron, Carole; Cupido, Cynthia; Flaherty, Barb; Fung, Sylvia; McCall, Lori

    2012-01-01

    Hospital ethics committees (HECs) and ethicists generally describe themselves as engaged in four domains of practice: case consultation, research, education, and policy work. Despite the increasing attention to quality indicators, practice standards, and evaluation methods for the other domains, comparatively little is known or published about the policy work of HECs or ethicists. This article attempts to open the "black box" of this health care ethics practice by providing two detailed case examples of ethics policy reviews. We also describe the development and application of an evaluation strategy to assess the quality of ethics policy review work, and to enable continuous improvement of ethics policy review processes. Given the potential for policy work to impact entire patient populations and organizational systems, it is imperative that HECs and ethicists develop clearer roles, responsibilities, procedural standards, and evaluation methods to ensure the delivery of consistent, relevant, and high-quality ethics policy reviews.

  5. Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments

    NASA Astrophysics Data System (ADS)

    Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel

    2017-03-01

    This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.

  6. Space network scheduling benchmark: A proof-of-concept process for technology transfer

    NASA Technical Reports Server (NTRS)

    Moe, Karen; Happell, Nadine; Hayden, B. J.; Barclay, Cathy

    1993-01-01

    This paper describes a detailed proof-of-concept activity to evaluate flexible scheduling technology as implemented in the Request Oriented Scheduling Engine (ROSE) and applied to Space Network (SN) scheduling. The criteria developed for an operational evaluation of a reusable scheduling system is addressed including a methodology to prove that the proposed system performs at least as well as the current system in function and performance. The improvement of the new technology must be demonstrated and evaluated against the cost of making changes. Finally, there is a need to show significant improvement in SN operational procedures. Successful completion of a proof-of-concept would eventually lead to an operational concept and implementation transition plan, which is outside the scope of this paper. However, a high-fidelity benchmark using actual SN scheduling requests has been designed to test the ROSE scheduling tool. The benchmark evaluation methodology, scheduling data, and preliminary results are described.

  7. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    NASA Technical Reports Server (NTRS)

    Jairala, Juniper; Durkin, Robert

    2012-01-01

    As an early step in preparing for future EVAs, astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. To date, neutral buoyancy demonstrations at NASA JSC’s Sonny Carter Training Facility have primarily evaluated assembly and maintenance tasks associated with several elements of the ISS. With the retirement of the Space Shuttle, completion of ISS assembly, and introduction of commercial participants for human transportation into space, evaluations at the NBL will take on a new focus. In this session, Juniper Jairala briefly discussed the design of the NBL and, in more detail, described the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated. Robert Durkin discussed the new and potential types of uses for the NBL, including those by non-NASA external customers.

  8. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    NASA Technical Reports Server (NTRS)

    Jairala, Juniper; Durkin, Robert

    2012-01-01

    As an early step in preparing for future EVAs, astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. To date, neutral buoyancy demonstrations at NASA JSC's Sonny Carter Training Facility have primarily evaluated assembly and maintenance tasks associated with several elements of the ISS. With the retirement of the Space Shuttle, completion of ISS assembly, and introduction of commercial participants for human transportation into space, evaluations at the NBL will take on a new focus. In this session, Juniper Jairala briefly discussed the design of the NBL and, in more detail, described the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated. Robert Durkin discussed the new and potential types of uses for the NBL, including those by non-NASA external customers.

  9. ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.

    PubMed

    Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D

    2014-01-01

    Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.

  10. Development of integrated, zero-G pneumatic transporter/rotating paddle incinerator/catalytic afterburner subsystem for processing human wastes on board spacecraft

    NASA Technical Reports Server (NTRS)

    Fields, S. F.; Labak, L. J.; Honegger, R. J.

    1974-01-01

    A four component system was developed which consists of a particle size reduction mechanism, a pneumatic waste transport system, a rotating-paddle incinerator, and a catalytic afterburner to be integrated into a six-man, zero-g subsystem for processing human wastes on board spacecraft. The study included the development of different concepts or functions, the establishment of operational specifications, and a critical evaluation for each of the four components. A series of laboratory tests was run, and a baseline subsystem design was established. An operational specification was also written in preparation for detailed design and testing of this baseline subsystem.

  11. Fused slurry silicide coatings for columbium alloy reentry heat shields. Volume 2: Experimental and coating process details

    NASA Technical Reports Server (NTRS)

    Fitzgerald, B.

    1973-01-01

    The experimental and coating process details are presented. The process specifications which were developed for the formulation and application of the R-512E fused slurry silicide coating using either an acrylic or nitrocellulose base slurry system is also discussed.

  12. The development and evaluation of an alternative powder prepregging technique for use with LaRC-TPI/graphite composites

    NASA Technical Reports Server (NTRS)

    Ogden, Andrea L.; Hyer, Michael W.; Wilkes, Garth L.; Loos, Alfred C.; St.clair, Terry L.

    1991-01-01

    An alternative powder prepregging method for use with LaRC-TPI (a thermoplastic polyimide)/graphite composites is investigated. The alternative method incorporates the idea of moistening the fiber prior to powder coating. Details of the processing parameters are given and discussed. The material was subsequently laminated into small coupons which were evaluated for processing defects using electron microscopy. After the initial evaluation of the material, no major processing defects were encountered but there appeared to be an interfacial adhesion problem. As a result, prepregging efforts were extended to include an additional fiber system, XAS, and a semicrystalline form of the matrix. The semicrystalline form of the matrix was the result of a complex heat treating cycle. Using scanning electron microscopy (SEM), the fiber/matrix adhesion was evaluated in these systems relative to the amorphous/XAS coupons. Based on these results, amorphous and semicrystalline/AS-4 and XAS materials were prepregged and laminated for transverse tensile testing. The results of these tests are presented, and in an effort to obtain more information on the effect of the matrix, remaining semicrystalline transverse tensile coupons were transformed back to the amorphous state and tested. The mechanical properties of the transformed coupons returned to the values observed for the original amorphous coupons, and the interfacial adhesion, as observed by SEM, was better than in any previous sample.

  13. A Spike Cocktail Approach to Improve Microbial Performance ...

    EPA Pesticide Factsheets

    Water reuse, via either centralized treatment of traditional wastewater or decentralized treatment and on-site reuse, is becoming an increasingly important element of sustainable water management. Despite advances in waterborne pathogen detection methods, low and highly variable pathogen levels limit their utility for routine evaluation of health risks in water reuse systems. Therefore, there is a need to improve our understanding of the linkage between pathogens and more readily measured process indicators during treatment. This paper describes an approach for constructing spiking experiments to relate the behavior of viral, bacterial, and protozoan pathogens with relevant process indicators. General issues are reviewed, and the spiking protocol is applied as a case study example to improve microbial performance monitoring and health risk evaluation in a water reuse system. This approach provides a foundation for the development of novel approaches to improve real or near-real time performance monitoring of water recycling systems. This manuscrupt details an approach for developing "spike cocktail", a mixture of microorganisms that can be used to evaluate the performance of engineered and natural systems.

  14. Evaluation of a Tracking System for Patients and Mixed Intravenous Medication Based on RFID Technology.

    PubMed

    Martínez Pérez, María; Vázquez González, Guillermo; Dafonte, Carlos

    2016-11-30

    At present, one of the primary concerns of healthcare professionals is how to increase the safety and quality of the care that patients receive during their stay in hospital. This is particularly important in the administration of expensive and high-risk medicines with which it is fundamental to minimize the possibility of adverse events in the process of prescription-validation-preparation/dosage-dispensation-administration of intravenous mixes. This work is a detailed analysis of the evaluation, carried out by the health personnel involved in the Radiofrequency Identification (RFID) system developed in the Day Hospital and Pharmacy services of the Complejo Hospitalario Universitario A Coruña (CHUAC). The RFID system is evaluated by analyzing surveys completed by said health personnel, since their questions represent the key indicators of the patient care process (safety, cost, adequacy with the clinical practice). This work allows us to conclude, among other things, that the system tracks the patients satisfactorily and that its cost, though high, is justified in the context of the project context (use of dangerous and costly medication).

  15. Evaluation of a Tracking System for Patients and Mixed Intravenous Medication Based on RFID Technology

    PubMed Central

    Martínez Pérez, María; Vázquez González, Guillermo; Dafonte, Carlos

    2016-01-01

    At present, one of the primary concerns of healthcare professionals is how to increase the safety and quality of the care that patients receive during their stay in hospital. This is particularly important in the administration of expensive and high-risk medicines with which it is fundamental to minimize the possibility of adverse events in the process of prescription-validation-preparation/dosage-dispensation-administration of intravenous mixes. This work is a detailed analysis of the evaluation, carried out by the health personnel involved in the Radiofrequency Identification (RFID) system developed in the Day Hospital and Pharmacy services of the Complejo Hospitalario Universitario A Coruña (CHUAC). The RFID system is evaluated by analyzing surveys completed by said health personnel, since their questions represent the key indicators of the patient care process (safety, cost, adequacy with the clinical practice). This work allows us to conclude, among other things, that the system tracks the patients satisfactorily and that its cost, though high, is justified in the context of the project context (use of dangerous and costly medication). PMID:27916915

  16. Selecting the right digital camera for telemedicine-choice for 2009.

    PubMed

    Patricoski, Chris; Ferguson, A Stewart; Brudzinski, Jay; Spargo, Garret

    2010-03-01

    Digital cameras are fundamental tools for store-and-forward telemedicine (electronic consultation). The choice of a camera may significantly impact this consultative process based on the quality of the images, the ability of users to leverage the cameras' features, and other facets of the camera design. The goal of this research was to provide a substantive framework and clearly defined process for reviewing digital cameras and to demonstrate the results obtained when employing this process to review point-and-shoot digital cameras introduced in 2009. The process included a market review, in-house evaluation of features, image reviews, functional testing, and feature prioritization. Seventy-two cameras were identified new on the market in 2009, and 10 were chosen for in-house evaluation. Four cameras scored very high for mechanical functionality and ease-of-use. The final analysis revealed three cameras that had excellent scores for both color accuracy and photographic detail and these represent excellent options for telemedicine: Canon Powershot SD970 IS, Fujifilm FinePix F200EXR, and Panasonic Lumix DMC-ZS3. Additional features of the Canon Powershot SD970 IS make it the camera of choice for our Alaska program.

  17. The systematic review as a research process in music therapy.

    PubMed

    Hanson-Abromeit, Deanna; Sena Moore, Kimberly

    2014-01-01

    Music therapists are challenged to present evidence on the efficacy of music therapy treatment and incorporate the best available research evidence to make informed healthcare and treatment decisions. Higher standards of evidence can come from a variety of sources including systematic reviews. To define and describe a range of research review methods using examples from music therapy and related literature, with emphasis on the systematic review. In addition, the authors provide a detailed overview of methodological processes for conducting and reporting systematic reviews in music therapy. The systematic review process is described in five steps. Step 1 identifies the research plan and operationalized research question(s). Step 2 illustrates the identification and organization of the existing literature related to the question(s). Step 3 details coding of data extracted from the literature. Step 4 explains the synthesis of coded findings and analysis to answer the research question(s). Step 5 describes the strength of evidence evaluation and results presentation for practice recommendations. Music therapists are encouraged to develop and conduct systematic reviews. This methodology contributes to review outcome credibility and can determine how information is interpreted and used by clinicians, clients or patients, and policy makers. A systematic review is a methodologically rigorous research method used to organize and evaluate extant literature related to a clinical problem. Systematic reviews can assist music therapists in managing the ever-increasing literature, making well-informed evidence based practice and research decisions, and translating existing music-based and nonmusic based literature to clinical practice and research development. © the American Music Therapy Association 2014. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Listed waste determination report. Environmental characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-06-01

    On September 23, 1988, the US Environmental Protection Agency (EPA) published a notice clarifying interim status requirements for the management of radioactive mixed waste thereby subjecting the Idaho National Engineering Laboratory (INEL) and other applicable Department of Energy (DOE) sites to regulation under the Resource Conservation and Recovery Act (RCRA). Therefore, the DOE was required to submit a Part A Permit application for each treatment, storage, and disposal (TSD) unit within the INEL, defining the waste codes and processes to be regulated under RCRA. The September 1990 revised Part A Permit application, that was approved by the State of Idahomore » identified 101 potential acute and toxic hazardous waste codes (F-, P-, and U- listed wastes according to 40 CFR 261.31 and 40 CFR 261.33) for some TSD units at the Idaho Chemical Processing Plant. Most of these waste were assumed to have been introduced into the High-level Liquid Waste TSD units via laboratory drains connected to the Process Equipment Waste (PEW) evaporator (PEW system). At that time, a detailed and systematic evaluation of hazardous chemical use and disposal practices had not been conducted to determine if F-, P-, or Unlisted waste had been disposed to the PEW system. The purpose of this investigation was to perform a systematic and detailed evaluation of the use and disposal of the 101 F-, P-, and Unlisted chemicals found in the approved September 1990 Part A Permit application. This investigation was aimed at determining which listed wastes, as defined in 40 CFR 261.31 (F-listed) and 261.33 (P & Unlisted) were discharged to the PEW system. Results of this investigation will be used to support revisions to the RCRA Part A Permit application.« less

  19. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151.

    PubMed

    Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John

    2015-11-01

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.

  20. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  1. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less

  2. Characterization of the structural details of residual austenite in the weld metal of a 9Cr1MoNbV welded rotor

    NASA Astrophysics Data System (ADS)

    Liu, Xia; Ji, Hui-jun; Liu, Peng; Wang, Peng; Lu, Feng-gui; Gao, Yu-lai

    2014-06-01

    The existence of residual austenite in weld metal plays an important role in determining the properties and dimensional accuracy of welded rotors. An effective corrosive agent and the metallographic etching process were developed to clearly reveal the characteristics of residual austenite in the weld metal of a 9Cr1MoNbV welded rotor. Moreover, the details of the distribution, shape, length, length-to-width ratio, and the content of residual austenite were systematically characterized using the Image-Pro Plus image analysis software. The results revealed that the area fraction of residual austenite was approximately 6.3% in the observed weld seam; the average area, length, and length-to-width ratio of dispersed residual austenite were quantitatively evaluated to be (5.5 ± 0.1) μm2, (5.0 ± 0.1) μm, and (2.2 ± 0.1), respectively. The newly developed corrosive agent and etching method offer an appropriate approach to characterize residual austenite in the weld metal of welded rotors in detail.

  3. Evaluating All-Metal Valves for Use in a Tritium Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, L.; Payton, A.

    In the tritium gas processing system, it is desired to minimize polymer components due to their degradation from tritium exposure (beta decay). One source of polymers in the tritium process is valve components. A vendor has been identified that manufactures a valve that is marketed as being made from all-metal construction. This manufacturer, Ham-Let Group, manufactures a diaphragm valve (3LE series) that claims to be made entirely of metal. SRNL procured twelve (12) Ham-Let diaphragm valves for characterization and evaluation. The characterization tests include identification of the maximum pressure of these valves by performing pressure and burst tests. Leak testsmore » were performed to ensure the valves do not exceed the acceptable leak rate for tritium service. These valves were then cycled in a nitrogen gas and/or vacuum environment to ensure they would be durable in a process environment. They were subsequently leak tested per ASTM protocol to ensure that the valves maintained their leak tight integrity. A detailed material analysis was also conducted to determine hydrogen and tritium compatibility.« less

  4. Integrating Theory and Practice: Applying the Quality Improvement Paradigm to Product Line Engineering

    NASA Technical Reports Server (NTRS)

    Stark, Michael; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    My assertion is that not only are product lines a relevant research topic, but that the tools used by empirical software engineering researchers can address observed practical problems. Our experience at NASA has been there are often externally proposed solutions available, but that we have had difficulties applying them in our particular context. We have also focused on return on investment issues when evaluating product lines, and while these are important, one can not attain objective data on success or failure until several applications from a product family have been deployed. The use of the Quality Improvement Paradigm (QIP) can address these issues: (1) Planning an adoption path from an organization's current state to a product line approach; (2) Constructing a development process to fit the organization's adoption path; (3) Evaluation of product line development processes as the project is being developed. The QIP consists of the following six steps: (1) Characterize the project and its environment; (2) Set quantifiable goals for successful project performance; (3) Choose the appropriate process models, supporting methods, and tools for the project; (4) Execute the process, analyze interim results, and provide real-time feedback for corrective action; (5) Analyze the results of completed projects and recommend improvements; and (6) Package the lessons learned as updated and refined process models. A figure shows the QIP in detail. The iterative nature of the QIP supports an incremental development approach to product lines, and the project learning and feedback provide the necessary early evaluations.

  5. A compositional framework for Markov processes

    NASA Astrophysics Data System (ADS)

    Baez, John C.; Fong, Brendan; Pollard, Blake S.

    2016-03-01

    We define the concept of an "open" Markov process, or more precisely, continuous-time Markov chain, which is one where probability can flow in or out of certain states called "inputs" and "outputs." One can build up a Markov process from smaller open pieces. This process is formalized by making open Markov processes into the morphisms of a dagger compact category. We show that the behavior of a detailed balanced open Markov process is determined by a principle of minimum dissipation, closely related to Prigogine's principle of minimum entropy production. Using this fact, we set up a functor mapping open detailed balanced Markov processes to open circuits made of linear resistors. We also describe how to "black box" an open Markov process, obtaining the linear relation between input and output data that holds in any steady state, including nonequilibrium steady states with a nonzero flow of probability through the system. We prove that black boxing gives a symmetric monoidal dagger functor sending open detailed balanced Markov processes to Lagrangian relations between symplectic vector spaces. This allows us to compute the steady state behavior of an open detailed balanced Markov process from the behaviors of smaller pieces from which it is built. We relate this black box functor to a previously constructed black box functor for circuits.

  6. Evaluating the implementation process of a participatory organizational level occupational health intervention in schools.

    PubMed

    Schelvis, Roosmarijn M C; Wiezer, Noortje M; Blatter, Birgitte M; van Genabeek, Joost A G M; Oude Hengel, Karen M; Bohlmeijer, Ernst T; van der Beek, Allard J

    2016-12-01

    The importance of process evaluations in examining how and why interventions are (un) successful is increasingly recognized. Process evaluations mainly studied the implementation process and the quality of the implementation (fidelity). However, in adopting this approach for participatory organizational level occupational health interventions, important aspects such as context and participants perceptions are missing. Our objective was to systematically describe the implementation process of a participatory organizational level occupational health intervention aimed at reducing work stress and increasing vitality in two schools by applying a framework that covers aspects of the intervention and its implementation as well as the context and participants perceptions. A program theory was developed, describing the requirements for successful implementation. Each requirement was operationalized by making use of the framework, covering: initiation, communication, participation, fidelity, reach, communication, satisfaction, management support, targeting, delivery, exposure, culture, conditions, readiness for change and perceptions. The requirements were assessed by quantitative and qualitative data, collected at 12 and 24 months after baseline in both schools (questionnaire and interviews) or continuously (logbooks). The intervention consisted of a needs assessment phase and a phase of implementing intervention activities. The needs assessment phase was implemented successfully in school A, but not in school B where participation and readiness for change were insufficient. In the second phase, several intervention activities were implemented at school A, whereas this was only partly the case in school B (delivery). In both schools, however, participants felt not involved in the choice of intervention activities (targeting, participation, support), resulting in a negative perception of and only partial exposure to the intervention activities. Conditions, culture and events hindered the implementation of intervention activities in both schools. The framework helped us to understand why the implementation process was not successful. It is therefore considered of added value for the evaluation of implementation processes in participatory organizational level interventions, foremost because of the context and mental models dimensions. However, less demanding methods for doing detailed process evaluations need to be developed. This can only be done if we know more about the most important process components and this study contributes to that knowledge base. Netherlands Trial Register NTR3284 .

  7. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    NASA Astrophysics Data System (ADS)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-04-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,"devil is in the detail".

  8. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    PubMed

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.

  9. A field evaluation of subsurface and surface runoff. II. Runoff processes

    USGS Publications Warehouse

    Pilgrim, D.H.; Huff, D.D.; Steele, T.D.

    1978-01-01

    Combined use of radioisotope tracer, flow rate, specific conductance and suspended-sediment measurements on a large field plot near Stanford, California, has provided more detailed information on surface and subsurface storm runoff processes than would be possible from any single approach used in isolation. Although the plot was surficially uniform, the runoff processes were shown to be grossly nonuniform, both spatially over the plot, and laterally and vertically within the soil. The three types of processes that have been suggested as sources of storm runoff (Horton-type surface runoff, saturated overland flow, and rapid subsurface throughflow) all occurred on the plot. The nonuniformity of the processes supports the partial- and variable-source area concepts. Subsurface storm runoff occurred in a saturated layer above the subsoil horizon, and short travel times resulted from flow through macropores rather than the soil matrix. Consideration of these observations would be necessary for physically realistic modeling of the storm runoff process. ?? 1978.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objective of the contract is to consolidate the advances made during the previous contract in the conversion of syngas to motor fuels using Molecular Sieve-containing catalysts and to demonstrate the practical utility and economic value of the new catalyst/process systems with appropriate laboratory runs. Work on the program is divided into the following six tasks: (1) preparation of a detailed work plan covering the entire performance of the contract; (2) preliminary techno-economic assessment of the UCC catalyst/process system; (3) optimization of the most promising catalysts developed under prior contract; (4) optimization of the UCC catalyst system in a mannermore » that will give it the longest possible service life; (5) optimization of a UCC process/catalyst system based upon a tubular reactor with a recycle loop; and (6) economic evaluation of the optimal performance found under Task 5 for the UCC process/catalyst system. Accomplishments are reported for Tasks 2 through 5.« less

  11. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  12. Weak ergodicity breaking, irreproducibility, and ageing in anomalous diffusion processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzler, Ralf

    2014-01-14

    Single particle traces are standardly evaluated in terms of time averages of the second moment of the position time series r(t). For ergodic processes, one can interpret such results in terms of the known theories for the corresponding ensemble averaged quantities. In anomalous diffusion processes, that are widely observed in nature over many orders of magnitude, the equivalence between (long) time and ensemble averages may be broken (weak ergodicity breaking), and these time averages may no longer be interpreted in terms of ensemble theories. Here we detail some recent results on weakly non-ergodic systems with respect to the time averagedmore » mean squared displacement, the inherent irreproducibility of individual measurements, and methods to determine the exact underlying stochastic process. We also address the phenomenon of ageing, the dependence of physical observables on the time span between initial preparation of the system and the start of the measurement.« less

  13. New bone post-processing tools in forensic imaging: a multi-reader feasibility study to evaluate detection time and diagnostic accuracy in rib fracture assessment.

    PubMed

    Glemser, Philip A; Pfleiderer, Michael; Heger, Anna; Tremper, Jan; Krauskopf, Astrid; Schlemmer, Heinz-Peter; Yen, Kathrin; Simons, David

    2017-03-01

    The aim of this multi-reader feasibility study was to evaluate new post-processing CT imaging tools in rib fracture assessment of forensic cases by analyzing detection time and diagnostic accuracy. Thirty autopsy cases (20 with and 10 without rib fractures in autopsy) were randomly selected and included in this study. All cases received a native whole body CT scan prior to the autopsy procedure, which included dissection and careful evaluation of each rib. In addition to standard transverse sections (modality A), CT images were subjected to a reconstruction algorithm to compute axial labelling of the ribs (modality B) as well as "unfolding" visualizations of the rib cage (modality C, "eagle tool"). Three radiologists with different clinical and forensic experience who were blinded to autopsy results evaluated all cases in a random manner of modality and case. Rib fracture assessment of each reader was evaluated compared to autopsy and a CT consensus read as radiologic reference. A detailed evaluation of relevant test parameters revealed a better accordance to the CT consensus read as to the autopsy. Modality C was the significantly quickest rib fracture detection modality despite slightly reduced statistic test parameters compared to modalities A and B. Modern CT post-processing software is able to shorten reading time and to increase sensitivity and specificity compared to standard autopsy alone. The eagle tool as an easy to use tool is suited for an initial rib fracture screening prior to autopsy and can therefore be beneficial for forensic pathologists.

  14. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  15. JPEG XS, a new standard for visually lossless low-latency lightweight image compression

    NASA Astrophysics Data System (ADS)

    Descampe, Antonin; Keinert, Joachim; Richter, Thomas; Fößel, Siegfried; Rouvroy, Gaël.

    2017-09-01

    JPEG XS is an upcoming standard from the JPEG Committee (formally known as ISO/IEC SC29 WG1). It aims to provide an interoperable visually lossless low-latency lightweight codec for a wide range of applications including mezzanine compression in broadcast and Pro-AV markets. This requires optimal support of a wide range of implementation technologies such as FPGAs, CPUs and GPUs. Targeted use cases are professional video links, IP transport, Ethernet transport, real-time video storage, video memory buffers, and omnidirectional video capture and rendering. In addition to the evaluation of the visual transparency of the selected technologies, a detailed analysis of the hardware and software complexity as well as the latency has been done to make sure that the new codec meets the requirements of the above-mentioned use cases. In particular, the end-to-end latency has been constrained to a maximum of 32 lines. Concerning the hardware complexity, neither encoder nor decoder should require more than 50% of an FPGA similar to Xilinx Artix 7 or 25% of an FPGA similar to Altera Cyclon 5. This process resulted in a coding scheme made of an optional color transform, a wavelet transform, the entropy coding of the highest magnitude level of groups of coefficients, and the raw inclusion of the truncated wavelet coefficients. This paper presents the details and status of the standardization process, a technical description of the future standard, and the latest performance evaluation results.

  16. Detailed seismic evaluation of bridges along I-24 in Western Kentucky.

    DOT National Transportation Integrated Search

    2006-09-01

    This report presents a seismic rating system and a detailed evaluation procedure for selected highway bridges on/over I-24 in Western Kentucky near the New Madrid Seismic Zone (MNSZ). The rating system, based upon structural vulnerability, seismic an...

  17. Modeling Wood Encroachment in Abandoned Grasslands in the Eifel National Park – Model Description and Testing

    PubMed Central

    Hudjetz, Silvana; Lennartz, Gottfried; Krämer, Klara; Roß-Nickoll, Martina; Gergs, André; Preuss, Thomas G.

    2014-01-01

    The degradation of natural and semi-natural landscapes has become a matter of global concern. In Germany, semi-natural grasslands belong to the most species-rich habitat types but have suffered heavily from changes in land use. After abandonment, the course of succession at a specific site is often difficult to predict because many processes interact. In order to support decision making when managing semi-natural grasslands in the Eifel National Park, we built the WoodS-Model (Woodland Succession Model). A multimodeling approach was used to integrate vegetation dynamics in both the herbaceous and shrub/tree layer. The cover of grasses and herbs was simulated in a compartment model, whereas bushes and trees were modelled in an individual-based manner. Both models worked and interacted in a spatially explicit, raster-based landscape. We present here the model description, parameterization and testing. We show highly detailed projections of the succession of a semi-natural grassland including the influence of initial vegetation composition, neighborhood interactions and ungulate browsing. We carefully weighted the single processes against each other and their relevance for landscape development under different scenarios, while explicitly considering specific site conditions. Model evaluation revealed that the model is able to emulate successional patterns as observed in the field as well as plausible results for different population densities of red deer. Important neighborhood interactions such as seed dispersal, the protection of seedlings from browsing ungulates by thorny bushes, and the inhibition of wood encroachment by the herbaceous layer, have been successfully reproduced. Therefore, not only a detailed model but also detailed initialization turned out to be important for spatially explicit projections of a given site. The advantage of the WoodS-Model is that it integrates these many mutually interacting processes of succession. PMID:25494057

  18. Government Open Systems Interconnection Profile (GOSIP) Transition Strategy

    DTIC Science & Technology

    1993-09-01

    it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their...version 1 and 2. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed ...ORGANIZATION OF STUDY 1. The Standards Process Chapter II describes the process whereby standards are developed and adopted by the ISO and how the

  19. Building a Personalized Cancer Treatment System.

    PubMed

    Martinez, Alexandra; López, Gustavo; Bola Nos, Constantino; Alvarado, Daniel; Solano, Andrés; López, Mariana; Báez, Andrés; Quirós, Steve; Mora, Rodrigo

    2017-02-01

    This paper reports the process by which a personalized cancer treatment system was built, following a user-centered approach. We give some background on personalized cancer treatment, the particular tumor chemosensitivity assay supported by the system, as well as some quality and legal issues related to such health systems. We describe how Contextual Design was applied when building the system. Contextual design is a user-centered design technique involving seven steps. We also provide some details about the system implementation. Finally, we explain how the Think-Aloud protocol and Heuristic Evaluation methods were used to evaluate the system and report its results. A qualitative assessment from the users perspective is also provided. Results from the heuristic evaluation indicate that only one of ten heuristics was missing from the system, while five were partially covered and four were fully covered.

  20. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.

  1. The quality of the evidence base for clinical pathway effectiveness: room for improvement in the design of evaluation trials.

    PubMed

    Rotter, Thomas; Kinsman, Leigh; James, Erica; Machotta, Andreas; Steyerberg, Ewout W

    2012-06-18

    The purpose of this article is to report on the quality of the existing evidence base regarding the effectiveness of clinical pathway (CPW) research in the hospital setting. The analysis is based on a recently published Cochrane review of the effectiveness of CPWs. An integral component of the review process was a rigorous appraisal of the methodological quality of published CPW evaluations. This allowed the identification of strengths and limitations of the evidence base for CPW effectiveness. We followed the validated Cochrane Effective Practice and Organisation of Care Group (EPOC) criteria for randomized and non-randomized clinical pathway evaluations. In addition, we tested the hypotheses that simple pre-post studies tend to overestimate CPW effects reported. Out of the 260 primary studies meeting CPW content criteria, only 27 studies met the EPOC study design criteria, with the majority of CPW studies (more than 70 %) excluded from the review on the basis that they were simple pre-post evaluations, mostly comparing two or more annual patient cohorts. Methodologically poor study designs are often used to evaluate CPWs and this compromises the quality of the existing evidence base. Cochrane EPOC methodological criteria, including the selection of rigorous study designs along with detailed descriptions of CPW development and implementation processes, are recommended for quantitative evaluations to improve the evidence base for the use of CPWs in hospitals.

  2. Automated extraction of clinical traits of multiple sclerosis in electronic medical records

    PubMed Central

    Davis, Mary F; Sriram, Subramaniam; Bush, William S; Denny, Joshua C; Haines, Jonathan L

    2013-01-01

    Objectives The clinical course of multiple sclerosis (MS) is highly variable, and research data collection is costly and time consuming. We evaluated natural language processing techniques applied to electronic medical records (EMR) to identify MS patients and the key clinical traits of their disease course. Materials and methods We used four algorithms based on ICD-9 codes, text keywords, and medications to identify individuals with MS from a de-identified, research version of the EMR at Vanderbilt University. Using a training dataset of the records of 899 individuals, algorithms were constructed to identify and extract detailed information regarding the clinical course of MS from the text of the medical records, including clinical subtype, presence of oligoclonal bands, year of diagnosis, year and origin of first symptom, Expanded Disability Status Scale (EDSS) scores, timed 25-foot walk scores, and MS medications. Algorithms were evaluated on a test set validated by two independent reviewers. Results We identified 5789 individuals with MS. For all clinical traits extracted, precision was at least 87% and specificity was greater than 80%. Recall values for clinical subtype, EDSS scores, and timed 25-foot walk scores were greater than 80%. Discussion and conclusion This collection of clinical data represents one of the largest databases of detailed, clinical traits available for research on MS. This work demonstrates that detailed clinical information is recorded in the EMR and can be extracted for research purposes with high reliability. PMID:24148554

  3. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  4. Proposed Land Conveyance for Construction of Three Facilities at March Air Force Base, California

    DTIC Science & Technology

    1988-09-01

    identified would result from future development on the 845-acre parcel after it has been conveyed. Therefore, detailed development review and...Impact Analysis Process (EIAP) of the Air Force. This detailed development review is within the purview of the state and local government with...establishes the process under which subsequent detailed environmental review would be conducted. CEQA and its implementing regulations are administered by

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process. The objective of this study is to verify the generalized body-modes approach in comparison to high-fidelity FSI simulations to accurately predict structural deflections and stress loads in a WEC. Two verification cases are considered, a free-floating barge and a fixed-bottom column. Details for both the generalized body-modes models and FSI models are first provided. Results for each of the models are then compared and discussed. Finally, based on the verification results obtained, future plans for incorporating the generalized body-modes method into the WEC simulation tool, WEC-Sim, and the overall WEC design process are discussed.« less

  6. A Simulation Study Comparing Incineration and Composting in a Mars-Based Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Hogan, John; Kang, Sukwon; Cavazzoni, Jim; Levri, Julie; Finn, Cory; Luna, Bernadette (Technical Monitor)

    2000-01-01

    The objective of this study is to compare incineration and composting in a Mars-based advanced life support (ALS) system. The variables explored include waste pre-processing requirements, reactor sizing and buffer capacities. The study incorporates detailed mathematical models of biomass production and waste processing into an existing dynamic ALS system model. The ALS system and incineration models (written in MATLAB/SIMULINK(c)) were developed at the NASA Ames Research Center. The composting process is modeled using first order kinetics, with different degradation rates for individual waste components (carbohydrates, proteins, fats, cellulose and lignin). The biomass waste streams are generated using modified "Eneray Cascade" crop models, which use light- and dark-cycle temperatures, irradiance, photoperiod, [CO2], planting density, and relative humidity as model inputs. The study also includes an evaluation of equivalent system mass (ESM).

  7. A Chemical Model of the Coma of Comet C/2009 P1 (Garradd)

    NASA Astrophysics Data System (ADS)

    Boice, Daniel C.; Kawakita, H.; Kobayashi, H.; Naka, C.; Phelps, L.

    2012-10-01

    Modeling is essential to understand the important physical and chemical processes that occur in cometary comae. Photochemistry is a major source of ions and electrons that further initiate key gas-phase reactions, leading to the plethora of molecules and atoms observed in comets. The effects of photoelectrons that react via impacts are important to the overall ionization. We identify the relevant processes within a global modeling framework to understand simultaneous observations in the visible and near-IR of Comet C/2009 (Garradd) and to provide valuable insights into the intrinsic properties of its nucleus. Details of these processes are presented in the collision-dominated, inner coma of the comet to evaluate the relative chemical pathways and the relationship between parent and sibling molecules. Acknowledgements: We appreciate support from the NSF Planetary Astronomy Program.

  8. Importance Of Quality Control in Reducing System Risk, a Lesson Learned From The Shuttle and a Recommendation for Future Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Messer, Bradley P.

    2006-01-01

    This paper presents lessons learned from the Space Shuttle return to flight experience and the importance of these lessons learned in the development of new the NASA Crew Launch Vehicle (CLV). Specifically, the paper discusses the relationship between process control and system risk, and the importance of process control in improving space vehicle flight safety. It uses the External Tank (ET) Thermal Protection System (TPS) experience and lessons learned from the redesign and process enhancement activities performed in preparation for Return to Flight after the Columbia accident. The paper also, discusses in some details, the Probabilistic engineering physics based risk assessment performed by the Shuttle program to evaluate the impact of TPS failure on system risk and the application of the methodology to the CLV.

  9. Evaluation of fatigue-prone details using a low-cost thermoelastic stress analysis system.

    DOT National Transportation Integrated Search

    2016-11-01

    This study was designed to develop a novel approach for in situ evaluation of stress fields in the vicinity of fatigue-prone details on highway bridges using a low-cost microbolometer thermal imager. : The method was adapted into a field-deployable i...

  10. Psychophysical evaluation of the image quality of a dynamic flat-panel digital x-ray image detector using the threshold contrast detail detectability (TCDD) technique

    NASA Astrophysics Data System (ADS)

    Davies, Andrew G.; Cowen, Arnold R.; Bruijns, Tom J. C.

    1999-05-01

    We are currently in an era of active development of the digital X-ray imaging detectors that will serve the radiological communities in the new millennium. The rigorous comparative physical evaluations of such devices are therefore becoming increasingly important from both the technical and clinical perspectives. The authors have been actively involved in the evaluation of a clinical demonstration version of a flat-panel dynamic digital X-ray image detector (or FDXD). Results of objective physical evaluation of this device have been presented elsewhere at this conference. The imaging performance of FDXD under radiographic exposure conditions have been previously reported, and in this paper a psychophysical evaluation of the FDXD detector operating under continuous fluoroscopic conditions is presented. The evaluation technique employed was the threshold contrast detail detectability (TCDD) technique, which enables image quality to be measured on devices operating in the clinical environment. This approach addresses image quality in the context of both the image acquisition and display processes, and uses human observers to measure performance. The Leeds test objects TO[10] and TO[10+] were used to obtain comparative measurements of performance on the FDXD and two digital spot fluorography (DSF) systems, one utilizing a Plumbicon camera and the other a state of the art CCD camera. Measurements were taken at a range of detector entrance exposure rates, namely 6, 12, 25 and 50 (mu) R/s. In order to facilitate comparisons between the systems, all fluoroscopic image processing such as noise reduction algorithms, were disabled during the experiments. At the highest dose rate FDXD significantly outperformed the DSF comparison systems in the TCDD comparisons. At 25 and 12 (mu) R/s all three-systems performed in an equivalent manner and at the lowest exposure rate FDXD was inferior to the two DSF systems. At standard fluoroscopic exposures, FDXD performed in an equivalent manner to the DSF systems for the TCDD comparisons. This would suggest that FDXD would therefore perform adequately in a clinical fluoroscopic environment and our initial clinical experiences support this. Noise reduction processing of the fluoroscopic data acquired on FDXD was also found to further improve TCDD performance for FDXD. FDXD therefore combines acceptable fluoroscopic performance with excellent radiographic (snap shot) imaging fidelity, allowing the possibility of a universal x-ray detector to be developed, based on FDXD's technology. It is also envisaged that fluoroscopic performance will be improved by the development of digital image enhancement techniques specifically tailored to the characteristics of the FDXD detector.

  11. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  12. Detailed Design of a Pulsed Plasma Thrust Stand

    NASA Astrophysics Data System (ADS)

    Verbin, Andrew J.

    This thesis gives a detailed design process for a pulsed type thruster. The thrust stand designed in this paper is for a Pulsed Plasma Thruster built by Sun Devil Satellite Laboratory, a student organization at Arizona State University. The thrust stand uses a torsional beam rotating to record displacement. This information, along with impulse-momentum theorem is applied to find the impulse bit of the thruster, which varies largely from other designs which focus on using the natural dynamics their fixtures. The target impulse to record on this fixture was estimated to be 275 muN-s of impulse. Through calibration and experimentation, the fixture is capable of recording an impulse of 332 muN-s +/- 14.81 muN-s, close to the target impulse. The error due to noise was characterized and evaluated to be under 5% which is deemed to be acceptable.

  13. Direct optical detection of protein-ligand interactions.

    PubMed

    Gesellchen, Frank; Zimmermann, Bastian; Herberg, Friedrich W

    2005-01-01

    Direct optical detection provides an excellent means to investigate interactions of molecules in biological systems. The dynamic equilibria inherent to these systems can be described in greater detail by recording the kinetics of a biomolecular interaction. Optical biosensors allow direct detection of interaction patterns without the need for labeling. An overview covering several commercially available biosensors is given, with a focus on instruments based on surface plasmon resonance (SPR) and reflectometric interference spectroscopy (RIFS). Potential assay formats and experimental design, appropriate controls, and calibration procedures, especially when handling low molecular weight substances, are discussed. The single steps of an interaction analysis combined with practical tips for evaluation, data processing, and interpretation of kinetic data are described in detail. In a practical example, a step-by-step procedure for the analysis of a low molecular weight compound interaction with serum protein, determined on a commercial SPR sensor, is presented.

  14. Study of the possibilities of more rational use of energy in the sector of trade and commerce, part 2

    NASA Astrophysics Data System (ADS)

    Ebersbach, K. F.; Fischer, A.; Layer, G.; Steinberger, W.; Wegner, M.; Wiesner, B.

    1982-07-01

    The energy demand in the sector of trade and commerce was registered and analyzed. Measures to improve the energy demand structure are presented. In several typical firms like hotels, office buildings, locksmith's shops, motor vehicle repair shops, butcher's shops, laundries and bakeries, detailed surveys of energy consumption were done and included in a statistic evaluation. Subjects analyzed were: development of the energy supply; technology of energy application; final energy demand broken down into demand for light, power, space heating and process heat as well as the demand for cooling; daily and annual load curves of energy consumption and their dependence on various parameters; and measures to improve the structure of energy demand. Detailed measurement points out negligences in the surveyed firms and shows possibilities for likely energy savings. In addition, standard values for specific energy consumption are obtained.

  15. Cylinder Test Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Catanach; Larry Hill; Herbert Harry

    1999-10-01

    The purpose of the cylinder testis two-fold: (1) to characterize the metal-pushing ability of an explosive relative to that of other explosives as evaluated by the E{sub 19} cylinder energy and the G{sub 19} Gurney energy and (2) to help establish the explosive product equation-of-state (historically, the Jones-Wilkins-Lee (JWL) equation). This specification details the material requirements and procedures necessary to assemble and fire a typical Los Alamos National Laboratory (LANL) cylinder test. Strict adherence to the cylinder. material properties, machining tolerances, material heat-treatment and etching processes, and high explosive machining tolerances is essential for test-to-test consistency and to maximize radialmore » wall expansions. Assembly and setup of the cylinder test require precise attention to detail, especially when placing intricate pin wires on the cylinder wall. The cylinder test is typically fired outdoors and at ambient temperature.« less

  16. Shuttle Processing

    NASA Technical Reports Server (NTRS)

    Guodace, Kimberly A.

    2010-01-01

    This slide presentation details shuttle processing flow which starts with wheel stop and ends with launching. The flow is from landing the orbiter is rolled into the Orbiter Processing Facility (OPF), where processing is performed, it is then rolled over to the Vehicle Assembly Building (VAB) where it is mated with the propellant tanks, and payloads are installed. A different flow is detailed if the weather at Kennedy Space Center requires a landing at Dryden.

  17. Transportation of radionuclides in urban environs: draft environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finley, N.C.; Aldrich, D.C.; Daniel, S.L.

    1980-07-01

    This report assesses the environmental consequences of the transportation of radioactive materials in densely populated urban areas, including estimates of the radiological, nonradiological, and social impacts arising from this process. The chapters of the report and the appendices which follow detail the methodology and results for each of four causative event categories: incident free transport, vehicular accidents, human errors or deviations from accepted quality assurance practices, and sabotage or malevolent acts. The numerical results are expressed in terms of the expected radiological and economic impacts from each. Following these discussions, alternatives to the current transport practice are considered. Then, themore » detailed analysis is extended from a limited area of New York city to other urban areas. The appendices contain the data bases and specific models used to evaluate these impacts, as well as discussions of chemical toxicity and the social impacts of radioactive material transport in urban areas. The latter are evaluated for each causative event category in terms of psychological, sociological, political, legal, and organizational impacts. The report is followed by an extensive bibliography covering the many fields of study which were required in performing the analysis.« less

  18. Comparing Anisotropic Output-Based Grid Adaptation Methods by Decomposition

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Loseille, Adrien; Krakos, Joshua A.; Michal, Todd

    2015-01-01

    Anisotropic grid adaptation is examined by decomposing the steps of flow solution, ad- joint solution, error estimation, metric construction, and simplex grid adaptation. Multiple implementations of each of these steps are evaluated by comparison to each other and expected analytic results when available. For example, grids are adapted to analytic metric fields and grid measures are computed to illustrate the properties of multiple independent implementations of grid adaptation mechanics. Different implementations of each step in the adaptation process can be evaluated in a system where the other components of the adaptive cycle are fixed. Detailed examination of these properties allows comparison of different methods to identify the current state of the art and where further development should be targeted.

  19. An audit of inpatient case records and suggestions for improvements.

    PubMed

    Arshad, A R; Ganesananthan, S; Ajik, S

    2000-09-01

    A study was carried out in Kuala Lumpur Hospital to review the adequacy of documentation of bio-data and clinical data including clinical examination, progress review, discharge process and doctor's identification in ten of our clinical departments. Twenty criteria were assessed in a retrospective manner to scrutinize the contents of medical notes and subsequently two prospective evaluations were conducted to see improvement in case notes documentation. Deficiencies were revealed in all the criteria selected. However there was a statistically significant improvement in the eleven clinical data criteria in the subsequent two evaluations. Illegibility of case note entries and an excessive usage of abbreviations were noted during this audit. All clinical departments and hospitals should carry out detailed studies into the contents of their medical notes.

  20. Love and sex after 60: how to evaluate and treat the impotent older man. A roundtable discussion: Part 2.

    PubMed

    Butler, R N; Lewis, M I; Hoffman, E; Whitehead, E D

    1994-10-01

    In the medical evaluation of older men with erectile dysfunction, obtain a detailed history to determine whether the dysfunction is organic or psychogenic. Determine if there are underlying pathologic processes--most notably vascular diseases--or other factors responsible for the dysfunction, such as medications or nerve or arterial damage from surgery. Lifestyle changes in mid-life (regular exercise, a low-fat diet, and smoking cessation) increase a man's chances of remaining potent as he grows older. Treatments for impotence include injection therapy, vacuum devices, and implants. Each therapy has advantages and disadvantages, and the informed patient plays an important role in choosing the therapy that is right for him.

Top