Sample records for baseline development process

  1. Improved silicon carbide for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.; Mangels, J. A.

    1986-01-01

    The development of silicon carbide materials of high strength was initiated and components of complex shape and high reliability were formed. The approach was to adapt a beta-SiC powder and binder system to the injection molding process and to develop procedures and process parameters capable of providing a sintered silicon carbide material with improved properties. The initial effort was to characterize the baseline precursor materials, develop mixing and injection molding procedures for fabricating test bars, and characterize the properties of the sintered materials. Parallel studies of various mixing, dewaxing, and sintering procedures were performed in order to distinguish process routes for improving material properties. A total of 276 modulus-of-rupture (MOR) bars of the baseline material was molded, and 122 bars were fully processed to a sinter density of approximately 95 percent. Fluid mixing techniques were developed which significantly reduced flaw size and improved the strength of the material. Initial MOR tests indicated that strength of the fluid-mixed material exceeds the baseline property by more than 33 percent. the baseline property by more than 33 percent.

  2. Baseliner: an open source, interactive tool for processing sap flux data from thermal dissipation probes.

    Treesearch

    Andrew C. Oishi; David Hawthorne; Ram Oren

    2016-01-01

    Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...

  3. Enhanced human service transportation models : joint demonstration. Phase 1, System planning and design process evaluation : baseline analysis

    DOT National Transportation Integrated Search

    2007-11-13

    This document presents the findings from the baseline phase of the evaluation of the process being used by eight sites to develop a design for a Travel Management Coordination Center (TMCC) for improved coordination of human service transportation wi...

  4. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woollard, J; Ayan, A; DiCostanzo, D

    2015-06-15

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed onmore » each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs.« less

  5. Baseline process description for simulating plutonium oxide production for precalc project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, J. A.

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less

  6. Climate Action Planning Process | Climate Neutral Research Campuses | NREL

    Science.gov Websites

    Action Planning Process Climate Action Planning Process For research campuses, NREL has developed a five-step process to develop and implement climate action plans: Determine baseline energy consumption Analyze technology options Prepare a plan and set priorities Implement the climate action plan Measure and

  7. Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment

    DTIC Science & Technology

    2015-02-04

    implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented

  8. Reinforced Carbon Carbon (RCC) oxidation resistant material samples - Baseline coated, and baseline coated with tetraethyl orthosilicate (TEOS) impregnation

    NASA Technical Reports Server (NTRS)

    Gantz, E. E.

    1977-01-01

    Reinforced carbon-carbon material specimens were machined from 19 and 33 ply flat panels which were fabricated and processed in accordance with the specifications and procedures accepted for the fabrication and processing of the leading edge structural subsystem (LESS) elements for the space shuttle orbiter. The specimens were then baseline coated and tetraethyl orthosilicate impregnated, as applicable, in accordance with the procedures and requirements of the appropriate LESS production specifications. Three heater bars were ATJ graphite silicon carbide coated with the Vought 'pack cementation' coating process, and three were stackpole grade 2020 graphite silicon carbide coated with the chemical vapor deposition process utilized by Vought in coating the LESS shell development program entry heater elements. Nondestructive test results are reported.

  9. Development of the Direct Fabrication Process for Plutonium Immobilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Congdon, J.W.

    2001-07-10

    The current baseline process for fabricating pucks for the Plutonium Immobilization Program includes granulation of the milled feed prior to compaction. A direct fabrication process was demonstrated that eliminates the need for granulation.

  10. Baseline LAW Glass Formulation Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  11. Development of an integrated, zero-G pneumatic transporter/rotating-paddle incinerator/catalytic afterburner subsystem for processing human waste on board spacecraft

    NASA Technical Reports Server (NTRS)

    Fields, S. F.; Labak, L. J.; Honegger, R. J.

    1974-01-01

    A baseline laboratory prototype of an integrated, six man, zero-g subsystem for processing human wastes onboard spacecraft was investigated, and included the development of an operational specification for the baseline subsystem, followed by design and fabrication. The program was concluded by performing a series of six tests over a period of two weeks to evaluate the performance of the subsystem. The results of the tests were satisfactory, however, several changes in the design of the subsystem are required before completely satisfactory performance can be achieved.

  12. Development of integrated, zero-G pneumatic transporter/rotating paddle incinerator/catalytic afterburner subsystem for processing human wastes on board spacecraft

    NASA Technical Reports Server (NTRS)

    Fields, S. F.; Labak, L. J.; Honegger, R. J.

    1974-01-01

    A four component system was developed which consists of a particle size reduction mechanism, a pneumatic waste transport system, a rotating-paddle incinerator, and a catalytic afterburner to be integrated into a six-man, zero-g subsystem for processing human wastes on board spacecraft. The study included the development of different concepts or functions, the establishment of operational specifications, and a critical evaluation for each of the four components. A series of laboratory tests was run, and a baseline subsystem design was established. An operational specification was also written in preparation for detailed design and testing of this baseline subsystem.

  13. Kennedy Space Center Launch and Landing Support

    NASA Technical Reports Server (NTRS)

    Wahlberg, Jennifer

    2010-01-01

    The presentations describes Kennedy Space Center (KSC) payload processing, facilities and capabilities, and research development and life science experience. Topics include launch site processing, payload processing, key launch site processing roles, leveraging KSC experience, Space Station Processing Facility and capabilities, Baseline Data Collection Facility, Space Life Sciences Laboratory and capabilities, research payload development, International Space Station research flight hardware, KSC flight payload history, and KSC life science expertise.

  14. Avionics test bed development plan

    NASA Technical Reports Server (NTRS)

    Harris, L. H.; Parks, J. M.; Murdock, C. R.

    1981-01-01

    A development plan for a proposed avionics test bed facility for the early investigation and evaluation of new concepts for the control of large space structures, orbiter attached flex body experiments, and orbiter enhancements is presented. A distributed data processing facility that utilizes the current laboratory resources for the test bed development is outlined. Future studies required for implementation, the management system for project control, and the baseline system configuration are defined. A background analysis of the specific hardware system for the preliminary baseline avionics test bed system is included.

  15. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, Isabelle S.; Pegg, Ian L.; Gan, Hao

    2015-06-18

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  16. Integrated Advanced Sounding Unit-A (AMSU-A). Configuration Management Plan

    NASA Technical Reports Server (NTRS)

    Cavanaugh, J.

    1996-01-01

    The purpose of this plan is to identify the baseline to be established during the development life cycle of the integrated AMSU-A, and define the methods and procedures which Aerojet will follow in the implementation of configuration control for each established baseline. Also this plan establishes the Configuration Management process to be used for the deliverable hardware, software, and firmware of the Integrated AMSU-A during development, design, fabrication, test, and delivery.

  17. Marital Conflict, Allostatic Load, and the Development of Children's Fluid Cognitive Performance

    PubMed Central

    Hinnant, J. Benjamin; El-Sheikh, Mona; Keiley, Margaret; Buckhalt, Joseph A.

    2013-01-01

    Relations between marital conflict, children’s respiratory sinus arrhythmia (RSA), and fluid cognitive performance were examined over three years to assess allostatic processes. Participants were 251 children reporting on marital conflict, baseline RSA and RSA reactivity to a lab challenge were recorded, and fluid cognitive performance was measured using the Woodcock-Johnson III. A cross-lagged model showed that higher levels of marital conflict at age 8 predicted weaker RSA-R at age 9 for children with lower baseline RSA. A growth model showed that lower baseline RSA in conjunction with weaker RSA-R predicted the slowest development of fluid cognitive performance. Findings suggest that stress may affect development of physiological systems regulating attention, which are tied to the development of fluid cognitive performance. PMID:23534537

  18. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  19. Baseline glucose level is an individual trait that is negatively associated with lifespan and increases due to adverse environmental conditions during development and adulthood.

    PubMed

    Montoya, Bibiana; Briga, Michael; Jimeno, Blanca; Moonen, Sander; Verhulst, Simon

    2018-05-01

    High baseline glucose levels are associated with pathologies and shorter lifespan in humans, but little is known about causes and consequences of individual variation in glucose levels in other species. We tested to what extent baseline blood glucose level is a repeatable trait in adult zebra finches, and whether glucose levels were associated with age, manipulated environmental conditions during development (rearing brood size) and adulthood (foraging cost), and lifespan. We found that: (1) repeatability of glucose levels was 30%, both within and between years. (2) Having been reared in a large brood and living with higher foraging costs as adult were independently associated with higher glucose levels. Furthermore, the finding that baseline glucose was low when ambient temperature was high, and foraging costs were low, indicates that glucose is regulated at a lower level when energy turnover is low. (3) Survival probability decreased with increasing baseline glucose. We conclude that baseline glucose is an individual trait negatively associated with survival, and increases due to adverse environmental conditions during development (rearing brood size) and adulthood (foraging cost). Blood glucose may be, therefore, part of the physiological processes linking environmental conditions to lifespan.

  20. Application of outlier analysis for baseline-free damage diagnosis

    NASA Astrophysics Data System (ADS)

    Kim, Seung Dae; In, Chi Won; Cronin, Kelly E.; Sohn, Hoon; Harries, Kent

    2006-03-01

    As carbon fiber-reinforced polymer (CFRP) laminates have been widely accepted as valuable materials for retrofitting civil infrastructure systems, an appropriate assessment of bonding conditions between host structures and CFRP laminates becomes a critical issue to guarantee the performance of CFRP strengthened structures. This study attempts to develop a continuous performance monitoring system for CFRP strengthened structures by autonomously inspecting the bonding conditions between the CFRP layers and the host structure. The uniqueness of this study is to develop a new concept and theoretical framework of nondestructive testing (NDT), in which debonding is detected "without using past baseline data." The proposed baseline-free damage diagnosis is achieved in two stages. In the first step, features sensitive to debonding of the CFPR layers but insensitive to loading conditions are extracted based on a concept referred to as a time reversal process. This time reversal process allows extracting damage-sensitive features without direct comparison with past baseline data. Then, a statistical damage classifier will be developed in the second step to make a decision regarding the bonding condition of the CFRP layers. The threshold necessary for decision making will be adaptively determined without predetermined threshold values. Monotonic and fatigue load tests of full-scale CFRP strengthened RC beams are conducted to demonstrate the potential of the proposed reference-free debonding monitoring system.

  1. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  2. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kellie, C.L.

    This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.

  3. Development of MCAERO wing design panel method with interactive graphics module

    NASA Technical Reports Server (NTRS)

    Hawk, J. D.; Bristow, D. R.

    1984-01-01

    A reliable and efficient iterative method has been developed for designing wing section contours corresponding to a prescribed subcritical pressure distribution. The design process is initialized by using MCAERO (MCAIR 3-D Subsonic Potential Flow Analysis Code) to analyze a baseline configuration. A second program DMCAERO is then used to calculate a matrix containing the partial derivative of potential at each control point with respect to each unknown geometry parameter by applying a first-order expansion to the baseline equations in MCAERO. This matrix is calculated only once but is used in each iteration cycle to calculate the geometry perturbation and to analyze the perturbed geometry. The potential on the new geometry is calculated by linear extrapolation from the baseline solution. This extrapolated potential is converted to velocity by numerical differentiation, and velocity is converted to pressure by using Bernoulli's equation. There is an interactive graphics option which allows the user to graphically display the results of the design process and to interactively change either the geometry or the prescribed pressure distribution.

  4. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, A.G.

    This plan establishes the integrated configuration management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford site technical baseline.

  5. Development of a S/w System for Relative Positioning Using GPS Carrier Phase

    NASA Astrophysics Data System (ADS)

    Ahn, Yong-Won; Kim, Chun-Hwey; Park, Pil-Ho; Park, Jong-Uk; Jo, Jeong-Ho

    1997-12-01

    We developed a GPS phase data processing S/W system which calculates baseline vectors and distances between two points located in the surface of the Earth. For this development a Double-Difference method and L1 carrier phase data from GPS(Global Positioning System) were used. This S/W system consists of four main parts : satellite position calculation, Single-Difference equation, Double-Difference equation, and correlation. To verify our S/W, we fixed KAO(N36.37, E127.37, H77.61m), one of the International GPS Services for Geodynamics, which is located at Tae-Jon, and we measured baseline vectors and relative distances with data from observations at approximate baseline distances of 2.7, 42.1, 81.1, 146.6km. Then we compared the vectors and distances with the data which we obtained from the GPSurvery S/W system, with the L1/L2 ION-Free method and broadcast ephemeris. From the comparison of the vectors and distances with the data from the GPSurvey S/W system, we found baseline vectors X, Y, Z and baseline distances matched well within the extent of 50cm and 10cm, respectively.

  6. Site systems engineering fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GRYGIEL, M.L.

    1998-10-08

    Manage the Site Systems Engineering process to provide a traceable integrated requirements-driven, and technically defensible baseline. Through the Site Integration Group(SIG), Systems Engineering ensures integration of technical activities across all site projects. Systems Engineering's primary interfaces are with the RL Project Managers, the Project Direction Office and with the Project Major Subcontractors, as well as with the Site Planning organization. Systems Implementation: (1) Develops, maintains, and controls the site integrated technical baseline, ensures the Systems Engineering interfaces between projects are documented, and maintain the Site Environmental Management Specification. (2) Develops and uses dynamic simulation models for verification of the baselinemore » and analysis of alternatives. (3) Performs and documents fictional and requirements analyses. (4) Works with projects, technology management, and the SIG to identify and resolve technical issues. (5) Supports technical baseline information for the planning and budgeting of the Accelerated Cleanup Plan, Multi-Year Work Plans, Project Baseline Summaries as well as performance measure reporting. (6) Works with projects to ensure the quality of data in the technical baseline. (7) Develops, maintains and implements the site configuration management system.« less

  7. Evaluation of ERDA-sponsored coal feed system development

    NASA Technical Reports Server (NTRS)

    Phen, R. L.; Luckow, W. K.; Mattson, L.; Otth, D.; Tsou, P.

    1977-01-01

    Coal feeders were evaluated based upon criteria such as technical feasibility, performance (i.e. ability to meet process requirements), projected life cycle costs, and projected development cost. An initial set of feeders was selected based on the feeders' cost savings potential compared with baseline lockhopper systems. Additional feeders were considered for selection based on: (1) increasing the probability of successful feeder development; (2) application to specific processes; and (3) technical merit. A coal feeder development program is outlined.

  8. Application of time-reversal guided waves to field bridge testing for baseline-free damage diagnosis

    NASA Astrophysics Data System (ADS)

    Kim, S. B.; Sohn, H.

    2006-03-01

    There is ongoing research at Carnegie Mellon University to develop a "baseline-free" nondestructive evaluation technique. The uniqueness of this baseline-free diagnosis lies in that certain types of damage can be identified without direct comparison of test signals with previously stored baseline signals. By relaxing dependency on the past baseline data, false positive indications of damage, which might take place due to varying operational and environmental conditions of in-service structures, can be minimized. This baseline-free diagnosis technique is developed based on the concept of a time reversal process (TRP). According to the TRP, an input signal at an original excitation location can be reconstructed if a response signal obtained from another point is emitted back to the original point after being reversed in a time domain. Damage diagnosis lies in the premise that the time reversibility breaks down when a certain type of defect such as nonlinear damage exists along the wave propagation path. Then, the defect can be sensed by examining a reconstructed signal after the TRP. In this paper, the feasibility of the proposed NDT technique is investigated using actual test data obtained from the Buffalo Creek Bridge in Pennsylvania.

  9. Artificial intelligence (AI) based tactical guidance for fighter aircraft

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The knowledge-based systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real time in the Langley Differential Maneuvering Simulator, are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs.

  10. The association between immune activation and manic symptoms in patients with a depressive disorder.

    PubMed

    Becking, K; Boschloo, L; Vogelzangs, N; Haarman, B C M; Riemersma-van der Lek, R; Penninx, B W J H; Schoevers, R A

    2013-10-22

    Although recent studies have shown that immunological processes play an important role in the pathophysiology of mood disorders, immune activation may only be present in specific subgroups of patients. Our study aimed to examine whether immune activation was associated with (a) the presence of manic symptoms and (b) the onset of manic symptoms during 2 years of follow-up in depressed patients. Patients with a depressive disorder at baseline (N=957) and healthy controls (N=430) were selected from the Netherlands Study of Depression and Anxiety. Assessments included lifetime manic symptoms at baseline and two-year follow up, as well as C-reactive protein (CRP), interleukin-6 (IL-6) and tumor necrosis factor alpha (TNF-α) at baseline. Within depressed patients, immune activation was not related to the presence or absence of lifetime manic symptoms at baseline. However, CRP levels were strongly elevated in depressed men who developed manic symptoms compared with those who did not develop manic symptoms over 2 years (P<0.001, Cohen's d=0.89). IL-6 and TNF-α were also higher in depressed men with an onset of manic symptoms, but this association was not significant. However, we found that the onset of manic symptoms was particularly high in men with multiple elevated levels of inflammatory markers. Depressed men who developed manic symptoms during follow-up had increased immunological activity (especially CRP) compared with depressed men who did not develop manic symptoms. Further research should explore whether a treatment approach focusing on inflammatory processes may be more effective in this specific subgroup of depressed patients.

  11. Master Plan for Data Processing Services, 1988-1993. Report.

    ERIC Educational Resources Information Center

    Connecticut Regional Community Colleges, Hartford. Board of Trustees.

    Developed in accordance with a legislative mandate, this master plan for Connecticut's regional community colleges provides baseline data on the current status of data processing; identifies issues, trends, and constraints; and sets forth specific plans for computing activities and services within the community college system. Introductory…

  12. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    PubMed

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  13. Clinical governance is "ACE"--using the EFQM excellence model to support baseline assessment.

    PubMed

    Holland, K; Fennell, S

    2000-01-01

    The introduction of clinical governance in the "new NHS" means that National Health Service (NHS) organisations are now accountable for the quality of the services they provide to their local communities. As part of the implementation of clinical governance in the NHS, Trusts and health authorities had to complete a baseline assessment of their capability and capacity by September 1999. Describes one Trust's approach to developing and implementing its baseline assessment tool, based upon its existing use of the European Foundation for Quality Management (EFQM) Excellence Model. An initial review of the process suggests that the model provides an adaptable framework for the development of a comprehensive and practical assessment tool and that self-assessment ensures ownership of action plans at service level.

  14. Modular space station, phase B extension. Information management advanced development. Volume 4: Data processing assembly

    NASA Technical Reports Server (NTRS)

    Gerber, C. R.

    1972-01-01

    The computation and logical functions which are performed by the data processing assembly of the modular space station are defined. The subjects discussed are: (1) requirements analysis, (2) baseline data processing assembly configuration, (3) information flow study, (4) throughput simulation, (5) redundancy study, (6) memory studies, and (7) design requirements specification.

  15. Improved silicon carbide for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Whalen, T. J.; Winterbottom, W. L.

    1986-01-01

    Work performed to develop silicon carbide materials of high strength and to form components of complex shape and high reliability is described. A beta-SiC powder and binder system was adapted to the injection molding process and procedures and process parameters developed capable of providing a sintered silicon carbide material with improved properties. The initial effort has been to characterize the baseline precursor materials (beta silicon carbide powder and boron and carbon sintering aids), develop mixing and injection molding procedures for fabricating test bars, and characterize the properties of the sintered materials. Parallel studies of various mixing, dewaxing, and sintering procedures have been carried out in order to distinguish process routes for improving material properties. A total of 276 MOR bars of the baseline material have been molded, and 122 bars have been fully processed to a sinter density of approximately 95 percent. The material has a mean MOR room temperature strength of 43.31 ksi (299 MPa), a Weibull characteristic strength of 45.8 ksi (315 MPa), and a Weibull modulus of 8.0. Mean values of the MOR strengths at 1000, 1200, and 14000 C are 41.4, 43.2, and 47.2 ksi, respectively. Strength controlling flaws in this material were found to consist of regions of high porosity and were attributed to agglomerates originating in the initial mixing procedures. The mean stress rupture lift at 1400 C of five samples tested at 172 MPa (25 ksi) stress was 62 hours and at 207 MPa (30 ksi) stress was 14 hours. New fluid mixing techniques have been developed which significantly reduce flaw size and improve the strength of the material. Initial MOR tests indicate the strength of the fluid-mixed material exceeds the baseline property by more than 33 percent.

  16. Exploring the success of an integrated primary care partnership: a longitudinal study of collaboration processes.

    PubMed

    Valentijn, Pim P; Vrijhoef, Hubertus J M; Ruwaard, Dirk; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-01-22

    Forming partnerships is a prominent strategy used to promote integrated service delivery across health and social service systems. Evidence about the collaboration process upon which partnerships evolve has rarely been addressed in an integrated-care setting. This study explores the longitudinal relationship of the collaboration process and the influence on the final perceived success of a partnership in such a setting. The collaboration process through which partnerships evolve is based on a conceptual framework which identifies five themes: shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management. Fifty-nine out of 69 partnerships from a national programme in the Netherlands participated in this survey study. At baseline, 338 steering committee members responded, and they returned 320 questionnaires at follow-up. Multiple-regression-analyses were conducted to explore the relationship between the baseline as well as the change in the collaboration process and the final success of the partnerships. Mutual gains and process management were the most significant baseline predictors for the final success of the partnership. A positive change in the relationship dynamics had a significant effect on the final success of a partnership. Insight into the collaboration process of integrated primary care partnerships offers a potentially powerful way of predicting their success. Our findings underscore the importance of monitoring the collaboration process during the development of the partnerships in order to achieve their full collaborative advantage.

  17. Single baseline GLONASS observations with VLBI: data processing and first results

    NASA Astrophysics Data System (ADS)

    Tornatore, V.; Haas, R.; Duev, D.; Pogrebenko, S.; Casey, S.; Molera Calvés, G.; Keimpema, A.

    2011-07-01

    Several tests to observe signals transmitted by GLONASS (GLObal NAvigation Satellite System) satellites have been performed using the geodetic VLBI (Very Long Baseline Interferometry) technique. The radio telescopes involved in these experiments were Medicina (Italy) and Onsala (Sweden), both equipped with L-band receivers. Observations at the stations were performed using the standard Mark4 VLBI data acquisition rack and Mark5A disk-based recorders. The goals of the observations were to develop and test the scheduling, signal acquisition and processing routines to verify the full tracking pipeline, foreseeing the cross-correlation of the recorded data on the baseline Onsala-Medicina. The natural radio source 3c286 was used as a calibrator before the starting of the satellite observation sessions. Delay models, including the tropospheric and ionospheric corrections, which are consistent for both far- and near-field sources are under development. Correlation of the calibrator signal has been performed using the DiFX software, while the satellite signals have been processed using the narrow band approach with the Metsaehovi software and analysed with a near-field delay model. Delay models both for the calibrator signals and the satellites signals, using the same geometrical, tropospheric and ionospheric models, are under investigation to make a correlation of the satellite signals possible.

  18. Semiparametric temporal process regression of survival-out-of-hospital.

    PubMed

    Zhan, Tianyu; Schaubel, Douglas E

    2018-05-23

    The recurrent/terminal event data structure has undergone considerable methodological development in the last 10-15 years. An example of the data structure that has arisen with increasing frequency involves the recurrent event being hospitalization and the terminal event being death. We consider the response Survival-Out-of-Hospital, defined as a temporal process (indicator function) taking the value 1 when the subject is currently alive and not hospitalized, and 0 otherwise. Survival-Out-of-Hospital is a useful alternative strategy for the analysis of hospitalization/survival in the chronic disease setting, with the response variate representing a refinement to survival time through the incorporation of an objective quality-of-life component. The semiparametric model we consider assumes multiplicative covariate effects and leaves unspecified the baseline probability of being alive-and-out-of-hospital. Using zero-mean estimating equations, the proposed regression parameter estimator can be computed without estimating the unspecified baseline probability process, although baseline probabilities can subsequently be estimated for any time point within the support of the censoring distribution. We demonstrate that the regression parameter estimator is asymptotically normal, and that the baseline probability function estimator converges to a Gaussian process. Simulation studies are performed to show that our estimating procedures have satisfactory finite sample performances. The proposed methods are applied to the Dialysis Outcomes and Practice Patterns Study (DOPPS), an international end-stage renal disease study.

  19. Best practices for budget-based design.

    DOT National Transportation Integrated Search

    2017-03-01

    State Departments of Transportation (State DOTs) encounter difficulties in establishing feasible and : reliable project budget early in the project development. The lack of a systematic process for establishing : baseline budget with the consideratio...

  20. Development of measures to evaluate youth advocacy for obesity prevention.

    PubMed

    Millstein, Rachel A; Woodruff, Susan I; Linton, Leslie S; Edwards, Christine C; Sallis, James F

    2016-07-26

    Youth advocacy has been successfully used in substance use prevention but is a novel strategy in obesity prevention. As a precondition for building an evidence base for youth advocacy for obesity prevention, the present study aimed to develop and evaluate measures of youth advocacy mediator, process, and outcome variables. The Youth Engagement and Action for Health (YEAH!) program (San Diego County, CA) engaged youth and adult group leaders in advocacy for school and neighborhood improvements to nutrition and physical activity environments. Based on a model of youth advocacy, scales were developed to assess mediators, intervention processes, and proximal outcomes of youth advocacy for obesity prevention. Youth (baseline n = 136) and adult group leaders (baseline n = 47) completed surveys before and after advocacy projects. With baseline data, we created youth advocacy and adult leadership subscales using confirmatory factor analysis (CFA) and described their psychometric properties. Youth came from 21 groups, were ages 9-22, and most were female. Most youth were non-White, and the largest ethnic group was Hispanic/Latino (35.6%). The proposed factor structure held for most (14/20 youth and 1/2 adult) subscales. Modifications were necessary for 6 of the originally proposed 20 youth and 1 of the 2 adult multi-item subscales, which involved splitting larger subscales into two components and dropping low-performing items. Internally consistent scales to assess mediators, intervention processes, and proximal outcomes of youth advocacy for obesity prevention were developed. The resulting scales can be used in future studies to evaluate youth advocacy programs.

  1. Baseline Maritime Aerosol: Methodology to Derive the Optical Thickness and Scattering Properties

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Smirnov, Alexander; Holben, Brent N.; Dubovik, Oleg; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Satellite Measurements of the global distribution of aerosol and their effect on climate should be viewed in respect to a baseline aerosol. In this concept, concentration of fine mode aerosol particles is elevated above the baseline by man-made activities (smoke or urban pollution), while coarse mode by natural processes (e.g. dust or sea-spray). Using 1-3 years of measurements in 10 stations of the Aerosol Robotic network (ACRONET we develop a methodology and derive the optical thickness and properties of this baseline aerosol for the Pacific and Atlantic Oceans. Defined as the median for periods of stable optical thickness (standard deviation < 0.02) during 2-6 days, the median baseline aerosol optical thickness over the Pacific Ocean is 0.052 at 500 am with Angstrom exponent of 0.77, and 0.071 and 1.1 respectively, over the Atlantic Ocean.

  2. Information processing biases concurrently and prospectively predict depressive symptoms in adolescents: Evidence from a self-referent encoding task.

    PubMed

    Connolly, Samantha L; Abramson, Lyn Y; Alloy, Lauren B

    2016-01-01

    Negative information processing biases have been hypothesised to serve as precursors for the development of depression. The current study examined negative self-referent information processing and depressive symptoms in a community sample of adolescents (N = 291, Mage at baseline = 12.34 ± 0.61, 53% female, 47.4% African-American, 49.5% Caucasian and 3.1% Biracial). Participants completed a computerised self-referent encoding task (SRET) and a measure of depressive symptoms at baseline and completed an additional measure of depressive symptoms nine months later. Several negative information processing biases on the SRET were associated with concurrent depressive symptoms and predicted increases in depressive symptoms at follow-up. Findings partially support the hypothesis that negative information processing biases are associated with depressive symptoms in a nonclinical sample of adolescents, and provide preliminary evidence that these biases prospectively predict increases in depressive symptoms.

  3. EVALUATION OF ALTERNATIVE STRONIUM AND TRANSURANIC SEPARATION PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SMALLEY CS

    2011-04-25

    In order to meet contract requirements on the concentrations of strontium-90 and transuranic isotopes in the immobilized low-activity waste, strontium-90 and transuranics must be removed from the supernate of tanks 241-AN-102 and 241-AN-107. The process currently proposed for this application is an in-tank precipitation process using strontium nitrate and sodium permanganate. Development work on the process has not proceeded since 2005. The purpose of the evaluation is to identify whether any promising alternative processes have been developed since this issue was last examined, evaluate the alternatives and the baseline process, and recommend which process should be carried forward.

  4. Advanced optical sensing and processing technologies for the distributed control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Williams, G. M.; Fraser, J. C.

    1991-01-01

    The objective was to examine state-of-the-art optical sensing and processing technology applied to control the motion of flexible spacecraft. Proposed large flexible space systems, such an optical telescopes and antennas, will require control over vast surfaces. Most likely distributed control will be necessary involving many sensors to accurately measure the surface. A similarly large number of actuators must act upon the system. The used technical approach included reviewing proposed NASA missions to assess system needs and requirements. A candidate mission was chosen as a baseline study spacecraft for comparison of conventional and optical control components. Control system requirements of the baseline system were used for designing both a control system containing current off-the-shelf components and a system utilizing electro-optical devices for sensing and processing. State-of-the-art surveys of conventional sensor, actuator, and processor technologies were performed. A technology development plan is presented that presents a logical, effective way to develop and integrate advancing technologies.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, Kester Diederik

    The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcoxmore » (B&W) to refine the process to meet production and facility requirements is expected.« less

  6. An Automatic Baseline Regulation in a Highly Integrated Receiver Chip for JUNO

    NASA Astrophysics Data System (ADS)

    Muralidharan, P.; Zambanini, A.; Karagounis, M.; Grewing, C.; Liebau, D.; Nielinger, D.; Robens, M.; Kruth, A.; Peters, C.; Parkalian, N.; Yegin, U.; van Waasen, S.

    2017-09-01

    This paper describes the data processing unit and an automatic baseline regulation of a highly integrated readout chip (Vulcan) for JUNO. The chip collects data continuously at 1 Gsamples/sec. The Primary data processing which is performed in the integrated circuit can aid to reduce the memory and data processing efforts in the subsequent stages. In addition, a baseline regulator compensating a shift in the baseline is described.

  7. Space Station Mission Planning System (MPS) development study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Klus, W. J.

    1987-01-01

    The basic objective of the Space Station (SS) Mission Planning System (MPS) Development Study was to define a baseline Space Station mission plan and the associated hardware and software requirements for the system. A detailed definition of the Spacelab (SL) payload mission planning process and SL Mission Integration Planning System (MIPS) software was derived. A baseline concept was developed for performing SS manned base payload mission planning, and it was consistent with current Space Station design/operations concepts and philosophies. The SS MPS software requirements were defined. Also, requirements for new software include candidate programs for the application of artificial intelligence techniques to capture and make more effective use of mission planning expertise. A SS MPS Software Development Plan was developed which phases efforts for the development software to implement the SS mission planning concept.

  8. The life cycles of six multi-center adaptive clinical trials focused on neurological emergencies developed for the Advancing Regulatory Science initiative of the National Institutes of Health and US Food and Drug Administration: Case studies from the Adaptive Designs Accelerating Promising Treatments Into Trials Project.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Mawocha, Samkeliso; Legocki, Laurie J; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-01-01

    Clinical trials are complicated, expensive, time-consuming, and frequently do not lead to discoveries that improve the health of patients with disease. Adaptive clinical trials have emerged as a methodology to provide more flexibility in design elements to better answer scientific questions regarding whether new treatments are efficacious. Limited observational data exist that describe the complex process of designing adaptive clinical trials. To address these issues, the Adaptive Designs Accelerating Promising Treatments Into Trials project developed six, tailored, flexible, adaptive, phase-III clinical trials for neurological emergencies, and investigators prospectively monitored and observed the processes. The objective of this work is to describe the adaptive design development process, the final design, and the current status of the adaptive trial designs that were developed. To observe and reflect upon the trial development process, we employed a rich, mixed methods evaluation that combined quantitative data from visual analog scale to assess attitudes about adaptive trials, along with in-depth qualitative data about the development process gathered from observations. The Adaptive Designs Accelerating Promising Treatments Into Trials team developed six adaptive clinical trial designs. Across the six designs, 53 attitude surveys were completed at baseline and after the trial planning process completed. Compared to baseline, the participants believed significantly more strongly that the adaptive designs would be accepted by National Institutes of Health review panels and non-researcher clinicians. In addition, after the trial planning process, the participants more strongly believed that the adaptive design would meet the scientific and medical goals of the studies. Introducing the adaptive design at early conceptualization proved critical to successful adoption and implementation of that trial. Involving key stakeholders from several scientific domains early in the process appears to be associated with improved attitudes towards adaptive designs over the life cycle of clinical trial development.

  9. A demonstration of real-time connected element interferometry for spacecraft navigation

    NASA Technical Reports Server (NTRS)

    Edwards, C.; Rogstad, D.; Fort, D.; White, L.; Iijima, B.

    1992-01-01

    Connected element interferometry is a technique of observing a celestial radio source at two spatially separated antennas, and then interfering the received signals to extract the relative phase of the signal at the two antennas. The high precision of the resulting phase delay data type can provide an accurate determination of the angular position of the radio source relative to the baseline vector between the two stations. A connected element interferometer on a 21-km baseline between two antennas at the Deep Space Network's Goldstone, CA tracking complex is developed. Fiber optic links are used to transmit the data at 112 Mbit/sec to a common site for processing. A real-time correlator to process these data in real-time is implemented. The architecture of the system is described, and observational data is presented to characterize the potential performance of such a system. The real-time processing capability offers potential advantages in terms of increased reliability and improved delivery of navigational data for time-critical operations. Angular accuracies of 50-100 nrad are achievable on this baseline.

  10. The goldstone real-time connected element interferometer

    NASA Technical Reports Server (NTRS)

    Edwards, C., Jr.; Rogstad, D.; Fort, D.; White, L.; Iijima, B.

    1992-01-01

    Connected element interferometry (CEI) is a technique of observing a celestial radio source at two spatially separated antennas and then interfering the received signals to extract the relative phase of the signal at the two antennas. The high precision of the resulting phase delay data type can provide an accurate determination of the angular position of the radio source relative to the baseline vector between the two stations. This article describes a recently developed connected element interferometer on a 21-km baseline between two antennas at the Deep Space Network's Goldstone, California, tracking complex. Fiber-optic links are used to transmit the data to a common site for processing. The system incorporates a real-time correlator to process these data in real time. The architecture of the system is described, and observational data are presented to characterize the potential performance of such a system. The real-time processing capability offers potential advantages in terms of increased reliability and improved delivery of navigational data for time-critical operations. Angular accuracies of 50-100 nrad are achievable on this baseline.

  11. Significant volume reduction of tank waste by selective crystallization: 1994 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herting, D.L.; Lunsford, T.R.

    1994-09-27

    The objective of this technology task plan is to develop and demonstrate a scaleable process of reclaim sodium nitrate (NaNO{sub 3}) from Hanford waste tanks as a clean nonradioactive salt. The purpose of the so-called Clean Salt Process is to reduce the volume of low level waste glass by as much as 70%. During the reporting period of October 1, 1993, through May 31, 1994, progress was made on four fronts -- laboratory studies, surrogate waste compositions, contracting for university research, and flowsheet development and modeling. In the laboratory, experiments with simulated waste were done to explore the effects ofmore » crystallization parameters on the size and crystal habit of product NaNO{sub 3} crystals. Data were obtained to allows prediction of decontamination factor as a function of solid/liquid separation parameters. Experiments with actual waste from tank 101-SY were done to determine the extent of contaminant occlusions in NaNO{sub 3} crystals. In preparation for defining surrogate waste compositions, single shell tanks were categorized according to the weight percent NaNO{sub 3} in each tank. A detailed process flowsheet and computer model were created using the ASPENPlus steady state process simulator. This is the same program being used by the Tank Waste Remediation System (TWRS) program for their waste pretreatment and disposal projections. Therefore, evaluations can be made of the effect of the Clean Salt Process on the low level waste volume and composition resulting from the TWRS baseline flowsheet. Calculations, using the same assumptions as used for the TWRS baseline where applicable indicate that the number of low level glass vaults would be reduced from 44 to 16 if the Clean Salt Process were incorporated into the baseline flowsheet.« less

  12. Beyond the Baseline: Proceedings of the Space Station Evolution Symposium. Volume 1, Part 2; Space Station Freedom

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This report contains the individual presentations delivered at the Space Station Evolution Symposium in League City, Texas on February 6, 7, 8, 1990. Personnel responsible for Advanced Systems Studies and Advanced Development within the Space Station Freedom Program reported on the results of their work to date. Systems Studies presentations focused on identifying the baseline design provisions (hooks and scars) necessary to enable evolution of the facility to support changing space policy and anticipated user needs. Also emphasized were evolution configuration and operations concepts including on-orbit processing of space transfer vehicles. Advanced Development task managers discussed transitioning advanced technologies to the baseline program, including those near-term technologies which will enhance the safety and productivity of the crew and the reliability of station systems. Special emphasis was placed on applying advanced automation technology to ground and flight systems.

  13. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  14. Final Report - Testing of Optimized Bubbler Configuration for HLW Melter VSL-13R2950-1, Rev. 0, dated 6/12/2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Pegg, I. L.; Callow, R. A.

    2013-11-13

    The principal objective of this work was to determine the glass production rate increase and ancillary effects of adding more bubbler outlets to the current WTP HLW melter baseline. This was accomplished through testing on the HLW Pilot Melter (DM1200) at VSL. The DM1200 unit was selected for these tests since it was used previously with several HLW waste streams including the four tank wastes proposed for initial processing at Hanford. This melter system was also used for the development and optimization of the present baseline WTP HLW bubbler configuration for the WTP HLW melter, as well as for MACTmore » testing for both HLW and LAW. Specific objectives of these tests were to: Conduct DM1200 melter testing with the baseline WTP bubbling configuration and as augmented with additional bubblers. Conduct DM1200 melter testing to differentiate the effects of total bubbler air flow and bubbler distribution on glass production rate and cold cap formation. Collect melter operating data including processing rate, temperatures at a variety of locations within the melter plenum space, melt pool temperature, glass melt density, and melter pressure with the baseline WTP bubbling configuration and as augmented with additional bubblers. Collect melter exhaust samples to compare particulate carryover for different bubbler configurations. Analyze all collected data to determine the effects of adding more bubblers to the WTP HLW melter to inform decisions regarding future lid re-designs. The work used a high aluminum HLW stream composition defined by ORP, for which an appropriate simulant and high waste loading glass formulation were developed and have been previously processed on the DM1200.« less

  15. Improving Safe Consumer Transfers in a Day Treatment Setting Using Training and Feedback

    PubMed Central

    Austin, John; Rost, Kristen; Stanley, Leslie

    2011-01-01

    An intervention package that included employee training, supervisory feedback, and graphic feedback was developed to increase employees' safe patient-transfers at a day treatment center for adults with disabilities. The intervention was developed based on the center's results from a Performance Diagnostic Checklist (PDC), which focused on antecedents, equipment and processes, knowledge and skills, and consequences related to patient-transfers. A multiple baseline (MBL) across two lifts (pivot and trunk), with one lift (side) remaining in baseline was used to evaluate the effects of the treatment package on three lifts commonly used by three health-care workers. The results indicated a substantial increase in the overall safe performance of the three lifts. The mean increase for group safety performance following intervention was 34% and 29% over baseline measures for the two target transfers, and 28% over baseline measures for the nontargeted transfer. The implications of these findings suggest that in settings where patient transfers are frequent and injuries are likely to occur (e.g., hospitals, day treatment centers), safe lifting and transferring behaviors can improve with an efficient and cost-effective intervention. PMID:22649577

  16. Satisfaction with daily occupations amongst asylum seekers in Denmark.

    PubMed

    Morville, Anne-Le; Erlandsson, Lena-Karin; Danneskiold-Samsøe, Bente; Amris, Kirstine; Eklund, Mona

    2015-05-01

    The aim of this study was to describe asylum seekers' satisfaction with daily occupations and activity level while in a Danish asylum centre, and whether this changed over time. Another aim was to describe whether exposure to torture, self-rated health measures, and ADL ability were related to their satisfaction with daily occupations and activity level. A total of 43 asylum seekers at baseline and 17 at follow-up were included. The questionnaires Satisfaction with Daily Occupations, Major Depression Inventory, WHO-5 Wellbeing, Pain Detect, a questionnaire covering torture, and basic social information were used as well as Assessment of Motor and Process Skills. The results showed a low level of satisfaction with daily occupations at both baseline and follow-up. There was no statistically significant change in satisfaction or activity level between baseline and the follow-up. Associations between AMPS process skills--education, worst pain and activity level--were present at baseline, as was a relationship between AMPS process skills and satisfaction. At follow-up, associations between WHO-5 and satisfaction and activity level and between MDI scores and activity level were found. Asylum seekers experience a low level of satisfaction with daily occupations, both at arrival and after 10 months in an asylum centre. There is a need for further research and development of occupation-focused rehabilitation methods for the asylum seeker population.

  17. In-Space Manufacturing Baseline Property Development

    NASA Technical Reports Server (NTRS)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  18. Plans for the development of EOS SAR systems using the Alaska SAR facility. [Earth Observing System (EOS)

    NASA Technical Reports Server (NTRS)

    Carsey, F. D.; Weeks, W.

    1988-01-01

    The Alaska SAR Facility (ASF) program for the acquisition and processing of data from the ESA ERS-1, the NASDA ERS-1, and Radarsat and to carry out a program of science investigations using the data is introduced. Agreements for data acquisition and analysis are in place except for the agreement between NASA and Radarsat which is in negotiation. The ASF baseline system, consisting of the Receiving Ground System, the SAR Processor System and the Archive and Operations System, passed critical design review and is fully in implementation phase. Augments to the baseline system for systems to perform geophysical processing and for processing of J-ERS-1 optical data are in the design and implementation phase. The ASF provides a very effective vehicle with which to prepare for the Earth Observing System (EOS) in that it will aid the development of systems and technologies for handling the data volumes produced by the systems of the next decades, and it will also supply some of the data types that will be produced by EOS.

  19. Improving Nutritional Status of Older Persons with Dementia Using a National Preventive Care Program.

    PubMed

    Johansson, L; Wijk, H; Christensson, L

    2017-01-01

    The aim of the study was to investigate the outcome of change in body weight associated with use of a structured preventive care process among persons with dementia assessed as at risk of malnutrition or malnourished. The preventive care process is a pedagogical model used in the Senior Alert (SA) quality register, where nutrition is one of the prioritized areas and includes four steps: assessment, analysis of underlying causes, actions performed and outcome. An analysis of data from SA with a pre-post design was performed. The participants were living in ordinary housing or special housing in Sweden. 1912 persons, 65 years and older, registered in both SA and the dementia quality register Svedem were included. A national preventive care program including individualized actions. The Mini Nutritional Assessment-Short Form was used to assess nutritional status at baseline. Body weight was measured during baseline and follow-up (7-106 days after baseline). 74.3% persons were malnourished or at risk of malnutrition. Those at risk of malnutrition or malnourished who were registered in all four steps of the preventive care process, increased in body weight from baseline (Md 60.0 kg) to follow-up (Md 62.0 kg) (p=0.013). In those with incomplete registration no increase in body weight was found. Using all steps in the structured preventive care process seems to improve nutritional status of persons with dementia assessed as at risk of malnutrition or malnourished. This study contributes to the development of evidence-based practice regarding malnutrition and persons with dementia.

  20. Artificial Intelligence (AI) Based Tactical Guidance for Fighter Aircraft

    NASA Technical Reports Server (NTRS)

    McManus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of Artificial Intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The Knowledge-Based Systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real-time in the Langley Differential Maneuvering Simulator (DMS), are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs. Alternate computing environments and programming approaches, including the use of parallel algorithms and heterogeneous computer networks are discussed, and the design and performance of a prototype concurrent TDG system are presented.

  1. Modeling Rabbit Responses to Single and Multiple Aerosol ...

    EPA Pesticide Factsheets

    Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev

  2. The life cycles of six multi-center adaptive clinical trials focused on neurological emergencies developed for the Advancing Regulatory Science initiative of the National Institutes of Health and US Food and Drug Administration: Case studies from the Adaptive Designs Accelerating Promising Treatments Into Trials Project

    PubMed Central

    Guetterman, Timothy C; Fetters, Michael D; Mawocha, Samkeliso; Legocki, Laurie J; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-01-01

    Objectives: Clinical trials are complicated, expensive, time-consuming, and frequently do not lead to discoveries that improve the health of patients with disease. Adaptive clinical trials have emerged as a methodology to provide more flexibility in design elements to better answer scientific questions regarding whether new treatments are efficacious. Limited observational data exist that describe the complex process of designing adaptive clinical trials. To address these issues, the Adaptive Designs Accelerating Promising Treatments Into Trials project developed six, tailored, flexible, adaptive, phase-III clinical trials for neurological emergencies, and investigators prospectively monitored and observed the processes. The objective of this work is to describe the adaptive design development process, the final design, and the current status of the adaptive trial designs that were developed. Methods: To observe and reflect upon the trial development process, we employed a rich, mixed methods evaluation that combined quantitative data from visual analog scale to assess attitudes about adaptive trials, along with in-depth qualitative data about the development process gathered from observations. Results: The Adaptive Designs Accelerating Promising Treatments Into Trials team developed six adaptive clinical trial designs. Across the six designs, 53 attitude surveys were completed at baseline and after the trial planning process completed. Compared to baseline, the participants believed significantly more strongly that the adaptive designs would be accepted by National Institutes of Health review panels and non-researcher clinicians. In addition, after the trial planning process, the participants more strongly believed that the adaptive design would meet the scientific and medical goals of the studies. Conclusion: Introducing the adaptive design at early conceptualization proved critical to successful adoption and implementation of that trial. Involving key stakeholders from several scientific domains early in the process appears to be associated with improved attitudes towards adaptive designs over the life cycle of clinical trial development. PMID:29085638

  3. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. Duringmore » the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)« less

  4. Operationalizing clean development mechanism baselines: A case study of China's electrical sector

    NASA Astrophysics Data System (ADS)

    Steenhof, Paul A.

    The global carbon market is rapidly developing as the first commitment period of the Kyoto Protocol draws closer and Parties to the Protocol with greenhouse gas (GHG) emission reduction targets seek alternative ways to reduce their emissions. The Protocol includes the Clean Development Mechanism (CDM), a tool that encourages project-based investments to be made in developing nations that will lead to an additional reduction in emissions. Due to China's economic size and rate of growth, technological characteristics, and its reliance on coal, it contains a large proportion of the global CDM potential. As China's economy modernizes, more technologies and processes are requiring electricity and demand for this energy source is accelerating rapidly. Relatively inefficient technology to generate electricity in China thereby results in the electrical sector having substantial GHG emission reduction opportunities as related to the CDM. In order to ensure the credibility of the CDM in leading to a reduction in GHG emissions, it is important that the baseline method used in the CDM approval process is scientifically sound and accessible for both others to use and for evaluation purposes. Three different methods for assessing CDM baselines and environmental additionality are investigated in the context of China's electrical sector: a method based on a historical perspective of the electrical sector (factor decomposition), a method structured upon a current perspective (operating and build margins), and a simulation of the future (dispatch analysis). Assessing future emission levels for China's electrical sector is a very challenging task given the complexity of the system, its dynamics, and that it is heavily influenced by internal and external forces, but of the different baseline methods investigated, dispatch modelling is best suited for the Chinese context as it is able to consider the important regional and temporal dimensions of its economy and its future development. For China, the most promising options for promoting sustainable development, one of the goals of the Kyoto Protocol, appear to be tied to increasing electrical end-use and generation efficiency, particularly clean coal technology for electricity generation since coal will likely continue to be a dominant primary fuel.

  5. Silicon solar cell process development, fabrication and analysis

    NASA Technical Reports Server (NTRS)

    Yoo, H. I.; Iles, P. A.; Leung, D. C.

    1981-01-01

    Solar cells were fabricated from EFG ribbons dendritic webs, cast ingots by heat exchanger method, and cast ingots by ubiquitous crystallization process. Baseline and other process variations were applied to fabricate solar cells. EFG ribbons grown in a carbon-containing gas atmosphere showed significant improvement in silicon quality. Baseline solar cells from dendritic webs of various runs indicated that the quality of the webs under investigation was not as good as the conventional CZ silicon, showing an average minority carrier diffusion length of about 60 um versus 120 um of CZ wafers. Detail evaluation of large cast ingots by HEM showed ingot reproducibility problems from run to run and uniformity problems of sheet quality within an ingot. Initial evaluation of the wafers prepared from the cast polycrystalline ingots by UCP suggested that the quality of the wafers from this process is considerably lower than the conventional CZ wafers. Overall performance was relatively uniform, except for a few cells which showed shunting problems caused by inclusions.

  6. NASA Occupational Health Program FY98 Self-Assessment

    NASA Technical Reports Server (NTRS)

    Brisbin, Steven G.

    1999-01-01

    The NASA Functional Management Review process requires that each NASA Center conduct self-assessments of each functional area. Self-Assessments were completed in June 1998 and results were presented during this conference session. During FY 97 NASA Occupational Health Assessment Team activities, a decision was made to refine the NASA Self-Assessment Process. NASA Centers were involved in the ISO registration process at that time and wanted to use the management systems approach to evaluate their occupational health programs. This approach appeared to be more consistent with NASA's management philosophy and would likely confer status needed by Senior Agency Management for the program. During FY 98 the Agency Occupational Health Program Office developed a revised self-assessment methodology based on the Occupational Health and Safety Management System developed by the American Industrial Hygiene Association. This process was distributed to NASA Centers in March 1998 and completed in June 1998. The Center Self Assessment data will provide an essential baseline on the status of OHP management processes at NASA Centers. That baseline will be presented to Enterprise Associate Administrators and DASHO on September 22, 1998 and used as a basis for discussion during FY 99 visits to NASA Centers. The process surfaced several key management system elements warranting further support from the Lead Center. Input and feedback from NASA Centers will be essential to defining and refining future self assessment efforts.

  7. Extended performance solar electric propulsion thrust system study. Volume 3: Tradeoff studies of alternate thrust system configurations

    NASA Technical Reports Server (NTRS)

    Hawthorne, E. I.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. Emphasis was placed on relatively high power missions. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power-processing components were performed. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. A program development plan was formulated that outlines the work structure considered necessary for developing, qualifying, and fabricating the flight hardware for the baseline thrust system within the time frame of a project to rendezvous with Halley's comet. An assessment was made of the costs and risks associated with a baseline thrust system as provided to the mission project under this plan. Critical procurements and interfaces were identified and defined.

  8. Extended performance solar electric propulsion thrust system study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Poeschel, R. L.; Hawthorne, E. I.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed. Confirmation testing and analysis of thruster and power-processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. A program development plan was formulated that outlines the work structure considered necessary for developing, qualifying, and fabricating the flight hardware for the baseline thrust system within the time frame of a project to rendezvous with Halley's comet. An assessment was made of the costs and risks associated with a baseline thrust system as provided to the mission project under this plan. Critical procurements and interfaces were identified and defined. Results are presented.

  9. Respiratory sinus arrhythmia and auditory processing in autism: modifiable deficits of an integrated social engagement system?

    PubMed

    Porges, Stephen W; Macellaio, Matthew; Stanfill, Shannon D; McCue, Kimberly; Lewis, Gregory F; Harden, Emily R; Handelman, Mika; Denver, John; Bazhenova, Olga V; Heilman, Keri J

    2013-06-01

    The current study evaluated processes underlying two common symptoms (i.e., state regulation problems and deficits in auditory processing) associated with a diagnosis of autism spectrum disorders. Although these symptoms have been treated in the literature as unrelated, when informed by the Polyvagal Theory, these symptoms may be viewed as the predictable consequences of depressed neural regulation of an integrated social engagement system, in which there is down regulation of neural influences to the heart (i.e., via the vagus) and to the middle ear muscles (i.e., via the facial and trigeminal cranial nerves). Respiratory sinus arrhythmia (RSA) and heart period were monitored to evaluate state regulation during a baseline and two auditory processing tasks (i.e., the SCAN tests for Filtered Words and Competing Words), which were used to evaluate auditory processing performance. Children with a diagnosis of autism spectrum disorders (ASD) were contrasted with aged matched typically developing children. The current study identified three features that distinguished the ASD group from a group of typically developing children: 1) baseline RSA, 2) direction of RSA reactivity, and 3) auditory processing performance. In the ASD group, the pattern of change in RSA during the attention demanding SCAN tests moderated the relation between performance on the Competing Words test and IQ. In addition, in a subset of ASD participants, auditory processing performance improved and RSA increased following an intervention designed to improve auditory processing. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Increase of EEG Spectral Theta Power Indicates Higher Risk of the Development of Severe Cognitive Decline in Parkinson’s Disease after 3 Years

    PubMed Central

    Cozac, Vitalii V.; Chaturvedi, Menorca; Hatz, Florian; Meyer, Antonia; Fuhr, Peter; Gschwandtner, Ute

    2016-01-01

    Objective: We investigated quantitative electroencephalography (qEEG) and clinical parameters as potential risk factors of severe cognitive decline in Parkinson’s disease. Methods: We prospectively investigated 37 patients with Parkinson’s disease at baseline and follow-up (after 3 years). Patients had no severe cognitive impairment at baseline. We used a summary score of cognitive tests as the outcome at follow-up. At baseline we assessed motor, cognitive, and psychiatric factors; qEEG variables [global relative median power (GRMP) spectra] were obtained by a fully automated processing of high-resolution EEG (256-channels). We used linear regression models with calculation of the explained variance to evaluate the relation of baseline parameters with cognitive deterioration. Results: The following baseline parameters significantly predicted severe cognitive decline: GRMP theta (4–8 Hz), cognitive task performance in executive functions and working memory. Conclusions: Combination of neurocognitive tests and qEEG improves identification of patients with higher risk of cognitive decline in PD. PMID:27965571

  11. Beyond Metrics? The Role of Hydrologic Baseline Archetypes in Environmental Water Management.

    PubMed

    Lane, Belize A; Sandoval-Solis, Samuel; Stein, Eric D; Yarnell, Sarah M; Pasternack, Gregory B; Dahlke, Helen E

    2018-06-22

    Balancing ecological and human water needs often requires characterizing key aspects of the natural flow regime and then predicting ecological response to flow alterations. Flow metrics are generally relied upon to characterize long-term average statistical properties of the natural flow regime (hydrologic baseline conditions). However, some key aspects of hydrologic baseline conditions may be better understood through more complete consideration of continuous patterns of daily, seasonal, and inter-annual variability than through summary metrics. Here we propose the additional use of high-resolution dimensionless archetypes of regional stream classes to improve understanding of baseline hydrologic conditions and inform regional environmental flows assessments. In an application to California, we describe the development and analysis of hydrologic baseline archetypes to characterize patterns of flow variability within and between stream classes. We then assess the utility of archetypes to provide context for common flow metrics and improve understanding of linkages between aquatic patterns and processes and their hydrologic controls. Results indicate that these archetypes may offer a distinct and complementary tool for researching mechanistic flow-ecology relationships, assessing regional patterns for streamflow management, or understanding impacts of changing climate.

  12. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  13. Water recovery and solid waste processing for aerospace and domestic applications. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    Murray, R. W.

    1973-01-01

    A comprehensive study of advanced water recovery and solid waste processing techniques employed in both aerospace and domestic or commercial applications is reported. A systems approach was used to synthesize a prototype system design of an advanced water treatment/waste processing system. Household water use characteristics were studied and modified through the use of low water use devices and a limited amount of water reuse. This modified household system was then used as a baseline system for development of several water treatment waste processing systems employing advanced techniques. A hybrid of these systems was next developed and a preliminary design was generated to define system and hardware functions.

  14. Summary of Granulation Matrix Testing for the Plutonium Immobilization Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, C.C.

    2001-10-19

    In FY00, a matrix for process development testing was created to identify those items related to the ceramic process that had not been fully developed or tested and to help identify variables that needed to be tested. This matrix, NMTP/IP-99-003, was jointly created between LLNL and SRTC and was issued to all affected individuals. The matrix was also used to gauge the progress of the development activities. As part of this matrix, several series of tests were identified for the granulation process. This summary provides the data and results from the granulation testing. The results of the granulation matrix testingmore » were used to identify the baseline process for testing in the PuCTF with cold surrogates in B241 at LLNL.« less

  15. Descriptive Study Analyzing Discrepancies in a Software Development Project Change Request (CR) Assessment Process and Recommendations for Process Improvements

    NASA Technical Reports Server (NTRS)

    Cunningham, Kenneth J.

    2002-01-01

    The Change Request (CR) assessment process is essential in the display development cycle. The assessment process is performed to ensure that the changes stated in the description of the CR match the changes in the actual display requirements. If a discrepancy is found between the CR and the requirements, the CR must be returned to the originator for corrections. Data will be gathered from each of the developers to determine the type of discrepancies and the amount of time spent assessing each CR. This study will determine the most common types of discrepancies and the amount of time spent assessing those issues. The results of the study will provide a foundation for future improvements as well as a baseline for future studies.

  16. Eutrophication monitoring for Lake Superior’s Chequamegon Bay before and after large summer storms

    EPA Science Inventory

    A priority for the Lake Superior CSMI was to identify susceptible nearshore eutrophication areas. We developed an integrated sampling design to collect baseline data for Lake Superior’s Chequamegon Bay to understand how nearshore physical processes and tributary loading rel...

  17. DASH (Dietary Approaches to Stop Hypertension) Diet and Risk of Subsequent Kidney Disease

    PubMed Central

    Rebholz, Casey M.; Crews, Deidra C.; Grams, Morgan E.; Steffen, Lyn M.; Levey, Andrew S.; Miller, Edgar R.; Appel, Lawrence J.; Coresh, Josef

    2016-01-01

    Background There are established guidelines for recommended dietary intake for hypertension treatment and cardiovascular disease prevention. Evidence is lacking for effective dietary patterns for kidney disease prevention. Study Design Prospective cohort study Setting & Participants Atherosclerosis Risk in Communities (ARIC) study participants with baseline estimated glomerular filtration rate (eGFR) ≥60 mL/min/1.73 m2 (N=14,882) Predictor The Dietary Approaches to Stop Hypertension (DASH) diet score was calculated based on self-reported dietary intake of red and processed meat, sweetened beverages, sodium, fruits, vegetables, whole grains, nuts and legumes, and low-fat dairy products, averaged over two visits. Outcomes Cases were ascertained based on development of eGFR <60 mL/min/1.73 m2 accompanied by ≥25% eGFR decline from baseline, an ICD-9/10 code for a kidney disease–related hospitalization or death, or end-stage renal disease from baseline through 2012. Results A total of 3,720 participants developed kidney disease during a median follow-up of 23 years. Participants with a DASH diet score in the lowest tertile were 16% more likely to develop kidney disease than those with the highest score tertile (HR, 1.16; 95% CI, 1.07-1.26; p for trend <0.001), after adjusting for socio-demographics, smoking status, physical activity, total caloric intake, baseline eGFR, overweight/obese status, diabetes status, hypertension status, systolic blood pressure, and anti-hypertensive medication use. Of the individual components of the DASH diet score, high intake of red and processed meat was adversely associated with kidney disease and high intake of nuts, legumes, and low-fat dairy products was associated with reduced risk of kidney disease. Limitations Potential measurement error due to self-reported dietary intake and lack of data on albuminuria Conclusions Consuming a DASH-style diet was associated with lower risk for kidney disease, independent of demographic characteristics, established kidney risk factors, and baseline kidney function. Healthful dietary patterns, such as the DASH diet, may be beneficial for kidney disease prevention. PMID:27519166

  18. DASH (Dietary Approaches to Stop Hypertension) Diet and Risk of Subsequent Kidney Disease.

    PubMed

    Rebholz, Casey M; Crews, Deidra C; Grams, Morgan E; Steffen, Lyn M; Levey, Andrew S; Miller, Edgar R; Appel, Lawrence J; Coresh, Josef

    2016-12-01

    There are established guidelines for recommended dietary intake for hypertension treatment and cardiovascular disease prevention. Evidence is lacking for effective dietary patterns for kidney disease prevention. Prospective cohort study. Atherosclerosis Risk in Communities (ARIC) Study participants with baseline estimated glomerular filtration rate (eGFR) ≥ 60mL/min/1.73m 2 (N=14,882). The Dietary Approaches to Stop Hypertension (DASH) diet score was calculated based on self-reported dietary intake of red and processed meat, sweetened beverages, sodium, fruits, vegetables, whole grains, nuts and legumes, and low-fat dairy products, averaged over 2 visits. Cases were ascertained based on the development of eGFRs<60mL/min/1.73m 2 accompanied by ≥25% eGFR decline from baseline, an International Classification of Diseases, Ninth/Tenth Revision code for a kidney disease-related hospitalization or death, or end-stage renal disease from baseline through 2012. 3,720 participants developed kidney disease during a median follow-up of 23 years. Participants with a DASH diet score in the lowest tertile were 16% more likely to develop kidney disease than those with the highest score tertile (HR, 1.16; 95% CI, 1.07-1.26; P for trend < 0.001), after adjusting for sociodemographics, smoking status, physical activity, total caloric intake, baseline eGFR, overweight/obese status, diabetes status, hypertension status, systolic blood pressure, and antihypertensive medication use. Of the individual components of the DASH diet score, high red and processed meat intake was adversely associated with kidney disease and high nuts, legumes, and low-fat dairy products intake was associated with reduced risk for kidney disease. Potential measurement error due to self-reported dietary intake and lack of data for albuminuria. Consuming a DASH-style diet was associated with lower risk for kidney disease independent of demographic characteristics, established kidney risk factors, and baseline kidney function. Healthful dietary patterns such as the DASH diet may be beneficial for kidney disease prevention. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  19. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  20. A participatory evaluation model for Healthier Communities: developing indicators for New Mexico.

    PubMed Central

    Wallerstein, N

    2000-01-01

    Participatory evaluation models that invite community coalitions to take an active role in developing evaluations of their programs are a natural fit with Healthy Communities initiatives. The author describes the development of a participatory evaluation model for New Mexico's Healthier Communities program. She describes evaluation principles, research questions, and baseline findings. The evaluation model shows the links between process, community-level system impacts, and population health changes. PMID:10968754

  1. High Temperature, Slow Strain Rate Forging of Advanced Disk Alloy ME3

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; OConnor, Kenneth

    2001-01-01

    The advanced disk alloy ME3 was designed in the HSR/EPM disk program to have extended durability at 1150 to 1250 F in large disks. This was achieved by designing a disk alloy and process producing balanced monotonic, cyclic, and time-dependent mechanical properties. combined with robust processing and manufacturing characteristics. The resulting baseline alloy, processing, and supersolvus heat treatment produces a uniform, relatively fine mean grain size of about ASTM 7, with as-large-as (ALA) grain size of about ASTM 3. There is a long term need for disks with higher rim temperature capabilities than 1250 F. This would allow higher compressor exit (T3) temperatures and allow the full utilization of advanced combustor and airfoil concepts under development. Several approaches are being studied that modify the processing and chemistry of ME3, to possibly improve high temperature properties. Promising approaches would be applied to subscale material, for screening the resulting mechanical properties at these high temperatures. n obvious path traditionally employed to improve the high temperature and time-dependent capabilities of disk alloys is to coarsen the grain size. A coarser grain size than ASTM 7 could potentially be achieved by varying the forging conditions and supersolvus heat treatment. The objective of this study was to perform forging and heat treatment experiments ("thermomechanical processing experiments") on small compression test specimens of the baseline ME3 composition, to identify a viable forging process allowing significantly coarser grain size targeted at ASTM 3-5, than that of the baseline, ASTM 7.

  2. A pragmatic randomized comparative effectiveness trial of transitional care for a socioeconomically diverse population: Design, rationale and baseline characteristics.

    PubMed

    Schaeffer, Christine; Teter, Caroline; Finch, Emily A; Hurt, Courtney; Keeter, Mary Kate; Liss, David T; Rogers, Angela; Sheth, Avani; Ackermann, Ronald

    2018-02-01

    Transitional care programs have been widely used to reduce readmissions and improve the quality and safety of the handoff process between hospital and outpatient providers. Very little is known about effective transitional care interventions among patients who are uninsured or with Medicaid. This paper describes the design and baseline characteristics of a pragmatic randomized comparative effectiveness trial of transitional care. Northwestern Medical Group- Transitional Care (NMG-TC) care model was developed to address the needs of patients with multiple medical problems that required lifestyle changes and were amenable to office-based management. We present the design, evaluation methods and baseline characteristics of NMG-TC trial patients. Baseline demographic characteristics indicate that our patient population is predominantly male, Medicaid insured and non-white. This study will evaluate two methods for implementing an effective transitional care model in a medically complex and socioeconomically diverse population. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. 78 FR 58309 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-23

    ... initiative, funded by the Children's Bureau (CB) within ACF, will support planning grants to develop a model... for the process evaluation will be used to assess grantees' organizational capacity and readiness to... response burden hours Baseline Telephone Interview of 540 270 1 1.0 270 Organizational Readiness...

  4. Development of a consensus core dataset in juvenile dermatomyositis for clinical use to inform research

    PubMed Central

    McCann, Liza J; Pilkington, Clarissa A; Huber, Adam M; Ravelli, Angelo; Appelbe, Duncan; Kirkham, Jamie J; Williamson, Paula R; Aggarwal, Amita; Christopher-Stine, Lisa; Constantin, Tamas; Feldman, Brian M; Lundberg, Ingrid; Maillard, Sue; Mathiesen, Pernille; Murphy, Ruth; Pachman, Lauren M; Reed, Ann M; Rider, Lisa G; van Royen-Kerkof, Annet; Russo, Ricardo; Spinty, Stefan; Wedderburn, Lucy R

    2018-01-01

    Objectives This study aimed to develop consensus on an internationally agreed dataset for juvenile dermatomyositis (JDM), designed for clinical use, to enhance collaborative research and allow integration of data between centres. Methods A prototype dataset was developed through a formal process that included analysing items within existing databases of patients with idiopathic inflammatory myopathies. This template was used to aid a structured multistage consensus process. Exploiting Delphi methodology, two web-based questionnaires were distributed to healthcare professionals caring for patients with JDM identified through email distribution lists of international paediatric rheumatology and myositis research groups. A separate questionnaire was sent to parents of children with JDM and patients with JDM, identified through established research networks and patient support groups. The results of these parallel processes informed a face-to-face nominal group consensus meeting of international myositis experts, tasked with defining the content of the dataset. This developed dataset was tested in routine clinical practice before review and finalisation. Results A dataset containing 123 items was formulated with an accompanying glossary. Demographic and diagnostic data are contained within form A collected at baseline visit only, disease activity measures are included within form B collected at every visit and disease damage items within form C collected at baseline and annual visits thereafter. Conclusions Through a robust international process, a consensus dataset for JDM has been formulated that can capture disease activity and damage over time. This dataset can be incorporated into national and international collaborative efforts, including existing clinical research databases. PMID:29084729

  5. The comprehensive health care orientation process indicators explain hospital organisation's attractiveness: a Bayesian analysis of newly hired nurse and physician survey data.

    PubMed

    Peltokoski, Jaana; Vehviläinen-Julkunen, Katri; Pitkäaho, Taina; Mikkonen, Santtu; Miettinen, Merja

    2015-10-01

    To examine the relationship of a comprehensive health care orientation process with a hospital's attractiveness. Little is known about indicators of the employee orientation process that most likely explain a hospital organisation's attractiveness. Empirical data collected from registered nurses (n = 145) and physicians (n = 37) working in two specialised hospital districts. A Naive Bayes Classification was applied to examine the comprehensive orientation process indicators that predict hospital's attractiveness. The model was composed of five orientation process indicators: the contribution of the orientation process to nurses' and physicians' intention to stay; the defined responsibilities of the orientation process; interaction between newcomer and colleagues; responsibilities that are adapted for tasks; and newcomers' baseline knowledge assessment that should be done before the orientation phase. The Naive Bayes Classification was used to explore employee orientation process and related indicators. The model constructed provides insight that can be used in designing and implementing the orientation process to promote the hospital organisation's attractiveness. Managers should focus on developing fluently organised orientation practices based on the indicators that predict the hospital's attractiveness. For the purpose of personalised orientation, employees' baseline knowledge and competence level should be assessed before the orientation phase. © 2014 John Wiley & Sons Ltd.

  6. 78 FR 35056 - Effectiveness of the Reactor Oversight Process Baseline Inspection Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... NUCLEAR REGULATORY COMMISSION [NRC-2013-0125] Effectiveness of the Reactor Oversight Process... the effectiveness of the reactor oversight process (ROP) baseline inspection program with members of... Nuclear Reactor Regulations, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001; telephone: 301...

  7. An approach to knowledge engineering to support knowledge-based simulation of payload ground processing at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mcmanus, Shawn; Mcdaniel, Michael

    1989-01-01

    Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.

  8. Beyond total treatment effects in randomised controlled trials: Baseline measurement of intermediate outcomes needed to reduce confounding in mediation investigations.

    PubMed

    Landau, Sabine; Emsley, Richard; Dunn, Graham

    2018-06-01

    Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of the three processes involving baseline measures of intermediate or clinical outcomes is operating. Necessary assumptions for the change score approach (B) to provide unbiased estimates under either process include the independence of baseline measures and change scores of the intermediate variable. Finally, estimates provided by the analysis of covariance approach (C) were found to be unbiased under all the three processes considered here. When applied to the example, there was evidence of mediation under all methods but the estimate of the indirect effect depended on the approach used with the proportion mediated varying from 57% to 86%. Trialists planning mediation analyses should measure baseline values of putative mediators as well as of continuous clinical outcomes. An analysis of covariance approach is recommended to avoid potential biases due to confounding processes involving baseline measures of intermediate or clinical outcomes, and not simply for increased precision.

  9. Wide-Field Imaging Interferometry Spatial-Spectral Image Synthesis Algorithms

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Leisawitz, David T.; Rinehart, Stephen A.; Memarsadeghi, Nargess; Sinukoff, Evan J.

    2012-01-01

    Developed is an algorithmic approach for wide field of view interferometric spatial-spectral image synthesis. The data collected from the interferometer consists of a set of double-Fourier image data cubes, one cube per baseline. These cubes are each three-dimensional consisting of arrays of two-dimensional detector counts versus delay line position. For each baseline a moving delay line allows collection of a large set of interferograms over the 2D wide field detector grid; one sampled interferogram per detector pixel per baseline. This aggregate set of interferograms, is algorithmically processed to construct a single spatial-spectral cube with angular resolution approaching the ratio of the wavelength to longest baseline. The wide field imaging is accomplished by insuring that the range of motion of the delay line encompasses the zero optical path difference fringe for each detector pixel in the desired field-of-view. Each baseline cube is incoherent relative to all other baseline cubes and thus has only phase information relative to itself. This lost phase information is recovered by having point, or otherwise known, sources within the field-of-view. The reference source phase is known and utilized as a constraint to recover the coherent phase relation between the baseline cubes and is key to the image synthesis. Described will be the mathematical formalism, with phase referencing and results will be shown using data collected from NASA/GSFC Wide-Field Imaging Interferometry Testbed (WIIT).

  10. A Complete Procedure for Predicting and Improving the Performance of HAWT's

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Ali; Ertunç, Özgür; Sittig, Florian; Delgado, Antonio

    2014-06-01

    A complete procedure for predicting and improving the performance of the horizontal axis wind turbine (HAWT) has been developed. The first process is predicting the power extracted by the turbine and the derived rotor torque, which should be identical to that of the drive unit. The BEM method and a developed post-stall treatment for resolving stall-regulated HAWT is incorporated in the prediction. For that, a modified stall-regulated prediction model, which can predict the HAWT performance over the operating range of oncoming wind velocity, is derived from existing models. The model involves radius and chord, which has made it more general in applications for predicting the performance of different scales and rotor shapes of HAWTs. The second process is modifying the rotor shape by an optimization process, which can be applied to any existing HAWT, to improve its performance. A gradient- based optimization is used for adjusting the chord and twist angle distribution of the rotor blade to increase the extraction of the power while keeping the drive torque constant, thus the same drive unit can be kept. The final process is testing the modified turbine to predict its enhanced performance. The procedure is applied to NREL phase-VI 10kW as a baseline turbine. The study has proven the applicability of the developed model in predicting the performance of the baseline as well as the optimized turbine. In addition, the optimization method has shown that the power coefficient can be increased while keeping same design rotational speed.

  11. An MBSE Approach to Space Suit Development

    NASA Technical Reports Server (NTRS)

    Cordova, Lauren; Kovich, Christine; Sargusingh, Miriam

    2012-01-01

    The EVA/Space Suit Development Office (ESSD) Systems Engineering and Integration (SE&I) team has utilized MBSE in multiple programs. After developing operational and architectural models, the MBSE framework was expanded to link the requirements space to the system models through functional analysis and interfaces definitions. By documenting all the connections within the technical baseline, ESSD experienced significant efficiency improvements in analysis and identification of change impacts. One of the biggest challenges presented to the MBSE structure was a program transition and restructuring effort, which was completed successfully in 4 months culminating in the approval of a new EVA Technical Baseline. During this time three requirements sets spanning multiple DRMs were streamlined into one NASA-owned Systems Requirement Document (SRD) that successfully identified requirements relevant to the current hardware development effort while remaining extensible to support future hardware developments. A capability-based hierarchy was established to provide a more flexible framework for future space suit development that can support multiple programs with minimal rework of basic EVA/Space Suit requirements. This MBSE approach was most recently applied for generation of an EMU Demonstrator technical baseline being developed for an ISS DTO. The relatively quick turnaround of operational concepts, architecture definition, and requirements for this new suit development has allowed us to test and evolve the MBSE process and framework in an extremely different setting while still offering extensibility and traceability throughout ESSD projects. The ESSD MBSE framework continues to be evolved in order to support integration of all products associated with the SE&I engine.

  12. Mindful "Vitality in Practice": an intervention to improve the work engagement and energy balance among workers; the development and design of the randomised controlled trial.

    PubMed

    van Berkel, Jantien; Proper, Karin I; Boot, Cécile R L; Bongers, Paulien M; van der Beek, Allard J

    2011-09-27

    Modern working life has become more mental and less physical in nature, contributing to impaired mental health and a disturbed energy balance. This may result in mental health problems and overweight. Both are significant threats to the health of workers and thus also a financial burden for society, including employers. Targeting work engagement and energy balance could prevent impaired mental health and overweight, respectively. The study population consists of highly educated workers in two Dutch research institutes. The intervention was systematically developed, based on the Intervention Mapping (IM) protocol, involving workers and management in the process. The workers' needs were assessed by combining the results of interviews, focus group discussions and a questionnaire with available literature. Suitable methods and strategies were selected resulting in an intervention including: eight weeks of customized mindfulness training, followed by eight sessions of e-coaching and supporting elements, such as providing fruit and snack vegetables at the workplace, lunch walking routes, and a buddy system. The effects of the intervention will be evaluated in a RCT, with measurements at baseline, six months (T1) and 12 months (T2). In addition, cost-effectiveness and process of the intervention will also be evaluated. At baseline the level of work engagement of the sample was "average". Of the study population, 60.1% did not engage in vigorous physical activity at all. An average working day consists of eight sedentary hours. For the Phase II RCT, there were no significant differences between the intervention and the control group at baseline, except for vigorous physical activity. The baseline characteristics of the study population were congruent with the results of the needs assessment. The IM protocol used for the systematic development of the intervention produced an appropriate intervention to test in the planned RCT. Netherlands Trial Register (NTR): NTR2199.

  13. Mindful "Vitality in Practice": an intervention to improve the work engagement and energy balance among workers; the development and design of the randomised controlled trial

    PubMed Central

    2011-01-01

    Background Modern working life has become more mental and less physical in nature, contributing to impaired mental health and a disturbed energy balance. This may result in mental health problems and overweight. Both are significant threats to the health of workers and thus also a financial burden for society, including employers. Targeting work engagement and energy balance could prevent impaired mental health and overweight, respectively. Methods/Design The study population consists of highly educated workers in two Dutch research institutes. The intervention was systematically developed, based on the Intervention Mapping (IM) protocol, involving workers and management in the process. The workers' needs were assessed by combining the results of interviews, focus group discussions and a questionnaire with available literature. Suitable methods and strategies were selected resulting in an intervention including: eight weeks of customized mindfulness training, followed by eight sessions of e-coaching and supporting elements, such as providing fruit and snack vegetables at the workplace, lunch walking routes, and a buddy system. The effects of the intervention will be evaluated in a RCT, with measurements at baseline, six months (T1) and 12 months (T2). In addition, cost-effectiveness and process of the intervention will also be evaluated. Discussion At baseline the level of work engagement of the sample was "average". Of the study population, 60.1% did not engage in vigorous physical activity at all. An average working day consists of eight sedentary hours. For the Phase II RCT, there were no significant differences between the intervention and the control group at baseline, except for vigorous physical activity. The baseline characteristics of the study population were congruent with the results of the needs assessment. The IM protocol used for the systematic development of the intervention produced an appropriate intervention to test in the planned RCT. Trial registration number Netherlands Trial Register (NTR): NTR2199 PMID:21951433

  14. Development and fabrication of a graphite polyimide box beam

    NASA Technical Reports Server (NTRS)

    Nadler, M. A.; Darms, F. J.

    1972-01-01

    The state-of-the-art of graphite/polyimide structures was evaluated and key design and fabrication issues to be considered in future hardware programs are defined. The fabrication and testing at 500 F of a graphite/polyimide center wing box beam using OV-10A aircraft criteria was accomplished. The baseline design of this box was developed in a series of studies of other advanced composite materials: glass/epoxy, boron/epoxy, and boron/polyimide. The use of this basic design permits ready comparison of the performance of graphite/polyimide with these materials. Modifications to the baseline composite design were made only in those areas effected by the change of materials. Processing studies of graphite fiber polyimide resins systems resulted in the selection of a Modmor II/Gemon L material.

  15. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias; Khomenko, Anton; Haq, Mahmoodul; Udpa, Lalita

    2018-01-01

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions (EOC). To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations. We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a different segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using Monte-Carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate. We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all EOC, while the latter does not and leverages the fact that EOC vary slowly over time and can be modeled as a Gaussian process.

  16. Assessment of pictographs developed through a participatory design process using an online survey tool.

    PubMed

    Kim, Hyeoneui; Nakamura, Carlos; Zeng-Treitler, Qing

    2009-02-24

    Inpatient discharge instructions are a mandatory requirement of the Centers for Medicare and Medicaid Services and Joint Commission on Accreditation of Healthcare Organizations. The instructions include all the information relevant to post-discharge patient care. Prior studies show that patients often cannot fully understand or remember all the instructions. To address this issue, we have previously conducted a pilot study in which pictographs were created through a participatory design process to facilitate the comprehension and recall of discharge instructions. The main objective of this study was to verify the individual effectiveness of pictographs created through a participatory design process. In this study, we included 20 pictographs developed by our group and 20 pictographs developed by the Robert Wood Johnson Foundation as a reference baseline for pictographic recognition. To assess whether the participants could recognize the meaning of the pictographs, we designed an asymmetrical pictograph-text label-linking test. Data collection lasted for 7 days after the email invitation. A total of 44 people accessed the survey site. We excluded 7 participants who completed less than 50% of the survey. A total of 719 answers from 37 participants were analyzed. The analysis showed that the participants recognized the pictographs developed in-house significantly better than those included in the study as a baseline (P< .001). This trend was true regardless of the participant's gender, age, and education level. The results also revealed that there is a large variance in the quality of the pictographs developed using the same design process-the recognition rate ranged from below 50% to above 90%. This study confirmed that the majority of the pictographs developed in a participatory design process involving a small number of nurses and consumers were recognizable by a larger number of consumers. The variance in recognition rates suggests that pictographs should be assessed individually before being evaluated within the context of an application.

  17. Life support systems analysis and technical trades for a lunar outpost

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Ganapathi, G. B.; Rohatgi, N. K.; Seshan, P. K.

    1994-01-01

    The NASA/JPL life support systems analysis (LISSA) software tool was used to perform life support system analysis and technology trades for a Lunar Outpost. The life support system was modeled using a chemical process simulation program on a steady-state, one-person, daily basis. Inputs to the LiSSA model include metabolic balance load data, hygiene load data, technology selection, process operational assumptions and mission parameter assumptions. A baseline set of technologies has been used against which comparisons have been made by running twenty-two cases with technology substitutions. System, subsystem, and technology weights and powers are compared for a crew of 4 and missions of 90 and 600 days. By assigning a weight value to power, equivalent system weights are compared. Several less-developed technologies show potential advantages over the baseline. Solid waste treatment technologies show weight and power disadvantages but one could have benefits associated with the reduction of hazardous wastes and very long missions. Technology development towards reducing the weight of resupplies and lighter materials of construction was recommended. It was also recommended that as technologies are funded for development, contractors should be required to generate and report data useful for quantitative technology comparisons.

  18. Improved Topographic Mapping Through Multi-Baseline SAR Interferometry with MAP Estimation

    NASA Astrophysics Data System (ADS)

    Dong, Yuting; Jiang, Houjun; Zhang, Lu; Liao, Mingsheng; Shi, Xuguo

    2015-05-01

    There is an inherent contradiction between the sensitivity of height measurement and the accuracy of phase unwrapping for SAR interferometry (InSAR) over rough terrain. This contradiction can be resolved by multi-baseline InSAR analysis, which exploits multiple phase observations with different normal baselines to improve phase unwrapping accuracy, or even avoid phase unwrapping. In this paper we propose a maximum a posteriori (MAP) estimation method assisted by SRTM DEM data for multi-baseline InSAR topographic mapping. Based on our method, a data processing flow is established and applied in processing multi-baseline ALOS/PALSAR dataset. The accuracy of resultant DEMs is evaluated by using a standard Chinese national DEM of scale 1:10,000 as reference. The results show that multi-baseline InSAR can improve DEM accuracy compared with single-baseline case. It is noteworthy that phase unwrapping is avoided and the quality of multi-baseline InSAR DEM can meet the DTED-2 standard.

  19. Development of the Hydroecological Integrity Assessment Process for Determining Environmental Flows for New Jersey Streams

    USGS Publications Warehouse

    Kennen, Jonathan G.; Henriksen, James A.; Nieswand, Steven P.

    2007-01-01

    The natural flow regime paradigm and parallel stream ecological concepts and theories have established the benefits of maintaining or restoring the full range of natural hydrologic variation for physiochemical processes, biodiversity, and the evolutionary potential of aquatic and riparian communities. A synthesis of recent advances in hydroecological research coupled with stream classification has resulted in a new process to determine environmental flows and assess hydrologic alteration. This process has national and international applicability. It allows classification of streams into hydrologic stream classes and identification of a set of non-redundant and ecologically relevant hydrologic indices for 10 critical sub-components of flow. Three computer programs have been developed for implementing the Hydroecological Integrity Assessment Process (HIP): (1) the Hydrologic Indices Tool (HIT), which calculates 171 ecologically relevant hydrologic indices on the basis of daily-flow and peak-flow stream-gage data; (2) the New Jersey Hydrologic Assessment Tool (NJHAT), which can be used to establish a hydrologic baseline period, provide options for setting baseline environmental-flow standards, and compare past and proposed streamflow alterations; and (3) the New Jersey Stream Classification Tool (NJSCT), designed for placing unclassified streams into pre-defined stream classes. Biological and multivariate response models including principal-component, cluster, and discriminant-function analyses aided in the development of software and implementation of the HIP for New Jersey. A pilot effort is currently underway by the New Jersey Department of Environmental Protection in which the HIP is being used to evaluate the effects of past and proposed surface-water use, ground-water extraction, and land-use changes on stream ecosystems while determining the most effective way to integrate the process into ongoing regulatory programs. Ultimately, this scientifically defensible process will help to quantify the effects of anthropogenic changes and development on hydrologic variability and help planners and resource managers balance current and future water requirements with ecological needs.

  20. Spacesuit glove manufacturing enhancements through the use of advanced technologies

    NASA Astrophysics Data System (ADS)

    Cadogan, David; Bradley, David; Kosmo, Joseph

    The sucess of astronauts performing extravehicular activity (EVA) on orbit is highly dependent upon the performance of their spacesuit gloves.A study has recently been conducted to advance the development and manufacture of spacesuit gloves. The process replaces the manual techniques of spacesuit glove manufacture by utilizing emerging technologies such as laser scanning, Computer Aided Design (CAD), computer generated two-dimensional patterns from three-dimensionl surfaces, rapid prototyping technology, and laser cutting of materials, to manufacture the new gloves. Results of the program indicate that the baseline process will not increase the cost of the gloves as compared to the existing styles, and in production, may reduce the cost of the gloves. perhaps the most important outcome of the Laserscan process is that greater accuracy and design control can be realized. Greater accuracy was achieved in the baseline anthropometric measurement and CAD data measurement which subsequently improved the design feature. This effectively enhances glove performance through better fit and comfort.

  1. SHARP's systems engineering challenge: rectifying integrated product team requirements with performance issues in an evolutionary spiral development acquisition

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    2003-08-01

    Completing its final development and early deployment on the Navy's multi-role aircraft, the F/A-18 E/F Super Hornet, the SHAred Reconnaissance Pod (SHARP) provides the war fighter with the latest digital tactical reconnaissance (TAC Recce) Electro-Optical/Infrared (EO/IR) sensor system. The SHARP program is an evolutionary acquisition that used a spiral development process across a prototype development phase tightly coupled into overlapping Engineering and Manufacturing Development (EMD) and Low Rate Initial Production (LRIP) phases. Under a tight budget environment with a highly compressed schedule, SHARP challenged traditional acquisition strategies and systems engineering (SE) processes. Adopting tailored state-of-the-art systems engineering process models allowd the SHARP program to overcome the technical knowledge transition challenges imposed by a compressed program schedule. The program's original goal was the deployment of digital TAC Recce mission capabilities to the fleet customer by summer of 2003. Hardware and software integration technical challenges resulted from requirements definition and analysis activities performed across a government-industry led Integrated Product Team (IPT) involving Navy engineering and test sites, Boeing, and RTSC-EPS (with its subcontracted hardware and government furnished equipment vendors). Requirements development from a bottoms-up approach was adopted using an electronic requirements capture environment to clarify and establish the SHARP EMD product baseline specifications as relevant technical data became available. Applying Earned-Value Management (EVM) against an Integrated Master Schedule (IMS) resulted in efficiently managing SE task assignments and product deliveries in a dynamically evolving customer requirements environment. Application of Six Sigma improvement methodologies resulted in the uncovering of root causes of errors in wiring interconnectivity drawings, pod manufacturing processes, and avionics requirements specifications. Utilizing the draft NAVAIR SE guideline handbook and the ANSI/EIA-632 standard: Processes for Engineering a System, a systems engineering tailored process approach was adopted for the accelerated SHARP EMD prgram. Tailoring SE processes in this accelerated product delivery environment provided unique opportunities to be technically creative in the establishment of a product performance baseline. This paper provides an historical overview of the systems engineering activities spanning the prototype phase through the EMD SHARP program phase, the performance requirement capture activities and refinement process challenges, and what SE process improvements can be applied to future SHARP-like programs adopting a compressed, evolutionary spiral development acquisition paradigm.

  2. Multiple Acquisition InSAR Analysis: Persistent Scatterer and Small Baseline Approaches

    NASA Astrophysics Data System (ADS)

    Hooper, A.

    2006-12-01

    InSAR techniques that process data from multiple acquisitions enable us to form time series of deformation and also allow us to reduce error terms present in single interferograms. There are currently two broad categories of methods that deal with multiple images: persistent scatterer methods and small baseline methods. The persistent scatterer approach relies on identifying pixels whose scattering properties vary little with time and look angle. Pixels that are dominated by a singular scatterer best meet these criteria; therefore, images are processed at full resolution to both increase the chance of there being only one dominant scatterer present, and to reduce the contribution from other scatterers within each pixel. In images where most pixels contain multiple scatterers of similar strength, even at the highest possible resolution, the persistent scatterer approach is less optimal, as the scattering characteristics of these pixels vary substantially with look angle. In this case, an approach that interferes only pairs of images for which the difference in look angle is small makes better sense, and resolution can be sacrificed to reduce the effects of the look angle difference by band-pass filtering. This is the small baseline approach. Existing small baseline methods depend on forming a series of multilooked interferograms and unwrapping each one individually. This approach fails to take advantage of two of the benefits of processing multiple acquisitions, however, which are usually embodied in persistent scatterer methods: the ability to find and extract the phase for single-look pixels with good signal-to-noise ratio that are surrounded by noisy pixels, and the ability to unwrap more robustly in three dimensions, the third dimension being that of time. We have developed, therefore, a new small baseline method to select individual single-look pixels that behave coherently in time, so that isolated stable pixels may be found. After correction for various error terms, the phase values of the selected pixels are unwrapped using a new three-dimensional algorithm. We apply our small baseline method to an area in southern Iceland that includes Katla and Eyjafjallajökull volcanoes, and retrieve a time series of deformation that shows transient deformation due to intrusion of magma beneath Eyjafjallajökull. We also process the data using the Stanford method for persistent scatterers (StaMPS) for comparison.

  3. Parametric study of two planar high power flexible solar array concepts

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Kudija, D. A.; Zeldin, B.; Costogue, E. N.

    1978-01-01

    The design parameters examined were: frequency, aspect ratio, packaging constraints, and array blanket flatness. Specific power-to-mass ratios for both solar arrays as a function of array frequency and array width were developed and plotted. Summaries of the baseline design data, developed equations, the computer program operation, plots of the parameters, and the process for using the information as a design manual are presented.

  4. A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.

    1989-01-01

    In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.

  5. Impact of neurocognition on social and role functioning in individuals at clinical high risk for psychosis.

    PubMed

    Carrión, Ricardo E; Goldberg, Terry E; McLaughlin, Danielle; Auther, Andrea M; Correll, Christoph U; Cornblatt, Barbara A

    2011-08-01

    Cognitive deficits have been well documented in schizophrenia and have been shown to impair quality of life and to compromise everyday functioning. Recent studies of adolescents and young adults at high risk for developing psychosis show that neurocognitive impairments are detectable before the onset of psychotic symptoms. However, it remains unclear how cognitive impairments affect functioning before the onset of psychosis. The authors assessed cognitive impairment in adolescents at clinical high risk for psychosis and examined its impact on social and role functioning. A sample of 127 treatment-seeking patients at clinical high risk for psychosis and a group of 80 healthy comparison subjects were identified and recruited for research in the Recognition and Prevention Program. At baseline, participants were assessed with a comprehensive neurocognitive battery as well as measures of social and role functioning. Relative to healthy comparison subjects, clinical high-risk patients showed significant impairments in the domains of processing speed, verbal memory, executive function, working memory, visuospatial processing, motor speed, sustained attention, and language. Clinical high-risk patients also displayed impaired social and role functioning at baseline. Among patients with attenuated positive symptoms, processing speed was related to social and role functioning at baseline. These findings demonstrate that cognitive and functional impairments are detectable in patients at clinical high risk for psychosis before the onset of psychotic illness and that processing speed appears to be an important cognitive predictor of poor functioning.

  6. Impact of Neurocognition on Social and Role Functioning in Individuals at Clinical High Risk for Psychosis

    PubMed Central

    Carrión, Ricardo E.; Goldberg, Terry E.; McLaughlin, Danielle; Auther, Andrea M.; Correll, Christoph U.; Cornblatt, Barbara A.

    2011-01-01

    Objective Cognitive deficits have been well documented in schizophrenia and have been shown to impair quality of life and to compromise everyday functioning. Recent studies of adolescents and young adults at high risk for developing psychosis show that neurocognitive impairments are detectable before the onset of psychotic symptoms. However, it remains unclear how cognitive impairments affect functioning before the onset of psychosis. The authors assessed cognitive impairment in adolescents at clinical high risk for psychosis and examined its impact on social and role functioning. Method A sample of 127 treatment-seeking patients at clinical high risk for psychosis and a group of 80 healthy comparison subjects were identified and recruited for research in the Recognition and Prevention Program. At baseline, participants were assessed with a comprehensive neurocognitive battery as well as measures of social and role functioning. Results Relative to healthy comparison subjects, clinical high-risk patients showed significant impairments in the domains of processing speed, verbal memory, executive function, working memory, visuospatial processing, motor speed, sustained attention, and language. Clinical high-risk patients also displayed impaired social and role functioning at baseline. Among patients with attenuated positive symptoms, processing speed was related to social and role functioning at baseline. Conclusions These findings demonstrate that cognitive and functional impairments are detectable in patients at clinical high risk for psychosis before the onset of psychotic illness and that processing speed appears to be an important cognitive predictor of poor functioning. PMID:21536691

  7. Development of multichannel analyzer using sound card ADC for nuclear spectroscopy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Maslina Mohd; Yussup, Nolida; Lombigit, Lojius

    This paper describes the development of Multi-Channel Analyzer (MCA) using sound card analogue to digital converter (ADC) for nuclear spectroscopy system. The system was divided into a hardware module and a software module. Hardware module consist of detector NaI (Tl) 2” by 2”, Pulse Shaping Amplifier (PSA) and a build in ADC chip from readily available in any computers’ sound system. The software module is divided into two parts which are a pre-processing of raw digital input and the development of the MCA software. Band-pass filter and baseline stabilization and correction were implemented for the pre-processing. For the MCA development,more » the pulse height analysis method was used to process the signal before displaying it using histogram technique. The development and tested result for using the sound card as an MCA are discussed.« less

  8. Loris Malaguzzi, Reggio Emilia and Democratic Alternatives to Early Childhood Education Assessment

    ERIC Educational Resources Information Center

    Roberts-Holmes, Guy

    2017-01-01

    This article responds to the dangers arising from baseline assessment in reception classes. It contrasts predictive testing which claims to ascertain each child's ability and potential with the processes of observation, documentation and discussion developed in Reggio Emilia. It explores the two very different understandings of children which they…

  9. An analysis of the market potential of water hyacinth-based systems for municipal wastewater treatment

    NASA Technical Reports Server (NTRS)

    Robinson, A. C.; Gorman, H. J.; Hillman, M.; Lawhon, W. T.; Maase, D. L.; Mcclure, T. A.

    1976-01-01

    The potential U.S. market for tertiary municipal wastewater treatment facilities which make use of water hyacinths was investigated. A baseline design was developed which approximates the "typical" or "average" situation under which hyacinth-based systems can be used. The total market size for tertiary treatment was then estimated for those geographical regions in which hyacinths appear to be applicable. Market penetration of the baseline hyacinth system when competing with conventional chemical and physical processing systems was approximated, based primarily on cost differences. A limited analysis was made of the sensitivity of market penetration to individual changes in these assumptions.

  10. CryoSat Level1b SAR/SARin BaselineC: Product Format and Algorithm Improvements

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline B, was released in operation in February 2012. A reprocessing campaign followed, in order to reprocess the data since July 2010. After more than 2 years of development, the release in operations of Baseline C is expected in the first half of 2015. BaselineC Level1b products will be distributed in an updated format, including for example the attitude information (roll, pitch and yaw) and, for SAR/SARIN, the waveform length doubled with respect to Baseline B. Moreveor, various algorithm improvements have been identified: • a datation bias of about -0.5195 ms will be corrected (SAR/SARIn) • a range bias of about 0.6730 m will be corrected (SAR/SARIn) • a roll bias of 0.1062 deg and a pitch bias of 0.0520 deg • Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms With the operational release of BaselineC, the second CryoSat reprocessing campaign will be initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also at IPF2 level. The reprocessing campaign will cover the full Cryosat mission starting on 16th July 2010. This poster details the new information that will be added in the CryoSat BaselineC Level1b SAR/SARin products and the main quality improvements will be described.

  11. Encapsulation Processing and Manufacturing Yield Analysis

    NASA Technical Reports Server (NTRS)

    Willis, P.

    1985-01-01

    Evaluation of the ethyl vinyl acetate (EVA) encapsulation system is presented. This work is part of the materials baseline needed to demonstrate a 30 year module lifetime capability. Process and compound variables are both being studied along with various module materials. Results have shown that EVA should be stored rolled up, and enclosed in a plastic bag to retard loss of peroxide curing agents. The TBEC curing agent has superior shelf life and processing than the earlier Lupersol-101 curing agent. Analytical methods were developed to test for peroxide content, and experimental methodologies were formalized.

  12. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Restructuring research objectives from a technical readiness demonstration program to an investigation of high risk, high payoff activities associated with producing photovoltaic modules using non-CZ sheet material is reported. Deletion of the module frame in favor of a frameless design, and modification in cell series parallel electrical interconnect configuration are reviewed. A baseline process sequence was identified for the fabrication of modules using the selected dendritic web sheet material, and economic evaluations of the sequence were completed.

  13. Simple processes drive unpredictable differences in estuarine fish assemblages: Baselines for understanding site-specific ecological and anthropogenic impacts

    NASA Astrophysics Data System (ADS)

    Sheaves, Marcus

    2016-03-01

    Predicting patterns of abundance and composition of biotic assemblages is essential to our understanding of key ecological processes, and our ability to monitor, evaluate and manage assemblages and ecosystems. Fish assemblages often vary from estuary to estuary in apparently unpredictable ways, making it challenging to develop a general understanding of the processes that determine assemblage composition. This makes it problematic to transfer understanding from one estuary situation to another and therefore difficult to assemble effective management plans or to assess the impacts of natural and anthropogenic disturbance. Although system-to-system variability is a common property of ecological systems, rather than being random it is the product of complex interactions of multiple causes and effects at a variety of spatial and temporal scales. I investigate the drivers of differences in estuary fish assemblages, to develop a simple model explaining the diversity and complexity of observed estuary-to-estuary differences, and explore its implications for management and conservation. The model attributes apparently unpredictable differences in fish assemblage composition from estuary to estuary to the interaction of species-specific, life history-specific and scale-specific processes. In explaining innate faunal differences among estuaries without the need to invoke complex ecological or anthropogenic drivers, the model provides a baseline against which the effects of additional natural and anthropogenic factors can be evaluated.

  14. U-10Mo Baseline Fuel Fabrication Process Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less

  15. Decreasing Postanesthesia Care Unit to Floor Transfer Times to Facilitate Short Stay Total Joint Replacements.

    PubMed

    Sibia, Udai S; Grover, Jennifer; Turcotte, Justin J; Seanger, Michelle L; England, Kimberly A; King, Jennifer L; King, Paul J

    2018-04-01

    We describe a process for studying and improving baseline postanesthesia care unit (PACU)-to-floor transfer times after total joint replacements. Quality improvement project using lean methodology. Phase I of the investigational process involved collection of baseline data. Phase II involved developing targeted solutions to improve throughput. Phase III involved measured project sustainability. Phase I investigations revealed that patients spent an additional 62 minutes waiting in the PACU after being designated ready for transfer. Five to 16 telephone calls were needed between the PACU and the unit to facilitate each patient transfer. The most common reason for delay was unavailability of the unit nurse who was attending to another patient (58%). Phase II interventions resulted in transfer times decreasing to 13 minutes (79% reduction, P < .001). Phase III recorded sustained transfer times at 30 minutes, a net 52% reduction (P < .001) from baseline. Lean methodology resulted in the immediate decrease of PACU-to-floor transfer times by 79%, with a 52% sustained improvement. Our methods can also be used to improve efficiencies of care at other institutions. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  16. Similarities in error processing establish a link between saccade prediction at baseline and adaptation performance.

    PubMed

    Wong, Aaron L; Shelhamer, Mark

    2014-05-01

    Adaptive processes are crucial in maintaining the accuracy of body movements and rely on error storage and processing mechanisms. Although classically studied with adaptation paradigms, evidence of these ongoing error-correction mechanisms should also be detectable in other movements. Despite this connection, current adaptation models are challenged when forecasting adaptation ability with measures of baseline behavior. On the other hand, we have previously identified an error-correction process present in a particular form of baseline behavior, the generation of predictive saccades. This process exhibits long-term intertrial correlations that decay gradually (as a power law) and are best characterized with the tools of fractal time series analysis. Since this baseline task and adaptation both involve error storage and processing, we sought to find a link between the intertrial correlations of the error-correction process in predictive saccades and the ability of subjects to alter their saccade amplitudes during an adaptation task. Here we find just such a relationship: the stronger the intertrial correlations during prediction, the more rapid the acquisition of adaptation. This reinforces the links found previously between prediction and adaptation in motor control and suggests that current adaptation models are inadequate to capture the complete dynamics of these error-correction processes. A better understanding of the similarities in error processing between prediction and adaptation might provide the means to forecast adaptation ability with a baseline task. This would have many potential uses in physical therapy and the general design of paradigms of motor adaptation. Copyright © 2014 the American Physiological Society.

  17. Beyond the Baseline: Proceedings of the Space Station Evolution Symposium. Volume 2, Part 2; Space Station Freedom Advanced Development Program

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This report contains the individual presentations delivered at the Space Station Evolution Symposium in League City, Texas on February 6, 7, 8, 1990. Personnel responsible for Advanced Systems Studies and Advanced Development within the Space Station Freedom program reported on the results of their work to date. Systems Studies presentations focused on identifying the baseline design provisions (hooks and scars) necessary to enable evolution of the facility to support changing space policy and anticipated user needs. Also emphasized were evolution configuration and operations concepts including on-orbit processing of space transfer vehicles. Advanced Development task managers discussed transitioning advanced technologies to the baseline program, including those near-term technologies which will enhance the safety and productivity of the crew and the reliability of station systems. Special emphasis was placed on applying advanced automation technology to ground and flight systems. This publication consists of two volumes. Volume 1 contains the results of the advanced system studies with the emphasis on reference evolution configurations, system design requirements and accommodations, and long-range technology projections. Volume 2 reports on advanced development tasks within the Transition Definition Program. Products of these tasks include: engineering fidelity demonstrations and evaluations on Station development testbeds and Shuttle-based flight experiments; detailed requirements and performance specifications which address advanced technology implementation issues; and mature applications and the tools required for the development, implementation, and support of advanced technology within the Space Station Freedom Program.

  18. Development of a consensus core dataset in juvenile dermatomyositis for clinical use to inform research.

    PubMed

    McCann, Liza J; Pilkington, Clarissa A; Huber, Adam M; Ravelli, Angelo; Appelbe, Duncan; Kirkham, Jamie J; Williamson, Paula R; Aggarwal, Amita; Christopher-Stine, Lisa; Constantin, Tamas; Feldman, Brian M; Lundberg, Ingrid; Maillard, Sue; Mathiesen, Pernille; Murphy, Ruth; Pachman, Lauren M; Reed, Ann M; Rider, Lisa G; van Royen-Kerkof, Annet; Russo, Ricardo; Spinty, Stefan; Wedderburn, Lucy R; Beresford, Michael W

    2018-02-01

    This study aimed to develop consensus on an internationally agreed dataset for juvenile dermatomyositis (JDM), designed for clinical use, to enhance collaborative research and allow integration of data between centres. A prototype dataset was developed through a formal process that included analysing items within existing databases of patients with idiopathic inflammatory myopathies. This template was used to aid a structured multistage consensus process. Exploiting Delphi methodology, two web-based questionnaires were distributed to healthcare professionals caring for patients with JDM identified through email distribution lists of international paediatric rheumatology and myositis research groups. A separate questionnaire was sent to parents of children with JDM and patients with JDM, identified through established research networks and patient support groups. The results of these parallel processes informed a face-to-face nominal group consensus meeting of international myositis experts, tasked with defining the content of the dataset. This developed dataset was tested in routine clinical practice before review and finalisation. A dataset containing 123 items was formulated with an accompanying glossary. Demographic and diagnostic data are contained within form A collected at baseline visit only, disease activity measures are included within form B collected at every visit and disease damage items within form C collected at baseline and annual visits thereafter. Through a robust international process, a consensus dataset for JDM has been formulated that can capture disease activity and damage over time. This dataset can be incorporated into national and international collaborative efforts, including existing clinical research databases. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudha, P.; Shubhashree, D.; Khan, H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline,more » namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.« less

  20. Development of stitched/RTM primary structures for transport aircraft

    NASA Technical Reports Server (NTRS)

    Hawley, Arthur V.

    1993-01-01

    This report covers work accomplished in the Innovative Composite Aircraft Primary Structure (ICAPS) program. An account is given of the design criteria and philosophy that guides the development. Wing and fuselage components used as a baseline for development are described. The major thrust of the program is to achieve a major cost breakthrough through development of stitched dry preforms and resin transfer molding (RTM), and progress on these processes is reported. A full description is provided on the fabrication of the stitched RTM wing panels. Test data are presented.

  1. Seeing light at the end of the tunnel: Positive prospective mental imagery and optimism in depression.

    PubMed

    Ji, Julie L; Holmes, Emily A; Blackwell, Simon E

    2017-01-01

    Optimism is associated with positive outcomes across many health domains, from cardiovascular disease to depression. However, we know little about cognitive processes underlying optimism in psychopathology. The present study tested whether the ability to vividly imagine positive events in one's future was associated with dispositional optimism in a sample of depressed adults. Cross-sectional and longitudinal analyses were conducted, using baseline (all participants, N=150) and follow-up data (participants in the control condition only, N=63) from a clinical trial (Blackwell et al., 2015). Vividness of positive prospective imagery, assessed on a laboratory-administered task at baseline, was significantly associated with both current optimism levels at baseline and future (seven months later) optimism levels, including when controlling for potential confounds. Even when depressed, those individuals able to envision a brighter future were more optimistic, and regained optimism more quickly over time, than those less able to do so at baseline. Strategies to increase the vividness of positive prospective imagery may aid development of mental health interventions to boost optimism. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions. To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations.We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a differentmore » segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using monte-carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate.We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all environmental and operating conditions, while the latter does not and leverages the fact that environmental and operating conditions vary slowly over time and can be modeled as a Gaussian process.« less

  3. Marital Conflict, Allostatic Load, and the Development of Children's Fluid Cognitive Performance

    ERIC Educational Resources Information Center

    Hinnant, J. Benjamin; El-Sheikh, Mona; Keiley, Margaret; Buckhalt, Joseph A.

    2013-01-01

    Relations between marital conflict, children's respiratory sinus arrhythmia (RSA), and fluid cognitive performance were examined over 3 years to assess allostatic processes. Participants were 251 children reporting on marital conflict, baseline RSA, and RSA reactivity (RSA-R) to a lab challenge were recorded, and fluid cognitive performance…

  4. Process Waste Assessment for the Diana Laser Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, N.M.

    1993-12-01

    This Process Waste Assessment was conducted to evaluate the Diana Laser Laboratory, located in the Combustion Research Facility. It documents the hazardous chemical waste streams generated by the laser process and establishes a baseline for future waste minimization efforts. This Process Waste Assessment will be reevaluated in approximately 18 to 24 months, after enough time has passed to implement recommendations and to compare results with the baseline established in this assessment.

  5. Building a Privacy, Ethics, and Data Access Framework for Real World Computerised Medical Record System Data: A Delphi Study. Contribution of the Primary Health Care Informatics Working Group.

    PubMed

    Liyanage, H; Liaw, S-T; Di Iorio, C T; Kuziemsky, C; Schreiber, R; Terry, A L; de Lusignan, S

    2016-11-10

    Privacy, ethics, and data access issues pose significant challenges to the timely delivery of health research. Whilst the fundamental drivers to ensure that data access is ethical and satisfies privacy requirements are similar, they are often dealt with in varying ways by different approval processes. To achieve a consensus across an international panel of health care and informatics professionals on an integrated set of privacy and ethics principles that could accelerate health data access in data-driven health research projects. A three-round consensus development process was used. In round one, we developed a baseline framework for privacy, ethics, and data access based on a review of existing literature in the health, informatics, and policy domains. This was further developed using a two-round Delphi consensus building process involving 20 experts who were members of the International Medical Informatics Association (IMIA) and European Federation of Medical Informatics (EFMI) Primary Health Care Informatics Working Groups. To achieve consensus we required an extended Delphi process. The first round involved feedback on and development of the baseline framework. This consisted of four components: (1) ethical principles, (2) ethical guidance questions, (3) privacy and data access principles, and (4) privacy and data access guidance questions. Round two developed consensus in key areas of the revised framework, allowing the building of a newly, more detailed and descriptive framework. In the final round panel experts expressed their opinions, either as agreements or disagreements, on the ethics and privacy statements of the framework finding some of the previous round disagreements to be surprising in view of established ethical principles. This study develops a framework for an integrated approach to ethics and privacy. Privacy breech risk should not be considered in isolation but instead balanced by potential ethical benefit.

  6. GPS-based system for satellite tracking and geodesy

    NASA Technical Reports Server (NTRS)

    Bertiger, Willy I.; Thornton, Catherine L.

    1989-01-01

    High-performance receivers and data processing systems developed for GPS are reviewed. The GPS Inferred Positioning System (GIPSY) and the Orbiter Analysis and Simulation Software (OASIS) are described. The OASIS software is used to assess GPS system performance using GIPSY for data processing. Consideration is given to parameter estimation for multiday arcs, orbit repeatability, orbit prediction, daily baseline repeatability, agreement with VLBI, and ambiguity resolution. Also, the dual-frequency Rogue receiver, which can track up to eight GPS satellites simultaneously, is discussed.

  7. Pretreatment data is highly predictive of liver chemistry signals in clinical trials.

    PubMed

    Cai, Zhaohui; Bresell, Anders; Steinberg, Mark H; Silberg, Debra G; Furlong, Stephen T

    2012-01-01

    The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline) information. Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results. Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy's law cases. Baseline γ-glutamyltransferase (GGT) level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests. It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline) data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones.

  8. Subfoveal choroidal thickness predicts macular atrophy in age-related macular degeneration: results from the TREX-AMD trial.

    PubMed

    Fan, Wenying; Abdelfattah, Nizar Saleh; Uji, Akihito; Lei, Jianqin; Ip, Michael; Sadda, SriniVas R; Wykoff, Charles C

    2018-03-01

    Our purpose was to evaluate the relationship between subfoveal choroidal thickness (SCT) and development of macular atrophy (MA) in eyes with age-related macular degeneration (AMD). This was a prospective, multicenter study. Sixty participants (120 eyes) in the TREX-AMD trial (NCT01648292) with treatment-naïve neovascular AMD (NVAMD) in at least one eye were included. SCT was measured by certified reading center graders at baseline using spectral domain optical coherence tomography (SDOCT). The baseline SCT was correlated with the presence of MA at baseline and development of incident MA by month 18. Generalized estimating equations were used to account for information from both eyes. Baseline SCT in eyes with MA was statistically significantly less than in those without MA in both the dry AMD (DAMD) (P = 0.04) and NVAMD (P = 0.01) groups. Comparison of baseline SCT between MA developers and non-MA developers revealed a statistically significant difference (P = 0.03). Receiver operating characteristic curve (ROC) analysis showed the cut-off threshold of SCT for predicting the development of MA in cases without MA at baseline was 124 μm (AUC = 0.772; Sensitivity = 0.923; Specificity = 0.5). Among eyes without MA at baseline, those with baseline SCT ≤124 μm were 4.3 times more likely to develop MA (Odds ratio: 4.3, 95% confidence interval: 1.6-12, P = 0.005) than those with baseline SCT >124 μm. Eyes with AMD and MA had less SCT than those without MA. Eyes with less baseline SCT also appear to be at higher risk to develop MA within 18 months.

  9. Defense Remote Handled Transuranic Waste Cost/Schedule Optimization Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, G.D.; Beaulieu, D.H.; Wolaver, R.W.

    1986-11-01

    The purpose of this study is to provide the DOE information with which it can establish the most efficient program for the long management and disposal, in the Waste Isolation Pilot Plant (WIPP), of remote handled (RH) transuranic (TRU) waste. To fulfill this purpose, a comprehensive review of waste characteristics, existing and projected waste inventories, processing and transportation options, and WIPP requirements was made. Cost differences between waste management alternatives were analyzed and compared to an established baseline. The result of this study is an information package that DOE can use as the basis for policy decisions. As part ofmore » this study, a comprehensive list of alternatives for each element of the baseline was developed and reviewed with the sites. The principle conclusions of the study follow. A single processing facility for RH TRU waste is both necessary and sufficient. The RH TRU processing facility should be located at Oak Ridge National Laboratory (ORNL). Shielding of RH TRU to contact handled levels is not an economic alternative in general, but is an acceptable alternative for specific waste streams. Compaction is only cost effective at the ORNL processing facility, with a possible exception at Hanford for small compaction of paint cans of newly generated glovebox waste. It is more cost effective to ship certified waste to WIPP in 55-gal drums than in canisters, assuming a suitable drum cask becomes available. Some waste forms cannot be packaged in drums, a canister/shielded cask capability is also required. To achieve the desired disposal rate, the ORNL processing facility must be operational by 1996. Implementing the conclusions of this study can save approximately $110 million, compared to the baseline, in facility, transportation, and interim storage costs through the year 2013. 10 figs., 28 tabs.« less

  10. Distributed state machine supervision for long-baseline gravitational-wave detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rollins, Jameson Graef, E-mail: jameson.rollins@ligo.org

    The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitatemore » the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.« less

  11. An application of multiattribute decision analysis to the Space Station Freedom program. Case study: Automation and robotics technology evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.

    1990-01-01

    The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.

  12. Red Plague Control Plan (RPCP)

    NASA Technical Reports Server (NTRS)

    Cooke, Robert W.

    2010-01-01

    SCOPE: Prescribes the minimum requirements for the control of cuprous / cupric oxide corrosion (a.k.a. Red Plague) of silver-coated copper wire, cable, and harness assemblies. PURPOSE: Targeted for applications where exposure to assembly processes, environmental conditions, and contamination may promote the development of cuprous / cupric oxide corrosion (a.k.a. Red Plague) in silver-coated copper wire, cable, and harness assemblies. Does not exclude any alternate or contractor-proprietary documents or processes that meet or exceed the baseline of requirements established by this document. Use of alternate or contractor-proprietary documents or processes shall require review and prior approval of the procuring NASA activity.

  13. 40 CFR 63.1281 - Control equipment requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... dehydration unit baseline operations (as defined in § 63.1271). Records of glycol dehydration unit baseline... the Administrator's satisfaction, the conditions for which glycol dehydration unit baseline operations... emission reduction of 95.0 percent for the glycol dehydration unit process vent. Only modifications in...

  14. 40 CFR 63.1281 - Control equipment requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... dehydration unit baseline operations (as defined in § 63.1271). Records of glycol dehydration unit baseline... the Administrator's satisfaction, the conditions for which glycol dehydration unit baseline operations... emission reduction of 95.0 percent for the glycol dehydration unit process vent. Only modifications in...

  15. Individual differences in the cortisol-awakening response during the first two years of shift work: A longitudinal study in novice police officers.

    PubMed

    Lammers-van der Holst, Heidi M; Kerkhof, Gerard A

    2015-01-01

    Cortisol acts as a critical biological intermediary through which chronic stressors like shift work impact upon multiple physiological, neuro-endocrine and hormonal functions. Therefore, the cortisol awakening response (CAR) is suggested as a prime index of shift work tolerance. Repeated assessments of the CAR (calculated as MnInc) in a group of 25 young novice police officers showed that in the interval between about 4 and 14 months after transitioning from regular day work to rotating shift work, mean values began to rise from baseline to significantly higher levels at about 14 months after they commenced shift work. Visual inspection of the individual trends revealed that a subgroup of 10 subjects followed a monotonically rising trend, whereas another 14 subjects, after an initial rise from about 4-14 months, reverted to a smaller, baseline level cortisol response at about 20 months after the start of shift work. If the initial increase in the cortisol response marks the development of a chronic stress response, the subsequent reversal to baseline levels in the subgroup of 14 participants might be indicative of a process of recovery, possibly the development of shift work tolerance.

  16. Baseline design of an OTEC pilot plantship. Volume A. Detailed report. [Performance analysis of OTEC power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, J. F.; Richards, D.; Perini, L. L.

    1979-05-01

    The Applied Physics Laboratory (APL) of the Johns Hopkins University has engineered a baseline design of an Ocean Thermal Energy Conversion (OTEC) pilot plantship. The work was sponsored jointly by the Department of Energy and the US Maritime Administration of the Department of Commerce. The design, drawings, specifications, supporting calculations, and narrative documentation are available through APL for use by the Government and industry for the acquisition of a pilot OTEC system. The baseline design features a platform that is configured to produce up to 20 MW(e) (net) power, using low-cost folded-tube aluminum heat exchangers, while it grazes slowly inmore » tropical waters where the thermal gradient is greatest and the ocean environment is least severe. The design was developed by a team of contractors whose capabilities provided a systems approach to the design process. The work is documented in three volumes. Volume A is the Detailed report, which develops the design rationale, summarizes important calculations, outlines areas for future work, and presents a study of system costs. Volumes B and C, respectively, contain the engineering drawings and specifications.« less

  17. Factors Contributing to Disparities in Baseline Neurocognitive Performance and Concussion Symptom Scores Between Black and White Collegiate Athletes.

    PubMed

    Wallace, Jessica; Covassin, Tracey; Moran, Ryan; Deitrick, Jamie McAllister

    2017-11-02

    National Collegiate Athletic Association (NCAA) concussion guidelines state that all NCAA athletes must have a concussion baseline test prior to commencing their competitive season. To date, little research has examined potential racial differences on baseline neurocognitive performance among NCAA athletes. The purpose of this study was to investigate differences between Black and White collegiate athletes on baseline neurocognitive performance and self-reported symptoms. A total of 597 collegiate athletes (400 White, 197 Black) participated in this study. Athletes self-reported their race on the demographic section of their pre-participation physical examination and were administered the Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) neurocognitive battery in a supervised, quiet room. Controlling for sex, data were analyzed using separate one-way analyses of covariance (ANCOVAs) on symptom score, verbal and visual memory, visual motor processing speed, and reaction time composite scores. Results revealed significant differences between White and Black athletes on baseline symptom score (F (1,542)  = 5.82, p = .01), visual motor processing speed (F (1,542)  = 14.89, p < .001), and reaction time (F (1,542)  = 11.50, p < .01). White athletes performed better than Black athletes on baseline visual motor processing speed and reaction time. Black athletes reported higher baseline symptom scores compared to Whites. There was no statistical difference between race on verbal memory (p = .08) and that on visual memory (p = .06). Black athletes demonstrated disparities on some neurocognitive measures at baseline. These results suggest capturing an individual baseline on each athlete, as normative data comparisons may be inappropriate for athletes of a racial minority.

  18. Deformation Estimation In Non-Urban Areas Exploiting High Resolution SAR Data

    NASA Astrophysics Data System (ADS)

    Goel, Kanika; Adam, Nico

    2012-01-01

    Advanced techniques such as the Small Baseline Subset Algorithm (SBAS) have been developed for terrain motion mapping in non-urban areas with a focus on extracting information from distributed scatterers (DSs). SBAS uses small baseline differential interferograms (to limit the effects of geometric decorrelation) and these are typically multilooked to reduce phase noise, resulting in loss of resolution. Various error sources e.g. phase unwrapping errors, topographic errors, temporal decorrelation and atmospheric effects also affect the interferometric phase. The aim of our work is an improved deformation monitoring in non-urban areas exploiting high resolution SAR data. The paper provides technical details and a processing example of a newly developed technique which incorporates an adaptive spatial phase filtering algorithm for an accurate high resolution differential interferometric stacking, followed by deformation retrieval via the SBAS approach where we perform the phase inversion using a more robust L1 norm minimization.

  19. Do compensation processes impair mental health? A meta-analysis.

    PubMed

    Elbers, Nieke A; Hulst, Liesbeth; Cuijpers, Pim; Akkermans, Arno J; Bruinvels, David J

    2013-05-01

    Victims who are involved in a compensation processes generally have more health complaints compared to victims who are not involved in a compensation process. Previous research regarding the effect of compensation processes has concentrated on the effect on physical health. This meta-analysis focuses on the effect of compensation processes on mental health. Prospective cohort studies addressing compensation and mental health after traffic accidents, occupational accidents or medical errors were identified using PubMed, EMBASE, PsycInfo, CINAHL, and the Cochrane Library. Relevant studies published between January 1966 and 10 June 2011 were selected for inclusion. Ten studies were included. The first finding was that the compensation group already had higher mental health complaints at baseline compared to the non-compensation group (standardised mean difference (SMD)=-0.38; 95% confidence interval (CI) -0.66 to -0.10; p=.01). The second finding was that mental health between baseline and post measurement improved less in the compensation group compared to the non-compensation group (SMD=-0.35; 95% CI -0.70 to -0.01; p=.05). However, the quality of evidence was limited, mainly because of low quality study design and heterogeneity. Being involved in a compensation process is associated with higher mental health complaints but three-quarters of the difference appeared to be already present at baseline. The findings of this study should be interpreted with caution because of the limited quality of evidence. The difference at baseline may be explained by a selection bias or more anger and blame about the accident in the compensation group. The difference between baseline and follow-up may be explained by secondary gain and secondary victimisation. Future research should involve assessment of exposure to compensation processes, should analyse and correct for baseline differences, and could examine the effect of time, compensation scheme design, and claim settlement on (mental) health. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Experimental Analysis of Small-Group Performance Effectiveness: Behavioral and Biological Interactions.

    DTIC Science & Technology

    1982-04-01

    processes requiring systematic experimental analysis. Accordingly, group performance effectiveness studies were initiated to 61 assess the effects on...the experiment. 67 active processes associated with Joining the respective established groups, but the absence of baseline levels precludes such an...novitiate in comparison to such values observed during baseline days suggested an active process associated with the joining of the group and emphasized the

  1. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint.

    PubMed

    Gong, Ang; Zhao, Xiubin; Pang, Chunlei; Duan, Rong; Wang, Yong

    2015-12-02

    For Global Navigation Satellite System (GNSS) single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA) method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  2. Pretreatment data is highly predictive of liver chemistry signals in clinical trials

    PubMed Central

    Cai, Zhaohui; Bresell, Anders; Steinberg, Mark H; Silberg, Debra G; Furlong, Stephen T

    2012-01-01

    Purpose The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline) information. Patients and methods Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results. Results Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy’s law cases. Baseline γ-glutamyltransferase (GGT) level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests. Conclusion It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline) data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones. PMID:23226004

  3. Meat Consumption and Risk of Developing Type 2 Diabetes in the SUN Project: A Highly Educated Middle-Class Population.

    PubMed

    Mari-Sanchis, A; Gea, A; Basterra-Gortari, F J; Martinez-Gonzalez, M A; Beunza, J J; Bes-Rastrollo, M

    2016-01-01

    Meat consumption has been consistently associated with the risk of diabetes in different populations. The aim of our study was to investigate the incidence of type 2 diabetes according to baseline total meat consumption in a longitudinal assessment of a middle-aged Mediterranean population. We followed 18,527 participants (mean age: 38 years, 61% women) in the SUN Project, an open-enrolment cohort of a highly educated population of middle-class Spanish graduate students. All participants were initially free of diabetes. Diet was assessed at baseline using a semi-quantitative food frequency questionnaire of 136-items previously validated. Incident diabetes was defined according to the American Diabetes Association's criteria. We identified 146 incident cases of diabetes after a maximum of 14 years of follow-up period (mean: 8.7 years). In the fully adjusted model, the consumption of ≥3 servings/day of all types of meat was significantly associated with a higher risk of diabetes (HR: 1.85; 95% CI: 1.03-3.31; p for trend = 0.031) in comparison with the reference category (<2 servings/day). When we separated processed from non-processed meat, we observed a non-significant higher risk associated with greater consumption of processed meat and a non-significant lower risk associated with non-processed meat consumption (p for trend = 0.123 and 0.487, respectively). No significant difference was found between the two types of meat (p = 0.594). Our results suggest that meat consumption, especially processed meat, was associated with a higher risk of developing diabetes in our young Mediterranean cohort.

  4. Investigation of Central Pain Processing in Post-Operative Shoulder Pain and Disability

    PubMed Central

    Valencia, Carolina; Fillingim, Roger B.; Bishop, Mark; Wu, Samuel S.; Wright, Thomas W.; Moser, Michael; Farmer, Kevin; George, Steven Z.

    2014-01-01

    Measures of central pain processing like conditioned pain modulation (CPM), and suprathreshold heat pain response (SHPR) have been described to assess different components of central pain modulatory mechanisms. Central pain processing potentially play a role in the development of postsurgical pain, however, the role of CPM and SHPR in explaining postoperative clinical pain and disability is still unclear. Seventy eight patients with clinical shoulder pain were included in this study. Patients were examined before shoulder surgery, at 3 months, and 6 months after surgery. The primary outcome measures were pain intensity and upper extremity disability. Analyses revealed that the change score (baseline – 3 months) of 5th pain rating of SHPR accounted for a significant amount of variance in 6 month postsurgical clinical pain intensity and disability after age, sex, preoperative pain intensity, and relevant psychological factors were considered. The present study suggests that baseline measures of central pain processing were not predictive of 6 month postoperative pain outcome. Instead, the 3 month change in SHPR might be a relevant factor in the transition to elevated 6-month postoperative pain and disability outcomes. In patients with shoulder pain, the 3 month change in a measure of central pain processing might be a relevant factor in the transition to elevated 6-month postoperative pain and disability scores. PMID:24042347

  5. The ASPRS Remote Sensing Industry Forecast: Phase II & III - Digital Sensor Compilation

    NASA Technical Reports Server (NTRS)

    Mondello, Charles

    2007-01-01

    In August 1999, ASPRS and NASA's (then) Commercial Remote Sensing Program (CRSP) entered into a 5-year Space Act Agreement (SAA), combining resources and expertise to: (a) Baseline the Remote Sensing Industry (RSI) based on GEIA Model; (b) Develop a 10-Year RSI market forecast and attendant processes; and (c) Provide improved information for decision makers.

  6. Sixth-Grade Students' Views of the Nature of Engineering and Images of Engineers

    ERIC Educational Resources Information Center

    Karatas, Faik O.; Micklos, Amy; Bodner, George M.

    2011-01-01

    This study investigated the views of the nature of engineering held by 6th-grade students to provide a baseline upon which activities or curriculum materials might be developed to introduce middle-school students to the work of engineers and the process of engineering design. A phenomenographic framework was used to guide the analysis of data…

  7. Progress since the World Summit for Children: A Statistical Review.

    ERIC Educational Resources Information Center

    United Nations Children's Fund, New York, NY.

    One of the strengths of the 1990 World Summit for Children was its emphasis on goals to drive development and shape actions, and on the need to monitor progress, thereby transforming the way the world collected and processed data on children and women and creating a vital base and baseline for progress. In 2000, an exhaustive end-decade review of…

  8. Improving Social Cognition in People with Schizophrenia with RC2S: Two Single-Case Studies.

    PubMed

    Peyroux, Elodie; Franck, Nicolas

    2016-01-01

    Difficulties in social interactions are a central characteristic of people with schizophrenia, and can be partly explained by impairments of social cognitive processes. New strategies of cognitive remediation have been recently developed to target these deficits. The RC2S therapy is an individualized and partly computerized program through which patients practice social interactions and develop social cognitive abilities with simulation techniques in a realistic environment. Here, we present the results of two case-studies involving two patients with schizophrenia presenting with specific profiles of impaired social cognition. Each patient completed three baseline sessions, 14 treatment sessions, and 3 follow-up sessions at the end of the therapy - and for 1 patient, another 3 sessions 9 months later. We used a multiple baseline design to assess specific components of social cognition according to the patients' profiles. Functioning and symptomatology were also assessed at the end of the treatment and 6 months later. Results highlight significant improvements in the targeted social cognitive processes and positive changes in functioning in the long term. The RC2S program seems, thus, to be a new useful program for social cognitive remediation in schizophrenia.

  9. Improving Social Cognition in People with Schizophrenia with RC2S: Two Single-Case Studies

    PubMed Central

    Peyroux, Elodie; Franck, Nicolas

    2016-01-01

    Difficulties in social interactions are a central characteristic of people with schizophrenia, and can be partly explained by impairments of social cognitive processes. New strategies of cognitive remediation have been recently developed to target these deficits. The RC2S therapy is an individualized and partly computerized program through which patients practice social interactions and develop social cognitive abilities with simulation techniques in a realistic environment. Here, we present the results of two case-studies involving two patients with schizophrenia presenting with specific profiles of impaired social cognition. Each patient completed three baseline sessions, 14 treatment sessions, and 3 follow-up sessions at the end of the therapy – and for 1 patient, another 3 sessions 9 months later. We used a multiple baseline design to assess specific components of social cognition according to the patients’ profiles. Functioning and symptomatology were also assessed at the end of the treatment and 6 months later. Results highlight significant improvements in the targeted social cognitive processes and positive changes in functioning in the long term. The RC2S program seems, thus, to be a new useful program for social cognitive remediation in schizophrenia. PMID:27199776

  10. Do not throw out the baby with the bath water: choosing an effective baseline for a functional localizer of speech processing.

    PubMed

    Stoppelman, Nadav; Harpaz, Tamar; Ben-Shachar, Michal

    2013-05-01

    Speech processing engages multiple cortical regions in the temporal, parietal, and frontal lobes. Isolating speech-sensitive cortex in individual participants is of major clinical and scientific importance. This task is complicated by the fact that responses to sensory and linguistic aspects of speech are tightly packed within the posterior superior temporal cortex. In functional magnetic resonance imaging (fMRI), various baseline conditions are typically used in order to isolate speech-specific from basic auditory responses. Using a short, continuous sampling paradigm, we show that reversed ("backward") speech, a commonly used auditory baseline for speech processing, removes much of the speech responses in frontal and temporal language regions of adult individuals. On the other hand, signal correlated noise (SCN) serves as an effective baseline for removing primary auditory responses while maintaining strong signals in the same language regions. We show that the response to reversed speech in left inferior frontal gyrus decays significantly faster than the response to speech, thus suggesting that this response reflects bottom-up activation of speech analysis followed up by top-down attenuation once the signal is classified as nonspeech. The results overall favor SCN as an auditory baseline for speech processing.

  11. Predicting infant cortical surface development using a 4D varifold-based learning framework and local topography-based shape morphing.

    PubMed

    Rekik, Islem; Li, Gang; Lin, Weili; Shen, Dinggang

    2016-02-01

    Longitudinal neuroimaging analysis methods have remarkably advanced our understanding of early postnatal brain development. However, learning predictive models to trace forth the evolution trajectories of both normal and abnormal cortical shapes remains broadly absent. To fill this critical gap, we pioneered the first prediction model for longitudinal developing cortical surfaces in infants using a spatiotemporal current-based learning framework solely from the baseline cortical surface. In this paper, we detail this prediction model and even further improve its performance by introducing two key variants. First, we use the varifold metric to overcome the limitations of the current metric for surface registration that was used in our preliminary study. We also extend the conventional varifold-based surface registration model for pairwise registration to a spatiotemporal surface regression model. Second, we propose a morphing process of the baseline surface using its topographic attributes such as normal direction and principal curvature sign. Specifically, our method learns from longitudinal data both the geometric (vertices positions) and dynamic (temporal evolution trajectories) features of the infant cortical surface, comprising a training stage and a prediction stage. In the training stage, we use the proposed varifold-based shape regression model to estimate geodesic cortical shape evolution trajectories for each training subject. We then build an empirical mean spatiotemporal surface atlas. In the prediction stage, given an infant, we select the best learnt features from training subjects to simultaneously predict the cortical surface shapes at all later timepoints, based on similarity metrics between this baseline surface and the learnt baseline population average surface atlas. We used a leave-one-out cross validation method to predict the inner cortical surface shape at 3, 6, 9 and 12 months of age from the baseline cortical surface shape at birth. Our method attained a higher prediction accuracy and better captured the spatiotemporal dynamic change of the highly folded cortical surface than the previous proposed prediction method. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Paleolimnological assessment of riverine and atmospheric pathways and sources of metal deposition at a floodplain lake (Slave River Delta, Northwest Territories, Canada).

    PubMed

    MacDonald, Lauren A; Wiklund, Johan A; Elmes, Matthew C; Wolfe, Brent B; Hall, Roland I

    2016-02-15

    Growth of natural resource development in northern Canada has raised concerns about the effects on downstream aquatic ecosystems, but insufficient knowledge of pre-industrial baseline conditions continues to undermine ability of monitoring programs to distinguish industrial-derived contaminants from those supplied by natural processes. Here, we apply a novel paleolimnological approach to define pre-industrial baseline concentrations of 13 priority pollutant metals and vanadium and assess temporal changes, pathways and sources of these metals at a flood-prone lake (SD2) in the Slave River Delta (NWT, Canada) located ~500 km north of Alberta's oil sands development and ~140 km south of a former gold mine at Yellowknife, NWT. Results identify that metal concentrations, normalized to lithium concentration, are not elevated in sediments deposited during intervals of high flood influence or low flood influence since onset of oil sands development (post-1967) relative to the 1920-1967 baseline established at SD2. When compared to a previously defined baseline for the upstream Athabasca River, several metal-Li relations (Cd, Cr, Ni, Zn, V) in post-1967 sediments delivered by floodwaters appear to plot along a different trajectory, suggesting that the Peace and Slave River watersheds are important natural sources of metal deposition at the Slave River Delta. However, analysis revealed unusually high concentrations of As deposited during the 1950s, an interval of very low flood influence at SD2, which corresponded closely with emission history of the Giant Mine gold smelter indicating a legacy of far-field atmospheric pollution. Our study demonstrates the potential for paleolimnological characterization of baseline conditions and detection of pollution from multiple pathways in floodplain ecosystems, but that knowledge of paleohydrological conditions is essential for interpretation of contaminant profiles. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Post Launch Calibration and Testing of the Advanced Baseline Imager on the GOES-R Satellite

    NASA Technical Reports Server (NTRS)

    Lebair, William; Rollins, C.; Kline, John; Todirita, M.; Kronenwetter, J.

    2016-01-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United State's National Oceanic and Atmospheric Administration. The first launch of the GOES-R series is planned for October 2016. The GOES-R series satellites and instruments are being developed by the National Aeronautics and Space Administration (NASA). One of the key instruments on the GOES-R series is the Advance Baseline Imager (ABI). The ABI is a multi-channel, visible through infrared, passive imaging radiometer. The ABI will provide moderate spatial and spectral resolution at high temporal and radiometric resolution to accurately monitor rapidly changing weather. Initial on-orbit calibration and performance characterization is crucial to establishing baseline used to maintain performance throughout mission life. A series of tests has been planned to establish the post launch performance and establish the parameters needed to process the data in the Ground Processing Algorithm. The large number of detectors for each channel required to provide the needed temporal coverage presents unique challenges for accurately calibrating ABI and minimizing striping. This paper discusses the planned tests to be performed on ABI over the six-month Post Launch Test period and the expected performance as it relates to ground tests.

  14. Self-Concept Clarity in Adolescents and Parents: A Six-Wave Longitudinal and Multi-Informant Study on Development and Intergenerational Transmission.

    PubMed

    Crocetti, Elisabetta; Rubini, Monica; Branje, Susan; Koot, Hans M; Meeus, Wim

    2016-10-01

    The purpose of this study was twofold: (a) to disentangle patterns of change and stability in self-concept clarity (SCC) in adolescents and in their parents and (b) to examine processes of intergenerational transmission of SCC in families with adolescents. Participants were 497 Dutch families including the father (baseline Mage  = 46.74), the mother (baseline Mage  = 44.41), and their adolescent child (56.9% males; baseline Mage  = 13.03). Each family member completed the SCC scale for six waves, with a one-year interval between each wave. Latent growth curve analyses indicated that adolescent boys reported higher SCC than girls. Furthermore, fathers and mothers reported higher SCC than their children, and it increased over time. Indices of SCC rank-order stability were high and increased from T1 to T2, T2 to T3, etc., for each family member, especially for adolescents. Multivariate latent growth curve analyses and cross-lagged models highlighted a unidirectional transmission process, with fathers' and mothers' SCC influencing adolescents' SCC. This result was not moderated by adolescent gender. These findings indicate that self-concept clarity is transmitted from parents to children. © 2015 Wiley Periodicals, Inc.

  15. Processing speed, attention, and working memory after treatment for medulloblastoma: an international, prospective, and longitudinal study.

    PubMed

    Palmer, Shawna L; Armstrong, Carol; Onar-Thomas, Arzu; Wu, Shengjie; Wallace, Dana; Bonner, Melanie J; Schreiber, Jane; Swain, Michelle; Chapieski, Lynn; Mabbott, Donald; Knight, Sarah; Boyle, Robyn; Gajjar, Amar

    2013-10-01

    The current study prospectively examined processing speed (PS), broad attention (BA), and working memory (WM) ability of patients diagnosed with medulloblastoma over a 5-year period. The study included 126 patients, ages 3 to 21 years at diagnosis, enrolled onto a collaborative protocol for medulloblastoma. Patients were treated with postsurgical risk-adapted craniospinal irradiation (n = 36 high risk [HR]; n = 90 average risk) followed by four cycles of high-dose chemotherapy with stem-cell support. Patients completed 509 neuropsychological evaluations using the Woodcock-Johnson Tests of Cognitive Abilities Third Edition (median of three observations per patient). Linear mixed effects models revealed that younger age at diagnosis, HR classification, and higher baseline scores were significantly associated with poorer outcomes in PS. Patients treated as HR and those with higher baseline scores are estimated to have less favorable outcomes in WM and BA over time. Parent education and marital status were significantly associated with BA and WM baseline scores but not change over time. Of the three key domains, PS was estimated to have the lowest scores at 5 years after diagnosis. Identifying cognitive domains most vulnerable to decline should guide researchers who are aiming to develop efficacious cognitive intervention and rehabilitation programs, thereby improving the quality of survivorship for the pediatric medulloblastoma population.

  16. Post Launch Calibration and Testing of the Advanced Baseline Imager on the GOES-R Satellite

    NASA Technical Reports Server (NTRS)

    Lebair, William; Rollins, C.; Kline, John; Todirita, M.; Kronenwetter, J.

    2016-01-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United States National Oceanic and Atmospheric Administration. The first launch of the GOES-R series is planned for October 2016. The GOES-R series satellites and instruments are being developed by the National Aeronautics and Space Administration (NASA). One of the key instruments on the GOES-R series is the Advance Baseline Imager (ABI). The ABI is a multi-channel, visible through infrared, passive imaging radiometer. The ABI will provide moderate spatial and spectral resolution at high temporal and radiometric resolution to accurately monitor rapidly changing weather. Initial on-orbit calibration and performance characterization is crucial to establishing baseline used to maintain performance throughout mission life. A series of tests has been planned to establish the post launch performance and establish the parameters needed to process the data in the Ground Processing Algorithm. The large number of detectors for each channel required to provide the needed temporal coverage presents unique challenges for accurately calibrating ABI and minimizing striping. This paper discusses the planned tests to be performed on ABI over the six-month Post Launch Test period and the expected performance as it relates to ground tests.

  17. Post launch calibration and testing of the Advanced Baseline Imager on the GOES-R satellite

    NASA Astrophysics Data System (ADS)

    Lebair, William; Rollins, C.; Kline, John; Todirita, M.; Kronenwetter, J.

    2016-05-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United State's National Oceanic and Atmospheric Administration. The first launch of the GOES-R series is planned for October 2016. The GOES-R series satellites and instruments are being developed by the National Aeronautics and Space Administration (NASA). One of the key instruments on the GOES-R series is the Advance Baseline Imager (ABI). The ABI is a multi-channel, visible through infrared, passive imaging radiometer. The ABI will provide moderate spatial and spectral resolution at high temporal and radiometric resolution to accurately monitor rapidly changing weather. Initial on-orbit calibration and performance characterization is crucial to establishing baseline used to maintain performance throughout mission life. A series of tests has been planned to establish the post launch performance and establish the parameters needed to process the data in the Ground Processing Algorithm. The large number of detectors for each channel required to provide the needed temporal coverage presents unique challenges for accurately calibrating ABI and minimizing striping. This paper discusses the planned tests to be performed on ABI over the six-month Post Launch Test period and the expected performance as it relates to ground tests.

  18. Baseline tests for arc melter vitrification of INEL buried wastes. Volume II: Baseline test data appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, L.L.; O`Conner, W.K.; Turner, P.C.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc meltingmore » furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.« less

  19. The Emergy Baseline of the Earth: Is it Arbitrary?

    EPA Science Inventory

    The emergy baseline for the Earth is used in determining the transformities of the products of all planetary processes and through these relationships it influences all emergy evaluations. Estimates of the emergy baseline made in the past have changed depending on the number of i...

  20. Active lithium chloride cell for spacecraft power

    NASA Technical Reports Server (NTRS)

    Fleischmann, C. W.; Horning, R. J.

    1988-01-01

    An active thionyl chloride high rate battery is under development for spacecraft operations. It is a 540kC (150 Ah) battery capable of pulses up to 75A. This paper describes the design and initial test data on a 'state-of-the-art' cell that has been selected to be the baseline for the prototype cell for that battery. Initial data indicate that the specification can be met with fresh cells. Data for stored cells and additional environmental test data are in the process of being developed.

  1. Arabic handwritten: pre-processing and segmentation

    NASA Astrophysics Data System (ADS)

    Maliki, Makki; Jassim, Sabah; Al-Jawad, Naseer; Sellahewa, Harin

    2012-06-01

    This paper is concerned with pre-processing and segmentation tasks that influence the performance of Optical Character Recognition (OCR) systems and handwritten/printed text recognition. In Arabic, these tasks are adversely effected by the fact that many words are made up of sub-words, with many sub-words there associated one or more diacritics that are not connected to the sub-word's body; there could be multiple instances of sub-words overlap. To overcome these problems we investigate and develop segmentation techniques that first segment a document into sub-words, link the diacritics with their sub-words, and removes possible overlapping between words and sub-words. We shall also investigate two approaches for pre-processing tasks to estimate sub-words baseline, and to determine parameters that yield appropriate slope correction, slant removal. We shall investigate the use of linear regression on sub-words pixels to determine their central x and y coordinates, as well as their high density part. We also develop a new incremental rotation procedure to be performed on sub-words that determines the best rotation angle needed to realign baselines. We shall demonstrate the benefits of these proposals by conducting extensive experiments on publicly available databases and in-house created databases. These algorithms help improve character segmentation accuracy by transforming handwritten Arabic text into a form that could benefit from analysis of printed text.

  2. Visually cued motor synchronization: modulation of fMRI activation patterns by baseline condition.

    PubMed

    Cerasa, Antonio; Hagberg, Gisela E; Bianciardi, Marta; Sabatini, Umberto

    2005-01-03

    A well-known issue in functional neuroimaging studies, regarding motor synchronization, is to design suitable control tasks able to discriminate between the brain structures involved in primary time-keeper functions and those related to other processes such as attentional effort. The aim of this work was to investigate how the predictability of stimulus onsets in the baseline condition modulates the activity in brain structures related to processes involved in time-keeper functions during the performance of a visually cued motor synchronization task (VM). The rational behind this choice derives from the notion that using different stimulus predictability can vary the subject's attention and the consequently neural activity. For this purpose, baseline levels of BOLD activity were obtained from 12 subjects during a conventional-baseline condition: maintained fixation of the visual rhythmic stimuli presented in the VM task, and a random-baseline condition: maintained fixation of visual stimuli occurring randomly. fMRI analysis demonstrated that while brain areas with a documented role in basic time processing are detected independent of the baseline condition (right cerebellum, bilateral putamen, left thalamus, left superior temporal gyrus, left sensorimotor cortex, left dorsal premotor cortex and supplementary motor area), the ventral premotor cortex, caudate nucleus, insula and inferior frontal gyrus exhibited a baseline-dependent activation. We conclude that maintained fixation of unpredictable visual stimuli can be employed in order to reduce or eliminate neural activity related to attentional components present in the synchronization task.

  3. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  4. Environmental baselines: preparing for shale gas in the UK

    NASA Astrophysics Data System (ADS)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at < 10 micrograms per litre. These values compare with much higher levels of methane in aquicludes and thermal waters, for example from the Carboniferous and Triassic which have concentrations in excess of 1500 micrograms per litre. It is important to understand the spatial relationships between potential shale gas source rocks and overlying aquifers if shale gas is to be developed in a safe and sustainable manner. The BGS and the Environment Agency have undertaken a national-scale study of the UK to assess the vertical separation between potential shale gas source rocks and major aquifers (iHydrogeology project). Aquifer - shale separations have been documented in the range <200m to >2km. The geological modelling process will be presented and discussed along with maps combining the results of the methane baseline study, the distribution of Principal Aquifers and shale/clay units, and aquifer - shale separation maps for the UK.

  5. GPS Attitude Determination Using Deployable-Mounted Antennas

    NASA Technical Reports Server (NTRS)

    Osborne, Michael L.; Tolson, Robert H.

    1996-01-01

    The primary objective of this investigation is to develop a method to solve for spacecraft attitude in the presence of potential incomplete antenna deployment. Most research on the use of the Global Positioning System (GPS) in attitude determination has assumed that the antenna baselines are known to less than 5 centimeters, or one quarter of the GPS signal wavelength. However, if the GPS antennas are mounted on a deployable fixture such as a solar panel, the actual antenna positions will not necessarily be within 5 cm of nominal. Incomplete antenna deployment could cause the baselines to be grossly in error, perhaps by as much as a meter. Overcoming this large uncertainty in order to accurately determine attitude is the focus of this study. To this end, a two-step solution method is proposed. The first step uses a least-squares estimate of the baselines to geometrically calculate the deployment angle errors of the solar panels. For the spacecraft under investigation, the first step determines the baselines to 3-4 cm with 4-8 minutes of data. A Kalman filter is then used to complete the attitude determination process, resulting in typical attitude errors of 0.50.

  6. Utility of the advanced chronic kidney disease patient management tools: case studies.

    PubMed

    Patwardhan, Meenal B; Matchar, David B; Samsa, Gregory P; Haley, William E

    2008-01-01

    Appropriate management of advanced chronic kidney disease (CKD) delays or limits its progression. The Advanced CKD Patient Management Toolkit was developed using a process-improvement technique to assist patient management and address CKD-specific management issues. We pilot tested the toolkit in 2 community nephrology practices, assessed the utility of individual tools, and evaluated the impact on conformance to an advanced CKD guideline through patient chart abstraction. Tool use was distinct in the 2 sites and depended on the site champion's involvement, the extent of process reconfiguration demanded by a tool, and its perceived value. Baseline conformance varied across guideline recommendations (averaged 54%). Posttrial conformance increased in all clinical areas (averaged 59%). Valuable features of the toolkit in real-world settings were its ability to: facilitate tool selection, direct implementation efforts in response to a baseline performance audit, and allow selection of tool versions and customizing them. Our results suggest that systematically created, multifaceted, and customizable tools can promote guideline conformance.

  7. Uniformly Processed Strong Motion Database for Himalaya and Northeast Region of India

    NASA Astrophysics Data System (ADS)

    Gupta, I. D.

    2018-03-01

    This paper presents the first uniformly processed comprehensive database on strong motion acceleration records for the extensive regions of western Himalaya, northeast India, and the alluvial plains juxtaposing the Himalaya. This includes 146 three components of old analog records corrected for the instrument response and baseline distortions and 471 three components of recent digital records corrected for baseline errors. The paper first provides a background of the evolution of strong motion data in India and the seismotectonics of the areas of recording, then describes the details of the recording stations and the contributing earthquakes, which is finally followed by the methodology used to obtain baseline corrected data in a uniform and consistent manner. Two different schemes in common use for baseline correction are based on the application of the Ormsby filter without zero pads (Trifunac 1971) and that on the Butterworth filter with zero pads at the start as well as at the end (Converse and Brady 1992). To integrate the advantages of both the schemes, Ormsby filter with zero pads at the start only is used in the present study. A large number of typical example results are presented to illustrate that the methodology adopted is able to provide realistic velocity and displacement records with much smaller number of zero pads. The present strong motion database of corrected acceleration records will be useful for analyzing the ground motion characteristics of engineering importance, developing prediction equations for various strong motion parameters, and calibrating the seismological source model approach for ground motion simulation for seismically active and risk prone areas of India.

  8. Predictive value of different prostate-specific antigen-based markers in men with baseline total prostate-specific antigen <2.0 ng/mL.

    PubMed

    Fujizuka, Yuji; Ito, Kazuto; Oki, Ryo; Suzuki, Rie; Sekine, Yoshitaka; Koike, Hidekazu; Matsui, Hiroshi; Shibata, Yasuhiro; Suzuki, Kazuhiro

    2017-08-01

    To investigate the predictive value of various molecular forms of prostate-specific antigen in men with baseline prostate-specific antigen <2.0 ng/mL. The case cohort comprised 150 men with a baseline prostate-specific antigen level <2.0 ng/mL, and who developed prostate cancer within 10 years. The control cohort was 300 baseline prostate-specific antigen- and age-adjusted men who did not develop prostate cancer. Serum prostate-specific antigen, free prostate-specific antigen, and [-2] proenzyme prostate-specific antigen were measured at baseline and last screening visit. The predictive impact of baseline prostate-specific antigen- and [-2] proenzyme prostate-specific antigen-related indices on developing prostate cancer was investigated. The predictive impact of those indices at last screening visit and velocities from baseline to final screening on tumor aggressiveness were also investigated. The baseline free to total prostate-specific antigen ratio was a significant predictor of prostate cancer development. The odds ratio was 6.08 in the lowest quintile baseline free to total prostate-specific antigen ratio subgroup. No serum indices at diagnosis were associated with tumor aggressiveness. The Prostate Health Index velocity and [-2] proenzyme prostate-specific antigen/free prostate-specific antigen velocity significantly increased in patients with higher risk D'Amico risk groups and higher Gleason scores. Free to total prostate-specific antigen ratio in men with low baseline prostate-specific antigen levels seems to predict the risk of developing prostate cancer, and it could be useful for a more effective individualized screening system. Longitudinal changes in [-2] proenzyme prostate-specific antigen-related indices seem to correlate with tumor aggressiveness, and they could be used as prognostic tool before treatment and during active surveillance. © 2017 The Japanese Urological Association.

  9. Disentangling the developmental trajectories of letter position and letter identity coding using masked priming.

    PubMed

    Kezilas, Yvette; McKague, Meredith; Kohnen, Saskia; Badcock, Nicholas A; Castles, Anne

    2017-02-01

    Masked transposed-letter (TL) priming effects have been used to index letter position processing over the course of reading development. Whereas some studies have reported an increase in TL priming over development, others have reported a decrease. These findings have led to the development of 2 somewhat contradictory accounts of letter position development: the lexical tuning hypothesis and the multiple-route model. One factor that may be contributing to these discrepancies is the use of baseline primes that substitute letters in the target word, which may confound the effect of changes in letter position processing over development with those of letter identity. The present study included an identity prime (e.g., listen-LISTEN), in addition to the standard two-substituted-letter (2SL; e.g., lidfen-LISTEN) and all-letter-different (ALD; e.g., rodfup-LISTEN) baselines, to remove the potential confound between letter position and letter identity information in determining the effect of the TL prime. Priming effects were measured in a lexical decision task administered to children aged 7-12 and a group of university students. Using inverse transformed response times, targets preceded by a TL prime were responded to significantly faster than those preceded by 2SL and ALD primes, and priming remained stable across development. In contrast, targets preceded by a TL prime were responded to significantly slower than those preceded by an ID prime, and this reaction-time cost increased significantly over development, with adults showing the largest cost. These findings are consistent with a lexical tuning account of letter position development, and are inconsistent with the multiple-route model. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Detailed requirements document for Stowage List and Hardware Tracking System (SLAHTS). [computer based information management system in support of space shuttle orbiter stowage configuration

    NASA Technical Reports Server (NTRS)

    Keltner, D. J.

    1975-01-01

    The stowage list and hardware tracking system, a computer based information management system, used in support of the space shuttle orbiter stowage configuration and the Johnson Space Center hardware tracking is described. The input, processing, and output requirements that serve as a baseline for system development are defined.

  11. Multi-mission space science data processing systems - Past, present, and future

    NASA Technical Reports Server (NTRS)

    Stallings, William H.

    1990-01-01

    Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.

  12. Civilian Radioactive Waste Management System Requirements Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C.A. Kouts

    2006-05-10

    The CRD addresses the requirements of Department of Energy (DOE) Order 413.3-Change 1, ''Program and Project Management for the Acquisition of Capital Assets'', by providing the Secretarial Acquisition Executive (Level 0) scope baseline and the Program-level (Level 1) technical baseline. The Secretarial Acquisition Executive approves the Office of Civilian Radioactive Waste Management's (OCRWM) critical decisions and changes against the Level 0 baseline; and in turn, the OCRWM Director approves all changes against the Level 1 baseline. This baseline establishes the top-level technical scope of the CRMWS and its three system elements, as described in section 1.3.2. The organizations responsible formore » design, development, and operation of system elements described in this document must therefore prepare subordinate project-level documents that are consistent with the CRD. Changes to requirements will be managed in accordance with established change and configuration control procedures. The CRD establishes requirements for the design, development, and operation of the CRWMS. It specifically addresses the top-level governing laws and regulations (e.g., ''Nuclear Waste Policy Act'' (NWPA), 10 Code of Federal Regulations (CFR) Part 63, 10 CFR Part 71, etc.) along with specific policy, performance requirements, interface requirements, and system architecture. The CRD shall be used as a vehicle to incorporate specific changes in technical scope or performance requirements that may have significant program implications. Such may include changes to the program mission, changes to operational capability, and high visibility stakeholder issues. The CRD uses a systems approach to: (1) identify key functions that the CRWMS must perform, (2) allocate top-level requirements derived from statutory, regulatory, and programmatic sources, and (3) define the basic elements of the system architecture and operational concept. Project-level documents address CRD requirements by further defining system element functions, decomposing requirements into significantly greater detail, and developing designs of system components, facilities, and equipment. The CRD addresses the identification and control of functional, physical, and operational boundaries between and within CRWMS elements. The CRD establishes requirements regarding key interfaces between the CRWMS and elements external to the CRWMS. Project elements define interfaces between CRWMS program elements. The Program has developed a change management process consistent with DOE Order 413.3-Change 1. Changes to the Secretarial Acquisition Executive and Program-level baselines must be approved by a Program Baseline Change Control Board. Specific thresholds have been established for identifying technical, cost, and schedule changes that require approval. The CRWMS continually evaluates system design and operational concepts to optimize performance and/or cost. The Program has developed systems analysis tools to assess potential enhancements to the physical system and to determine the impacts from cost saving initiatives, scientific and technological improvements, and engineering developments. The results of systems analyses, if appropriate, are factored into revisions to the CRD as revised Programmatic Requirements.« less

  13. Simulation models and designs for advanced Fischer-Tropsch technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less

  14. CryoSat Ice Processor: Known Processor Anomalies and Potential Future Product Evolutions

    NASA Astrophysics Data System (ADS)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Launched in 2010, CryoSat was designed to measure changes in polar sea ice thickness and ice sheet elevation. To reach this goal the CryoSat data products have to meet the highest performance standards and are subjected to a continual cycle of improvement achieved through upgrades to the Instrument Processing Facilities (IPFs). Following the switch to the Baseline-C Ice IPFs there are already planned evolutions for the next processing Baseline, based on recommendations from the Scientific Community, Expert Support Laboratory (ESL), Quality Control (QC) Centres and Validation campaigns. Some of the proposed evolutions, to be discussed with the scientific community, include the activation of freeboard computation in SARin mode, the potential operation of SARin mode over flat-to-slope transitory land ice areas, further tuning of the land ice retracker, the switch to NetCDF format and the resolution of anomalies arising in Baseline-C. This paper describes some of the anomalies known to affect Baseline-C in addition to potential evolutions that are planned and foreseen for Baseline-D.

  15. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    NASA Astrophysics Data System (ADS)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These factors are critical inputs into the support staffing resources calculation used by the lean planning model. Null hypothesis related to functional support staff resources was rejected for most functional groups, indicating that the baseline site plan inadequately provided for cross-functional staff involvement to support the lean transformation plan. Null hypotheses related to total lean transformation staffing could not be rejected, indicating that while total staffing plans were not significantly different than plans developed during the on-site review and through the use of the lean planning model, the allocation of staffing among various functional groups such as engineering, production, and materials planning was an issue. The on-site review process and simple lean transformation plan developed was determined to be useful in identifying short-comings in lean transformation planning within aerospace manufacturing and assembly sites. It was concluded that the differences uncovered were likely contributing factors affecting the effectiveness of aerospace manufacturing sites' implementation of lean cellular manufacturing.

  16. Acoustic startle response in rats predicts inter-individual variation in fear extinction.

    PubMed

    Russo, Amanda S; Parsons, Ryan G

    2017-03-01

    Although a large portion of the population is exposed to a traumatic event at some point, only a small percentage of the population develops post-traumatic stress disorder (PTSD), suggesting the presence of predisposing factors. Abnormal acoustic startle response (ASR) has been shown to be associated with PTSD, implicating it as a potential predictor of the development of PTSD-like behavior. Since poor extinction and retention of extinction learning are characteristic of PTSD patients, it is of interest to determine if abnormal ASR is predictive of development of such deficits. To determine whether baseline ASR has utility in predicting the development of PTSD-like behavior, the relationship between baseline ASR and freezing behavior following Pavlovian fear conditioning was examined in a group of adult, male Sprague-Dawley rats. Baseline acoustic startle response (ASR) was assessed preceding exposure to a Pavlovian fear conditioning paradigm where freezing behavior was measured during fear conditioning, extinction training, and extinction testing. Although there was no relationship between baseline ASR and fear memory following conditioning, rats with low baseline ASR had significantly lower magnitude of retention of the extinction memory than rats with high baseline ASR. The results suggest that baseline ASR has value as a predictive index of the development of a PTSD-like phenotype. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    NASA Technical Reports Server (NTRS)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily measured in the frequency domain. Therefore, we use the power spectrum (PS), which is the Fourier transform of the ACF, to describe our inter-trial correlations. The decay of the PS yields a straight line on a log-log frequency plot, which we quantify by Beta = - (slope of PS on log-log axes). Hence, Beta is a measure of the strength of inter- trial correlations in the baseline data. Larger Beta values are indicative of longer inter-trial correlations. Experimental Approach: We will begin by performing a retrospective analysis of treadmill-gait adaptation data previously collected by Dr. Bloomberg and colleagues. Specifically, we will quantify the strength of inter-trial correlations in the baseline step cadence and heart rate data and compare it to the locomotor adaptability performance results already described by these investigators. Incorporating these datasets will also allow us to explore the applicability of (and potential limitations surrounding) the use of Beta in forecasting physiological performance. We will also perform a new experiment, in which Beta will be derived from baseline data collected during over-ground (non-treadmill) walking, which will enable us to consider locomotor performance, through the parameter Beta, under the most functionallyrelevant, natural gait condition. This experiment will incorporate two baseline and five post-training over-ground locomotion tests to explore the consistency and potential adaptability of the Beta values themselves. HYPOTHESES: We hypothesize that the strength of baseline inter-trial correlations of step cadence and heart rate will relate to locomotor adaptability. Specifically, we anticipate that individuals who show weaker longer-term inter-trial correlations in baseline step cadence data will be the better adaptors, as step cadence can be modified in real-time (i.e., online corrections are an inherent property of the locomotor system; analogous to results observed in the VOR). Conversely, because heart rate is not altered mid-beat, we expect that individuals who demonstrate stronger longer-term correlations in heart rate will be the better adaptors (analogous to results observed in the saccadic system). CONCLUSIONS: At the conclusion of this project we hope to uncover a baseline predictor of locomotor adaptability. If our hypotheses hold true, our results will demonstrate that the temporal structure of baseline behavioral data contains important information that may aid in forecasting adaptive capacities. The ability to predict such adaptability in the sensorimotor system has significant implications for spaceflight, where astronauts must adjust their motor programs following a change in g-level to retain movement accuracy.

  18. Overview and evolution of the LeRC PMAD DC test bed

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.

    1992-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the Lewis Research Center (LeRC) has been developed electrical power system test beds to support the overall design effort. Through this time, the SSFP has changed the design baseline numerous times, however, the test bed effort has endeavored to track these changes. Beginning in August 1989 with the baseline and an all DC system, a test bed was developed to support the design baseline. The LeRC power measurement and distribution (PMAD) DC test bed and the changes in the restructure are described. The changes included the size reduction of primary power channel and various power processing elements. A substantial reduction was also made in the amount of flight software with the subsequent migration of these functions to ground control centers. The impact of these changes on the design of the power hardware, the controller algorithms, the control software, and a description of their current status is presented. An overview of the testing using the test bed is described, which includes investigation of stability and source impedance, primary and secondary fault protection, and performance of a rotary utility transfer device. Finally, information is presented on the evolution of the test bed to support the verification and operational phases of the SSFP in light of these restructure scrubs.

  19. Overview and evolution of the LeRC PMAD DC Testbed

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.

    1992-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the Lewis Research Center (LeRC) has been developed electrical power system test beds to support the overall design effort. Through this time, the SSFP has changed the design baseline numerous times, however, the test bed effort has endeavored to track these changes. Beginning in August 1989 with the baseline and an all DC system, a test bed was developed to support the design baseline. The LeRC power measurement and distribution (PMAD) DC test bed and the changes in the restructure are described. The changes includeed the size reduction of primary power channel and various power processing elements. A substantial reduction was also made in the amount of flight software with the subsequent migration of these functions to ground control centers. The impact of these changes on the design of the power hardware, the controller algorithms, the control software, and a description of their current status is presented. An overview of the testing using the test bed is described, which includes investigation of stability and source impedance, primary and secondary fault protection, and performance of a rotary utility transfer device. Finally, information is presented on the evolution of the test bed to support the verification and operational phases of the SSFP in light of these restructure scrubs.

  20. Multi-GNSS high-rate RTK, PPP and novel direct phase observation processing method: application to precise dynamic displacement detection

    NASA Astrophysics Data System (ADS)

    Paziewski, Jacek; Sieradzki, Rafal; Baryla, Radoslaw

    2018-03-01

    This paper provides the methodology and performance assessment of multi-GNSS signal processing for the detection of small-scale high-rate dynamic displacements. For this purpose, we used methods of relative (RTK) and absolute positioning (PPP), and a novel direct signal processing approach. The first two methods are recognized as providing accurate information on position in many navigation and surveying applications. The latter is an innovative method for dynamic displacement determination with the use of GNSS phase signal processing. This method is based on the developed functional model with parametrized epoch-wise topocentric relative coordinates derived from filtered GNSS observations. Current regular kinematic PPP positioning, as well as medium/long range RTK, may not offer coordinate estimates with subcentimeter precision. Thus, extended processing strategies of absolute and relative GNSS positioning have been developed and applied for displacement detection. The study also aimed to comparatively analyze the developed methods as well as to analyze the impact of combined GPS and BDS processing and the dependence of the results of the relative methods on the baseline length. All the methods were implemented with in-house developed software allowing for high-rate precise GNSS positioning and signal processing. The phase and pseudorange observations collected with a rate of 50 Hz during the field test served as the experiment’s data set. The displacements at the rover station were triggered in the horizontal plane using a device which was designed and constructed to ensure a periodic motion of GNSS antenna with an amplitude of ~3 cm and a frequency of ~4.5 Hz. Finally, a medium range RTK, PPP, and direct phase observation processing method demonstrated the capability of providing reliable and consistent results with the precision of the determined dynamic displacements at the millimeter level. Specifically, the research shows that the standard deviation of the displacement residuals obtained as the difference between a benchmark-ultra-short baseline RTK solution and selected scenarios ranged between 1.1 and 3.4 mm. At the same time, the differences in the mean amplitude of the oscillations derived from the established scenarios did not exceed 1.3 mm, whereas the frequency of the motion detected with the use of Fourier transformation was the same.

  1. The influence of socioeconomic factors on cardiovascular disease risk factors in the context of economic development in the Samoan archipelago.

    PubMed

    Ezeamama, Amara E; Viali, Satupaitea; Tuitele, John; McGarvey, Stephen T

    2006-11-01

    Early in economic development there are positive associations between socioeconomic status (SES) and cardiovascular disease (CVD) risk factors, and in the most developed market economy societies there are negative associations. The purpose of this report is to describe cross-sectional and longitudinal associations between indicators of SES and CVD risk factors in a genetically homogenous population of Samoans at different levels of economic development. At baseline 1289 participants 25-58yrs, and at 4-year follow-up, 963 participants were studied in less economically developed Samoa and in more developed American Samoa. SES was assessed by education, occupation, and material lifestyle at baseline. The CVD risk factors, obesity, type-2 diabetes and hypertension were measured at baseline and 4-year follow-up, and an index of any incident CVD risk factor at follow-up was calculated. Sex and location (Samoa and American Samoa) specific multivariable logistic regression models were used to test for relationships between SES and CVD risk factors at baseline after adjustment for age and the other SES indicators. In addition an ordinal SES index was constructed for each individual based on all three SES indicators, and used in a multivariable model to estimate the predicted probability of CVD risk factors across the SES index for the two locations. In both the models using specific SES measures and CVD risk factor outcomes, and the models using the ordinal SES index and predicted probabilities of CVD risk factors, we detected a pattern of high SES associated with: (1) elevated odds of CVD risk factors in less developed Samoa, and (2) decreased odds of CVD risk factors in more developed American Samoa. We conclude that the pattern of inverse associations between SES and CVD risk factors in Samoa and direct associations in American Samoa is attributable to the heterogeneity across the Samoas in specific exposures to social processes of economic development and the natural history of individual CVD risk factors. The findings suggest that interventions on non-communicable diseases in the Samoas must be devised based on the level of economic development, the socio-economic context of risk factor exposures, and individual characteristics such as age, sex and education level.

  2. Thermomechanical processing of HAYNES alloy No. 188 sheet to improve creep strength

    NASA Technical Reports Server (NTRS)

    Klarstrom, D. L.

    1978-01-01

    Improvements in the low strain creep strength of HAYNES alloy No. 188 thin gauge sheet by means of thermomechanical processing were developed. Processing methods designed to develop a sheet with strong crystallographic texture after recrystallization and to optimize grain size were principally studied. The effects of thickness-to-grain diameter ratio and prestrain on low strain creep strength were also briefly examined. Results indicate that the most significant improvements were obtained in the sheets having a strong crystallographic texture. The low strain creep strength of the textured sheets was observed to be superior to that of standard production sheets in the 922 K to 1255 K temperature range. Tensile, stress rupture, fabricability, and surface stability properties of the experimental sheets were also measured and compared to property values reported for the baseline production sheets.

  3. Developing services for climate impact and adaptation baseline information and methodologies for the Andes

    NASA Astrophysics Data System (ADS)

    Huggel, C.

    2012-04-01

    Impacts of climate change are observed and projected across a range of ecosystems and economic sectors, and mountain regions thereby rank among the hotspots of climate change. The Andes are considered particularly vulnerable to climate change, not only due to fragile ecosystems but also due to the high vulnerability of the population. Natural resources such as water systems play a critical role and are observed and projected to be seriously affected. Adaptation to climate change impacts is therefore crucial to contain the negative effects on the population. Adaptation projects require information on the climate and affected socio-environmental systems. There is, however, generally a lack of methodological guidelines how to generate the necessary scientific information and how to communicate to implementing governmental and non-governmental institutions. This is particularly important in view of the international funds for adaptation such as the Green Climate Fund established and set into process at the UNFCCC Conferences of the Parties in Cancun 2010 and Durban 2011. To facilitate this process international and regional organizations (World Bank and Andean Community) and a consortium of research institutions have joined forces to develop and define comprehensive methodologies for baseline and climate change impact assessments for the Andes, with an application potential to other mountain regions (AndesPlus project). Considered are the climatological baseline of a region, and the assessment of trends based on ground meteorological stations, reanalysis data, and satellite information. A challenge is the scarcity of climate information in the Andes, and the complex climatology of the mountain terrain. A climate data platform has been developed for the southern Peruvian Andes and is a key element for climate data service and exchange. Water resources are among the key livelihood components for the Andean population, and local and national economy, in particular for agriculture and hydropower. The retreat of glaciers as one of the clearest signal of climate change represents a problem for water supply during the long dry season. Hydrological modeling, using data from the few gauging stations and complemented by satellite precipitation data, is needed to generate baseline and climate impact information. Food security is often considered threatened due to climate change impacts, in the Andes for instance by droughts and cold spells that seriously affect high-elevation food systems. Eventually, methodologies are compiled and developed for analyzing risks from natural hazards and disasters. The vulnerabilities and risks for all types of climate impacts need to be reflected by analyzing the local and regional social, cultural, political and economic context. To provide the necessary references and information the project AndesPlus has developed a web-based knowledge and information platform. The highly interdisciplinary process of the project should contribute to climate impact and adaptation information services, needed to meet the challenges of adaptation.

  4. Word Sense Disambiguation in Bangla Language Using Supervised Methodology with Necessary Modifications

    NASA Astrophysics Data System (ADS)

    Pal, Alok Ranjan; Saha, Diganta; Dash, Niladri Sekhar; Pal, Antara

    2018-05-01

    An attempt is made in this paper to report how a supervised methodology has been adopted for the task of word sense disambiguation in Bangla with necessary modifications. At the initial stage, the Naïve Bayes probabilistic model that has been adopted as a baseline method for sense classification, yields moderate result with 81% accuracy when applied on a database of 19 (nineteen) most frequently used Bangla ambiguous words. On experimental basis, the baseline method is modified with two extensions: (a) inclusion of lemmatization process into of the system, and (b) bootstrapping of the operational process. As a result, the level of accuracy of the method is slightly improved up to 84% accuracy, which is a positive signal for the whole process of disambiguation as it opens scope for further modification of the existing method for better result. The data sets that have been used for this experiment include the Bangla POS tagged corpus obtained from the Indian Languages Corpora Initiative, and the Bangla WordNet, an online sense inventory developed at the Indian Statistical Institute, Kolkata. The paper also reports about the challenges and pitfalls of the work that have been closely observed and addressed to achieve expected level of accuracy.

  5. Low cost solar array project cell and module formation research area: Process research of non-CZ silicon material

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Liquid diffusion masks and liquid applied dopants to replace the CVD Silox masking and gaseous diffusion operations specified for forming junctions in the Westinghouse baseline process sequence for producing solar cells from dendritic web silicon were investigated. The baseline diffusion masking and drive processes were compared with those involving direct liquid applications to the dendritic web silicon strips. Attempts were made to control the number of variables by subjecting dendritic web strips cut from a single web crystal to both types of operations. Data generated reinforced earlier conclusions that efficiency levels at least as high as those achieved with the baseline back junction formation process can be achieved using liquid diffusion masks and liquid dopants. The deliveries of dendritic web sheet material and solar cells specified by the current contract were made as scheduled.

  6. Impact of Active Climate Control Seats on Energy Use, Fuel Use, and CO2 Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreutzer, Cory J; Rugh, John P; Titov, Eugene V

    A project was developed through collaboration between Gentherm and NREL to determine the impact of climate control seats for light-duty vehicles in the United States. The project used a combination of experimentation and analysis, with experimental results providing critical input to the analysis process. First, outdoor stationary vehicle testing was performed at NREL's facility in Golden, CO using multiple occupants. Two pre-production Ford Focus electric vehicles were used for testing; one containing a standard inactive seat and the second vehicle containing a Gentherm climate control seat. Multiple maximum cool-down and steady-state cooling tests were performed in late summer conditions. Themore » two vehicles were used to determine the increase in cabin temperature when using the climate control seat in comparison to the baseline vehicle cabin temperature with a standard seat at the equivalent occupant whole-body sensation. The experiments estimated that on average, the climate control seats allowed for a 2.61 degrees Celsius increase in vehicle cabin temperature at equivalent occupant body sensation compared to the baseline vehicle. The increased cabin air temperature along with their measured energy usage were then used as inputs to the national analysis process. The national analysis process was constructed from full vehicle cabin, HVAC, and propulsion models previously developed by NREL. In addition, three representative vehicle platforms, vehicle usage patterns, and vehicle registration weighted environmental data were integrated into the analysis process. Both the baseline vehicle and the vehicle with climate control seats were simulated, using the experimentally determined cabin temperature offset of 2.61degrees Celsius and added seat energy as inputs to the climate control seat vehicle model. The U.S. composite annual fuel use savings for the climate control seats over the baseline A/C system was determined to be 5.1 gallons of gasoline per year per vehicle, corresponding to 4.0 grams of CO2/mile savings. Finally, the potential impact of 100 percent adoption of climate control seats on U.S. light-duty fleet A/C fuel use was calculated to be 1.3 billion gallons of gasoline annually with a corresponding CO2 emissions reduction of 12.7 million tons. Direct comparison of the impact of the CCS to the ventilated seat off-cycle credit was not possible because the NREL analysis calculated a combined car/truck savings and the baseline A/C CO2 emissions were higher than EPA. To enable comparison, the CCS national A/C CO2 emissions were split into car/truck components and the ventilated seat credit was scaled up. The split CO2 emissions savings due to the CCS were 3.5 g/mi for a car and 4.4 g/mi for a truck. The CCS saved an additional 2.0 g/mi and 2.5 g/mi over the adjusted ventilated seat credit for a car and truck, respectively.« less

  7. The application of statistically designed experiments to resistance spot welding

    NASA Technical Reports Server (NTRS)

    Hafley, Robert A.; Hales, Stephen J.

    1991-01-01

    State-of-the-art Resistance Spot Welding (RSW) equipment has the potential to permit realtime monitoring of operations through advances in computerized process control. In order to realize adaptive feedback capabilities, it is necessary to establish correlations among process variables, welder outputs, and weldment properties. The initial step toward achieving this goal must involve assessment of the effect of specific process inputs and the interactions among these variables on spot weld characteristics. This investigation evaluated these effects through the application of a statistically designed experiment to the RSW process. A half-factorial, Taguchi L sub 16 design was used to understand and refine a RSW schedule developed for welding dissimilar aluminum-lithium alloys of different thickness. The baseline schedule had been established previously by traditional trial and error methods based on engineering judgment and one-factor-at-a-time studies. A hierarchy of inputs with respect to each other was established, and the significance of these inputs with respect to experimental noise was determined. Useful insight was gained into the effect of interactions among process variables, particularly with respect to weldment defects. The effects of equipment related changes associated with disassembly and recalibration were also identified. In spite of an apparent decrease in equipment performance, a significant improvement in the maximum strength for defect-free welds compared to the baseline schedule was achieved.

  8. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, L.L.; O`Connor, W.K.; Turner, P.C.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc meltingmore » furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.« less

  9. Lost ecosystem services as a measure of oil spill damages: a conceptual analysis of the importance of baselines.

    PubMed

    Kennedy, Chris J; Cheong, So-Min

    2013-10-15

    The assessment and quantification of damages resulting from marine oil spills is typically coordinated by NOAA, and has historically utilized Habitat Equivalency Analysis (HEA) to estimate damages. Resource economists and others have called for the damage assessment process to instead estimate injuries through the valuation of lost ecosystem services. Our conceptual analysis explores ecosystem service valuation from the perspective of "baselines," which are a fundamental component of both primary and compensatory restoration activities. In practice, baselines have been defined in ecological terms, with minimal consideration of the socioeconomic side of ecosystem service provision. We argue that, for the purposes of scaling compensatory restoration, it is more appropriate to characterize baselines in value terms, thereby integrating non-market valuation approaches from the onset of the damage assessment process. Benefits and challenges of this approach are discussed, along with guidelines for practitioners to identify circumstances in which socioeconomic variables are likely to be important for baseline characterization. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    PubMed

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of magnitude on many simulated datasets. The advantages of the proposed pipeline include informed and data specific input arguments for baseline subtraction methods, the avoidance of time-intensive and subjective piecewise baseline subtraction, and the ability to automate baseline subtraction completely. Moreover, individual steps can be adopted as stand-alone routines.

  11. Developing Interprofessional Education at One U.S. Dental School: Establishing a Baseline and Moving Forward.

    PubMed

    Townsend, Janice; Zorek, Joseph A; Andrieu, Sandra C; de Carvalho, Raquel Baroni; Mercante, Donald E; Schiavo, Julie H; Gunaldo, Tina P

    2018-05-01

    Dental schools across the U.S. are in the process of incorporating interprofessional education (IPE) into their curricula. At Louisiana State University Health Sciences Center-New Orleans (LSUHSC), the process of educating competent students fully prepared to maximize patient outcomes through interprofessional care is under way. The aim of this study was to establish baseline data on three years of LSU dental students' perceptions of IPE prior to and as a new two-year IPE curriculum was being introduced. A survey was conducted of dental students in all four years from 2015 to 2017 using the Student Perceptions of Interprofessional Clinical Education-Revised instrument, version 2 (SPICE-R2). In 2015, 120 students participated in the survey for a response rate of 46%, followed by 160 students in 2016 (62%) and 170 in 2017 (67%). The results showed that the first-year students in 2017 had a higher total SPICE-R2 mean score than the first-year students in 2015 and 2016; the difference was statistically significant. Even though the 2017 first-year students had only received an orientation to the curriculum at the time they completed the survey, this change in attitude suggests the new focus on IPE was already having an effect on students. There were no statistically significant differences between mean scores for the three cohorts of second-, third-, and fourth-year students, none of whom had experienced the new IPE curriculum. Data from this study will serve as a baseline from which to evaluate the impact of the new IPE curriculum that is now required of all first- and second-year dental students. Through continued IPE exposure in the curriculum and ongoing faculty development, further improvements in students' attitudes toward IPE can be anticipated.

  12. Automated baseline change detection -- Phases 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byler, E.

    1997-10-31

    The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrelmore » and on feature recognition in images. The ABCD image processing software was installed on a robotic vehicle developed under a related DOE/FETC contract DE-AC21-92MC29112 Intelligent Mobile Sensor System (IMSS) and integrated with the electronics and software. This vehicle was designed especially to navigate in DOE Waste Storage Facilities. Initial system testing was performed at Fernald in June 1996. After some further development and more extensive integration the prototype integrated system was installed and tested at the Radioactive Waste Management Facility (RWMC) at INEEL beginning in April 1997 through the present (November 1997). The integrated system, composed of ABCD imaging software and IMSS mobility base, is called MISS EVE (Mobile Intelligent Sensor System--Environmental Validation Expert). Evaluation of the integrated system in RWMC Building 628, containing approximately 10,000 drums, demonstrated an easy to use system with the ability to properly navigate through the facility, image all the defined drums, and process the results into a report delivered to the operator on a GUI interface and on hard copy. Further work is needed to make the brassboard system more operationally robust.« less

  13. Crew interface specification development study for in-flight maintenance and stowage functions

    NASA Technical Reports Server (NTRS)

    Carl, J. G.

    1971-01-01

    The need and potential solutions for an orderly systems engineering approach to the definition, management and documentation requirements for in-flight maintenance, assembly, servicing, and stowage process activities of the flight crews of future spacecraft were investigated. These processes were analyzed and described using a new technique (mass/function flow diagramming), developed during the study, to give visibility to crew functions and supporting requirements, including data products. This technique is usable by NASA for specification baselines and can assist the designer in identifying both upper and lower level requirements associated with these processes. These diagrams provide increased visibility into the relationships between functions and related equipments being utilized and managed and can serve as a common communicating vehicle between the designer, program management, and the operational planner. The information and data product requirements to support the above processes were identified along with optimum formats and contents of these products. The resulting data product concepts are presented to support these in-flight maintenance and stowage processes.

  14. Detecting and Measuring Land Subsidence in Houston-Galveston, Texas using Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System Data, 2012-2016

    NASA Astrophysics Data System (ADS)

    Reed, A.; Baker, S.

    2016-12-01

    Several cities in the Houston-Galveston (HG) region in Texas have subsided up to 13 feet over several decades due to natural and anthropogenic processes [Yu et al. 2014]. Land subsidence, a gradual sinking of the Earth's surface, is an often human-induced hazard and a major environmental problem expedited by activities such as mining, oil and gas extraction, urbanization and excessive groundwater pumping. We are able to detect and measure subsidence in HG using interferometric synthetic aperture radar (InSAR) and global positioning systems (GPS). Qu et al. [2015] used ERS, Envisat, and ALOS-1 to characterize subsidence in HG from 1995 to 2011, but a five-year gap in InSAR measurements exists due to a lack of freely available SAR data. We build upon the previous study by comparing subsidence patterns detected by Sentinel-1 data starting in July 2015. We used GMT5SAR to generate a stack of interferograms with perpendicular baselines less than 100 meters and temporal baselines less than 100 days to minimize temporal and spatial decorrelation. We applied the short baseline subset (SBAS) time series processing using GIAnT and compared our results with GPS measurements. The implications of this work will strengthen land subsidence monitoring systems in HG and broadly aid in the development of effective water resource management policies and strategies.

  15. Intelligent monitoring and diagnosis systems for the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.

    1991-01-01

    Specific activities in NASA's environmental control and life support system (ECLSS) advanced automation project that is designed to minimize the crew and ground manpower needed for operations are discussed. Various analyses and the development of intelligent software for the initial and evolutionary Space Station Freedom (SSF) ECLSS are described. The following are also discussed: (1) intelligent monitoring and diagnostics applications under development for the ECLSS domain; (2) integration into the MSFC ECLSS hardware testbed; and (3) an evolutionary path from the baseline ECLSS automation to the more advanced ECLSS automation processes.

  16. Development of improved high temperature coatings for IN-792 + HF

    NASA Technical Reports Server (NTRS)

    Profant, D. D.; Naik, S. K.

    1981-01-01

    The development for t-55 l712 engine of high temperature for integral turbine nozzles with improved thermal fatigue resistance without sacrificing oxidation/corrosion protection is discussed. The program evaluated to coating systems which comprised one baseline plasma spray coating (12% Al-NiCoCrALY), three aluminide coatings including the baseline aluminide (701), two CoNiCrAly (6% Al) + aluminide systems and four NiCoCrY + aluminide coating were evaluated. The two-step coating processes were investigated since it offered the advantage of tailoring the composition as well as properly coating surfaces of an integral or segmented nozzle. Cyclic burner rig thermal fatigue and oxidation/corrosion tests were used to evaluate the candidate coating systems. The plasma sprayed 12% Al-NiCoCrAlY was rated the best coating in thermal fatigue resistance and outperformed all coatings by a factor between 1.4 to 2.5 in cycles to crack initiation. However, this coatings is not applicable to integral or segmented nozzles due to the line of sight limitation of the plasma spray process. The 6% Al-CoNiCrAlY + Mod. 701 aluminide (32 w/o Al) was rated the best coating in oxidation/corrosion resistance and was rated the second best in thermal fatigue resistance.

  17. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  18. National facilities study. Volume 3: Mission and requirements model report

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The National Facility Study (NFS) was initiated in 1992 by Daniel S. Goldin, Administrator of NASA as an initiative to develop a comprehensive and integrated long-term plan for future facilities. The resulting, multi-agency NFS consisted of three Task Groups: Aeronautics, Space Operations, and Space Research and Development (R&D) Task Groups. A fourth group, the Engineering and Cost Analysis Task Group, was subsequently added to provide cross-cutting functions, such as assuring consistency in developing an inventory of space facilities. Space facilities decisions require an assessment of current and future needs. Therefore, the two task groups dealing with space developed a consistent model of future space mission programs, operations and R&D. The model is a middle ground baseline constructed for NFS analytical purposes with excursions to cover potential space program strategies. The model includes three major sectors: DOD, civilian government, and commercial space. The model spans the next 30 years because of the long lead times associated with facilities development and usage. This document, Volume 3 of the final NFS report, is organized along the following lines: Executive Summary -- provides a summary view of the 30-year mission forecast and requirements baseline, an overview of excursions from that baseline that were studied, and organization of the report; Introduction -- provides discussions of the methodology used in this analysis; Baseline Model -- provides the mission and requirements model baseline developed for Space Operations and Space R&D analyses; Excursions from the baseline -- reviews the details of variations or 'excursions' that were developed to test the future program projections captured in the baseline; and a Glossary of Acronyms.

  19. Hybrid Kalman Filter: A New Approach for Aircraft Engine In-Flight Diagnostics

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2006-01-01

    In this paper, a uniquely structured Kalman filter is developed for its application to in-flight diagnostics of aircraft gas turbine engines. The Kalman filter is a hybrid of a nonlinear on-board engine model (OBEM) and piecewise linear models. The utilization of the nonlinear OBEM allows the reference health baseline of the in-flight diagnostic system to be updated to the degraded health condition of the engines through a relatively simple process. Through this health baseline update, the effectiveness of the in-flight diagnostic algorithm can be maintained as the health of the engine degrades over time. Another significant aspect of the hybrid Kalman filter methodology is its capability to take advantage of conventional linear and nonlinear Kalman filter approaches. Based on the hybrid Kalman filter, an in-flight fault detection system is developed, and its diagnostic capability is evaluated in a simulation environment. Through the evaluation, the suitability of the hybrid Kalman filter technique for aircraft engine in-flight diagnostics is demonstrated.

  20. Investigation of baseline measurement resolution of a Si plate-based extrinsic Fabry-Perot interferometer

    NASA Astrophysics Data System (ADS)

    Ushakov, Nikolai; Liokumovich, Leonid

    2014-05-01

    Measurement of a wafer thickness is of a great value for fabrication and interrogation of MEMS/MOEMS devices, as well as conventional optical fiber sensors. In the current paper we investigate the abilities of the wavelength-scanning interferometry techniques for registering the baseline of an extrinsic fiber Fabry-Perot interferometer (EFPI) with the cavity formed by the two sides of a silicon plate. In order to enhance the resolution, an improved signal processing algorithm was developed. Various experiments, including contact and non-contact measurement of a silicon wafer thickness were performed, with the achieved resolutions from 10 to 20 pm. This enables one to use the described approach for high-precision measurement of geometric parameters of micro electro (electro-optic) mechanical systems for their characterization, utilization in sensing tasks and fabrication control. An ability of a Si plate-based EFPI interrogated by the developed technique to capture temperature variations of about 4 mK was demonstrated.

  1. Hydrologic, vegetation, and soil data collected in selected wetlands of the Big River Management area, Rhode Island, from 2008 through 2010

    USGS Publications Warehouse

    Borenstein, Meredith S.; Golet, Francis C.; Armstrong, David S.; Breault, Robert F.; McCobb, Timothy D.; Weiskel, Peter K.

    2012-01-01

    The Rhode Island Water Resources Board planned to develop public water-supply wells in the Big River Management Area in Kent County, Rhode Island. Research in the United States and abroad indicates that groundwater withdrawal has the potential to affect wetland hydrology and related processes. In May 2008, the Rhode Island Water Resources Board, the U.S. Geological Survey, and the University of Rhode Island formed a partnership to establish baseline conditions at selected Big River wetland study sites and to develop an approach for monitoring potential impacts once pumping begins. In 2008 and 2009, baseline data were collected on the hydrology, vegetation, and soil characteristics at five forested wetland study sites in the Big River Management Area. Four of the sites were located in areas of potential drawdown associated with the projected withdrawals. The fifth site was located outside the area of projected drawdown and served as a control site. The data collected during this study are presented in this report.

  2. Air/Superfund national technical guidance study series, Volume 2. Estimation of baseline air emission at Superfund sites. Interim report(Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    This volume is one in a series of manuals prepared for EPA to assist its Remedial Project Managers in the assessment of the air contaminant pathway and developing input data for risk assessment. The manual provides guidance on developing baseline-emission estimates from hazardous waste sites. Baseline-emission estimates (BEEs) are defined as emission rates estimated for a site in its undisturbed state. Specifically, the manual is intended to: Present a protocol for selecting the appropriate level of effort to characterize baseline air emissions; Assist site managers in designing an approach for BEEs; Describe useful technologies for developing site-specific baseline emission estimatesmore » (BEEs); Help site managers select the appropriate technologies for generating site-specific BEEs.« less

  3. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  4. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  5. Pinellas Plant Environmental Baseline Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current andmore » past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.« less

  6. ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya; Rosenberg, Michael I.

    This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less

  7. Space Station Freedom ECLSS: A step toward autonomous regenerative life support systems

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1990-01-01

    The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to extensive automation primarily due to its comparatively long control system latencies. These allow longer contemplation times in which to form a more intelligent control strategy and to prevent and diagnose faults. The regenerative nature of the Space Station Freedom ECLSS will contribute closed loop complexities never before encountered in life support systems. A study to determine ECLSS automation approaches has been completed. The ECLSS baseline software and system processes could be augmented with more advanced fault management and regenerative control systems for a more autonomous evolutionary system, as well as serving as a firm foundation for future regenerative life support systems. Emerging advanced software technology and tools can be successfully applied to fault management, but a fully automated life support system will require research and development of regenerative control systems and models. The baseline Environmental Control and Life Support System utilizes ground tests in development of batch chemical and microbial control processes. Long duration regenerative life support systems will require more active chemical and microbial feedback control systems which, in turn, will require advancements in regenerative life support models and tools. These models can be verified using ground and on orbit life support test and operational data, and used in the engineering analysis of proposed intelligent instrumentation feedback and flexible process control technologies for future autonomous regenerative life support systems, including the evolutionary Space Station Freedom ECLSS.

  8. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    PubMed

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  9. Automatically finding relevant citations for clinical guideline development.

    PubMed

    Bui, Duy Duc An; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2015-10-01

    Literature database search is a crucial step in the development of clinical practice guidelines and systematic reviews. In the age of information technology, the process of literature search is still conducted manually, therefore it is costly, slow and subject to human errors. In this research, we sought to improve the traditional search approach using innovative query expansion and citation ranking approaches. We developed a citation retrieval system composed of query expansion and citation ranking methods. The methods are unsupervised and easily integrated over the PubMed search engine. To validate the system, we developed a gold standard consisting of citations that were systematically searched and screened to support the development of cardiovascular clinical practice guidelines. The expansion and ranking methods were evaluated separately and compared with baseline approaches. Compared with the baseline PubMed expansion, the query expansion algorithm improved recall (80.2% vs. 51.5%) with small loss on precision (0.4% vs. 0.6%). The algorithm could find all citations used to support a larger number of guideline recommendations than the baseline approach (64.5% vs. 37.2%, p<0.001). In addition, the citation ranking approach performed better than PubMed's "most recent" ranking (average precision +6.5%, recall@k +21.1%, p<0.001), PubMed's rank by "relevance" (average precision +6.1%, recall@k +14.8%, p<0.001), and the machine learning classifier that identifies scientifically sound studies from MEDLINE citations (average precision +4.9%, recall@k +4.2%, p<0.001). Our unsupervised query expansion and ranking techniques are more flexible and effective than PubMed's default search engine behavior and the machine learning classifier. Automated citation finding is promising to augment the traditional literature search. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  11. Process Development for Automated Solar Cell and Module Production. Task 4: Automated Array Assembly

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A baseline sequence for the manufacture of solar cell modules was specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells. The cells were then series connected on a ribbon and bonded into a finished glass tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.

  12. Statistical error model for a solar electric propulsion thrust subsystem

    NASA Technical Reports Server (NTRS)

    Bantell, M. H.

    1973-01-01

    The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.

  13. Water recovery and solid waste processing for aerospace and domestic applications

    NASA Technical Reports Server (NTRS)

    Murawczyk, C.

    1973-01-01

    The work is described accomplished in compiling information needed to establish the current water supply and waste water processing requirements for dwellings, and for developing a preliminary design for a waste water to potable water management system. Data generated was used in formulation of design criteria for the preliminary design of the waste water to potable water recycling system. The system as defined was sized for a group of 500 dwelling units. Study tasks summarized include: water consumption, nature of domestic water, consumer appliances for low water consumption, water quality monitoring, baseline concept, and current and projected costs.

  14. Microstructures and Mechanical Properties of Two-Phase Alloys Based on NbCr(2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cady, C.M.; Chen, K.C.; Kotula, P.G.

    A two-phase, Nb-Cr-Ti alloy (bee+ C15 Laves phase) has been developed using several alloy design methodologies. In effort to understand processing-microstructure-property relationships, diffment processing routes were employed. The resulting microstructure and mechanical properties are discussed and compared. Plasma arc-melted samples served to establish baseline, . . . as-cast properties. In addition, a novel processing technique, involving decomposition of a supersaturated and metastable precursor phase during hot isostatic pressing (HIP), was used to produce a refined, equilibrium two-phase microstructure. Quasi-static compression tests as a ~ function of temperature were performed on both alloy types. Different deformation mechanisms were encountered based uponmore » temperature and microstructure.« less

  15. Low cost solar array project production process and equipment task: A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.

  16. Improved Modeling of Finite-Rate Turbulent Combustion Processes in Research Combustors

    NASA Technical Reports Server (NTRS)

    VanOverbeke, Thomas J.

    1998-01-01

    The objective of this thesis is to further develop and test a stochastic model of turbulent combustion in recirculating flows. There is a requirement to increase the accuracy of multi-dimensional combustion predictions. As turbulence affects reaction rates, this interaction must be more accurately evaluated. In this work a more physically correct way of handling the interaction of turbulence on combustion is further developed and tested. As turbulence involves randomness, stochastic modeling is used. Averaged values such as temperature and species concentration are found by integrating the probability density function (pdf) over the range of the scalar. The model in this work does not assume the pdf type, but solves for the evolution of the pdf using the Monte Carlo solution technique. The model is further developed by including a more robust reaction solver, by using accurate thermodynamics and by more accurate transport elements. The stochastic method is used with Semi-Implicit Method for Pressure-Linked Equations. The SIMPLE method is used to solve for velocity, pressure, turbulent kinetic energy and dissipation. The pdf solver solves for temperature and species concentration. Thus, the method is partially familiar to combustor engineers. The method is compared to benchmark experimental data and baseline calculations. The baseline method was tested on isothermal flows, evaporating sprays and combusting sprays. Pdf and baseline predictions were performed for three diffusion flames and one premixed flame. The pdf method predicted lower combustion rates than the baseline method in agreement with the data, except for the premixed flame. The baseline and stochastic predictions bounded the experimental data for the premixed flame. The use of a continuous mixing model or relax to mean mixing model had little effect on the prediction of average temperature. Two grids were used in a hydrogen diffusion flame simulation. Grid density did not effect the predictions except for peak temperature and tangential velocity. The hybrid pdf method did take longer and required more memory, but has a theoretical basis to extend to many reaction steps which cannot be said of current turbulent combustion models.

  17. Physical activity in the elderly is associated with improved executive function and processing speed: the LADIS Study.

    PubMed

    Frederiksen, Kristian Steen; Verdelho, Ana; Madureira, Sofia; Bäzner, Hansjörg; O'Brien, John T; Fazekas, Franz; Scheltens, Philip; Schmidt, Reinhold; Wallin, Anders; Wahlund, Lars-Olof; Erkinjunttii, Timo; Poggesi, Anna; Pantoni, Leonardo; Inzitari, Domenico; Waldemar, Gunhild

    2015-07-01

    Physical activity reduces the risk of cognitive decline but may affect cognitive domains differently. We examined whether physical activity modifies processing speed, executive function and memory in a population of non-dementia elderly subjects with age-related white matter changes (ARWMC). Data from the Leukoaraiosis And DISability (LADIS) study, a multicenter, European prospective cohort study aimed at examining the role of ARWMC in transition to disability, was used. Subjects in the LADIS study were clinically assessed yearly for 3 years including MRI at baseline and 3-year follow-up. Physical activity was assessed at baseline, and cognitive compound scores at baseline and 3-year assessment were used. Two-hundred-eighty-two subjects (age, y (mean (SD)): 73.1 (± 5.1); gender (f/m): 164/118); MMSE (mean (SD)): 28.3 (± 1.7)) who had not progressed to MCI or dementia, were included. Multiple variable linear regression analysis with baseline MMSE, education, gender, age, stroke, diabetes and ARWMC rating as covariates revealed that physical activity was associated with better scores at baseline and 3-year follow-up for executive function (baseline: β: 0.39, 95% CI: 0.13-0.90, p = 0.008; follow-up: β: 0.24, 95% CI: 0.10-0.38, p = 0.001) and processing speed (baseline: β: 0.48, 95% CI: 0.14-0.89, p = 0.005; follow-up: β: 0.15, 95% CI: 0.02-0.29, p = 0.02) but not memory. When including baseline cognitive score as a covariate in the analysis of 3-year follow-up scores, executive function remained significant (β: 0.11, 95% CI: 0-0.22, p = 0.04). Our findings confirm previous findings of a positive effect of physical activity on cognitive functions in elderly subjects, and further extends these by showing that the association is also present in patients with ARWMC. Copyright © 2014 John Wiley & Sons, Ltd.

  18. MRMAide: a mixed resolution modeling aide

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; McGraw, Robert M.

    2002-07-01

    The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.

  19. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    NASA Astrophysics Data System (ADS)

    Pinheiro, Muriel; Reigber, Andreas; Moreira, Alberto

    2017-09-01

    The global Digital Elevation Model (DEM) resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  20. Transient Receptor Potential Vanilloid 2 Regulates Myocardial Response to Exercise

    PubMed Central

    Naticchioni, Mindi; Karani, Rajiv; Smith, Margaret A.; Onusko, Evan; Robbins, Nathan; Jiang, Min; Radzyukevich, Tatiana; Fulford, Logan; Gao, Xu; Apel, Ryan; Heiny, Judith; Rubinstein, Jack; Koch, Sheryl E.

    2015-01-01

    The myocardial response to exercise is an adaptive mechanism that permits the heart to maintain cardiac output via improved cardiac function and development of hypertrophy. There are many overlapping mechanisms via which this occurs with calcium handling being a crucial component of this process. Our laboratory has previously found that the stretch sensitive TRPV2 channels are active regulators of calcium handling and cardiac function under baseline conditions based on our observations that TRPV2-KO mice have impaired cardiac function at baseline. The focus of this study was to determine the cardiac function of TRPV2-KO mice under exercise conditions. We measured skeletal muscle at baseline in WT and TRPV2-KO mice and subjected them to various exercise protocols and measured the cardiac response using echocardiography and molecular markers. Our results demonstrate that the TRPV2-KO mouse did not tolerate forced exercise although they became increasingly exercise tolerant with voluntary exercise. This occurs as the cardiac function deteriorates further with exercise. Thus, our conclusion is that TRPV2-KO mice have impaired cardiac functional response to exercise. PMID:26356305

  1. Evaluation of a Stirling Solar Dynamic System for Lunar Oxygen Production

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.; Wong, Wayne A.

    2006-01-01

    An evaluation of a solar concentrator-based system for producing oxygen from the lunar regolith was performed. The system utilizes a solar concentrator mirror to provide thermal energy for the oxygen production process as well as thermal energy to power a Stirling heat engine for the production of electricity. The electricity produced is utilized to operate the equipment needed in the oxygen production process. The oxygen production method utilized in the analysis was the hydrogen reduction of ilmenite. Utilizing this method of oxygen production a baseline system design was produced. This baseline system had an oxygen production rate of 0.6 kg/hr with a concentrator mirror size of 5 m. Variations were performed on the baseline design to show how changes in the system size and process rate effected the oxygen production rate.

  2. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the locations of beryllium operations and other locations of potential beryllium contamination, and identify the...

  3. Development of a hospital-based patient-reported outcome framework for lung cancer patients: a study protocol.

    PubMed

    Moloczij, Natasha; Gough, Karla; Solomon, Benjamin; Ball, David; Mileshkin, Linda; Duffy, Mary; Krishnasamy, Mei

    2018-01-11

    Patient-reported outcome (PRO) data is central to the delivery of quality health care. Establishing sustainable, reliable and cost-efficient methods for routine collection and integration of PRO data into health information systems is challenging. This protocol paper describes the design and structure of a study to develop and pilot test a PRO framework to systematically and longitudinally collect PRO data from a cohort of lung cancer patients at a comprehensive cancer centre in Australia. Best-practice guidelines for developing registries aimed at collecting PROs informed the development of this PRO framework. Framework components included: achieving consensus on determining the purpose of the framework, the PRO measures to be included, the data collection time points and collection methods (electronic and paper), establishing processes to safeguard the quality of the data collected and to link the PRO framework to an existing hospital-based lung cancer clinical registry. Lung cancer patients will be invited to give feedback on the PRO measures (PROMs) chosen and the data collection time points and methods. Implementation of the framework will be piloted for 12 months. Then a mixed-methods approach used to explore patient and multidisciplinary perspectives on the feasibility of implementing the framework and linking it to the lung cancer clinical registry, its clinical utility, perceptions of data collection burden, and preliminary assessment of resource costs to integrate, implement and sustain the PRO framework. The PRO data set will include: a quality of life questionnaire (EORTC-QLQ-C30) and the EORTC lung cancer specific module (QLQC-LC-13). These will be collected pre-treatment (baseline), 2, 6 and 12 months post-baseline. Also, four social isolation questions (PROMIS) will be collected at baseline. Identifying and deciding on the overall purpose, clinical utility of data and which PROs to collect from patients requires careful consideration. Our study will explore how PRO data collection processes that link to a clinical data set can be developed and integrated; how PRO systems that are easy for patients to complete and professionals to use in practice can be achieved, and will provide indicative costs of developing and integrating a longitudinal PRO framework into routine hospital data collection systems. This study is not a clinical trial and is therefore not registered in any trial registry. However, it has received human research ethics approval (LNR/16/PMCC/45).

  4. Machine health prognostics using the Bayesian-inference-based probabilistic indication and high-order particle filtering framework

    NASA Astrophysics Data System (ADS)

    Yu, Jianbo

    2015-12-01

    Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.

  5. Directly connecting the Very Long Baseline Array

    NASA Astrophysics Data System (ADS)

    Hunt, Gareth; Romney, Jonathan D.; Walker, R. Craig

    2002-11-01

    At present, the signals received by the 10 antennas of the Very Long Baseline Array (VLBA) are recorded on instrumentation tapes. These tapes are then shipped from the antenna locations - distributed across the mainland USA, the US Virgin Islands, and Hawaii - to the processing center in Socorro, New Mexico. The Array operates today at a mean sustained data rate of 128 Mbps per antenna, but peak rates of 256 Mbps and 512 Mbps are also used. Transported tapes provide the cheapest method of attaining these bit rates. The present tape system derives from wideband recording techniques dating back to the late 1960s, and has been in use since the commissioning of the VLBA in 1993. It is in need of replacement on a time scale of a few years. Further, plans are being developed which would increase the required data rates to 1 Gbps in 5 years and 100 Gbps in 10 years. With the advent of higher performance networks, it should be possible to transmit the data directly to the processing center. However, achieving this connectivity is complicated by the remoteness of the antennas -

  6. A short baseline strainmeter using fiber-optic Bragg-Grating (FBG) sensor and a nano-optic interferometer

    NASA Astrophysics Data System (ADS)

    Coutant, O.; Demengin, M.; Le Coarer, E.; Gaffet, S.

    2013-12-01

    Strain recordings from tiltmeters or borehole volumetric strainmeters on volcanoes reveal extremely rich signal of deformation associated with eruptive processes. The ability to detect and record signals of the order of few tens of nanostrain is complementary to other monitoring techniques, and of great interest to monitor and model the volcanic processes. Strain recording remains however a challenge, for both the instrumental and the installation point of view. We present in this study the first results of strain recordings, using a new fiber-optic Bragg-Grating (FBG) sensor. FBG sensors are known for many years and used as strain gauges in civil engineering. They are however limited in this case to microstrain capability. We use here a newly developped interferometer named SWIFTS whose main characteristics are i) an extremely high optical wavelength precision and ii) a small design and low power requirements allowing an easy field deployment. Our FBG sensor uses a short baseline, 3cm long Bragg network. We show preliminary results obtained from a several months recordings in the low noise underground laboratory at Rustrel (LSBB), south of France.

  7. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  8. The ALMA Science Pipeline: Current Status

    NASA Astrophysics Data System (ADS)

    Humphreys, Elizabeth; Miura, Rie; Brogan, Crystal L.; Hibbard, John; Hunter, Todd R.; Indebetouw, Remy

    2016-09-01

    The ALMA Science Pipeline is being developed for the automated calibration and imaging of ALMA interferometric and single-dish data. The calibration Pipeline for interferometric data was accepted for use by ALMA Science Operations in 2014, and for single-dish data end-to-end processing in 2015. However, work is ongoing to expand the use cases for which the Pipeline can be used e.g. for higher frequency and lower signal-to-noise datasets, and for new observing modes. A current focus includes the commissioning of science target imaging for interferometric data. For the Single Dish Pipeline, the line finding algorithm used in baseline subtraction and baseline flagging heuristics have been greately improved since the prototype used for data from the previous cycle. These algorithms, unique to the Pipeline, produce better results than standard manual processing in many cases. In this poster, we report on the current status of the Pipeline capabilities, present initial results from the Imaging Pipeline, and the smart line finding and flagging algorithm used in the Single Dish Pipeline. The Pipeline is released as part of CASA (the Common Astronomy Software Applications package).

  9. Integration of safety engineering into a cost optimized development program.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1972-01-01

    A six-segment management model is presented, each segment of which represents a major area in a new product development program. The first segment of the model covers integration of specialist engineers into 'systems requirement definition' or the system engineering documentation process. The second covers preparation of five basic types of 'development program plans.' The third segment covers integration of system requirements, scheduling, and funding of specialist engineering activities into 'work breakdown structures,' 'cost accounts,' and 'work packages.' The fourth covers 'requirement communication' by line organizations. The fifth covers 'performance measurement' based on work package data. The sixth covers 'baseline requirements achievement tracking.'

  10. Daily Goals Formulation and Enhanced Visualization of Mechanical Ventilation Variance Improves Mechanical Ventilation Score.

    PubMed

    Walsh, Brian K; Smallwood, Craig; Rettig, Jordan; Kacmarek, Robert M; Thompson, John; Arnold, John H

    2017-03-01

    The systematic implementation of evidence-based practice through the use of guidelines, checklists, and protocols mitigates the risks associated with mechanical ventilation, yet variation in practice remains prevalent. Recent advances in software and hardware have allowed for the development and deployment of an enhanced visualization tool that identifies mechanical ventilation goal variance. Our aim was to assess the utility of daily goal establishment and a computer-aided visualization of variance. This study was composed of 3 phases: a retrospective observational phase (baseline) followed by 2 prospective sequential interventions. Phase I intervention comprised daily goal establishment of mechanical ventilation. Phase II intervention was the setting and monitoring of daily goals of mechanical ventilation with a web-based data visualization system (T3). A single score of mechanical ventilation was developed to evaluate the outcome. The baseline phase evaluated 130 subjects, phase I enrolled 31 subjects, and phase II enrolled 36 subjects. There were no differences in demographic characteristics between cohorts. A total of 171 verbalizations of goals of mechanical ventilation were completed in phase I. The use of T3 increased by 87% from phase I. Mechanical ventilation score improved by 8.4% in phase I and 11.3% in phase II from baseline ( P = .032). The largest effect was in the low risk V T category, with a 40.3% improvement from baseline in phase I, which was maintained at 39% improvement from baseline in phase II ( P = .01). mechanical ventilation score was 9% higher on average in those who survived. Daily goal formation and computer-enhanced visualization of mechanical ventilation variance were associated with an improvement in goal attainment by evidence of an improved mechanical ventilation score. Further research is needed to determine whether improvements in mechanical ventilation score through a targeted, process-oriented intervention will lead to improved patient outcomes. (ClinicalTrials.gov registration NCT02184208.). Copyright © 2017 by Daedalus Enterprises.

  11. Technoeconomic Assessment of an Advanced Aqueous Ammonia-Based Postcombustion Capture Process Integrated with a 650-MW Coal-Fired Power Station.

    PubMed

    Li, Kangkang; Yu, Hai; Yan, Shuiping; Feron, Paul; Wardhaugh, Leigh; Tade, Moses

    2016-10-04

    Using a rigorous, rate-based model and a validated economic model, we investigated the technoeconomic performance of an aqueous NH 3 -based CO 2 capture process integrated with a 650-MW coal-fired power station. First, the baseline NH 3 process was explored with the process design of simultaneous capture of CO 2 and SO 2 to replace the conventional FGD unit. This reduced capital investment of the power station by US$425/kW (a 13.1% reduction). Integration of this NH 3 baseline process with the power station takes the CO 2 -avoided cost advantage over the MEA process (US$67.3/tonne vs US$86.4/tonne). We then investigated process modifications of a two-stage absorption, rich-split configuration and interheating stripping to further advance the NH 3 process. The modified process reduced energy consumption by 31.7 MW/h (20.2% reduction) and capital costs by US$55.4 million (6.7% reduction). As a result, the CO 2 -avoided cost fell to $53.2/tonne: a savings of $14.1 and $21.9/tonne CO 2 compared with the NH 3 baseline and advanced MEA process, respectively. The analysis of energy breakdown and cost distribution indicates that the technoeconomic performance of the NH 3 process still has great potential to be improved.

  12. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to meet current aerospace challenges. Overarching goal is to avoid the reoccurring situation of optimizing an already ill-fated solution.

  13. Embedded algorithms within an FPGA-based system to process nonlinear time series data

    NASA Astrophysics Data System (ADS)

    Jones, Jonathan D.; Pei, Jin-Song; Tull, Monte P.

    2008-03-01

    This paper presents some preliminary results of an ongoing project. A pattern classification algorithm is being developed and embedded into a Field-Programmable Gate Array (FPGA) and microprocessor-based data processing core in this project. The goal is to enable and optimize the functionality of onboard data processing of nonlinear, nonstationary data for smart wireless sensing in structural health monitoring. Compared with traditional microprocessor-based systems, fast growing FPGA technology offers a more powerful, efficient, and flexible hardware platform including on-site (field-programmable) reconfiguration capability of hardware. An existing nonlinear identification algorithm is used as the baseline in this study. The implementation within a hardware-based system is presented in this paper, detailing the design requirements, validation, tradeoffs, optimization, and challenges in embedding this algorithm. An off-the-shelf high-level abstraction tool along with the Matlab/Simulink environment is utilized to program the FPGA, rather than coding the hardware description language (HDL) manually. The implementation is validated by comparing the simulation results with those from Matlab. In particular, the Hilbert Transform is embedded into the FPGA hardware and applied to the baseline algorithm as the centerpiece in processing nonlinear time histories and extracting instantaneous features of nonstationary dynamic data. The selection of proper numerical methods for the hardware execution of the selected identification algorithm and consideration of the fixed-point representation are elaborated. Other challenges include the issues of the timing in the hardware execution cycle of the design, resource consumption, approximation accuracy, and user flexibility of input data types limited by the simplicity of this preliminary design. Future work includes making an FPGA and microprocessor operate together to embed a further developed algorithm that yields better computational and power efficiency.

  14. Systems Engineering Provides Successful High Temperature Steam Electrolysis Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles V. Park; Emmanuel Ohene Opare, Jr.

    2011-06-01

    This paper describes two Systems Engineering Studies completed at the Idaho National Laboratory (INL) to support development of the High Temperature Stream Electrolysis (HTSE) process. HTSE produces hydrogen from water using nuclear power and was selected by the Department of Energy (DOE) for integration with the Next Generation Nuclear Plant (NGNP). The first study was a reliability, availability and maintainability (RAM) analysis to identify critical areas for technology development based on available information regarding expected component performance. An HTSE process baseline flowsheet at commercial scale was used as a basis. The NGNP project also established a process and capability tomore » perform future RAM analyses. The analysis identified which components had the greatest impact on HTSE process availability and indicated that the HTSE process could achieve over 90% availability. The second study developed a series of life-cycle cost estimates for the various scale-ups required to demonstrate the HTSE process. Both studies were useful in identifying near- and long-term efforts necessary for successful HTSE process deployment. The size of demonstrations to support scale-up was refined, which is essential to estimate near- and long-term cost and schedule. The life-cycle funding profile, with high-level allocations, was identified as the program transitions from experiment scale R&D to engineering scale demonstration.« less

  15. Work-related heat stress concerns in automotive industries: a case study from Chennai, India

    PubMed Central

    Ayyappan, Ramalingam; Sankar, Sambandam; Rajkumar, Paramasivan; Balakrishnan, Kalpana

    2009-01-01

    Background Work-related heat stress assessments, the quantification of thermal loads and their physiological consequences have mostly been performed in non-tropical developed country settings. In many developing countries (many of which are also tropical), limited attempts have been made to create detailed job-exposure profiles for various sectors. We present here a case study from Chennai in southern India that illustrates the prevalence of work-related heat stress in multiple processes of automotive industries and the efficacy of relatively simple controls in reducing prevalence of the risk through longitudinal assessments. Methods We conducted workplace heat stress assessments in automotive and automotive parts manufacturing units according to the protocols recommended by NIOSH, USA. Sites for measurements included indoor locations with process-generated heat exposure, indoor locations without direct process-generated heat exposure and outdoor locations. Nearly 400 measurements of heat stress were made over a four-year period at more than 100 locations within eight units involved with automotive or automotive parts manufacturing in greater Chennai metropolitan area. In addition, cross-sectional measurements were made in select processes of glass manufacturing and textiles to estimate relative prevalence of heat stress. Results Results indicate that many processes even in organised large-scale industries have yet to control heat stress-related hazards adequately. Upwards of 28% of workers employed in multiple processes were at risk of heat stress-related health impairment in the sectors assessed. Implications of longitudinal baseline data for assessing efficacy of interventions as well as modelling potential future impacts from climate change (through contributions from worker health and productivity impairments consequent to increases in ambient temperature) are described. Conclusions The study re-emphasises the need for recognising heat stress as an important occupational health risk in both formal and informal sectors in India. Making available good baseline data is critical for estimating future impacts. PMID:20052426

  16. Process development of a New Haemophilus influenzae type b conjugate vaccine and the use of mathematical modeling to identify process optimization possibilities.

    PubMed

    Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel

    2016-05-01

    Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.

  17. Proliferation resistance design of a plutonium cycle (Proliferation Resistance Engineering Program: PREP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorenson, R.J.; Roberts, F.P.; Clark, R.G.

    1979-01-19

    This document describes the proliferation resistance engineering concepts developed to counter the threat of proliferation of nuclear weapons in an International Fuel Service Center (IFSC). The basic elements of an International Fuel Service Center are described. Possible methods for resisting proliferation such as processing alternatives, close-coupling of facilities, process equipment layout, maintenance philosophy, process control, and process monitoring are discussed. Political and institutional issues in providing proliferation resistance for an International Fuel Service Center are analyzed. The conclusions drawn are (1) use-denial can provide time for international response in the event of a host nation takeover. Passive use-denial is moremore » acceptable than active use-denial, and acceptability of active-denial concepts is highly dependent on sovereignty, energy dependence and economic considerations; (2) multinational presence can enhance proliferation resistance; and (3) use-denial must be nonprejudicial with balanced interests for governments and/or private corporations being served. Comparisons between an IFSC as a national facility, an IFSC with minimum multinational effect, and an IFSC with maximum multinational effect show incremental design costs to be less than 2% of total cost of the baseline non-PRE concept facility. The total equipment acquisition cost increment is estimated to be less than 2% of total baseline facility costs. Personnel costs are estimated to increase by less than 10% due to maximum international presence. 46 figures, 9 tables.« less

  18. The effect of stochastic modeling of ionospheric effect on the various lengths of baseline determination

    NASA Astrophysics Data System (ADS)

    Kwon, J.; Yang, H.

    2006-12-01

    Although GPS provides continuous and accurate position information, there are still some rooms for improvement of its positional accuracy, especially in the medium and long range baseline determination. In general, in case of more than 50 km baseline length, the effect of ionospheric delay is the one causing the largest degradation in positional accuracy. For example, the ionospheric delay in terms of double differenced mode easily reaches 10 cm with baseline length of 101 km. Therefore, many researchers have been tried to mitigate/reduce the effect using various modeling methods. In this paper, the optimal stochastic modeling of the ionospheric delay in terms of baseline length is presented. The data processing has been performed by constructing a Kalman filter with states of positions, ambiguities, and the ionospheric delays in the double differenced mode. Considering the long baseline length, both double differenced GPS phase and code observations are used as observables and LAMBDA has been applied to fix the ambiguities. Here, the ionospheric delay is stochastically modeled by well-known Gaussian, 1st and 3rd order Gauss-Markov process. The parameters required in those models such as correlation distance and time is determined by the least-square adjustment using ionosphere-only observables. Mainly the results and analysis from this study show the effect of stochastic models of the ionospheric delay in terms of the baseline length, models, and parameters used. In the above example with 101 km baseline length, it was found that the positional accuracy with appropriate ionospheric modeling (Gaussian) was about ±2 cm whereas it reaches about ±15 cm with no stochastic modeling. It is expected that the approach in this study contributes to improve positional accuracy, especially in medium and long range baseline determination.

  19. Evolution of the Power Processing Units Architecture for Electric Propulsion at CRISA

    NASA Astrophysics Data System (ADS)

    Palencia, J.; de la Cruz, F.; Wallace, N.

    2008-09-01

    Since 2002, the team formed by EADS Astrium CRISA, Astrium GmbH Friedrichshafen, and QinetiQ has participated in several flight programs where the Electric Propulsion based on Kaufman type Ion Thrusters is the baseline conceptOn 2002, CRISA won the contract for the development of the Ion Propulsion Control Unit (IPCU) for GOCE. This unit together with the T5 thruster by QinetiQ provides near perfect atmospheric drag compensation offering thrust levels in the range of 1 to 20mN.By the end of 2003, CRISA started the adaptation of the IPCU concept to the QinetiQ T6 Ion Thruster for the Alphabus program.This paper shows how the Power Processing Unit design evolved in time including the current developments.

  20. Increased Visceral Adipose Tissue Is an Independent Predictor for Future Development of Atherogenic Dyslipidemia.

    PubMed

    Hwang, You-Cheol; Fujimoto, Wilfred Y; Hayashi, Tomoshige; Kahn, Steven E; Leonetti, Donna L; Boyko, Edward J

    2016-02-01

    Atherogenic dyslipidemia is frequently observed in persons with a greater amount of visceral adipose tissue (VAT). However, it is still uncertain whether VAT is independently associated with the future development of atherogenic dyslipidemia. The aim of this study was to determine whether baseline and changes in VAT and subcutaneous adipose tissue (SAT) are associated with future development of atherogenic dyslipidemia independent of baseline lipid levels and standard anthropometric indices. Community-based prospective cohort study with 5 years of follow-up. A total of 452 Japanese Americans (240 men, 212 women), aged 34-75 years were assessed at baseline and after 5 years of follow-up. Abdominal fat areas were measured by computed tomography. Atherogenic dyslipidemia was defined as one or more abnormalities in high-density lipoprotein (HDL) cholesterol, triglycerides, or non-HDL cholesterol levels. Baseline VAT and change in VAT over 5 years were independently associated with log-transformed HDL cholesterol, log-transformed triglyceride, and non-HDL cholesterol after 5 years (standardized β = -0.126, 0.277, and 0.066 for baseline VAT, respectively, and -0.095, 0.223, and 0.090 for change in VAT, respectively). However, baseline and change in SAT were not associated with any future atherogenic lipid level. In multivariate logistic regression analysis, incremental change in VAT (odds ratio [95% confidence interval], 1.73 [1.20-2.48]; P = .003), triglycerides (4.01 [1.72-9.33]; P = .001), HDL cholesterol (0.32 [0.18-0.58]; P < .001), and non-HDL cholesterol (7.58 [4.43-12.95]; P < .001) were significantly associated with the future development of atherogenic dyslipidemia independent of age, sex, diastolic blood pressure, homeostasis model assessment insulin resistance, body mass index (BMI), change in BMI, SAT, and baseline atherogenic lipid levels. Baseline and change in VAT were independent predictors for future development of atherogenic dyslipidemia. However, BMI, waist circumference, and SAT were not associated with future development of atherogenic dyslipidemia.

  1. Space Station needs, attributes and architectural options. Volume 2, book 1, part 1: Mission requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The baseline mission model used to develop the space station mission-related requirements is described as well as the 90 civil missions that were evaluated, (including the 62 missions that formed the baseline model). Mission-related requirements for the space station baseline are defined and related to space station architectural development. Mission-related sensitivity analyses are discussed.

  2. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  3. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  4. Processing Science of Epoxy Resin Composites

    DTIC Science & Technology

    1984-01-15

    3 2.2 LAMINATE FABRICATION 30 2.2.1 Baseline Laminate Fabrication 30 2.2.2 Large Laminate Fabrication 36 2.3 DIFFUSIVITY AND SOLUBILITY...Thick Laminate 42 28 Baseline Cure Cycle With Specimen Advancement Levels 45 29 Composite Panel Fabrication 47 30 Composite Panel Fabrication 48 31...first change was the elimination of the different 1 resin formulations and concentration on the normal or baseline 5208/T300 prepreg as produced by

  5. Increase in relative skeletal muscle mass over time and its inverse association with metabolic syndrome development: a 7-year retrospective cohort study.

    PubMed

    Kim, Gyuri; Lee, Seung-Eun; Jun, Ji Eun; Lee, You-Bin; Ahn, Jiyeon; Bae, Ji Cheol; Jin, Sang-Man; Hur, Kyu Yeon; Jee, Jae Hwan; Lee, Moon-Kyu; Kim, Jae Hyeon

    2018-02-05

    Skeletal muscle mass was negatively associated with metabolic syndrome prevalence in previous cross-sectional studies. The aim of this study was to investigate the impact of baseline skeletal muscle mass and changes in skeletal muscle mass over time on the development of metabolic syndrome in a large population-based 7-year cohort study. A total of 14,830 and 11,639 individuals who underwent health examinations at the Health Promotion Center at Samsung Medical Center, Seoul, Korea were included in the analyses of baseline skeletal muscle mass and those changes from baseline over 1 year, respectively. Skeletal muscle mass was estimated by bioelectrical impedance analysis and was presented as a skeletal muscle mass index (SMI), a body weight-adjusted appendicular skeletal muscle mass value. Using Cox regression models, hazard ratio for developing metabolic syndrome associated with SMI values at baseline or changes of SMI over a year was analyzed. During 7 years of follow-up, 20.1% of subjects developed metabolic syndrome. Compared to the lowest sex-specific SMI tertile at baseline, the highest sex-specific SMI tertile showed a significant inverse association with metabolic syndrome risk (adjusted hazard ratio [AHR] = 0.61, 95% confidence interval [CI] 0.54-0.68). Furthermore, compared with SMI changes < 0% over a year, multivariate-AHRs for metabolic syndrome development were 0.87 (95% CI 0.78-0.97) for 0-1% changes and 0.67 (0.56-0.79) for > 1% changes in SMI over 1 year after additionally adjusting for baseline SMI and glycometabolic parameters. An increase in relative skeletal muscle mass over time has a potential preventive effect on developing metabolic syndrome, independently of baseline skeletal muscle mass and glycometabolic parameters.

  6. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  7. Development of advanced manufacturing technologies for low cost hydrogen storage vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leavitt, Mark; Lam, Patrick

    2014-12-29

    The U.S. Department of Energy (DOE) defined a need for low-cost gaseous hydrogen storage vessels at 700 bar to support cost goals aimed at 500,000 units per year. Existing filament winding processes produce a pressure vessel that is structurally inefficient, requiring more carbon fiber for manufacturing reasons, than would otherwise be necessary. Carbon fiber is the greatest cost driver in building a hydrogen pressure vessel. The objective of this project is to develop new methods for manufacturing Type IV pressure vessels for hydrogen storage with the purpose of lowering the overall product cost through an innovative hybrid process of optimizingmore » composite usage by combining traditional filament winding (FW) and advanced fiber placement (AFP) techniques. A numbers of vessels were manufactured in this project. The latest vessel design passed all the critical tests on the hybrid design per European Commission (EC) 79-2009 standard except the extreme temperature cycle test. The tests passed include burst test, cycle test, accelerated stress rupture test and drop test. It was discovered the location where AFP and FW overlap for load transfer could be weakened during hydraulic cycling at 85°C. To design a vessel that passed these tests, the in-house modeling software was updated to add capability to start and stop fiber layers to simulate the AFP process. The original in-house software was developed for filament winding only. Alternative fiber was also investigated in this project, but the added mass impacted the vessel cost negatively due to the lower performance from the alternative fiber. Overall the project was a success to show the hybrid design is a viable solution to reduce fiber usage, thus driving down the cost of fuel storage vessels. Based on DOE’s baseline vessel size of 147.3L and 91kg, the 129L vessel (scaled to DOE baseline) in this project shows a 32% composite savings and 20% cost savings when comparing Vessel 15 hybrid design and the Quantum baseline all filament wound vessel. Due to project timing, there was no additional time available to fine tune the design to improve the load transfer between AFP and FW. Further design modifications will likely help pass the extreme temperature cycle test, the remaining test that is critical to the hybrid design.« less

  8. [Educative strategy evaluation to improve critical reading skills on clinical research texts in second year gyneco-obstetrics residents].

    PubMed

    Carranza Lira, Sebastián; Arce Herrera, Rosa María; González González, Patricia

    2007-11-01

    The educative models and strategies to achieve a significant learning have a wide variety. The development of clinical aptitude for clinical research papers lecture has an important place to maintain the physician actualized and for resident formation. To evaluate the degree of development of the aptitude for the reading of clinical research articles in 2nd grade residents of the gynecology and obstetrics speciality alter an educative strategy. In 16 2nd year gynecology and obstetrics residents, a previously validated instrument was applied for the evaluation of critical lecture of clinical research articles in general medicine previous and after and educative strategy. Statistical analysis was with Kruskal-Wallis analysis of variance. Also Wilcoxon test was used to assess the differences between baseline and final results. The median of age was 27 (24-31) years, gender 56.3% women and 43.8% men. A statistically significant increase in global score was observed after the educative strategy. After it only there was a significant increase in the indicator to interpret. After evaluating the domain degrees according to the indicator to interpret, in baseline evaluation it predominated the very low level and at the final evaluation the very low and low levels. In the indicator to judge at baseline the majority were in the very low level, and at the end in very low and low levels. According to the indicator to propose at baseline all were in the level expected by hazard, and at the end a minimal proportion was at very low level. These results traduce a discrete improvement in critical lecture process, which makes to consider the educative strategy that was used, since the objective to improve critical lecture capacity was not achieved.

  9. Reduced cerebellar brain activity during reward processing in adolescent binge drinkers

    PubMed Central

    Cservenka, Anita; Jonesb, Scott A.; Nagel, Bonnie J.

    2015-01-01

    Due to ongoing development, adolescence may be a period of heightened vulnerability to the neurotoxic effects of alcohol. Binge drinking may alter reward-driven behavior and neurocircuitry, thereby increasing risk for escalating alcohol use. Therefore, we compared reward processing in adolescents with and without a history of recent binge drinking. At their baseline study visit, all participants (age = 14.86 ± 0.88) were free of heavy alcohol use and completed a modified version of the Wheel of Fortune (WOF) functional magnetic resonance imaging task. Following this visit, 17 youth reported binge drinking on ≥3 occasions within a 90 day period and were matched to 17 youth who remained alcohol and substance-naïve. All participants repeated the WOF task during a second visit (age = 16.83 ± 1.22). No significant effects were found in a region of interest analysis of the ventral striatum, but whole-brain analyses showed significant group differences in reward response at the second study visit in the left cerebellum, controlling for baseline visit brain activity (p/α<0.05), which was negatively correlated with mean number of drinks consumed/drinking day in the last 90 days. These findings suggest that binge drinking during adolescence may alter brain activity during reward processing in a dose-dependent manner. PMID:26190276

  10. Mechanisms underlying syntactic comprehension deficits in vascular aphasia: new evidence from self-paced listening.

    PubMed

    Caplan, David; Michaud, Jennifer; Hufford, Rebecca

    2015-01-01

    Sixty-one people with aphasia (pwa) and 41 matched controls were tested for the ability to understand sentences that required the ability to process particular syntactic elements and assign particular syntactic structures. Participants paced themselves word-by-word through 20 examples of 11 spoken sentence types and indicated which of two pictures corresponded to the meaning of each sentence. Sentences were developed in pairs such that comprehension of the experimental version of a pair required an aspect of syntactic processing not required in the corresponding baseline sentence. The need for the syntactic operations required only in the experimental version was triggered at a "critical word" in the experimental sentence. Listening times for critical words in experimental sentences were compared to those for corresponding words in the corresponding baseline sentences. The results were consistent with several models of syntactic comprehension deficits in pwa: resource reduction, slowed lexical and/or syntactic processing, abnormal susceptibility to interference from thematic roles generated non-syntactically. They suggest that a previously unidentified disturbance limiting the duration of parsing and interpretation may lead to these deficits, and that this mechanism may lead to structure-specific deficits in pwa. The results thus point to more than one mechanism underlying syntactic comprehension disorders both across and within pwa.

  11. Improved silicon carbide for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.

    1987-01-01

    This is the second annual technical report entitled, Improved Silicon Carbide for Advanced Heat Engines, and includes work performed during the period February 16, 1986 to February 15, 1987. The program is conducted for NASA under contract NAS3-24384. The objective is the development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines. The fabrication methods used are to be adaptable for mass production of such parts on an economically sound basis. Injection molding is the forming method selected. This objective is to be accomplished in a two-phase program: (1) to achieve a 20 percent improvement in strength and a 100 percent increase in Weibull modulus of the baseline material; and (2) to produce a complex shaped part, a gas turbine rotor, for example, with the improved mechanical properties attained in the first phase. Eight tasks are included in the first phase covering the characterization of the properties of a baseline material, the improvement of those properties and the fabrication of complex shaped parts. Activities during the first contract year concentrated on two of these areas: fabrication and characterization of the baseline material (Task 1) and improvement of material and processes (Task 7). Activities during the second contract year included an MOR bar matrix study to improve mechanical properties (Task 2), materials and process improvements (Task 7), and a Ford-funded task to mold a turbocharger rotor with an improved material (Task 8).

  12. Not the Same Old Thing: Establishing the Unique Contribution of Drinking Identity as a Predictor of Alcohol Consumption and Problems Over Time

    PubMed Central

    Lindgren, Kristen P.; Ramirez, Jason J.; Olin, Cecilia C.; Neighbors, Clayton

    2016-01-01

    Drinking identity – how much individuals view themselves as drinkers– is a promising cognitive factor that predicts problem drinking. Implicit and explicit measures of drinking identity have been developed (the former assesses more reflexive/automatic cognitive processes; the latter more reflective/controlled cognitive processes): each predicts unique variance in alcohol consumption and problems. However, implicit and explicit identity’s utility and uniqueness as a predictor relative to cognitive factors important for problem drinking screening and intervention has not been evaluated. Thus, the current study evaluated implicit and explicit drinking identity as predictors of consumption and problems over time. Baseline measures of drinking identity, social norms, alcohol expectancies, and drinking motives were evaluated as predictors of consumption and problems (evaluated every three months over two academic years) in a sample of 506 students (57% female) in their first or second year of college. Results found that baseline identity measures predicted unique variance in consumption and problems over time. Further, when compared to each set of cognitive factors, the identity measures predicted unique variance in consumption and problems over time. Findings were more robust for explicit, versus, implicit identity and in models that did not control for baseline drinking. Drinking identity appears to be a unique predictor of problem drinking relative to social norms, alcohol expectancies, and drinking motives. Intervention and theory could benefit from including and considering drinking identity. PMID:27428756

  13. Not the same old thing: Establishing the unique contribution of drinking identity as a predictor of alcohol consumption and problems over time.

    PubMed

    Lindgren, Kristen P; Ramirez, Jason J; Olin, Cecilia C; Neighbors, Clayton

    2016-09-01

    Drinking identity-how much individuals view themselves as drinkers-is a promising cognitive factor that predicts problem drinking. Implicit and explicit measures of drinking identity have been developed (the former assesses more reflexive/automatic cognitive processes; the latter more reflective/controlled cognitive processes): each predicts unique variance in alcohol consumption and problems. However, implicit and explicit identity's utility and uniqueness as predictors relative to cognitive factors important for problem drinking screening and intervention has not been evaluated. Thus, the current study evaluated implicit and explicit drinking identity as predictors of consumption and problems over time. Baseline measures of drinking identity, social norms, alcohol expectancies, and drinking motives were evaluated as predictors of consumption and problems (evaluated every 3 months over 2 academic years) in a sample of 506 students (57% female) in their first or second year of college. Results found that baseline identity measures predicted unique variance in consumption and problems over time. Further, when compared to each set of cognitive factors, the identity measures predicted unique variance in consumption and problems over time. Findings were more robust for explicit versus implicit identity and in models that did not control for baseline drinking. Drinking identity appears to be a unique predictor of problem drinking relative to social norms, alcohol expectancies, and drinking motives. Intervention and theory could benefit from including and considering drinking identity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Measuring precise sea level from a buoy using the global positioning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocken, C.; Kelecy, T.M.; Born, G.H.

    1990-11-01

    High-accuracy sea surface positioning is required for sea floor geodesy, satellite altimeter verification, and the study of sea level. An experiment to study the feasibility of using the Global Positioning System (GPS) for accurate sea surface positioning was conducted. A GPS-equipped buoy (floater) was deployed off the Scripps pier at La Jolla, California during December 13-15, 1989. Two reference GPS receivers were placed on land, one within {approximately}100 m of the floater, and the other about 80 km inland at the laser ranging site on Monument Peak. The position of the floater was determined relative to the land-fixed receivers using:more » (a) kinematic GPS processing software developed at the National Geodetic Survey (NGS), and (b) the Jet Propulsion Laboratory's GIPSY (GPS Inferred Positioning SYstem) software. Sea level and ocean wave spectra were calculated from GPPS measurements. These results were compared to measurements made with a NOAA tide gauge and a Paros{trademark} pressure transducer (PPT). GPS sea level for the short 100-m baseline agrees with the PPT sea level at the 1-cm level and has an rms variation of 5 mm over a period of 4 hours. Agreement between results with the two independent GPS analyses is on the order of a few millimeters. Processing of the longer Monument Peak - floater baseline is in progress and will require orbit adjustments and tropospheric modeling to obtain results comparable to the short baseline.« less

  15. Delirium as a Predictor of Physical and Cognitive Function in Individuals Aged 80 and Older After Transcatheter Aortic Valve Implantation or Surgical Aortic Valve Replacement.

    PubMed

    Eide, Leslie S P; Ranhoff, Anette H; Fridlund, Bengt; Haaverstad, Rune; Hufthammer, Karl Ove; Kuiper, Karel K J; Nordrehaug, Jan E; Norekvål, Tone M

    2016-06-01

    To determine how development of delirium after surgical aortic valve replacement (SAVR) or transcatheter aortic valve implantation (TAVI) could predict activity of daily living (ADL) and instrumental ADLs (IADL) disability, cognitive function, and self-reported health in individuals aged 80 and older. Prospective cohort study. Tertiary university hospital. Individuals aged 80 and older undergoing elective SAVR or TAVI (N = 136). Delirium was assessed for 5 days using the Confusion Assessment Method. The Barthel Index, Nottingham Extended ADL Scale, and SF-12 were used to determine ADL and IADL ability and self-reported health at baseline and 1- and 6-month follow-up. Cognition was assessed using the Mini-Mental State Examination at baseline and 6-month follow-up. Participants had lower IADL scores 1 month after SAVR than at baseline (baseline 58, 1 month: delirium 42, no delirium 50, P ≤ .02), but scores had returned to baseline levels at 6 months. The Medical Outcomes Study 12-item Short-Form Health Survey (SF-12) Physical Component Summary (PCS) score was higher at 6-month follow-up (48) than at baseline (39), especially in participants who did not develop delirium (P < .001). No differences in other outcomes were found. Regression models suggest that delirium may help predict IADL disability 1 month after baseline (P ≤ .07) but does not predict large differences in ADL disability, cognitive function, or SF-12-scores. Individuals who underwent TAVI and developed delirium had lower ADL (baseline 19, 1-month 16, P < .001) and IADL (baseline 49, 1-month 40, P = .003) scores at 1-month follow-up. SF-12 PCS score (baseline 30) increased from baseline to 1- (35, P = .04) and 6- (35, P = .02) month follow-up in individuals who underwent TAVI and did not develop delirium. Delirium after TAVI predicted greater ADL and IADL disability at 1-month but not at 6-month follow-up. Individuals who develop delirium after SAVR and TAVI have poorer short-term IADL function but do not seem to have long-term reductions in physical, mental, or self-reported health. © 2016 The Authors. The Journal of the American Geriatrics Society published by Wiley Periodicals, Inc. on behalf of The American Geriatrics Society.

  16. 48 CFR 34.202 - Integrated Baseline Reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... inherent risks in offerors'/contractors' performance plans and the underlying management control systems...) The degree to which the management process provides effective and integrated technical/schedule/cost... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Integrated Baseline...

  17. 48 CFR 1034.202 - Integrated Baseline Reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... which the management process provides effective and integrated technical/schedule/cost planning and... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Integrated Baseline... SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.202...

  18. Associations between social vulnerabilities and dietary patterns in European children: the Identification and prevention of Dietary- and lifestyle-induced health EFfects In Children and infantS (IDEFICS) study.

    PubMed

    Iguacel, Isabel; Fernández-Alvira, Juan M; Bammann, Karin; De Clercq, Bart; Eiben, Gabriele; Gwozdz, Wencke; Molnar, Dénes; Pala, Valeria; Papoutsou, Stalo; Russo, Paola; Veidebaum, Toomas; Wolters, Maike; Börnhorst, Claudia; Moreno, Luis A

    2016-10-01

    Socio-economic inequalities in childhood can determine dietary patterns, and therefore future health. This study aimed to explore associations between social vulnerabilities and dietary patterns assessed at two time points, and to investigate the association between accumulation of vulnerabilities and dietary patterns. A total of 9301 children aged 2-9 years participated at baseline and 2-year follow-up examinations of the Identification and prevention of Dietary- and lifestyle-induced health EFfects In Children and infantS study. In all, three dietary patterns were identified at baseline and follow-up by applying the K-means clustering algorithm based on a higher frequency of consumption of snacks and fast food (processed), sweet foods and drinks (sweet), and fruits and vegetables (healthy). Vulnerable groups were defined at baseline as follows: children whose parents lacked a social network, children from single-parent families, children of migrant origin and children with unemployed parents. Multinomial mixed models were used to assess the associations between social vulnerabilities and children's dietary patterns at baseline and follow-up. Children whose parents lacked a social network (OR 1·31; 99 % CI 1·01, 1·70) and migrants (OR 1·45; 99 % CI 1·15, 1·83) were more likely to be in the processed cluster at baseline and follow-up. Children whose parents were homemakers (OR 0·74; 99 % CI 0·60, 0·92) were less likely to be in the processed cluster at baseline. A higher number of vulnerabilities was associated with a higher probability of children being in the processed cluster (OR 1·78; 99 % CI 1·21, 2·62). Therefore, special attention should be paid to children of vulnerable groups as they present unhealthier dietary patterns.

  19. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    PubMed

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  20. Demonstration of an N7 integrated fab process for metal oxide EUV photoresist

    NASA Astrophysics Data System (ADS)

    De Simone, Danilo; Mao, Ming; Kocsis, Michael; De Schepper, Peter; Lazzarino, Frederic; Vandenberghe, Geert; Stowers, Jason; Meyers, Stephen; Clark, Benjamin L.; Grenville, Andrew; Luong, Vinh; Yamashita, Fumiko; Parnell, Doni

    2016-03-01

    Inpria has developed a directly patternable metal oxide hard-mask as a robust, high-resolution photoresist for EUV lithography. In this paper we demonstrate the full integration of a baseline Inpria resist into an imec N7 BEOL block mask process module. We examine in detail both the lithography and etch patterning results. By leveraging the high differential etch resistance of metal oxide photoresists, we explore opportunities for process simplification and cost reduction. We review the imaging results from the imec N7 block mask patterns and its process windows as well as routes to maximize the process latitude, underlayer integration, etch transfer, cross sections, etch equipment integration from cross metal contamination standpoint and selective resist strip process. Finally, initial results from a higher sensitivity Inpria resist are also reported. A dose to size of 19 mJ/cm2 was achieved to print pillars as small as 21nm.

  1. Modelling fluid accumulation in the neck using simple baseline fluid metrics: implications for sleep apnea.

    PubMed

    Vena, Daniel; Yadollahi, A; Bradley, T Douglas

    2014-01-01

    Obstructive sleep apnea (OSA) is a common respiratory disorder among adults. Recently we have shown that sedentary lifestyle causes an increase in diurnal leg fluid volume (LFV), which can shift into the neck at night when lying down to sleep and increase OSA severity. The purpose of this work was to investigate various metrics that represent baseline fluid retention in the legs and examine their correlation with neck fluid volume (NFV) and to develop a robust model for predicting fluid accumulation in the neck. In 13 healthy awake non-obese men, LFV and NFV were recorded continuously and simultaneously while standing for 5 minutes and then lying supine for 90 minutes. Simple regression was used to examine correlations between baseline LFV, baseline neck circumference (NC) and change in LFV with the outcome variables: change in NC (ΔNC) and in NFV (ΔNFV90) after lying supine for 90 minutes. An exhaustive grid search was implemented to find combinations of input variables which best modeled outcomes. We found strong positive correlations between baseline LFV (supine and standing) and ΔNFV90. Models developed for predicting ΔNFV90 included baseline standing LFV, baseline NC combined with change in LFV after lying supine for 90 minutes. These correlations and the developed models suggest that a greater baseline LFV might contribute to increased fluid accumulation in the neck. These results give more evidence that sedentary lifestyle might play a role in the pathogenesis of OSA by increasing the baseline LFV. The best models for predicting ΔNC include baseline LFV and NC; they improved accuracies of estimating ΔNC over individual predictors, suggesting that a combination of baseline fluid metrics is a good predictor of the change in NC while lying supine. Future work is aimed at adding additional baseline demographic features to improve model accuracy and eventually use it as a screening tool to predict severity of OSA prior to sleep.

  2. Quality process measures for rheumatoid arthritis: performance from members enrolled in a national health plan.

    PubMed

    Tkacz, Joseph; Ellis, Lorie A; Meyer, Roxanne; Bolge, Susan C; Brady, Brenna L; Ruetsch, Charles

    2015-02-01

    Health care quality problems are reflected in the underuse, overuse, and misuse of health care services. There is evidence suggesting that the quality of rheumatoid arthritis (RA) patient care is suboptimal, which has spurred the development of a number of systematic quality improvement metrics. To investigate a quality process measurement set in a sample of commercially insured RA patients. Medical, pharmacy, and laboratory claims for members with an RA diagnosis (ICD-9-CM 714.x) during calendar years 2008 through 2012 were extracted from the Optum Clinformatics Data Mart database. Eight process quality measures focused on RA patient response and tolerance to therapy were examined in the claims database. Measures were calculated for individual calendar years from 2009 to 2012, inclusive. The majority of adult RA patients received at least 1 prescription for a disease-modifying antirheumatic drug (DMARD) across the 4 measurement years: range = 78.5%-81.6%. Erythrocyte sedimentation rate and C-reactive protein testing were also evident in the majority of the sample, with 67.1%-72.2% of newly diagnosed RA patients receiving baseline testing, and 56.0%-58.7% of existing RA patients receiving annual testing. Among methotrexate users, liver function tests were performed in 74.5%-75.7% of treated patients, serum creatinine tests in 70.1%-72.6% of patients, and complete blood count tests in 74.5%-76.0% of patients. Additionally, most patients initiating a new DMARD had a claim for a baseline serum creatinine test (68.0%-70.3%) and baseline liver function test (69.3%-71.0%). Findings suggest that a majority of RA patients are attaining patient quality process measures, although a considerable proportion of patients (approximately 25%) may be receiving suboptimal care. Further studies are warranted to understand whether attainment of these measures translates into better outcomes.

  3. Hippocampal perfusion predicts impending neurodegeneration in REM sleep behavior disorder.

    PubMed

    Dang-Vu, Thien Thanh; Gagnon, Jean-François; Vendette, Mélanie; Soucy, Jean-Paul; Postuma, Ronald B; Montplaisir, Jacques

    2012-12-11

    Patients with idiopathic REM sleep behavior disorder (IRBD) are at risk for developing Parkinson disease (PD) and dementia with Lewy bodies (DLB). We aimed to identify functional brain imaging patterns predicting the emergence of PD and DLB in patients with IRBD, using SPECT with (99m)Tc-ethylene cysteinate dimer (ECD). Twenty patients with IRBD were scanned at baseline during wakefulness using (99m)Tc-ECD SPECT. After a follow-up of 3 years on average, patients were divided into 2 groups according to whether or not they developed defined neurodegenerative disease (PD, DLB). SPECT data analysis comparing regional cerebral blood flow (rCBF) between groups assessed whether specific brain perfusion patterns were associated with subsequent clinical evolution. Regression analysis between rCBF and clinical markers of neurodegeneration (motor, color vision, olfaction) looked for neural structures involved in this process. Of the 20 patients with IRBD recruited for this study, 10 converted to PD or DLB during the follow-up. rCBF at baseline was increased in the hippocampus of patients who would later convert compared with those who would not (p < 0.05 corrected). Hippocampal perfusion was correlated with motor and color vision scores across all IRBD patients. (99m)Tc-ECD SPECT identifies patients with IRBD at risk for conversion to other neurodegenerative disorders such as PD or DLB; disease progression in IRBD is predicted by abnormal perfusion in the hippocampus at baseline. Perfusion within this structure is correlated with clinical markers of neurodegeneration, further suggesting its involvement in the development of presumed synucleinopathies.

  4. Removal of gadolinium nitrate from heavy water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilde, E.W.

    2000-03-22

    Work was conducted to develop a cost-effective process to purify 181 55-gallon drums containing spent heavy water moderator (D2O) contaminated with high concentrations of gadolinium nitrate, a chemical used as a neutron poison during former nuclear reactor operations at the Savannah River Site (SRS). These drums also contain low level radioactive contamination, including tritium, which complicates treatment options. Presently, the drums of degraded moderator are being stored on site. It was suggested that a process utilizing biological mechanisms could potentially lower the total cost of heavy water purification by allowing the use of smaller equipment with less product loss andmore » a reduction in the quantity of secondary waste materials produced by the current baseline process (ion exchange).« less

  5. Biomarkers and taphonomic processes in fresh and fossil biosignatures from Hot Spring silica deposits in El Tatio Chile, as a Mars Analogue

    NASA Astrophysics Data System (ADS)

    Carrizo, D.; Sánchez-García, L.; Parro, V.; Cady, S. L.; Cabrol, N. A.

    2017-09-01

    Biomarkers characterization and taphonomic process of recent and fossil biosignatures in extreme environments with analogies to Mars is essential to understanding how life could develop and survive in this conditions. Siliceous sinter deposits on Mars where similar to those found in the hydrothermal hot springs and geysers from El Tatio, Chile. Organic preservation have been shown in this study. Many different labile functional groups (i.e., carboxylic acids, alcohols, aldehydes, etc.) were found in both "age" samples. A shift in congener pattern for the different lipids families were found and discuss. This results give insight in taphonomic processes actin in this extreme environment, which could be used as a baseline in Mars exploration.

  6. Automated array assembly task, phase 1

    NASA Technical Reports Server (NTRS)

    Carbajal, B. G.

    1977-01-01

    An assessment of state-of-the-art technologies that are applicable to silicon solar cell and solar cell module fabrication is provided. The assessment consists of a technical feasibility evaluation and a cost projection for high-volume production of silicon solar cell modules. The cost projection was approached from two directions; a design-to-cost analysis assigned cost goals to each major process element in the fabrication scheme, and a cost analysis built up projected costs for alternate technologies for each process element. A technical evaluation was used in combination with the cost analysis to identify a baseline low cost process. A novel approach to metal pattern design based on minimum power loss was developed. These design equations were used as a tool in the evaluation of metallization technologies.

  7. The challenge of logistics facilities development

    NASA Technical Reports Server (NTRS)

    Davis, James R.

    1987-01-01

    The paper discusses the experiences of a group of engineers and logisticians at John F. Kennedy Space center in the design, construction and activation of a consolidated logistics facility for support of Space Transportation System ground operations and maintenance. The planning, methodology and processes are covered, with emphasis placed on unique aspects and lessons learned. The project utilized a progressive design, baseline and build concept for each phase of construction, with the Government exercising funding and configuration oversight.

  8. Data Collection Procedures and Descriptive Statistics for the Grade One Achievement Monitoring Tests (Baseline, S-1, S-2, and S-3), Coordinated Study No. 1. Working Paper 316. Report from the Project on Studies in Mathematics.

    ERIC Educational Resources Information Center

    Buchanan, Anne E.; Romberg, Thomas A.

    As part of a 3-year study of arithmetic problem-solving skills in young children, pretests were administered to 180 middle class first grade students. Following each of three instructional units, another achievement test was administered. The three first grade units corresponded to the Developing Mathematical Processes curriculum and involved…

  9. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  10. [Problems in the individual adaptation of working women].

    PubMed

    Grebeneva, O V; Balaeva, E A

    2008-01-01

    The mechanisms of development of dysadaptive changes were revealed in factory workers in relation to congenital personality traits and the schemes of individual adaptation strategies were defined. At the same time increased anxiety leading to the accelerated rates of aging preceded impaired adaptive processes. The differences in the female adaptive patterns were determined by both the degree of emotional stability and the baseline energy capacities of the cardiorespiratory system and the involvement of a mental component in adaptation.

  11. A quality improvement project to increase self-administration of medicines in an acute hospital.

    PubMed

    Garfield, S; Bell, H; Nathan, C; Randall, S; Husson, F; Boucher, C; Taylor, A; Lloyd, J; Backhouse, A; Ritchie, L; Franklin, B D

    2018-03-24

    A patient survey found significantly fewer patients reported they had self-administered their medicines while in hospital (20% of 100 patients) than reported that they would like to (44% of 100). We aimed to make self-administration more easily available to patients who wanted it. We conducted a failure, modes and effects analysis, collected baseline data on four wards and carried out observations. Our initial assessment suggested that the main areas we should focus on were raising patient awareness of self-administration, changing the patient assessment process and creating a storage solution for medicines being self-administered. We developed new patient information leaflets and posters and a doctor's assessment form using Plan-Do-Study-Act cycles. We developed initial designs for a storage solution. We piloted the new materials on three wards; the fourth withdrew due to staff shortages. Following collection of baseline data, we continued to collect weekly data. We found that the proportion of patients who wished to self-administer who reported that they were able to do so, significantly increased from 41% (of 155 patients) to 66% (of 118 patients) during the study, despite a period when the hospital was over capacity. Raising and maintaining healthcare professionals' awareness of self-administration can greatly increase the proportion of patients who wish to self-administer who actually do so. Healthcare professionals prefer multi-disciplinary input into the assessment process.

  12. The role of Space Station Freedom in the Human Exploration Initiative

    NASA Technical Reports Server (NTRS)

    Ahlf, P. R.; Saucillo, R. J.; Meredith, B. D.; Peach, L. L.

    1990-01-01

    Exploration accommodation requirements for Space Station Freedom (SSF) and mission-supporting capabilities have been studied. For supporting the Human Exploration Initiative (HEI), SSF will accommodate two functions with augmentations to the baseline Assembly Complete configuration. First, it will be an earth-orbiting transportation node providing facilities and resources (crew, power, communications) for space vehicle assembly, testing, processing and postflight servicing. Second, it will be an in-space laboratory for science research and technology development. The evolutionary design of SSF will allow the on-orbit addition of pressurized laboratory and habitation modules, power generation equipment, truss structure, and unpressurized vehicle processing platforms.

  13. Characterization of PMR polyimide resin and prepreg

    NASA Technical Reports Server (NTRS)

    Lindenmeyer, P. H.; Sheppard, C. H.

    1984-01-01

    Procedures for the chemical characterization of PMR-15 resin solutions and graphite-reinforced prepregs were developed, and a chemical data base was established. In addition, a basic understanding of PMR-15 resin chemistry was gained; this was translated into effective processing procedures for the production of high quality graphite composites. During the program the PMR monomers and selected model compounds representative of postulated PMR-15 solution chemistry were acquired and characterized. Based on these data, a baseline PMR-15 resin was formulated and evaluated for processing characteristics and composite properties. Commercially available PMR-15 resins were then obtained and chemically characterized. Composite panels were fabricated and evaluated.

  14. Verbal short-term memory development and spoken language outcomes in deaf children with cochlear implants.

    PubMed

    Harris, Michael S; Kronenberger, William G; Gao, Sujuan; Hoen, Helena M; Miyamoto, Richard T; Pisoni, David B

    2013-01-01

    Cochlear implants (CIs) help many deaf children achieve near-normal speech and language (S/L) milestones. Nevertheless, high levels of unexplained variability in S/L outcomes are limiting factors in improving the effectiveness of CIs in deaf children. The objective of this study was to longitudinally assess the role of verbal short-term memory (STM) and working memory (WM) capacity as a progress-limiting source of variability in S/L outcomes after CI in children. Longitudinal study of 66 children with CIs for prelingual severe-to-profound hearing loss. Outcome measures included performance on digit span forward (DSF), digit span backward (DSB), and four conventional S/L measures that examined spoken-word recognition (Phonetically Balanced Kindergarten word test), receptive vocabulary (Peabody Picture Vocabulary Test ), sentence-recognition skills (Hearing in Noise Test), and receptive and expressive language functioning (Clinical Evaluation of Language Fundamentals Fourth Edition Core Language Score; CELF). Growth curves for DSF and DSB in the CI sample over time were comparable in slope, but consistently lagged in magnitude relative to norms for normal-hearing peers of the same age. For DSF and DSB, 50.5% and 44.0%, respectively, of the CI sample scored more than 1 SD below the normative mean for raw scores across all ages. The first (baseline) DSF score significantly predicted all endpoint scores for the four S/L measures, and DSF slope (growth) over time predicted CELF scores. DSF baseline and slope accounted for an additional 13 to 31% of variance in S/L scores after controlling for conventional predictor variables such as: chronological age at time of testing, age at time of implantation, communication mode (auditory-oral communication versus total communication), and maternal education. Only DSB baseline scores predicted endpoint language scores on Peabody Picture Vocabulary Test and CELF. DSB slopes were not significantly related to any endpoint S/L measures. DSB baseline scores and slopes taken together accounted for an additional 4 to 19% of variance in S/L endpoint measures after controlling for the conventional predictor variables. Verbal STM/WM scores, process measures of information capacity, develop at an average rate in the years after cochlear implantation, but were found to consistently lag in absolute magnitude behind those reported for normal-hearing peers. Baseline verbal STM/WM predicted long-term endpoint S/L outcomes, but verbal STM slopes predicted only endpoint language outcomes. Verbal STM/WM processing skills reflect important underlying core elementary neurocognitive functions and represent potential intervention targets for improving endpoint S/L outcomes in pediatric CI users.

  15. Comparing the neural bases of self-referential processing in typically developing and 22q11.2 adolescents.

    PubMed

    Schneider, Maude; Debbané, Martin; Lagioia, Annalaura; Salomon, Roy; d'Argembeau, Arnaud; Eliez, Stephan

    2012-04-01

    The investigation of self-reflective processing during adolescence is relevant, as this period is characterized by deep reorganization of the self-concept. It may be the case that an atypical development of brain regions underlying self-reflective processing increases the risk for psychological disorders and impaired social functioning. In this study, we investigated the neural bases of self- and other-related processing in typically developing adolescents and youths with 22q11.2 deletion syndrome (22q11DS), a rare neurogenetic condition associated with difficulties in social interactions and increased risk for schizophrenia. The fMRI paradigm consisted in judging if a series of adjectives applied to the participant himself/herself (self), to his/her best friend or to a fictional character (Harry Potter). In control adolescents, we observed that self- and other-related processing elicited strong activation in cortical midline structures (CMS) when contrasted with a semantic baseline condition. 22q11DS exhibited hypoactivation in the CMS and the striatum during the processing of self-related information when compared to the control group. Finally, the hypoactivation in the anterior cingulate cortex was associated with the severity of prodromal positive symptoms of schizophrenia. The findings are discussed in a developmental framework and in light of their implication for the development of schizophrenia in this at-risk population. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Cognitive and affective trait and state factors influencing the long-term symptom course in remitted depressed patients.

    PubMed

    Timm, Christina; Ubl, Bettina; Zamoscik, Vera; Ebner-Priemer, Ulrich; Reinhard, Iris; Huffziger, Silke; Kirsch, Peter; Kuehner, Christine

    2017-01-01

    Major depressive disorder (MDD) is characterized by a high risk for relapses and chronic developments. Clinical characteristics such as residual symptoms have been shown to negatively affect the long-term course of MDD. However, it is unclear so far how trait repetitive negative thinking (RNT) as well as cognitive and affective momentary states, the latter experienced during daily-life, affect the long-term course of MDD. We followed up 57 remitted depressed (rMDD) individuals six (T2) and 36 (T3) months after baseline. Clinical outcomes were time to relapse, time spent with significant symptoms as a marker of chronicity, and levels of depressive symptoms at T2 and T3. Predictors assessed at baseline included residual symptoms and trait RNT. Furthermore, momentary daily life affect and momentary rumination, and their variation over the day were assessed at baseline using ambulatory assessment (AA). In multiple models, residual symptoms and instability of daily-life affect at baseline independently predicted a faster time to relapse, while chronicity was significantly predicted by trait RNT. Multilevel models revealed that depressive symptom levels during follow-up were predicted by baseline residual symptom levels and by instability of daily-life rumination. Both instability features were linked to a higher number of anamnestic MDD episodes. Our findings indicate that trait RNT, but also affective and cognitive processes during daily life impact the longer-term course of MDD. Future longitudinal research on the role of respective AA-phenotypes as potential transdiagnostic course-modifiers is warranted.

  17. Training tomorrow's clinicians today--managed care essentials: a process for curriculum development.

    PubMed

    Colenda, C C; Wadland, W; Hayes, O; Anderson, W; Priester, F; Pearson, R; Keefe, C; Fleck, L

    2000-05-01

    To develop a managed care curriculum for primary care residents. This article outlines a 4-stage curriculum development process focusing on concepts of managed care organization and finance. The stages consist of: (1) identifying the curriculum development work group and framing the scope of the curriculum, (2) identifying stakeholder buy-in and expectations, (3) choosing curricular topics and delivery mechanisms, and (4) outlining the evaluation process. Key elements of building a curriculum development team, content objectives of the curriculum, the rationale for using problem-based learning, and finally, lessons learned from the partnership among the stakeholders are reviewed. The curriculum was delivered to an entering group of postgraduate-year 1 primary care residents. Attitudes among residents toward managed care remained relatively negative and stable over the yearlong curriculum, especially over issues relating to finance, quality of care, control and autonomy of practitioners, time spent with patients, and managed care's impact on the doctor-patient relationship. Residents' baseline knowledge of core concepts about managed care organization and finance improved during the year that the curriculum was delivered. Satisfaction with a problem-based learning approach was high. Problem-based learning, using real-life clinical examples, is a successful approach to resident instruction about managed care.

  18. Different antidiabetic regimens and the development of renal dysfunction in US Veterans with type 2 diabetes mellitus.

    PubMed

    Gosmanova, Elvira O; Canada, Robert B; Wan, Jim; Mangold, Therese A; Wall, Barry M

    2012-10-01

    The aim of this study was to evaluate the development of renal dysfunction in veterans with type 2 diabetes mellitus (T2DM) treated with different antidiabetic regimens. A retrospective cohort study involving 1715 patients with T2DM and baseline serum creatinine (SCr) of 1.5 mg/dL or lesser. The development of renal dysfunction, defined as 0.5 mg/dL or greater increase from baseline SCr during 4.8 years of follow-up with monotherapy metformin (M), 2 combination therapy groups: metformin + insulin (MI) and metformin + sulfonylurea (MS) users were compared with changes observed in sulfonylurea monotherapy users (S). Both MI and MS groups had higher mean baseline hemoglobin A1C (HbA1C) (9.0 and 8.6%, respectively) and higher rates of baseline macroalbuminuria (17.3 and 12.1%, respectively) as compared with M and S groups (mean HbA1C7.7% in both groups, and proteinuria M-5.1% and S-7.4%). In unadjusted analysis, the development of renal dysfunction was more frequent in MI and MS but not in M group as compared with sulfonylurea monotherapy (unadjusted HRs and [95% confidence interval (CI), 2.1[1.4-3.0], 1.4[1.1-1.9], and 1.0[0.6-1.7], respectively). However, differences in the development of renal dysfunction were not significant between the 4 groups after adjusting for baseline variables. Baseline macroalbuminuria was a strong predictor of Scr elevation of 0.5 mg/dL or greater during follow-up (adjusted HR, 3.1[1.9-4.7]). Unexpectedly, baseline use of renin-angiotensin-aldosterone system blockers was also associated with the development of renal dysfunction (adjusted HR, 1.9[1.3-2.8]). In this retrospective cohort study involving US predominantly male veterans with T2DM, baseline macroalbuminuria and use of RAAS blockers were associated with increased risk of development of renal dysfunction, whereas different antidiabetic regimens were not.

  19. Predictors of Exercise Relapse in a College Population.

    ERIC Educational Resources Information Center

    Sullum, Julie; Clark, Matthew M.; King, Teresa K.

    2000-01-01

    Investigated factors that predicted exercise relapse among college students. Physically active undergraduates completed questionnaires measuring Prochaska's 10 processes for change of exercise, self-efficacy, and decisional balance. Exercise levels were assessed at baseline and 8 weeks later. At baseline, relapsers had significantly lower…

  20. Space Shuttle SRM development. [Solid Rocket Motors

    NASA Technical Reports Server (NTRS)

    Brinton, B. C.; Kilminster, J. C.

    1979-01-01

    The successful static test of the fourth Development Space Shuttle Solid Rocket Motor (SRM) in February 1979 concluded the development testing phase of the SRM Project. Qualification and flight motors are currently being fabricated, with the first qualification motor to be static tested. Delivered thrust-time traces on all development motors were very close to predicted values, and both specific and total impulse exceeded specification requirements. 'All-up' static tests conducted with a solid rocket booster equipment on development motors achieved all test objectives. Transportation and support equipment concepts have been proven, baselining is complete, and component reusability has been demonstrated. Evolution of the SRM transportation support equipment, and special test equipment designs are reviewed, and development activities discussed. Handling and processing aspects of large, heavy components are described.

  1. Food purchasing selection among low-income, Spanish-speaking Latinos.

    PubMed

    Cortés, Dharma E; Millán-Ferro, Andreina; Schneider, Karen; Vega, Rodolfo R; Caballero, A Enrique

    2013-03-01

    In the U.S., poverty has been linked to both obesity and disease burden. Latinos in the U.S. are disproportionately affected by poverty, and over the past 10 years, the percentage of overweight U.S. Latino youth has approximately doubled. Buying low-cost food that is calorie-dense and filling has been linked to obesity. Low-income individuals tend to favor energy-dense foods because of their low cost, and economic decisions made during food purchasing have physiologic repercussions. Diets based on energy-dense foods tend to be high in processed staples, such as refined grains, added sugars, and added fats. These diets have been linked to a higher risk of obesity, type 2 diabetes, and cardiovascular disease. This pilot study conducted ethnographic qualitative analyses combined with quantitative analyses to understand grocery shopping practices among 20 Spanish-speaking, low-income Latino families. The purpose was to analyze food selection practices in order to determine the effect of nutrition education on changes in shopping practices to later develop educational tools to promote selection of healthier food options. Participants received tailored, interactive, nutrition education during three to five home visits and a supermarket tour. Grocery store receipts for grocery purchases collected at baseline and at the end of the project were analyzed for each family to extract nutritional content of purchased foods. Nutritional content was measured with these factors in mind: quantity, calories, fats, carbohydrates, fiber, protein, and percentage of sugary beverages and processed food. Data were collected in 2010-2011 and analyzed in 2011-2012. After receiving between three and five home-based nutrition education sessions and a supermarket tour over a 6-month period, many families adopted instructions on buying budget-friendly, healthier alternative foods. Findings indicate that participating families decreased the total number of calories and calories per dollar purchased from baseline to post-education (median total calories: baseline, 20,191; post-education, 15,991, p=0.008); median calories per dollar: baseline, 404; post-education, 320, p=0.008). The median grams of carbohydrates per dollar (baseline, 66, post-education, 45) and median calories from processed food (baseline, 11,000, post-education, 7845) were not reduced (p=0.06). This pilot study demonstrated that grocery shopping practices are an important factor to address in nutrition education among Spanish-speaking, low-income individuals, and that there may be ways to encourage low-income, Latino families to purchase healthier foods. Findings challenged arguments suggesting that such an approach is not possible because of the high cost of healthier foods. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Stack Characterization in CryoSat Level1b SAR/SARin Baseline C

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. CryoSat is the first altimetry mission operating in SAR mode and it carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. The current CryoSat IPF (Instrument Processing Facility), Baseline B, was released in operation in February 2012. After more than 2 years of development, the release in operations of the Baseline C is expected in the first half of 2015. It is worth recalling here that the CryoSat SAR/SARin IPF1 generates 20Hz waveforms in correspondence of an approximately equally spaced set of ground locations on the Earth surface, i.e. surface samples, and that a surface sample gathers a collection of single-look echoes coming from the processed bursts during the time of visibility. Thus, for a given surface sample, the stack can be defined as the collection of all the single-look echoes pointing to the current surface sample, after applying all the necessary range corrections. The L1B product contains the power average of all the single-look echoes in the stack: the multi-looked L1B waveform. This reduces the data volume, while removing some information contained in the single looks, useful for characterizing the surface and modelling the L1B waveform. To recover such information, a set of parameters has been added to the L1B product: the stack characterization or beam behaviour parameters. The stack characterization, already included in previous Baselines, has been reviewed and expanded in Baseline C. This poster describes all the stack characterization parameters, detailing what they represent and how they have been computed. In details, such parameters can be summarized in: - Stack statistical parameters, such as skewness and kurtosis - Look angle (i.e. the angle at which the surfaces sample is seen with respect to the nadir direction of the satellite) and Doppler angle (i.e. the angle at which the surfaces sample is seen with respect to the normal to the velocity vector) for the first and the last single-look echoes in the stack. - Number of single-looks averaged in the stack (in Baseline C a stack-weighting has been applied that reduces the number of looks). With the correct use of these parameters, users will be able to retrieve some of the 'lost' information contained within the stack and fully exploit the L1B product.

  3. Developing and Piloting a Baselining Tool for Education for Sustainable Development and Global Citizenship (ESDGC) in Welsh Higher Education

    ERIC Educational Resources Information Center

    Glover, Alison; Jones, Yvonne; Claricoates, Jane; Morgan, Jan; Peters, Carl

    2013-01-01

    Mainstreaming Education for Sustainable Development in higher education is vital if graduates are to possess the abilities, skills, and knowledge needed to tackle the sustainability issues of the future. In this article we explain the development and piloting of a baselining tool, the Education for Sustainable Development and Global Citizenship…

  4. Advanced Research Deposition System (ARDS) for processing CdTe solar cells

    NASA Astrophysics Data System (ADS)

    Barricklow, Keegan Corey

    CdTe solar cells have been commercialized at the Gigawatt/year level. The development of volume manufacturing processes for next generation CdTe photovoltaics (PV) with higher efficiencies requires research systems with flexibility, scalability, repeatability and automation. The Advanced Research Deposition Systems (ARDS) developed by the Materials Engineering Laboratory (MEL) provides such a platform for the investigation of materials and manufacturing processes necessary to produce the next generation of CdTe PV. Limited by previous research systems, the ARDS was developed to provide process and hardware flexibility, accommodating advanced processing techniques, and capable of producing device quality films. The ARDS is a unique, in-line process tool with nine processing stations. The system was designed, built and assembled at the Materials Engineering Laboratory. Final assembly, startup, characterization and process development are the focus of this research. Many technical challenges encountered during the startup of the ARDS were addressed in this research. In this study, several hardware modifications needed for the reliable operation of the ARDS were designed, constructed and successfully incorporated into the ARDS. The effect of process condition on film properties for each process step was quantified. Process development to achieve 12% efficient baseline solar cell required investigation of discrete processing steps, troubleshooting process variation, and developing performance correlations. Subsequent to this research, many advances have been demonstrated with the ARDS. The ARDS consistently produces devices of 12% +/-.5% by the process of record (POR). The champion cell produced to date utilizing the ARDS has an efficiency of 16.2% on low cost commercial sodalime glass and utilizes advanced films. The ARDS has enabled investigation of advanced concepts for processing CdTe devices including, Plasma Cleaning, Plasma Enhanced Closed Space Sublimation (PECSS), Electron Reflector (ER) using Cd1-xMgxTe (CMT) structure and alternative device structures. The ARDS has been instrumental in the collaborative research with many institutions.

  5. Using Correlative Properties of Neighboring Pixels to Enhance Contrast-to-Noise Ratio of Abnormal Hippocampus in Patients With Intractable Epilepsy and Mesial Temporal Sclerosis.

    PubMed

    Parsons, Matthew S; Sharma, Aseem; Hildebolt, Charles

    2018-06-12

    To test whether an image-processing algorithm can aid in visualization of mesial temporal sclerosis on magnetic resonance imaging by selectively increasing contrast-to-noise ratio (CNR) between abnormal hippocampus and normal brain. In this Institutional Review Board-approved and Health Insurance Portability and Accountability Act-compliant study, baseline coronal fluid-attenuated inversion recovery images of 18 adults (10 females, eight males; mean age 41.2 years) with proven mesial temporal sclerosis were processed using a custom algorithm to produce corresponding enhanced images. Average (Hmean) and maximum (Hmax) CNR for abnormal hippocampus were calculated relative to normal ipsilateral white matter. CNR values for normal gray matter (GM) were similarly calculated using ipsilateral cingulate gyrus as the internal control. To evaluate effect of image processing on visual conspicuity of hippocampal signal alteration, a neuroradiologist masked to the side of hippocampal abnormality rated signal intensity (SI) of hippocampi on baseline and enhanced images using a five-point scale (definitely abnormal to definitely normal). Differences in Hmean, Hmax, GM, and SI ratings for abnormal hippocampi on baseline and enhanced images were assessed for statistical significance. Both Hmean and Hmax were significantly higher in enhanced images as compared to baseline images (p < 0.0001 for both). There was no significant difference in the GM between baseline and enhanced images (p = 0.9375). SI ratings showed a more confident identification of abnormality on enhanced images (p = 0.0001). Image-processing resulted in increased CNR of abnormal hippocampus without affecting the CNR of normal gray matter. This selective increase in conspicuity of abnormal hippocampus was associated with more confident identification of hippocampal signal alteration. Copyright © 2018 Academic Radiology. Published by Elsevier Inc. All rights reserved.

  6. Petroleum hydrocarbons in water from a Brazilian tropical estuary facing industrial and port development.

    PubMed

    Lemos, Rafael Thompson de Oliveira; de Carvalho, Paulo Sérgio Martins; Zanardi-Lamardo, Eliete

    2014-05-15

    A fast paced industrial and port development has occurred at Suape Estuary, Northeast Brazil, but no information about hydrocarbon concentrations in water is available to this area. Considering that, the contamination level of Suape was determined by UV-Fluorescence in terms of dissolved and/or dispersed petroleum hydrocarbons (DDPHs), during wet and dry seasons. DDPHs ranged between 0.05 and 4.59 μg L(-1) Carmópolis oil equivalents and 0.01-1.39 μg L(-1) chrysene equivalents, indicating DDPHs close to a baseline contamination level. Some relatively high concentrations (>1 μg L(-1)) were probably associated with shipyard operations (hull paintings and ship docking), pollutants remobilization by dredging operations, occasional industrial discharges and oil derivatives released by vessels. DDPHs concentrations were lower in the wet season suggesting that the increased dilution rates caused by rainfall dominated upon the wet deposition of atmospheric combustion-derived PAHs process. Results in this study may be used as baseline to further studies in this area. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Management of the baseline shift using a new and simple method for respiratory-gated radiation therapy: Detectability and effectiveness of a flexible monitoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, Hidenobu; Kitamura, Nozomi; Ito, Yasushi

    2011-07-15

    Purpose: In respiratory-gated radiation therapy, a baseline shift decreases the accuracy of target coverage and organs at risk (OAR) sparing. The effectiveness of audio-feedback and audio-visual feedback in correcting the baseline shift in the breathing pattern of the patient has been demonstrated previously. However, the baseline shift derived from the intrafraction motion of the patient's body cannot be corrected by these methods. In the present study, the authors designed and developed a simple and flexible system. Methods: The system consisted of a web camera and a computer running our in-house software. The in-house software was adapted to template matching andmore » also to no preimage processing. The system was capable of monitoring the baseline shift in the intrafraction motion of the patient's body. Another marker box was used to monitor the baseline shift due to the flexible setups required of a marker box for gated signals. The system accuracy was evaluated by employing a respiratory motion phantom and was found to be within AAPM Task Group 142 tolerance (positional accuracy <2 mm and temporal accuracy <100 ms) for respiratory-gated radiation therapy. Additionally, the effectiveness of this flexible and independent system in gated treatment was investigated in healthy volunteers, in terms of the results from the differences in the baseline shift detectable between the marker positions, which the authors evaluated statistically. Results: The movement of the marker on the sternum [1.599 {+-} 0.622 mm (1 SD)] was substantially decreased as compared with the abdomen [6.547 {+-} 0.962 mm (1 SD)]. Additionally, in all of the volunteers, the baseline shifts for the sternum [-0.136 {+-} 0.868 (2 SD)] were in better agreement with the nominal baseline shifts than was the case for the abdomen [-0.722 {+-} 1.56 mm (2 SD)]. The baseline shifts could be accurately measured and detected using the monitoring system, which could acquire the movement of the marker on the sternum. The baseline shift-monitoring system with the displacement-based methods for highly accurate respiratory-gated treatments should be used to make most of the displacement-based gating methods. Conclusions: The advent of intensity modulated radiation therapy and volumetric modulated radiation therapy facilitates margin reduction for the planning target volumes and the OARs, but highly accurate irradiation is needed to achieve target coverage and OAR sparing with a small margin. The baseline shifts can affect treatment not only with the respiratory gating system but also without the system. Our system can manage the baseline shift and also enables treatment irradiation to be undertaken with high accuracy.« less

  8. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  9. NLS propulsion - Government view

    NASA Technical Reports Server (NTRS)

    Smelser, Jerry W.

    1992-01-01

    The paper discusses the technology development for the Space Transportation Main Engine (STME). The STME is a liquid oxygen/liquid hydrogen engine with 650,000 pounds of thrust, which may be flown in single-engine or multiple-engine configurations, depending upon the payload and mission requirements. The technological developments completed so far include a vacuum plasma spray process, the liquid interface diffusion bonding, and a thin membrane platelet technology for the combustion chamber fabrication; baseline designs for the hydrogen turbopump and the oxygen pump; and the engine control system. The family of spacecraft for which this engine is being developed includes a 20,000 pound payload to LEO and a 150,000 pound to LEO vehicle.

  10. Advanced composites structural concepts and materials technologies for primary aircraft structures: Design/manufacturing concept assessment

    NASA Technical Reports Server (NTRS)

    Chu, Robert L.; Bayha, Tom D.; Davis, HU; Ingram, J. ED; Shukla, Jay G.

    1992-01-01

    Composite Wing and Fuselage Structural Design/Manufacturing Concepts have been developed and evaluated. Trade studies were performed to determine how well the concepts satisfy the program goals of 25 percent cost savings, 40 percent weight savings with aircraft resizing, and 50 percent part count reduction as compared to the aluminum Lockheed L-1011 baseline. The concepts developed using emerging technologies such as large scale resin transfer molding (RTM), automatic tow placed (ATP), braiding, out-of-autoclave and automated manufacturing processes for both thermoset and thermoplastic materials were evaluated for possible application in the design concepts. Trade studies were used to determine which concepts carry into the detailed design development subtask.

  11. Cognitive predictors of everyday functioning in older adults: results from the ACTIVE Cognitive Intervention Trial.

    PubMed

    Gross, Alden L; Rebok, George W; Unverzagt, Frederick W; Willis, Sherry L; Brandt, Jason

    2011-09-01

    The present study sought to predict changes in everyday functioning using cognitive tests. Data from the Advanced Cognitive Training for Independent and Vital Elderly trial were used to examine the extent to which competence in different cognitive domains--memory, inductive reasoning, processing speed, and global mental status--predicts prospectively measured everyday functioning among older adults. Coefficients of determination for baseline levels and trajectories of everyday functioning were estimated using parallel process latent growth models. Each cognitive domain independently predicts a significant proportion of the variance in baseline and trajectory change of everyday functioning, with inductive reasoning explaining the most variance (R2 = .175) in baseline functioning and memory explaining the most variance (R2 = .057) in changes in everyday functioning. Inductive reasoning is an important determinant of current everyday functioning in community-dwelling older adults, suggesting that successful performance in daily tasks is critically dependent on executive cognitive function. On the other hand, baseline memory function is more important in determining change over time in everyday functioning, suggesting that some participants with low baseline memory function may reflect a subgroup with incipient progressive neurologic disease.

  12. Process evaluation of a patient-centred, patient-directed, group-based education program for the management of type 2 diabetes mellitus.

    PubMed

    Odgers-Jewell, Kate; Isenring, Elisabeth; Thomas, Rae; Reidlinger, Dianne P

    2017-07-01

    The present study developed and evaluated a patient-centred, patient-directed, group-based education program for the management of type 2 diabetes mellitus. Two frameworks, the Medical Research Council (MRC) framework for developing and evaluating complex interventions and the RE-AIM framework were followed. Data to develop the intervention were sourced from scoping of the literature and formative evaluation. Program evaluation comprised analysis of primary recruitment of participants through general practitioners, baseline and end-point measures of anthropometry, four validated questionnaires, contemporaneous facilitator notes and telephone interviews with participants. A total of 16 participants enrolled in the intervention. Post-intervention results were obtained from 13 participants, with an estimated mean change from baseline in weight of -0.72 kg (95%CI -1.44 to -0.01), body mass index of -0.25 kg/m 2 (95%CI -0.49 to -0.01) and waist circumference of -1.04 cm (95%CI -4.52 to 2.44). The group education program was acceptable to participants. The results suggest that recruitment through general practitioners is ineffective, and alternative recruitment strategies are required. This patient-centred, patient-directed, group-based intervention for the management of type 2 diabetes mellitus was both feasible and acceptable to patients. Health professionals should consider the combined use of the MRC and RE-AIM frameworks in the development of interventions to ensure a rigorous design process and to enable the evaluation of all phases of the intervention, which will facilitate translation to other settings. Further research with a larger sample trialling additional recruitment strategies, evaluating further measures of effectiveness and utilising lengthier follow-up periods is required. © 2016 Dietitians Association of Australia.

  13. Performance assessment of multi-frequency processing of ICU chest images for enhanced visualization of tubes and catheters

    NASA Astrophysics Data System (ADS)

    Wang, Xiaohui; Couwenhoven, Mary E.; Foos, David H.; Doran, James; Yankelevitz, David F.; Henschke, Claudia I.

    2008-03-01

    An image-processing method has been developed to improve the visibility of tube and catheter features in portable chest x-ray (CXR) images captured in the intensive care unit (ICU). The image-processing method is based on a multi-frequency approach, wherein the input image is decomposed into different spatial frequency bands, and those bands that contain the tube and catheter signals are individually enhanced by nonlinear boosting functions. Using a random sampling strategy, 50 cases were retrospectively selected for the study from a large database of portable CXR images that had been collected from multiple institutions over a two-year period. All images used in the study were captured using photo-stimulable, storage phosphor computed radiography (CR) systems. Each image was processed two ways. The images were processed with default image processing parameters such as those used in clinical settings (control). The 50 images were then separately processed using the new tube and catheter enhancement algorithm (test). Three board-certified radiologists participated in a reader study to assess differences in both detection-confidence performance and diagnostic efficiency between the control and test images. Images were evaluated on a diagnostic-quality, 3-megapixel monochrome monitor. Two scenarios were studied: the baseline scenario, representative of today's workflow (a single-control image presented with the window/level adjustments enabled) vs. the test scenario (a control/test image pair presented with a toggle enabled and the window/level settings disabled). The radiologists were asked to read the images in each scenario as they normally would for clinical diagnosis. Trend analysis indicates that the test scenario offers improved reading efficiency while providing as good or better detection capability compared to the baseline scenario.

  14. Long-Baseline Comparisons of the Brazilian National Time Scale to UTC (NIST) Using Near Real-Time and Postprocessed Solutions

    DTIC Science & Technology

    2007-11-01

    long baseline of ~8600 km. The comparisons were made with measurement systems developed for the Sistema Interamericano de Metrologia (SIM) comparison...measurements are compared and summarized. I. INTRODUCTION The Sistema Interamericano de Metrologia (SIM) is a regional metrology organization...Brazil. The two time scales are separated by a long baseline of ~8600 km. The comparisons were made with measurement systems developed for the Sistema

  15. Application of the Hydroecological Integrity Assessment Process for Missouri Streams

    USGS Publications Warehouse

    Kennen, Jonathan G.; Henriksen, James A.; Heasley, John; Cade, Brian S.; Terrell, James W.

    2009-01-01

    Natural flow regime concepts and theories have established the justification for maintaining or restoring the range of natural hydrologic variability so that physiochemical processes, native biodiversity, and the evolutionary potential of aquatic and riparian assemblages can be sustained. A synthesis of recent research advances in hydroecology, coupled with stream classification using hydroecologically relevant indices, has produced the Hydroecological Integrity Assessment Process (HIP). HIP consists of (1) a regional classification of streams into hydrologic stream types based on flow data from long-term gaging-station records for relatively unmodified streams, (2) an identification of stream-type specific indices that address 11 subcomponents of the flow regime, (3) an ability to establish environmental flow standards, (4) an evaluation of hydrologic alteration, and (5) a capacity to conduct alternative analyses. The process starts with the identification of a hydrologic baseline (reference condition) for selected locations, uses flow data from a stream-gage network, and proceeds to classify streams into hydrologic stream types. Concurrently, the analysis identifies a set of non-redundant and ecologically relevant hydrologic indices for 11 subcomponents of flow for each stream type. Furthermore, regional hydrologic models for synthesizing flow conditions across a region and the development of flow-ecology response relations for each stream type can be added to further enhance the process. The application of HIP to Missouri streams identified five stream types ((1) intermittent, (2) perennial runoff-flashy, (3) perennial runoff-moderate baseflow, (4) perennial groundwater-stable, and (5) perennial groundwater-super stable). Two Missouri-specific computer software programs were developed: (1) a Missouri Hydrologic Assessment Tool (MOHAT) which is used to establish a hydrologic baseline, provide options for setting environmental flow standards, and compare past and proposed hydrologic alterations; and (2) a Missouri Stream Classification Tool (MOSCT) designed for placing previously unclassified streams into one of the five pre-defined stream types.

  16. Clinical and neurocognitive course in early-onset psychosis: a longitudinal study of adolescents with schizophrenia-spectrum disorders*

    PubMed Central

    Wozniak, Jeffrey R.; Block, Erin E.; White, Tonya; Jensen, Jonathan B.; Schulz, S. Charles

    2017-01-01

    Aim Adolescents with psychotic disorders show deficits in IQ, attention, learning and memory, executive functioning, and processing speed that are related to important clinical variables including negative symptoms, adaptive functioning and academics. Previous studies have reported relatively consistent deficits with varying relationships to illness status and symptoms. The goals of this study were to examine these relationships in a larger sample at baseline, and also to examine the longitudinal course of these deficits in a smaller subset of adolescents. Method Thirty-six subjects, aged 10 to 17 years, were included at baseline. All had Diagnostic and Statistical Manual-Fourth Edition diagnoses of schizophrenia, schizoaffective disorder, schizophreniform disorder and psychosis – not otherwise specified, as determined by Kiddie-Schedule for Affective Disorders and Schizophrenia for School-Age Children structured interviews. Patients were administered a neuropsychological battery, and Positive and Negative Syndrome Scale ratings were completed at baseline and again at 1 year (n = 14). Most participants were inpatients at baseline, and 13 of 14were on atypical antipsychotic medication during both sessions. Results At baseline, the patients demonstrated impairments in working memory, processing speed, executive function and verbal learning. No significant cognitive change was detected at 1-year follow-up. In contrast, clinical symptoms were variable across 1 year, with an improvement in positive symptoms at 1 year. No relationships between clinical and cognitive symptoms were observed, with the exception of baseline IQ predicting negative symptoms at 1 year. Conclusions Young patients with schizophrenia-spectrum disorders displayed neurocognitive impairments at baseline. Despite measurable fluctuations in clinical symptoms over the year, no significant changes were measured in cognition. Lower IQ at baseline was predictive of more negative symptoms at 1 year. PMID:21352150

  17. Development of ceramic matrix composites for application in the ceramic technology for Advanced Heat Engines Project: Phase 2a, Development of in-situ toughened silicon nitride. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollinger, J.; Newson, D.; Yeh, H.

    1992-06-01

    The objective of this program was to develop a net shape forming process for an in-situ reinforced Si{sub 3}N{sub 4} (AS-700). AS-700 was initially developed using cold isostatic pressing (CIP) of alcohol milled powders. The CIP`ed AS-700 material exhibited a moderate strength (690 MPa) and high toughness (9 MPa{radical}m) at room temperature. In addition to net-shape process development, optimization of AS-700 properties was also investigated through the refinement of densification processes, and evaluation of the effect of Si{sub 3}N{sub 4} powder properties on resulting microstructure and mechanical properties. Slip casting was chosen as the net-shape forming process. A slip castingmore » process was successfully developed for forming green parts ranging from thin plates to thick cylinders, and to large complex shaped turbine rotors. The densification cycle was optimized to achieve full density parts without any cracks or warpage, and with comparable properties and microstructure to the CIP`ed baseline AS-700 material. The evaluation of six (6) alternate Si{sub 3}N{sub 4} powders indicated that Si{sub 3}N{sub 4} powders have a very strong influence on the development of resulting AS-700 in-situ microstructures and mechanical properties. The AS-700 slip casting process and optimized densification process were then combined and a number of test specimens were fabricated. The mechanical properties and microstructure of the optimized slip cast AS-700 Si{sub 3}N{sub 4} were then fully characterized. The key property values are: 695 MPa at room temperature, 446 MPa at 1370{degree}C flexural strengths and 8.25 MPa{radical}m toughness.« less

  18. Using the Safer Clinical Systems approach and Model for Improvement methodology to decrease Venous Thrombo-Embolism in Elective Surgical Patients.

    PubMed

    Humphries, Angela; Peden, Carol; Jordan, Lesley; Crowe, Josephine; Peden, Carol

    2016-01-01

    A significant incidence of post-procedural deep vein thrombosis (DVT) and pulmonary embolus (PE) was identified in patients undergoing surgery at our hospital. Investigation showed an unreliable peri-operative process leading to patients receiving incorrect or missed venous thromboembolism (VTE) prophylaxis. The Trust had previously participated in a project funded by the Health Foundation using the "Safer Clinical Systems" methodology to assess, diagnose, appraise options, and implement interventions to improve a high risk medication pathway. We applied the methodology from that study to this cohort of patients demonstrating that the same approach could be applied in a different context. Interventions were linked to the greatest hazards and risks identified during the diagnostic phase. This showed that many surgical elective patients had no VTE risk assessment completed pre-operatively, leading to missed or delayed doses of VTE prophylaxis post-operatively. Collaborative work with stakeholders led to the development of a new process to ensure completion of the VTE risk assessment prior to surgery, which was implemented using the Model for Improvement methodology. The process was supported by the inclusion of a VTE check in the Sign Out element of the WHO Surgical Safety Checklist at the end of surgery, which also ensured that appropriate prophylaxis was prescribed. A standardised operation note including the post-operative VTE plan will be implemented in the near future. At the end of the project VTE risk assessments were completed for 100% of elective surgical patients on admission, compared with 40% in the baseline data. Baseline data also revealed that processes for chemical and mechanical prophylaxis were not reliable. Hospital wide interventions included standardisation of mechanical prophylaxis devices and anti-thromboembolic stockings (resulting in a cost saving of £52,000), and a Trust wide awareness and education programme. The education included increased emphasis on use of mechanical prophylaxis when chemical prophylaxis was contraindicated. VTE guidelines were also included in the existing junior Doctor guideline App. and a "CLOTS" anticoagulation webpage was developed and published on the hospital intranet. The improvement in VTE processes resulted in an 80% reduction in hospital associated thrombosis following surgery from 0.2% in January 2014 to 0.04% in December 2015 and a reduction in the number of all hospital associated VTE from a baseline median of 9 per month as of January 2014 to a median of 1 per month by December 2015.

  19. Presentation Of The Small Baseline NSBAS Processing Chain On A Case Example: The ETNA Deformation Monitoring From 2003 to 2010 Using ENVISAT Data

    NASA Astrophysics Data System (ADS)

    Doin, Marie-Pierre; Lodge, Felicity; Guillaso, Stephane; Jolivet, Romain; Lasserre, Cecile; Ducret, Gabriel; Grandin, Raphael; Pathier, Erwan; Pinel, Virginie

    2012-01-01

    We assemble a processing chain that handles InSAR computation from raw data to time series analysis. A large part of the chain (from raw data to geocoded unwrapped interferograms) is based on ROI PAC modules (Rosen et al., 2004), with original routines rearranged and combined with new routines to process in series and in a common radar geometry all SAR images and interferograms. A new feature of the software is the range-dependent spectral filtering to improve coherence in interferograms with long spatial baselines. Additional components include a module to estimate and remove digital elevation model errors before unwrapping, a module to mitigate the effects of the atmospheric phase delay and remove residual orbit errors, and a module to construct the phase change time series from small baseline interferograms (Berardino et al. 2002). This paper describes the main elements of the processing chain and presents an example of application of the software using a data set from the ENVISAT mission covering the Etna volcano.

  20. Development of deployable structures for large space platform systems. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Greenberg, H. S.

    1983-01-01

    The preponderance of study effort was devoted toward the deployable platform systems study which culminated in the detailed design of a ground test article for future development testing. This design is representative of a prototype square-truss, single-fold building-block design that can construct deployable platform structures. This prototype design was selected through a comprehensive and traceable selection process applied to eight competitive designs. The selection process compared the competitive designs according to seven major selection criteria, i.e., design versatility, cost, thermal stability, meteoroid impact significance, reliability, performance predictability, and orbiter integration suitability. In support of the foregoing, a materials data base, and platform systems technology development needs were established. An erectable design of an OTV hangar was selected and recommended for further design development. This design was selected from five study-developed competitive single-fold and double-fold designs including hard-shell and inflatable designs. Also, two deployable manned module configurations, i.e., a hard-shell and an inflatable design were each developed to the same requirements as the composite of two Space station baseline habitat modules.

  1. Application of the HARDMAN methodology to the single channel ground-airborne radio system (SINCGARS)

    NASA Astrophysics Data System (ADS)

    Balcom, J.; Park, J.; Toomer, L.; Feng, T.

    1984-12-01

    The HARDMAN methodology is designed to assess the human resource requirements early in the weapon system acquisition process. In this case, the methodology was applied to the family of radios known as SINCGARS (Single Channel Ground-Airborne Radio System). At the time of the study, SINCGARS was approaching the Full-Scale Development phase, with 2 contractors in competition. Their proposed systems were compared with a composite baseline comparison (reference) system. The systems' manpower, personnel and training requirements were compared. Based on RAM data, the contractors' MPT figures showed a significant reduction from the figures derived for the baseline comparison system. Differences between the two contractors were relatively small. Impact and some tradeoff analyses were hindered by data access problems. Tactical radios, manpower and personnel requirements analysis, impact and tradeoff analysis, human resource sensitivity, training requirements analysis, human resources in LCSMM, and logistics analyses are discussed.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz-Fellenz, Emily S.

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation andmore » careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.« less

  3. Low gravity synthesis of polymers with controlled molecular configuration

    NASA Technical Reports Server (NTRS)

    Heimbuch, A. H.; Parker, J. A.; Schindler, A.; Olf, H. G.

    1975-01-01

    Heterogeneous chemical systems have been studied for the synthesis of isotactic polypropylene in order to establish baseline parameters for the reaction process and to develop sensitive and accurate methods of analysis. These parameters and analytical methods may be used to make a comparison between the polypropylene obtained at one g with that of zero g (gravity). Baseline reaction parameters have been established for the slurry (liquid monomer in heptane/solid catalyst) polymerization of propylene to yield high purity, 98% isotactic polypropylene. Kinetic data for the slurry reaction showed that a sufficient quantity of polymer for complete characterization can be produced in a reaction time of 5 min; this time is compatible with that available on a sounding rocket for a zero-g simulation experiment. The preformed (activated) catalyst was found to be more reproducible in its activity than the in situ formed catalyst.

  4. Financial gains and risks in pay-for-performance bonus algorithms.

    PubMed

    Cromwell, Jerry; Drozd, Edward M; Smith, Kevin; Trisolini, Michael

    2007-01-01

    Considerable attention has been given to evidence-based process indicators associated with quality of care, while much less attention has been given to the structure and key parameters of the various pay-for-performance (P4P) bonus and penalty arrangements using such measures. In this article we develop a general model of quality payment arrangements and discuss the advantages and disadvantages of the key parameters. We then conduct simulation analyses of four general P4P payment algorithms by varying seven parameters, including indicator weights, indicator intercorrelation, degree of uncertainty regarding intervention effectiveness, and initial baseline rates. Bonuses averaged over several indicators appear insensitive to weighting, correlation, and the number of indicators. The bonuses are sensitive to disease manager perceptions of intervention effectiveness, facing challenging targets, and the use of actual-to-target quality levels versus rates of improvement over baseline.

  5. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Harry C.; Fang, Ho T.

    1991-01-01

    The results of a four year program to improve the strength and reliability of injection-molded silicon nitride are summarized. Statistically designed processing experiments were performed to identify and optimize critical processing parameters and compositions. Process improvements were monitored by strength testing at room and elevated temperatures, and microstructural characterization by optical, scanning electron microscopes, and scanning transmission electron microscope. Processing modifications resulted in a 20 percent strength and 72 percent Weibull slope improvement of the baseline material. Additional sintering aids screening and optimization experiments succeeded in developing a new composition (GN-10) capable of 581.2 MPa at 1399 C. A SiC whisker toughened composite using this material as a matrix achieved a room temperature toughness of 6.9 MPa m(exp .5) by the Chevron notched bar technique. Exploratory experiments were conducted on injection molding of turbocharger rotors.

  6. The effects of Young Adults Eating and Active for Health (YEAH): a theory-based Web-delivered intervention.

    PubMed

    Kattelmann, Kendra K; Bredbenner, Carol Byrd; White, Adrienne A; Greene, Geoffrey W; Hoerr, Sharon L; Kidd, Tandalayo; Colby, Sarah; Horacek, Tanya M; Phillips, Beatrice W; Koenings, Mallory M; Brown, Onikia N; Olfert, Melissa D; Shelnutt, Karla P; Morrell, Jesse Stabile

    2014-01-01

    To assess the effectiveness of a tailored theory-based, Web-delivered intervention (Young Adults Eating and Active for Health) developed using community-based participatory research process. A 15-month (10-week intensive intervention with a 12-month follow-up) randomized, controlled trial delivered via Internet and e-mail. Thirteen college campuses. A total of 1,639 college students. Twenty-one mini-educational lessons and e-mail messages (called nudges) developed with the non-diet approach and focusing on eating behavior, physical activity, stress management, and healthy weight management. Nudges were short, frequent, entertaining, and stage-tailored to each behavior, and reinforced lesson content. All participants were assessed at baseline, postintervention (3 months from baseline), and follow-up (15 months from baseline) for primary outcomes of weight, body mass index (BMI), fruit and vegetable intake (FVI), physical activity (PA), and perceived stress; and secondary outcomes of waist circumference, percent dietary fat, energy from sugar-sweetened beverages, servings of whole grains, self-instruction and regulation for mealtime behavior, hours of sleep, and stage of readiness for change for consuming 5 cups of FVI, completing 150 minutes of PA/wk, and managing stress on most days of the week. Demographics were collected at baseline. Chi-square analysis and mixed-models repeated measures analysis were performed to determine differences between experimental and control outcomes. There were no differences between experimental and control participants in BMI, weight, and waist circumference. There were small improvements in FVI (P = .001), vigorous PA in females (P = .05), fat intake (P = .002), self-instruction (P = .001), and regulation (P = .004) for mealtime behavior, and hours of sleep (P = .05) at postintervention, but improvements were not maintained at follow-up. At postintervention, a greater proportion of experimental participants were in the action/maintenance stages for FVI (P = .019) and PA (P = .002) than control. Young Adults Eating and Active for Health is one of the first studies to use the community-based participatory research process of PRECEDE-PROCEED to develop a non-diet approach intervention. Although there were no differences between experimental and control participants in weight change or BMI, the intervention supported positive change in behaviors that may mediate excessive weight gain, such as increasing FVI and more healthful self-regulation mealtime behaviors immediately postintervention. Additional strategies to maintain the behavior changes need to be explored. Copyright © 2014 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  7. Profile of software engineering within the National Aeronautics and Space Administration (NASA)

    NASA Technical Reports Server (NTRS)

    Sinclair, Craig C.; Jeletic, Kellyann F.

    1994-01-01

    This paper presents findings of baselining activities being performed to characterize software practices within the National Aeronautics and Space Administration. It describes how such baseline findings might be used to focus software process improvement activities. Finally, based on the findings to date, it presents specific recommendations in focusing future NASA software process improvement efforts. The findings presented in this paper are based on data gathered and analyzed to date. As such, the quantitative data presented in this paper are preliminary in nature.

  8. Clinical audit of diabetes management can improve the quality of care in a resource-limited primary care setting.

    PubMed

    Govender, Indira; Ehrlich, Rodney; Van Vuuren, Unita; De Vries, Elma; Namane, Mosedi; De Sa, Angela; Murie, Katy; Schlemmer, Arina; Govender, Strini; Isaacs, Abdul; Martell, Rob

    2012-12-01

    To determine whether clinical audit improved the performance of diabetic clinical processes in the health district in which it was implemented. Patient folders were systematically sampled annually for review. Primary health-care facilities in the Metro health district of the Western Cape Province in South Africa. Health-care workers involved in diabetes management. Clinical audit and feedback. The Skillings-Mack test was applied to median values of pooled audit results for nine diabetic clinical processes to measure whether there were statistically significant differences between annual audits performed in 2005, 2007, 2008 and 2009. Descriptive statistics were used to illustrate the order of values per process. A total of 40 community health centres participated in the baseline audit of 2005 that decreased to 30 in 2009. Except for two routine processes, baseline medians for six out of nine processes were below 50%. Pooled audit results showed statistically significant improvements in seven out of nine clinical processes. The findings indicate an association between the application of clinical audit and quality improvement in resource-limited settings. Co-interventions introduced after the baseline audit are likely to have contributed to improved outcomes. In addition, support from the relevant government health programmes and commitment of managers and frontline staff contributed to the audit's success.

  9. Automatic Coregistration and orthorectification (ACRO) and subsequent mosaicing of NASA high-resolution imagery over the Mars MC11 quadrangle, using HRSC as a baseline

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian

    2018-02-01

    This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.

  10. Development of Integrated Programs for Aerospace-Vehicle Design (IPAD)

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Calvery, A. L.; Davis, D. A.; Dickmann, L.; Folger, D. H.; Jochem, E. N.; Kitto, C. M.; Vonlimbach, G.

    1977-01-01

    Integrated Programs for Aerospace Vehicle Design (IPAD) system design requirements are given. The information is based on the IPAD User Requirements Document (D6-IPAD-70013-D) and the Integrated Information Processing Requirements Document (D6-IPAD-70012-D). General information about IPAD and a list of the system design requirements that are to be satisfied by the IPAD system are given. The system design requirements definition is to be considered as a baseline definition of the IPAD system design requirements.

  11. Validating the Novel Method of Measuring Cortisol Levels in Cetacean Skin by use of an ACTH Challenge in Bottlenose Dolphins

    DTIC Science & Technology

    2015-09-30

    e.g. blubber biopsies). This process has shown to significantly raise both cortisol and aldosterone above baseline conditions and thus equals an...opening up a new avenue of research in physiological response studies following exposure to stressors. The current study will provide the validation... Physiology , 3: doi:10.1093/conphys/cov016 PUBLICATIONS Bechshøft TØ, Wright AJ, Teilmann J, Dietz R, Hansen M, Weisser JJ & Styrishave B. Developing a

  12. Site planning and integration fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SCHULTZ, E.A.

    The primary mission of the Site Planning and Integration (SP and I) project is to assist Fluor Daniel Project Direction to ensure that all work performed under the Project Hanford Management Contract (PHMC) is adequately planned, executed, controlled, and that performance is measured and reported in an integrated fashion. Furthermore, SP and I is responsible for the development, implementation, and management of systems and processes that integrate technical, schedule, and cost baselines for PHMC work.

  13. Association of Baseline Anterior Segment Parameters With the Development of Incident Gonioscopic Angle Closure.

    PubMed

    Nongpiur, Monisha E; Aboobakar, Inas F; Baskaran, Mani; Narayanaswamy, Arun; Sakata, Lisandro M; Wu, Renyi; Atalay, Eray; Friedman, David S; Aung, Tin

    2017-03-01

    Baseline anterior segment imaging parameters associated with incident gonioscopic angle closure, to our knowledge, are unknown. To identify baseline quantitative anterior segment optical coherence tomography parameters associated with the development of incident gonioscopic angle closure after 4 years among participants with gonioscopically open angles at baseline. Three hundred forty-two participants aged 50 years or older were recruited to participate in this prospective, community-based observational study. Participants underwent gonioscopy and anterior segment optical coherence tomography imaging at baseline and after 4 years. Custom image analysis software was used to quantify anterior chamber parameters from anterior segment optical coherence tomography images. Baseline anterior segment optical coherence tomography measurements among participants with gonioscopically open vs closed angles at follow-up. Of the 342 participants, 187 (55%) were women and 297 (87%) were Chinese. The response rate was 62.4%. Forty-nine participants (14.3%) developed gonioscopic angle closure after 4 years. The mean age (SD) at baseline of the 49 participants was 62.9 (8.0) years, 15 (30.6%) were men, and 43 (87.8%) were Chinese. These participants had a smaller baseline angle opening distance at 750 µm (AOD750) (0.15 mm; 95% CI, 0.12-0.18), trabecular iris surface area at 750 µm (0.07 mm2; 95% CI, 0.05-0.08), anterior chamber area (30 mm2; 95% CI, 2.27-3.74), and anterior chamber volume (24.32 mm2; 95% CI, 18.20-30.44) (all P < .001). Baseline iris curvature (-0.08; 95% CI, -0.12 to -0.04) and lens vault (LV) measurements (-0.29 mm; 95% CI, -0.37 to -0.21) were larger among these participants ( all P < .001). A model consisting of the LV and AOD750 measurements explained 38% of the variance in gonioscopic angle closure occurring at 4 years, with LV accounting for 28% of this variance. For every 0.1 mm increase in LV and 0.1 mm decrease in AOD750, the odds of developing gonioscopic angle closure was 1.29 (95% CI, 1.07-1.57) and 3.27 (95% CI, 1.87-5.69), respectively. In terms of per SD change in LV and AOD750, this translates to an odds ratio of 2.14 (95% CI, 2.48-12.34) and 5.53 (95% CI, 1.22-3.77), respectively. A baseline LV cut-off value of >0.56 mm had 64.6% sensitivity and 84.0% specificity for identifying participants who developed angle closure. These findings suggest that smaller AOD750 and larger LV measurements are associated with the development of incident gonioscopic angle closure after 4 years among participants with gonioscopically open angles at baseline.

  14. Expanding lean thinking to the product and process design and development within the framework of sustainability

    NASA Astrophysics Data System (ADS)

    Sorli, M.; Sopelana, A.; Salgado, M.; Pelaez, G.; Ares, E.

    2012-04-01

    Companies require tools to change towards a new way of developing and producing innovative products to be manufactured considering the economic, social and environmental impact along the product life cycle. Based on translating Lean principles in Product Development (PD) from the design stage and, along the entire product life cycle, it is aimed to address both sustainability and environmental issues. The drivers of sustainable culture within a lean PD have been identified and a baseline for future research on the development of appropriate tools and techniques has been provided. This research provide industry with a framework which balance environmental and sustainable factors with lean principles to be considered and incorporated from the beginning of product design and development covering the entire product lifecycle.

  15. Problem gambling symptomatology and alcohol misuse among adolescents: A parallel-process latent growth curve model.

    PubMed

    Mutti-Packer, Seema; Hodgins, David C; El-Guebaly, Nady; Casey, David M; Currie, Shawn R; Williams, Robert J; Smith, Garry J; Schopflocher, Don P

    2017-06-01

    The objective of the current study was to examine the possible temporal associations between alcohol misuse and problem gambling symptomatology from adolescence through to young adulthood. Parallel-process latent growth curve modeling was used to examine the trajectories of alcohol misuse and symptoms of problem gambling over time. Data were from a sample of adolescents recruited for the Leisure, Lifestyle, and Lifecycle Project in Alberta, Canada (n = 436), which included 4 assessments over 5 years. There was an average decline in problem gambling symptoms followed by an accelerating upward trend as the sample reached the legal age to gamble. There was significant variation in the rate of change in problem gambling symptoms over time; not all respondents followed the same trajectory. There was an average increase in alcohol misuse over time, with significant variability in baseline levels of use and the rate of change over time. The unconditional parallel process model indicated that higher baseline levels of alcohol misuse were associated with higher baseline levels of problem gambling symptoms. In addition, higher baseline levels of alcohol misuse were associated with steeper declines in problem gambling symptoms over time. However, these between-process correlations did not retain significance when covariates were added to the model, indicating that one behavior was not a risk factor for the other. The lack of mutual influence in the problem gambling symptomatology and alcohol misuse processes suggest that there are common risk factors underlying these two behaviors, supporting the notion of a syndrome model of addiction. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Developing Teaching Expertise in Dental Education

    ERIC Educational Resources Information Center

    Lyon, Lucinda J.

    2009-01-01

    This exploratory study was designed to develop a baseline model of expertise in dental education utilizing the Dreyfus and Dreyfus continuum of skill acquisition. The goal was the development of a baseline model of expertise, which will contribute to the body of knowledge about dental faculty skill acquisition and may enable dental schools to…

  17. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    PubMed

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  18. Treatment of biodiversity issues in impact assessment of electricity power transmission lines: A Finnish case review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soederman, Tarja

    The Environmental Impact Assessment (EIA) process concerning the route of a 400 kV power transmission line between Loviisa and Hikiae in southern Finland was reviewed in order to assess how biodiversity issues are treated and to provide suggestions on how to improve the effectiveness of treatment of biodiversity issues in impact assessment of linear development projects. The review covered the whole assessment process, including interviews of stakeholders, participation in the interest group meetings and review of all documents from the project. The baseline studies and assessment of direct impacts in the case study were detailed but the documentation, both themore » assessment programme and the assessment report, only gave a partial picture of the assessment process. All existing information, baseline survey and assessment methods should be addressed in the scoping phase in order to promote interaction between all stakeholders. In contrast to the assessment of the direct effects, which first emphasized impacts on the nationally important and protected flying squirrel but later expanded to deal with the assessment of impacts on ecologically important sites, the indirect and cumulative impacts of the power line were poorly addressed. The public was given the opportunity to become involved in the EIA process. However, they were more concerned with impacts on their properties and less so on biodiversity and species protection issues. This suggests that the public needs to become more informed about locally important features of biodiversity.« less

  19. Crossfit training changes brain-derived neurotrophic factor and irisin levels at rest, after wingate and progressive tests, and improves aerobic capacity and body composition of young physically active men and women.

    PubMed

    Murawska-Cialowicz, E; Wojna, J; Zuwala-Jagiello, J

    2015-12-01

    Brain-derived neurotrophic factor (BDNF) is a protein that stimulates processes of neurogenesis, the survival of neurons and microglia, stimulates neuroplasticity, and takes part in the differentiation of cells developed in the hippocampus. BDNF is also released from skeletal muscles during exercise and can facilitate cross-talk between the nervous and muscular system. Irisin, the exercise hormone, is also released from skeletal muscles and is involved in oxidation processes in the organism. It is a vital issue from the point of view of prophylaxis and treatment through exercise of age-related diseases (e.g. senile dementia), obesity, type-2 diabetes. The aim of the study was to assess the changes in BDNF and irisin levels in young people after a 3-month CrossFit training program. At baseline and after the training, levels of BDNF and irisin were assayed before and after Wingate and progressive tests. Physical performance, body mass and composition, and muscle circumferences were also measured. There were noted: an improvement in aerobic capacity, an increase in VO2max, a reduction in adipose tissue percentage in women and an increase in LBM in all subjects. After CrossFit training the resting BDNF level increased significantly in all subjects while the resting level of irisin decreased in women, without changes in men. The resting level of BDNF at baseline was higher in men than in women. At baseline we observed an increased level of BDNF in women after Wingate and progressive tests, but in men only after the progressive test. After 3 months of CrossFit training the level of BDNF increased in all subjects, and also was higher in men than in women. In women we did not observe significant differences after both tests in comparison to rest. After the training BDNF was lower in men after Wingate and progressive tests than at rest. At baseline irisin level decreased in women after the Wingate and progressive tests. Changes in men were not observed after both tests. There were no differences in irisin levels between the baseline and 3 months after the training after Wingate and progressive tests. A beneficial influence of CrossFit training on the subjects' body composition, anaerobic capacity and cardiovascular fitness as well as an increase in BDNF makes it possible to assume that this type of training could have a very high application value, especially in a therapeutic process leading to improving a patient's wellbeing.

  20. [Baseline correction of spectrum for the inversion of chlorophyll-a concentration in the turbidity water].

    PubMed

    Wei, Yu-Chun; Wang, Guo-Xiang; Cheng, Chun-Mei; Zhang, Jing; Sun, Xiao-Peng

    2012-09-01

    Suspended particle material is the main factor affecting remote sensing inversion of chlorophyll-a concentration (Chla) in turbidity water. According to the optical property of suspended material in water, the present paper proposed a linear baseline correction method to weaken the suspended particle contribution in the spectrum above turbidity water surface. The linear baseline was defined as the connecting line of reflectance from 450 to 750 nm, and baseline correction is that spectrum reflectance subtracts the baseline. Analysis result of field data in situ of Meiliangwan, Taihu Lake in April, 2011 and March, 2010 shows that spectrum linear baseline correction can improve the inversion precision of Chl a and produce the better model diagnoses. As the data in March, 2010, RMSE of band ratio model built by original spectrum is 4.11 mg x m(-3), and that built by spectrum baseline correction is 3.58 mg x m(-3). Meanwhile, residual distribution and homoscedasticity in the model built by baseline correction spectrum is improved obviously. The model RMSE of April, 2011 shows the similar result. The authors suggest that using linear baseline correction as the spectrum processing method to improve Chla inversion accuracy in turbidity water without algae bloom.

  1. Quantifying Process-Based Mitigation Strategies in Historical Context: Separating Multiple Cumulative Effects on River Meander Migration

    PubMed Central

    Fremier, Alexander K.; Girvetz, Evan H.; Greco, Steven E.; Larsen, Eric W.

    2014-01-01

    Environmental legislation in the US (i.e. NEPA) requires defining baseline conditions on current rather than historical ecosystem conditions. For ecosystems with long histories of multiple environmental impacts, this baseline method can subsequently lead to a significantly altered environment; this has been termed a ‘sliding baseline’. In river systems, cumulative effects caused by flow regulation, channel revetment and riparian vegetation removal significantly impact floodplain ecosystems by altering channel dynamics and precluding subsequent ecosystem processes, such as primary succession. To quantify these impacts on floodplain development processes, we used a model of river channel meander migration to illustrate the degree to which flow regulation and riprap impact migration rates, independently and synergistically, on the Sacramento River in California, USA. From pre-dam conditions, the cumulative effect of flow regulation alone on channel migration is a reduction by 38%, and 42–44% with four proposed water diversion project scenarios. In terms of depositional area, the proposed water project would reduce channel migration 51–71 ha in 130 years without current riprap in place, and 17–25 ha with riprap. Our results illustrate the utility of a modeling approach for quantifying cumulative impacts. Model-based quantification of environmental impacts allow scientists to separate cumulative and synergistic effects to analytically define mitigation measures. Additionally, by selecting an ecosystem process that is affected by multiple impacts, it is possible to consider process-based mitigation scenarios, such as the removal of riprap, to allow meander migration and create new floodplains and allow for riparian vegetation recruitment. PMID:24964145

  2. Planetary Protection Knowledge Gaps for Human Extraterrestrial Missions Workshop Booklet - 2015

    NASA Technical Reports Server (NTRS)

    Fonda, Mark L.

    2015-01-01

    Although NASA's preparations for the Apollo lunar missions had only a limited time to consider issues associated with the protection of the Moon from biological contamination and the quarantine of the astronauts returning to Earth, they learned many valuable lessons (both positive and negative) in the process. As such, those efforts represent the baseline of planetary protection preparations for sending humans to Mars. Neither the post-Apollo experience or the Shuttle and other follow-on missions of either the US or Russian human spaceflight programs could add many additional insights to that baseline. Current mission designers have had the intervening four decades for their consideration, and in that time there has been much learned about human-associated microbes, about Mars, and about humans in space that has helped prepare us for a broad spectrum of considerations regarding potential biological contamination in human Mars missions and how to control it. This paper will review the approaches used in getting this far, and highlight some implications of this history for the future development of planetary protection provisions for human missions to Mars. The role of NASA and ESA's planetary protection offices, and the aegis of COSPAR have been particularly important in the ongoing process.

  3. Synchronized LES for acoustic near-field analysis of a supersonic jet

    NASA Astrophysics Data System (ADS)

    S, Unnikrishnan; Gaitonde, Datta; The Ohio State University Team

    2014-11-01

    We develop a novel method using simultaneous, synchronized Large Eddy Simulations (LES) to examine the manner in which the plume of a supersonic jet generates the near acoustic field. Starting from a statistically stationary state, at each time-step, the first LES (Baseline) is used to obtain native perturbations, which are then localized in space, scaled to small values and injected into the second LES (Twin). At any subsequent time, the difference between the two simulations can be processed to discern how disturbances from any particular zone in the jet are modulated and filtered by the non-linear core to form the combined hydrodynamic and acoustic near field and the fully acoustic farfield. Unlike inverse techniques that use correlations between jet turbulence and far-field signals to infer causality, the current forward analysis effectively tags and tracks native perturbations as they are processed by the jet. Results are presented for a Mach 1.3 cold jet. Statistical analysis of the baseline and perturbation boost provides insight into different mechanisms of disturbance propagation, amplification, directivity, generation of intermittent wave-packet like events and the direct and indirect effect of different parts of the jet on the acoustic field. Office of Naval Research.

  4. Correlation Between Corneal Button Size and Intraocular Pressure During Femtosecond Laser-Assisted Keratoplasty.

    PubMed

    Choi, Mihyun; Lee, Yong Eun; Whang, Woong-Joo; Yoo, Young-Sik; Na, Kyung-Sun; Joo, Choun-Ki

    2016-03-01

    To evaluate changes in intraocular pressure (IOP) in recipient and donor eyes during femtosecond laser-assisted keratoplasty (FLAK) and to assess for differences in the diameter of trephinated corneal buttons according to changes in pressure. Twenty porcine whole eyes (recipient model) and 20 porcine-corneoscleral rims (donor model) were prepared, and anterior chamber pressures were measured using a fiberoptic sensing device (Opsens, Quebec, Canada) during the femtosecond laser corneal cutting process. To determine the diameter of corneal buttons, 10 porcine whole eyes (recipient model) and 12 corneoscleral rims (donor model) of each baseline IOP were cut with the femtosecond laser programmed to the following pattern: "vertical side cut"; 1200 μm (depth), 8 mm (diameter). Digital photographs were obtained using microscopy and subsequently analyzed. The IOP (mean ± SD) for the recipient model was 10.2 (±0.9) mm Hg at baseline and ranged from 96.6 (±4.5) to ∼138.4 (±3.8) mm Hg during the corneal cutting process. This shows that the maximum IOP during FLAK increased 13.5 times compared with baseline. In the donor model, the mean pressure elevation from baseline artificial anterior chamber (AAC) pressure to corneal cutting was 15.8 (±5.4) mm Hg. This showed a positive correlation with baseline IOP [correlation coefficient (CC) = 0.827, P = 0.006]. As the baseline IOP in the recipient eye increased, trephinated corneal button size was reduced by up to 3.9% in diameter (CC = -0.945, P = 0.015). In addition, in donor eyes, the diameter was decreased by up to 11.7% as the baseline AAC pressure increased (CC = -0.934, P = 0.006). During the FLAK procedure, the IOP increases in both recipient and donor eyes. The diameter of the trephinated donor and recipient corneal buttons was decreased as the initial baseline IOP increased. Ophthalmic surgeons can determine the AAC pressure based on the baseline IOP in the recipient patient.

  5. Evolutionary process development towards next generation crystalline silicon solar cells : a semiconductor process toolbox application

    NASA Astrophysics Data System (ADS)

    John, J.; Prajapati, V.; Vermang, B.; Lorenz, A.; Allebe, C.; Rothschild, A.; Tous, L.; Uruena, A.; Baert, K.; Poortmans, J.

    2012-08-01

    Bulk crystalline Silicon solar cells are covering more than 85% of the world's roof top module installation in 2010. With a growth rate of over 30% in the last 10 years this technology remains the working horse of solar cell industry. The full Aluminum back-side field (Al BSF) technology has been developed in the 90's and provides a production learning curve on module price of constant 20% in average. The main reason for the decrease of module prices with increasing production capacity is due to the effect of up scaling industrial production. For further decreasing of the price per wattpeak silicon consumption has to be reduced and efficiency has to be improved. In this paper we describe a successive efficiency improving process development starting from the existing full Al BSF cell concept. We propose an evolutionary development includes all parts of the solar cell process: optical enhancement (texturing, polishing, anti-reflection coating), junction formation and contacting. Novel processes are benchmarked on industrial like baseline flows using high-efficiency cell concepts like i-PERC (Passivated Emitter and Rear Cell). While the full Al BSF crystalline silicon solar cell technology provides efficiencies of up to 18% (on cz-Si) in production, we are achieving up to 19.4% conversion efficiency for industrial fabricated, large area solar cells with copper based front side metallization and local Al BSF applying the semiconductor toolbox.

  6. Utilizing the "Plan, Do, Study, Act" Framework to Explore the Process of Curricular Assessment and Redesign in a Physical Therapy Education Program in Suriname.

    PubMed

    Audette, Jennifer Gail; Baldew, Se-Sergio; Chang, Tony C M S; de Vries, Jessica; Ho A Tham, Nancy; Janssen, Johanna; Vyt, Andre

    2017-01-01

    To describe how a multinational team worked together to transition a physical therapy (PT) educational program in Paramaribo, Suriname, from a Bachelor level to a Master of Science in Physical Therapy (MSPT) level. The team was made up of PT faculty from Anton De Kom Universiteit van Suriname (AdeKUS), the Flemish Interuniversity Council University Development Cooperation (VLIR-UOS) leadership, and Health Volunteers Overseas volunteers. In this case study, the process for curricular assessment, redesign, and upgrade is described retrospectively using a Plan, Do, Study, Act (PDSA) framework. PT educational programs in developing countries are eager for upgrade to meet international expectations and to better meet community health-care needs. An ongoing process which included baseline assessment of all aspects of the existing bachelor's program in PT, development of a plan for a MSPT, implementation of the master's program, and evaluation following implementation is described. Curricular assessment and upgrade in resource-limited countries requires the implementation of process-oriented methods. The PDSA process is a useful tool to explore curricular development. The international collaboration described in this paper provides an example of the diligence, consistency, and dedication required to see a project through and achieve success while providing adequate support to the host site. This project might provide valuable insights for those involved in curricular redesign in similar settings.

  7. Neurocognitive predictors of financial capacity in traumatic brain injury.

    PubMed

    Martin, Roy C; Triebel, Kristen; Dreer, Laura E; Novack, Thomas A; Turner, Crystal; Marson, Daniel C

    2012-01-01

    To develop cognitive models of financial capacity (FC) in patients with traumatic brain injury (TBI). Longitudinal design. Inpatient brain injury rehabilitation unit. Twenty healthy controls, and 24 adults with moderate-to-severe TBI were assessed at baseline (30 days postinjury) and 6 months postinjury. The FC instrument (FCI) and a neuropsychological test battery. Univariate correlation and multiple regression procedures were employed to develop cognitive models of FCI performance in the TBI group, at baseline and 6-month time follow-up. Three cognitive predictor models of FC were developed. At baseline, measures of mental arithmetic/working memory and immediate verbal memory predicted baseline FCI performance (R = 0.72). At 6-month follow-up, measures of executive function and mental arithmetic/working memory predicted 6-month FCI performance (R = 0.79), and a third model found that these 2 measures at baseline predicted 6-month FCI performance (R = 0.71). Multiple cognitive functions are associated with initial impairment and partial recovery of FC in moderate-to-severe TBI patients. In particular, arithmetic, working memory, and executive function skills appear critical to recovery of FC in TBI. The study results represent an initial step toward developing a neurocognitive model of FC in patients with TBI.

  8. The power of event-driven analytics in Large Scale Data Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sebastiao, Nuno; Marques, Paulo

    2011-02-24

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of thismore » seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in the scope of the data processing needs of CERN. Pulse is available as open-­‐source and can be licensed both for non-­‐commercial and commercial applications. FeedZai is interested in exploring possible synergies with CERN in high-­‐volume low-­‐latency data processing applications. The seminar will be structured in two sessions, the first one being aimed to expose the general scope of FeedZai's activities, and the second focused on Pulse itself: 10:00-11:00 FeedZai and Large Scale Data Processing; Introduction to FeedZai; FeedZai Pulse and Complex Event Processing; Demonstration; Use-Cases and Applications; Conclusion and Q&A. 11:00-11:15 Coffee break 11:15-12:30 FeedZai Pulse Under the Hood; A First FeedZai Pulse Application; PulseQL overview; Defining KPIs and Baselines; Conclusion and Q&A. About the speakers Nuno Sebastião is the CEO of FeedZai. Having worked for many years for the European Space Agency (ESA), he was responsible the overall design and development of Satellite Simulation Infrastructure of the agency. Having left ESA to found FeedZai, Nuno is currently responsible for the whole operations of the company. Nuno holds an M.Eng. in Informatics Engineering for the University of Coimbra, and an MBA from the London Business School. Paulo Marques is the CTO of FeedZai, being responsible for product development. Paulo is an Assistant Professor at the University of Coimbra, in the area of Distributed Data Processing, and an Adjunct Associated Professor at Carnegie Mellon, in the US. In the past Paulo lead a large number of projects for institutions like the ESA, Microsoft Research, SciSys, Siemens, among others, being now fully dedicated to FeedZai. Paulo holds a Ph.D. in Distributed Systems from the University of Coimbra.« less

  9. The power of event-driven analytics in Large Scale Data Processing

    ScienceCinema

    None

    2017-12-09

    FeedZai is a software company specialized in creating high-­-throughput low-­-latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­-driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­-time web-­-based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­-20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­-scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in the scope of the data processing needs of CERN. Pulse is available as open-­-source and can be licensed both for non-­-commercial and commercial applications. FeedZai is interested in exploring possible synergies with CERN in high-­-volume low-­-latency data processing applications. The seminar will be structured in two sessions, the first one being aimed to expose the general scope of FeedZai's activities, and the second focused on Pulse itself: 10:00-11:00 FeedZai and Large Scale Data Processing Introduction to FeedZai FeedZai Pulse and Complex Event Processing Demonstration Use-Cases and Applications Conclusion and Q&A 11:00-11:15 Coffee break 11:15-12:30 FeedZai Pulse Under the Hood A First FeedZai Pulse Application PulseQL overview Defining KPIs and Baselines Conclusion and Q&A About the speakers Nuno Sebastião is the CEO of FeedZai. Having worked for many years for the European Space Agency (ESA), he was responsible the overall design and development of Satellite Simulation Infrastructure of the agency. Having left ESA to found FeedZai, Nuno is currently responsible for the whole operations of the company. Nuno holds an M.Eng. in Informatics Engineering for the University of Coimbra, and an MBA from the London Business School. Paulo Marques is the CTO of FeedZai, being responsible for product development. Paulo is an Assistant Professor at the University of Coimbra, in the area of Distributed Data Processing, and an Adjunct Associated Professor at Carnegie Mellon, in the US. In the past Paulo lead a large number of projects for institutions like the ESA, Microsoft Research, SciSys, Siemens, among others, being now fully dedicated to FeedZai. Paulo holds a Ph.D. in Distributed Systems from the University of Coimbra.

  10. Configuration Management Plan for the Tank Farm Contractor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WEIR, W.R.

    The Configuration Management Plan for the Tank Farm Contractor describes configuration management the contractor uses to manage and integrate its technical baseline with the programmatic and functional operations to perform work. The Configuration Management Plan for the Tank Farm Contractor supports the management of the project baseline by providing the mechanisms to identify, document, and control the technical characteristics of the products, processes, and structures, systems, and components (SSC). This plan is one of the tools used to identify and provide controls for the technical baseline of the Tank Farm Contractor (TFC). The configuration management plan is listed in themore » management process documents for TFC as depicted in Attachment 1, TFC Document Structure. The configuration management plan is an integrated approach for control of technical, schedule, cost, and administrative processes necessary to manage the mission of the TFC. Configuration management encompasses the five functional elements of: (1) configuration management administration, (2) configuration identification, (3) configuration status accounting, (4) change control, and (5 ) configuration management assessments.« less

  11. Development of forming and joining technology for TD-NiCr sheet

    NASA Technical Reports Server (NTRS)

    Torgerson, R. T.

    1973-01-01

    Forming joining techniques and properties data were developed for thin-gage TD-NiCr sheet in the recrystallized and unrecrystallized conditions. Theoretical and actual forming limit data are presented for several gages of each type of material for five forming processes: brake forming, corrugation forming, joggling, dimpling and beading. Recrystallized sheet can be best formed at room temperature, but unrecrystallized sheet requires forming at elevated temperature. Formability is satisfactory with most processes for the longitudinal orientation but poor for the transverse orientation. Dimpling techniques require further development for both material conditions. Data on joining techniques and joint properties are presented for four joining processes: resistance seam welding (solid-state), resistance spot welding (solid-state), resistance spot welding (fusion) and brazing. Resistance seam welded (solid-state) joints with 5t overlap were stronger than parent material for both material conditions when tested in tensile-shear and stress-rupture. Brazing studies resulted in development of NASA 18 braze alloy (Ni-16Cr-15Mo-8Al-4Si) with several properties superior to baseline TD-6 braze alloy, including lower brazing temperture, reduced reaction with Td-Ni-Cr, and higher stress-rupture properties.

  12. Soft x-ray imager (SXI) onboard the NeXT satellite

    NASA Astrophysics Data System (ADS)

    Tsuru, Takeshi Go; Takagi, Shin-Ichiro; Matsumoto, Hironori; Inui, Tatsuya; Ozawa, Midori; Koyama, Katsuji; Tsunemi, Hiroshi; Hayashida, Kiyoshi; Miyata, Emi; Ozawa, Hideki; Touhiguchi, Masakuni; Matsuura, Daisuke; Dotani, Tadayasu; Ozaki, Masanobu; Murakami, Hiroshi; Kohmura, Takayoshi; Kitamoto, Shunji; Awaki, Hisamitsu

    2006-06-01

    We give overview and the current status of the development of the Soft X-ray Imager (SXI) onboard the NeXT satellite. SXI is an X-ray CCD camera placed at the focal plane detector of the Soft X-ray Telescopes for Imaging (SXT-I) onboard NeXT. The pixel size and the format of the CCD is 24 x 24μm (IA) and 2048 x 2048 x 2 (IA+FS). Currently, we have been developing two types of CCD as candidates for SXI, in parallel. The one is front illumination type CCD with moderate thickness of the depletion layer (70 ~ 100μm) as a baseline plan. The other one is the goal plan, in which we develop back illumination type CCD with a thick depletion layer (200 ~ 300μm). For the baseline plan, we successfully developed the proto model 'CCD-NeXT1' with the pixel size of 12μm x 12μm and the CCD size of 24mm x 48mm. The depletion layer of the CCD has reached 75 ~ 85μm. The goal plan is realized by introduction of a new type of CCD 'P-channel CCD', which collects holes in stead of electrons in the common 'N-channel CCD'. By processing a test model of P-channel CCD we have confirmed high quantum efficiency above 10 keV with an equivalent depletion layer of 300μm. A back illumination type of P-channel CCD with a depletion layer of 200μm with aluminum coating for optical blocking has been also successfully developed. We have been also developing a thermo-electric cooler (TEC) with the function of the mechanically support of the CCD wafer without standoff insulators, for the purpose of the reduction of thermal input to the CCD through the standoff insulators. We have been considering the sensor housing and the onboard electronics for the CCD clocking, readout and digital processing of the frame date.

  13. Development of Very Long Baseline Interferometry (VLBI) techniques in New Zealand: Array simulation, image synthesis and analysis

    NASA Astrophysics Data System (ADS)

    Weston, S. D.

    2008-04-01

    This thesis presents the design and development of a process to model Very Long Base Line Interferometry (VLBI) aperture synthesis antenna arrays. In line with the Auckland University of Technology (AUT) Institute for Radiophysics and Space Research (IRSR) aims to develop the knowledge, skills and experience within New Zealand, extensive use of existing radio astronomical software has been incorporated into the process namely AIPS (Astronomical Imaging Processing System), MIRIAD (a radio interferometry data reduction package) and DIFMAP (a program for synthesis imaging of visibility data from interferometer arrays of radio telescopes). This process has been used to model various antenna array configurations for two proposed New Zealand sites for antenna in a VLBI array configuration with existing Australian facilities and a passable antenna at Scott Base in Antarctica; and the results are presented in an attempt to demonstrate the improvement to be gained by joint trans-Tasman VLBI observation. It is hoped these results and process will assist the planning and placement of proposed New Zealand radio telescopes for cooperation with groups such as the Australian Long Baseline Array (LBA), others in the Pacific Rim and possibly globally; also potential future involvement of New Zealand with the SKA. The developed process has also been used to model a phased building schedule for the SKA in Australia and the addition of two antennas in New Zealand. This has been presented to the wider astronomical community via the Royal Astronomical Society of New Zealand Journal, and is summarized in this thesis with some additional material. A new measure of quality ("figure of merit") for comparing the original model image and final CLEAN images by utilizing normalized 2-D cross correlation is evaluated as an alternative to the existing subjective visual operator image comparison undertaken to date by other groups. This new unit of measure is then used ! in the presentation of the results to provide a quantative comparison of the different array configurations modelled. Included in the process is the development of a new antenna array visibility program which was based on a Perl code script written by Prof Steven Tingay to plot antenna visibilities for the Australian Square Kilometre Array (SKA) proposal. This has been expanded and improved removing the hard coded fixed assumptions for the SKA configuration, providing a new useful and flexible program for the wider astronomical community. A prototype user interface using html/cgi/perl was developed for the process so that the underlying software packages can be served over the web to a user via an internet browser. This was used to demonstrate how easy it is to provide a friendlier interface compared to the existing cumbersome and difficult command line driven interfaces (although the command line can be retained for more experienced users).

  14. Development of the hybrid sulfur cycle for use with concentrated solar heat. I. Conceptual design

    DOE PAGES

    Gorensek, Maximilian B.; Corgnale, Claudio; Summers, William A.

    2017-07-27

    We propose a detailed conceptual design of a solar hybrid sulfur (HyS) cycle. Numerous design tradeoffs, including process operating conditions and strategies, methods of integration with solar energy sources, and solar design options were considered. A baseline design was selected, and process flowsheets were developed. Pinch analyses were performed to establish the limiting energy efficiency. Detailed material and energy balances were completed, and a full stream table prepared. Design assumptions include use of: location in the southwest US desert, falling particle concentrated solar receiver, indirect heat transfer via pressurized helium, continuous operation with thermal energy storage, liquid-fed electrolyzer with PBImore » membrane, and bayonet-type acid decomposer. Thermochemical cycle efficiency for the HyS process was estimated to be 35.0%, LHV basis. The solar-to-hydrogen (STH) energy conversion ratio was 16.9%. This thus exceeds the Year 2015 DOE STCH target of STH >10%, and shows promise for meeting the Year 2020 target of 20%.« less

  15. Error modeling for differential GPS. M.S. Thesis - MIT, 12 May 1995

    NASA Technical Reports Server (NTRS)

    Blerman, Gregory S.

    1995-01-01

    Differential Global Positioning System (DGPS) positioning is used to accurately locate a GPS receiver based upon the well-known position of a reference site. In utilizing this technique, several error sources contribute to position inaccuracy. This thesis investigates the error in DGPS operation and attempts to develop a statistical model for the behavior of this error. The model for DGPS error is developed using GPS data collected by Draper Laboratory. The Marquardt method for nonlinear curve-fitting is used to find the parameters of a first order Markov process that models the average errors from the collected data. The results show that a first order Markov process can be used to model the DGPS error as a function of baseline distance and time delay. The model's time correlation constant is 3847.1 seconds (1.07 hours) for the mean square error. The distance correlation constant is 122.8 kilometers. The total process variance for the DGPS model is 3.73 sq meters.

  16. Development of the hybrid sulfur cycle for use with concentrated solar heat. I. Conceptual design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, Maximilian B.; Corgnale, Claudio; Summers, William A.

    We propose a detailed conceptual design of a solar hybrid sulfur (HyS) cycle. Numerous design tradeoffs, including process operating conditions and strategies, methods of integration with solar energy sources, and solar design options were considered. A baseline design was selected, and process flowsheets were developed. Pinch analyses were performed to establish the limiting energy efficiency. Detailed material and energy balances were completed, and a full stream table prepared. Design assumptions include use of: location in the southwest US desert, falling particle concentrated solar receiver, indirect heat transfer via pressurized helium, continuous operation with thermal energy storage, liquid-fed electrolyzer with PBImore » membrane, and bayonet-type acid decomposer. Thermochemical cycle efficiency for the HyS process was estimated to be 35.0%, LHV basis. The solar-to-hydrogen (STH) energy conversion ratio was 16.9%. This thus exceeds the Year 2015 DOE STCH target of STH >10%, and shows promise for meeting the Year 2020 target of 20%.« less

  17. [Pregnancy in the context of general adaptation syndrome].

    PubMed

    Gur'ianov, V A; Pyregov, A V; Tolmachev, G N; Volodin, A V

    2007-01-01

    Based on their own findings and the data available in the literature on pregnancy including that complicated by gestosis, the authors consider these conditions in the context of Selye's general adaptation syndrome. They identify its basic links (the autonomic nervous and cardiovascular systems) the function of which is affected by all the physiological and pathophysiological processes involved in its development. There is a high likelihood of baseline impaired adaption processes in these links, which may lead to an inability to accommodate (dysadaptation) by the moment of delivery. The paper gives the current interpretation of functional disorders, called Zangemeister'a triad in 1913, from the present-day points of view of the evaluation of pregnancy as the systemic inflammatory response syndrome and, probably, adaptation disease. Based on the results of analyzing the data available in the literature, the authors indicate physiologically the basic trends in the modulation of impaired development processes of the general adaptation syndrome towards the completion of pregnancy and surgical delivery.

  18. Arc-Heater Facility for Hot Hydrogen Exposure of Nuclear Thermal Rocket Materials

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.; Foote, John P.; Wang,Ten-See; Hickman, Robert; Panda, Binayak; Dobson, Chris; Osborne, Robin; Clifton, Scooter

    2006-01-01

    A hyper-thermal environment simulator is described for hot hydrogen exposure of nuclear thermal rocket material specimens and component development. This newly established testing capability uses a high-power, multi-gas, segmented arc-heater to produce high-temperature pressurized hydrogen flows representative of practical reactor core environments and is intended to serve. as a low cost test facility for the purpose of investigating and characterizing candidate fueUstructura1 materials and improving associated processing/fabrication techniques. Design and development efforts are thoroughly summarized, including thermal hydraulics analysis and simulation results, and facility operating characteristics are reported, as determined from a series of baseline performance mapping tests.

  19. Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM

    NASA Technical Reports Server (NTRS)

    Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip

    2017-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.

  20. Gaining control: changing relations between executive control and processing speed and their relevance for mathematics achievement over course of the preschool period

    PubMed Central

    Clark, Caron A. C.; Nelson, Jennifer Mize; Garza, John; Sheffield, Tiffany D.; Wiebe, Sandra A.; Espy, Kimberly Andrews

    2014-01-01

    Early executive control (EC) predicts a range of academic outcomes and shows particularly strong associations with children's mathematics achievement. Nonetheless, a major challenge for EC research lies in distinguishing EC from related cognitive constructs that also are linked to achievement outcomes. Developmental cascade models suggest that children's information processing speed is a driving mechanism in cognitive development that supports gains in working memory, inhibitory control and associated cognitive abilities. Accordingly, individual differences in early executive task performance and their relation to mathematics may reflect, at least in part, underlying variation in children's processing speed. The aims of this study were to: (1) examine the degree of overlap between EC and processing speed at different preschool age points; and (2) determine whether EC uniquely predicts children's mathematics achievement after accounting for individual differences in processing speed. As part of a longitudinal, cohort-sequential study, 388 children (50% boys; 44% from low income households) completed the same battery of EC tasks at ages 3, 3.75, 4.5, and 5.25 years. Several of the tasks incorporated baseline speeded naming conditions with minimal EC demands. Multidimensional latent models were used to isolate the variance in executive task performance that did not overlap with baseline processing speed, covarying for child language proficiency. Models for separate age points showed that, while EC did not form a coherent latent factor independent of processing speed at age 3 years, it did emerge as a distinct factor by age 5.25. Although EC at age 3 showed no distinct relation with mathematics achievement independent of processing speed, EC at ages 3.75, 4.5, and 5.25 showed independent, prospective links with mathematics achievement. Findings suggest that EC and processing speed are tightly intertwined in early childhood. As EC becomes progressively decoupled from processing speed with age, it begins to take on unique, discriminative importance for children's mathematics achievement. PMID:24596563

  1. Passive perception system for day/night autonomous off-road navigation

    NASA Astrophysics Data System (ADS)

    Rankin, Arturo L.; Bergh, Charles F.; Goldberg, Steven B.; Bellutta, Paolo; Huertas, Andres; Matthies, Larry H.

    2005-05-01

    Passive perception of terrain features is a vital requirement for military related unmanned autonomous vehicle operations, especially under electromagnetic signature management conditions. As a member of Team Raptor, the Jet Propulsion Laboratory developed a self-contained passive perception system under the DARPA funded PerceptOR program. An environmentally protected forward-looking sensor head was designed and fabricated in-house to straddle an off-the-shelf pan-tilt unit. The sensor head contained three color cameras for multi-baseline daytime stereo ranging, a pair of cooled mid-wave infrared cameras for nighttime stereo ranging, and supporting electronics to synchronize captured imagery. Narrow-baseline stereo provided improved range data density in cluttered terrain, while wide-baseline stereo provided more accurate ranging for operation at higher speeds in relatively open areas. The passive perception system processed stereo images and outputted over a local area network terrain maps containing elevation, terrain type, and detected hazards. A novel software architecture was designed and implemented to distribute the data processing on a 533MHz quad 7410 PowerPC single board computer under the VxWorks real-time operating system. This architecture, which is general enough to operate on N processors, has been subsequently tested on Pentium-based processors under Windows and Linux, and a Sparc based-processor under Unix. The passive perception system was operated during FY04 PerceptOR program evaluations at Fort A. P. Hill, Virginia, and Yuma Proving Ground, Arizona. This paper discusses the Team Raptor passive perception system hardware and software design, implementation, and performance, and describes a road map to faster and improved passive perception.

  2. A Positive Deviance Approach to Understanding Key Features to Improving Diabetes Care in the Medical Home

    PubMed Central

    Gabbay, Robert A.; Friedberg, Mark W.; Miller-Day, Michelle; Cronholm, Peter F.; Adelman, Alan; Schneider, Eric C.

    2013-01-01

    PURPOSE The medical home has gained national attention as a model to reorganize primary care to improve health outcomes. Pennsylvania has undertaken one of the largest state-based, multipayer medical home pilot projects. We used a positive deviance approach to identify and compare factors driving the care models of practices showing the greatest and least improvement in diabetes care in a sample of 25 primary care practices in southeast Pennsylvania. METHODS We ranked practices into improvement quintiles on the basis of the average absolute percentage point improvement from baseline to 18 months in 3 registry-based measures of performance related to diabetes care: glycated hemoglobin concentration, blood pressure, and low-density lipoprotein cholesterol level. We then conducted surveys and key informant interviews with leaders and staff in the 5 most and least improved practices, and compared their responses. RESULTS The most improved/higher-performing practices tended to have greater structural capabilities (eg, electronic health records) than the least improved/lower-performing practices at baseline. Interviews revealed striking differences between the groups in terms of leadership styles and shared vision; sense, use, and development of teams; processes for monitoring progress and obtaining feedback; and presence of technologic and financial distractions. CONCLUSIONS Positive deviance analysis suggests that primary care practices’ baseline structural capabilities and abilities to buffer the stresses of change may be key facilitators of performance improvement in medical home transformations. Attention to the practices’ structural capabilities and factors shaping successful change, especially early in the process, will be necessary to improve the likelihood of successful medical home transformation and better care. PMID:23690393

  3. A targeted noise reduction observational study for reducing noise in a neonatal intensive unit.

    PubMed

    Chawla, S; Barach, P; Dwaihy, M; Kamat, D; Shankaran, S; Panaitescu, B; Wang, B; Natarajan, G

    2017-09-01

    Excessive noise in neonatal intensive care units (NICUs) can interfere with infants' growth, development and healing.Local problem:Sound levels in our NICUs exceeded the recommended levels by the World Health Organization. We implemented a noise reduction strategy in an urban, tertiary academic medical center NICU that included baseline noise measurements. We conducted a survey involving staff and visitors regarding their opinions and perceptions of noise levels in the NICU. Ongoing feedback to staff after each measurement cycle was provided to improve awareness, engagement and adherence with noise reduction strategies. After widespread discussion with active clinician involvement, consensus building and iterative testing, changes were implemented including: lowering of equipment alarm sounds, designated 'quiet times' and implementing a customized education program for staff. A multiphase noise reduction quality improvement (QI) intervention to reduce ambient sound levels in a patient care room in our NICUs by 3 dB (20%) over 18 months. The noise in the NICU was reduced by 3 dB from baseline. Mean (s.d.) baseline, phase 2, 3 and 4 noise levels in the two NICUs were: LAeq: 57.0 (0.84), 56.8 (1.6), 55.3 (1.9) and 54.5 (2.6) dB, respectively (P<0.01). Adherence with the planned process measure of 'quiet times' was >90%. Implementing a multipronged QI initiative resulted in significant noise level reduction in two multipod NICUs. It is feasible to reduce noise levels if QI interventions are coupled with active engagement of the clinical staff and following continuous process of improvement methods, measurements and protocols.

  4. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  5. Current status of solar cell performance of unconventional silicon sheets

    NASA Technical Reports Server (NTRS)

    Yoo, H. I.; Liu, J. K.

    1981-01-01

    It is pointed out that activities in recent years directed towards reduction in the cost of silicon solar cells for terrestrial photovoltaic applications have resulted in impressive advancements in the area of silicon sheet formation from melt. The techniques used in the process of sheet formation can be divided into two general categories. All approaches in one category require subsequent ingot wavering. The various procedures of the second category produce silicon in sheet form. The performance of baseline solar cells is discussed. The baseline process included identification marking, slicing to size, and surface treatment (etch-polishing) when needed. Attention is also given to the performance of cells with process variations, and the effects of sheet quality on performance and processing.

  6. Healthy Immigrant Families: Participatory Development and Baseline Characteristics of a Community-Based Physical Activity and Nutrition Intervention

    PubMed Central

    Wieland, Mark L.; Weis, Jennifer A.; Hanza, Marcelo M.K.; Meiers, Sonja J.; Patten, Christi A.; Clark, Matthew M.; Sloan, Jeff A.; Novotny, Paul J.; Njeru, Jane W.; Abbenyi, Adeline; Levine, James A.; Goodson, Miriam; Capetillo, Maria Graciela D. Porraz; Osman, Ahmed; Hared, Abdullah; Nigon, Julie A.; Sia, Irene G.

    2015-01-01

    Background US immigrants often have escalating cardiovascular risk. Barriers to optimal physical activity and diet have a significant role in this risk accumulation. Methods We developed a physical activity and nutrition intervention with immigrant and refugee families through a community-based participatory research approach. Work groups of community members and health scientists developed an intervention manual with 12 content modules that were based on social-learning theory. Family health promoters from the participating communities (Hispanic, Somali, Sudanese) were trained to deliver the intervention through 12 home visits during the first 6 months and up to 12 phone calls during the second 6 months. The intervention was tested through a randomized community-based trial with a delayed-intervention control group, with measurements at baseline, 6, 12, and 24 months. Primary measurements included accelerometer-based assessment of physical activity and 24-hour dietary recall. Secondary measures included biometrics and theory-based instruments. Results One hundred fifty-one individuals (81 adolescents, 70 adults; 44 families) were randomized. At baseline, mean (SD) time spent in moderate-to-vigorous physical activity was 64.7 (30.2) minutes/day for adolescents and 43.1 (35.4) minutes/day for adults. Moderate dietary quality was observed in both age groups. Biometric measures showed that 45.7% of adolescents and 80.0% of adults were overweight or obese. Moderate levels of self-efficacy and social support were reported for physical activity and nutrition. Discussion Processes and products from this program are relevant to other communities aiming to reduce cardiovascular risk and negative health behaviors among immigrants and refugees. Trial Registration This trial was registered at Clinicaltrials.gov (NCT01952808). PMID:26655431

  7. Large arrays of dual-polarized multichroic TES detectors for CMB measurements with the SPT-3G receiver

    DOE PAGES

    Holland, Wayne S.; Zmuidzinas, Jonas; Posada, Chrystian M.; ...

    2016-07-19

    Now, detectors for cosmic microwave background (CMB) experiments are background limited, so a straightforward alternative to improve sensitivity is to increase the number of detectors. Large arrays of multichroic pixels constitute an economical approach to increasing the number of detectors within a given focal plane area. We present the fabrication of large arrays of dual-polarized multichroic transition-edge-sensor (TES) bolometers for the South Pole Telescope third-generation CMB receiver (SPT-3G). The complete SPT-3G receiver will have 2690 pixels, each with six detectors, allowing for individual measurement of three spectral bands (centered at 95 GHz, 150 GHz and 220 GHz) in two orthogonalmore » polarizations. In total, the SPT-3G focal plane will have 16140 detectors. Each pixel is comprised of a broad-band sinuous antenna coupled to a niobium microstrip transmission line. In-line filters are used to define the different band-passes before the millimeter-wavelength signal is fed to the respective Ti/Au TES sensors. Detectors are read out using a 64x frequency domain multiplexing (fMux) scheme. The microfabrication of the SPT-3G detector arrays involves a total of 18 processes, including 13 lithography steps. Together with the fabrication process, the effect of processing on the Ti/Au TES's T-c is discussed. In addition, detectors fabricated with Ti/Au TES films with Tc between 400 mK 560 mK are presented and their thermal characteristics are evaluated. Optical characterization of the arrays is presented as well, indicating that the response of the detectors is in good agreement with the design values for all three spectral bands (95 GHz, 150 GHz, and 220 GHz). The measured optical efficiency of the detectors is between 0.3 and 0.8. Our results discussed here are extracted from a batch of research of development wafers used to develop the baseline process for the fabrication of the arrays of detectors to be deployed with the SPT-3G receiver. Results from these research and development wafers have been incorporated into the fabrication process to get the baseline fabrication process presented here. SPT-3G is scheduled to deploy to the South Pole Telescope in late 2016.« less

  8. Large arrays of dual-polarized multichroic TES detectors for CMB measurements with the SPT-3G receiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Wayne S.; Zmuidzinas, Jonas; Posada, Chrystian M.

    Now, detectors for cosmic microwave background (CMB) experiments are background limited, so a straightforward alternative to improve sensitivity is to increase the number of detectors. Large arrays of multichroic pixels constitute an economical approach to increasing the number of detectors within a given focal plane area. We present the fabrication of large arrays of dual-polarized multichroic transition-edge-sensor (TES) bolometers for the South Pole Telescope third-generation CMB receiver (SPT-3G). The complete SPT-3G receiver will have 2690 pixels, each with six detectors, allowing for individual measurement of three spectral bands (centered at 95 GHz, 150 GHz and 220 GHz) in two orthogonalmore » polarizations. In total, the SPT-3G focal plane will have 16140 detectors. Each pixel is comprised of a broad-band sinuous antenna coupled to a niobium microstrip transmission line. In-line filters are used to define the different band-passes before the millimeter-wavelength signal is fed to the respective Ti/Au TES sensors. Detectors are read out using a 64x frequency domain multiplexing (fMux) scheme. The microfabrication of the SPT-3G detector arrays involves a total of 18 processes, including 13 lithography steps. Together with the fabrication process, the effect of processing on the Ti/Au TES's T-c is discussed. In addition, detectors fabricated with Ti/Au TES films with Tc between 400 mK 560 mK are presented and their thermal characteristics are evaluated. Optical characterization of the arrays is presented as well, indicating that the response of the detectors is in good agreement with the design values for all three spectral bands (95 GHz, 150 GHz, and 220 GHz). The measured optical efficiency of the detectors is between 0.3 and 0.8. Our results discussed here are extracted from a batch of research of development wafers used to develop the baseline process for the fabrication of the arrays of detectors to be deployed with the SPT-3G receiver. Results from these research and development wafers have been incorporated into the fabrication process to get the baseline fabrication process presented here. SPT-3G is scheduled to deploy to the South Pole Telescope in late 2016.« less

  9. Large arrays of dual-polarized multichroic TES detectors for CMB measurements with the SPT-3G receiver

    NASA Astrophysics Data System (ADS)

    Posada, Chrystian M.; Ade, Peter A. R.; Anderson, Adam J.; Avva, Jessica; Ahmed, Zeeshan; Arnold, Kam S.; Austermann, Jason; Bender, Amy N.; Benson, Bradford A.; Bleem, Lindsey; Byrum, Karen; Carlstrom, John E.; Carter, Faustin W.; Chang, Clarence; Cho, Hsiao-Mei; Cukierman, Ari; Czaplewski, David A.; Ding, Junjia; Divan, Ralu N. S.; de Haan, Tijmen; Dobbs, Matt; Dutcher, Daniel; Everett, Wenderline; Gannon, Renae N.; Guyser, Robert J.; Halverson, Nils W.; Harrington, Nicholas L.; Hattori, Kaori; Henning, Jason W.; Hilton, Gene C.; Holzapfel, William L.; Huang, Nicholas; Irwin, Kent D.; Jeong, Oliver; Khaire, Trupti; Korman, Milo; Kubik, Donna L.; Kuo, Chao-Lin; Lee, Adrian T.; Leitch, Erik M.; Lendinez Escudero, Sergi; Meyer, Stephan S.; Miller, Christina S.; Montgomery, Joshua; Nadolski, Andrew; Natoli, Tyler J.; Nguyen, Hogan; Novosad, Valentyn; Padin, Stephen; Pan, Zhaodi; Pearson, John E.; Rahlin, Alexandra; Reichardt, Christian L.; Ruhl, John E.; Saliwanchik, Benjamin; Shirley, Ian; Sayre, James T.; Shariff, Jamil A.; Shirokoff, Erik D.; Stan, Liliana; Stark, Antony A.; Sobrin, Joshua; Story, Kyle; Suzuki, Aritoki; Tang, Qing Yang; Thakur, Ritoban B.; Thompson, Keith L.; Tucker, Carole E.; Vanderlinde, Keith; Vieira, Joaquin D.; Wang, Gensheng; Whitehorn, Nathan; Yefremenko, Volodymyr; Yoon, Ki Won

    2016-07-01

    Detectors for cosmic microwave background (CMB) experiments are now essentially background limited, so a straightforward alternative to improve sensitivity is to increase the number of detectors. Large arrays of multichroic pixels constitute an economical approach to increasing the number of detectors within a given focal plane area. Here, we present the fabrication of large arrays of dual-polarized multichroic transition-edge-sensor (TES) bolometers for the South Pole Telescope third-generation CMB receiver (SPT-3G). The complete SPT-3G receiver will have 2690 pixels, each with six detectors, allowing for individual measurement of three spectral bands (centered at 95 GHz, 150 GHz and 220 GHz) in two orthogonal polarizations. In total, the SPT-3G focal plane will have 16140 detectors. Each pixel is comprised of a broad-band sinuous antenna coupled to a niobium microstrip transmission line. In-line filters are used to define the different band-passes before the millimeter-wavelength signal is fed to the respective Ti/Au TES sensors. Detectors are read out using a 64x frequency domain multiplexing (fMux) scheme. The microfabrication of the SPT-3G detector arrays involves a total of 18 processes, including 13 lithography steps. Together with the fabrication process, the effect of processing on the Ti/Au TES's Tc is discussed. In addition, detectors fabricated with Ti/Au TES films with Tc between 400 mK 560 mK are presented and their thermal characteristics are evaluated. Optical characterization of the arrays is presented as well, indicating that the response of the detectors is in good agreement with the design values for all three spectral bands (95 GHz, 150 GHz, and 220 GHz). The measured optical efficiency of the detectors is between 0.3 and 0.8. Results discussed here are extracted from a batch of research of development wafers used to develop the baseline process for the fabrication of the arrays of detectors to be deployed with the SPT-3G receiver. Results from these research and development wafers have been incorporated into the fabrication process to get the baseline fabrication process presented here. SPT-3G is scheduled to deploy to the South Pole Telescope in late 2016.

  10. The environmental control and life support system advanced automation project. Phase 1: Application evaluation

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1990-01-01

    The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to advanced automation primarily due to the comparatively large reaction times of its subsystem processes. This allows longer contemplation times in which to form a more intelligent control strategy and to detect or prevent faults. The objective of the ECLSS Advanced Automation Project is to reduce the flight and ground manpower needed to support the initial and evolutionary ECLS system. The approach is to search out and make apparent those processes in the baseline system which are in need of more automatic control and fault detection strategies, to influence the ECLSS design by suggesting software hooks and hardware scars which will allow easy adaptation to advanced algorithms, and to develop complex software prototypes which fit into the ECLSS software architecture and will be shown in an ECLSS hardware testbed to increase the autonomy of the system. Covered here are the preliminary investigation and evaluation process, aimed at searching the ECLSS for candidate functions for automation and providing a software hooks and hardware scars analysis. This analysis shows changes needed in the baselined system for easy accommodation of knowledge-based or other complex implementations which, when integrated in flight or ground sustaining engineering architectures, will produce a more autonomous and fault tolerant Environmental Control and Life Support System.

  11. Physical activity (PA) and the disablement process: a 14-year follow-up study of older non-disabled women and men.

    PubMed

    Schultz-Larsen, Kirsten; Rahmanfard, Naghmeh; Holst, Claus

    2012-01-01

    Few studies have explored the associations of reported PA (RPA) with the processes underlying the development of disability. The present study was performed to explore RPA among older persons and its association with onset of functional dependence and mortality. Among a probability sample of 1782 community-living persons, aged 75-83 years, we evaluated the 1021 who reported no disability in basic activities of daily living. Participants were followed for a median of 8.34 years in public registers to determine onset of disability and mortality. RPA predicted mortality in older women (HR=1.77, 95%CI=1.42-2.19) and men (HR=1.65, 95%CI=1.27-2.14) over long time intervals. The effect of RPA persisted among permanently disabled older women, after adjusting for age, baseline vulnerability and grade of disability. Low RPA was independently associated with risk of incident disability (HR=1.56, 95%CI=1.10-2.23) in men. Among older women, the association between RPA and incidence of disability was attenuated in analyses that controlled for baseline mobility function. Thus, the association between physical activity and mortality reflected processes different from those underlying a simple relation between physical activity, disability and mortality. Physical activity was an ubiquitous predictor of longevity, but only for women. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Sugar-sweetened product consumption alters glucose homeostasis compared with dairy product consumption in men and women at risk of type 2 diabetes mellitus.

    PubMed

    Maki, Kevin C; Nieman, Kristin M; Schild, Arianne L; Kaden, Valerie N; Lawless, Andrea L; Kelley, Kathleen M; Rains, Tia M

    2015-03-01

    Dietary patterns characterized by high intakes of fruits and vegetables, whole grains, low-fat dairy products, and low glycemic load have been associated with lower type 2 diabetes mellitus (T2DM) risk. In contrast, dietary patterns that include high intakes of refined grains, processed meats, and high amounts of added sugars have been associated with increased T2DM risk. This randomized, 2-period crossover trial compared the effects of dairy and sugar-sweetened product (SSP) consumption on insulin sensitivity and pancreatic β-cell function in men and women at risk of the development of T2DM who habitually consume sugar-sweetened beverages. In a randomized, controlled crossover trial, participants consumed dairy products (474 mL/d 2% milk and 170 g/d low-fat yogurt) and SSPs (710 mL/d nondiet soda and 108 g/d nondairy pudding), each for 6 wk, with a 2-wk washout between treatments. A liquid meal tolerance test (LMTT) was administered at baseline and the end of each period. Participants were 50% female with a mean age and body mass index of 53.8 y and 32.2 kg/m(2), respectively. Changes from baseline were significantly different between dairy product and SSP conditions for median homeostasis model assessment 2-insulin sensitivity (HOMA2-%S) (1.3 vs. -21.3%, respectively, P = 0.009; baseline = 118%), mean LMTT disposition index (-0.03 vs. -0.36, respectively, P = 0.011; baseline = 2.59), mean HDL cholesterol (0.8 vs. -4.2%, respectively, P = 0.015; baseline = 44.3 mg/dL), and mean serum 25-hydroxyvitamin D [25(OH)D] (11.7 vs. -3.3, respectively, P = 0.022; baseline = 24.5 μg/L). Changes from baseline in LMTT Matsuda insulin sensitivity index (-0.10 vs. -0.49, respectively; baseline = 4.16) and mean HOMA2-β-cell function (-2.0 vs. 5.3%, respectively; baseline = 72.6%) did not differ significantly between treatments. These results suggest that SSP consumption is associated with less favorable values for HOMA2-%S, LMTT disposition index, HDL cholesterol, and serum 25(OH)D in men and women at risk of T2DM vs. baseline values and values during dairy product consumption. This trial was registered at clinicaltrials.gov as NCT01936935. © 2015 American Society for Nutrition.

  13. Very long baseline interferometry applied to polar motion, relativity, and geodesy. Ph. D. thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, C.

    1978-01-01

    The causes and effects of diurnal polar motion are described. An algorithm was developed for modeling the effects on very long baseline interferometry observables. A selection was made between two three-station networks for monitoring polar motion. The effects of scheduling and the number of sources observed on estimated baseline errors are discussed. New hardware and software techniques in very long baseline interferometry are described.

  14. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Improving Communication During Cardiac ICU Multidisciplinary Rounds Through Visual Display of Patient Daily Goals.

    PubMed

    Justice, Lindsey B; Cooper, David S; Henderson, Carla; Brown, James; Simon, Katherine; Clark, Lindsey; Fleckenstein, Elizabeth; Benscoter, Alexis; Nelson, David P

    2016-07-01

    To improve communication during daily cardiac ICU multidisciplinary rounds. Quality improvement methodology. Twenty-five-bed cardiac ICUs in an academic free-standing pediatric hospital. All patients admitted to the cardiac ICU. Implementation of visual display of patient daily goals through a write-down and read-back process. The Rounds Effectiveness Assessment and Communication Tool was developed based on the previously validated Patient Knowledge Assessment Tool to evaluate comprehension of patient daily goals. Rounds were assessed for each patient by the bedside nurse, nurse practitioner or fellow, and attending physician, and answers were compared to determine percent agreement per day. At baseline, percent agreement for patient goals was only 62%. After initial implementation of the daily goal write-down/read-back process, which was written on paper by the bedside nurse, the Rounds Effectiveness Assessment and Communication Tool survey revealed no improvement. With adaptation of the intervention so goals were written on whiteboards for visual display during rounds, the percent agreement improved to 85%. Families were also asked to complete a survey (1-6 Likert scale) of their satisfaction with rounds and understanding of daily goals before and after the intervention. Family survey results improved from a mean of 4.6-5.7. Parent selection of the best possible score for each question was 19% at baseline and 75% after the intervention. Visual display of patient daily goals via a write-down/read-back process improves comprehension of goals by all team members and improves parent satisfaction. The daily goal whiteboard facilitates consistent development of a comprehensive plan of care for each patient, fosters goal-directed care, and provides a checklist for providers and parents to review throughout the day.

  16. Impact of a multidimensional infection control approach on central line-associated bloodstream infections rates in adult intensive care units of 8 cities of Turkey: findings of the International Nosocomial Infection Control Consortium (INICC)

    PubMed Central

    2013-01-01

    Background Central line-associated bloodstream infections (CLABs) have long been associated with excess lengths of stay, increased hospital costs and mortality attributable to them. Different studies from developed countries have shown that practice bundles reduce the incidence of CLAB in intensive care units. However, the impact of the bundle strategy has not been systematically analyzed in the adult intensive care unit (ICU) setting in developing countries, such as Turkey. The aim of this study is to analyze the impact of the International Nosocomial Infection Control Consortium (INICC) multidimensional infection control approach to reduce the rates of CLAB in 13 ICUs of 13 INICC member hospitals from 8 cities of Turkey. Methods We conducted active, prospective surveillance before-after study to determine CLAB rates in a cohort of 4,017 adults hospitalized in ICUs. We applied the definitions of the CDC/NHSN and INICC surveillance methods. The study was divided into baseline and intervention periods. During baseline, active outcome surveillance of CLAB rates was performed. During intervention, the INICC multidimensional approach for CLAB reduction was implemented and included the following measures: 1- bundle of infection control interventions, 2- education, 3- outcome surveillance, 4- process surveillance, 5- feedback of CLAB rates, and 6- performance feedback on infection control practices. CLAB rates obtained in baseline were compared with CLAB rates obtained during intervention. Results During baseline, 3,129 central line (CL) days were recorded, and during intervention, we recorded 23,463 CL-days. We used random effects Poisson regression to account for clustering of CLAB rates within hospital across time periods. The baseline CLAB rate was 22.7 per 1000 CL days, which was decreased during the intervention period to 12.0 CLABs per 1000 CL days (IRR 0.613; 95% CI 0.43 – 0.87; P 0.007). This amounted to a 39% reduction in the incidence rate of CLAB. Conclusions The implementation of multidimensional infection control approach was associated with a significant reduction in the CLAB rates in adult ICUs of Turkey, and thus should be widely implemented. PMID:23641950

  17. Participation in the Analysis of the Far-Infrared/Submillmeter Interferometer

    NASA Technical Reports Server (NTRS)

    Lorenzini, Enrico C.

    2005-01-01

    We have contributed to the development of the Submillimiter Probe of the Evolution of Cosmic Structure (SPECS) by analyzing various aspects related to the tethers that connect the spacecraft of this space interferometer. We have focused our analysis on key topics as follows: (a) helping in the configuration selection; (b) computing the system eigenfrequencies as a function of baseline length; (c) developing techniques and conceptual design of devices for damping the tether oscillations; (d) carrying out numerical simulations of tethered formation to assess the effects of environmental perturbations upon the baseline length variation; (e) developing control laws for reconfiguring the baseline length; (f) devising control laws for fast retargeting of the interferometer at moderate baseline lengths; (g) estimating the survivability to micrometeoroid impacts of a tether at L2; and (h) developing a conceptual design of a high- strength and survivable tether. The work was conducted for NASA Goddard Space Flight Center under Grant NNG04GQ21G with William Danchi as technical monitor.

  18. Operations research methods improve chemotherapy patient appointment scheduling.

    PubMed

    Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott

    2012-12-01

    Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.

  19. An advanced algorithm for deformation estimation in non-urban areas

    NASA Astrophysics Data System (ADS)

    Goel, Kanika; Adam, Nico

    2012-09-01

    This paper presents an advanced differential SAR interferometry stacking algorithm for high resolution deformation monitoring in non-urban areas with a focus on distributed scatterers (DSs). Techniques such as the Small Baseline Subset Algorithm (SBAS) have been proposed for processing DSs. SBAS makes use of small baseline differential interferogram subsets. Singular value decomposition (SVD), i.e. L2 norm minimization is applied to link independent subsets separated by large baselines. However, the interferograms used in SBAS are multilooked using a rectangular window to reduce phase noise caused for instance by temporal decorrelation, resulting in a loss of resolution and the superposition of topography and deformation signals from different objects. Moreover, these have to be individually phase unwrapped and this can be especially difficult in natural terrains. An improved deformation estimation technique is presented here which exploits high resolution SAR data and is suitable for rural areas. The implemented method makes use of small baseline differential interferograms and incorporates an object adaptive spatial phase filtering and residual topography removal for an accurate phase and coherence estimation, while preserving the high resolution provided by modern satellites. This is followed by retrieval of deformation via the SBAS approach, wherein, the phase inversion is performed using an L1 norm minimization which is more robust to the typical phase unwrapping errors encountered in non-urban areas. Meter resolution TerraSAR-X data of an underground gas storage reservoir in Germany is used for demonstrating the effectiveness of this newly developed technique in rural areas.

  20. Serial Echocardiographic Characteristics, Novel Biomarkers and Cachexia Development in Patients with Stable Chronic Heart Failure.

    PubMed

    Gaggin, Hanna K; Belcher, Arianna M; Gandhi, Parul U; Ibrahim, Nasrien E; Januzzi, James L

    2016-12-01

    Little is known regarding objective predictors of cachexia affecting patients with heart failure (HF). We studied 108 stable chronic systolic HF patients with serial echocardiography and biomarker measurements over 10 months. Cachexia was defined as weight loss ≥5 % from baseline or final BMI <20 kg/m 2 ; 18.5 % developed cachexia. While there were no significant differences in baseline or serial echocardiographic measures in those developing cachexia, we found significant differences in baseline amino-terminal pro-B type natriuretic peptide (NT-proBNP), highly sensitive troponin I, sST2, and endothelin-1. Baseline log NT-proBNP (hazard ratio (HR) = 2.57, p = 0.004) and edema (HR = 3.36, p = 0.04) were predictive of cachexia in an adjusted analysis. When serial measurement of biomarkers was considered, only percent time with NT-proBNP ≥1000 pg/mL was predictive of cachexia. Thus, a close association exists between baseline and serial measurement of NT-proBNP and HF cachexia.

  1. Development and implementation of the Baltimore healthy carry-outs feasibility trial: process evaluation results

    PubMed Central

    2013-01-01

    Background Prepared food sources, including fast food restaurants and carry-outs, are common in low-income urban areas. These establishments provide foods high in calories, sugar, fat, and sodium. The aims of the study were to (1) describe the development and implementation of a carry-out intervention to provide and promote healthy food choices in prepared food sources, and (2) to assess its feasibility through a process evaluation. Methods To promote healthy eating in this setting, a culturally appropriate intervention was developed based on formative research from direct observation, interviews and focus groups. We implemented a 7-month feasibility trial in 8 carry-outs (4 intervention and 4 comparison) in low-income neighborhoods in Baltimore, MD. The trial included three phases: 1) Improving menu boards and labeling to promote healthier items; 2) Promoting healthy sides and beverages and introducing new items; and 3) Introducing affordable healthier combo meals and improving food preparation methods. A process evaluation was conducted to assess intervention reach, dose received, and fidelity using sales receipts, carry-out visit observations, and an intervention exposure assessment. Results On average, Baltimore Healthy Carry-outs (BHC) increased customer reach at intervention carry-outs; purchases increased by 36.8% at the end of the study compared to baseline. Additionally, menu boards and labels were seen by 100.0% and 84.2% of individuals (n = 101), respectively, at study completion compared to baseline. Customers reported purchasing specific foods due to the presence of a photo on the menu board (65.3%) or menu labeling (42.6%), suggesting moderate to high dose received. Promoted entrée availability and revised menu and poster presence all demonstrated high fidelity and feasibility. Conclusions The results suggest that BHC is a culturally acceptable intervention. The program was also immediately adopted by the Baltimore City Food Policy Initiative as a city-wide intervention in its public markets. PMID:23837722

  2. Development and implementation of the Baltimore healthy carry-outs feasibility trial: process evaluation results.

    PubMed

    Lee-Kwan, Seung Hee; Goedkoop, Sonja; Yong, Rachel; Batorsky, Benjamin; Hoffman, Vanessa; Jeffries, Jayne; Hamouda, Mohamed; Gittelsohn, Joel

    2013-07-09

    Prepared food sources, including fast food restaurants and carry-outs, are common in low-income urban areas. These establishments provide foods high in calories, sugar, fat, and sodium. The aims of the study were to (1) describe the development and implementation of a carry-out intervention to provide and promote healthy food choices in prepared food sources, and (2) to assess its feasibility through a process evaluation. To promote healthy eating in this setting, a culturally appropriate intervention was developed based on formative research from direct observation, interviews and focus groups. We implemented a 7-month feasibility trial in 8 carry-outs (4 intervention and 4 comparison) in low-income neighborhoods in Baltimore, MD. The trial included three phases: 1) Improving menu boards and labeling to promote healthier items; 2) Promoting healthy sides and beverages and introducing new items; and 3) Introducing affordable healthier combo meals and improving food preparation methods. A process evaluation was conducted to assess intervention reach, dose received, and fidelity using sales receipts, carry-out visit observations, and an intervention exposure assessment. On average, Baltimore Healthy Carry-outs (BHC) increased customer reach at intervention carry-outs; purchases increased by 36.8% at the end of the study compared to baseline. Additionally, menu boards and labels were seen by 100.0% and 84.2% of individuals (n = 101), respectively, at study completion compared to baseline. Customers reported purchasing specific foods due to the presence of a photo on the menu board (65.3%) or menu labeling (42.6%), suggesting moderate to high dose received. Promoted entrée availability and revised menu and poster presence all demonstrated high fidelity and feasibility. The results suggest that BHC is a culturally acceptable intervention. The program was also immediately adopted by the Baltimore City Food Policy Initiative as a city-wide intervention in its public markets.

  3. Development of Metric for Measuring the Impact of RD&D Funding on GTO's Geothermal Exploration Goals (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenne, S.; Young, K. R.; Thorsteinsson, H.

    The Department of Energy's Geothermal Technologies Office (GTO) provides RD&D funding for geothermal exploration technologies with the goal of lowering the risks and costs of geothermal development and exploration. In 2012, NREL was tasked with developing a metric to measure the impacts of this RD&D funding on the cost and time required for exploration activities. The development of this metric included collecting cost and time data for exploration techniques, creating a baseline suite of exploration techniques to which future exploration and cost and time improvements could be compared, and developing an online tool for graphically showing potential project impacts (allmore » available at http://en.openei.org/wiki/Gateway:Geothermal). The conference paper describes the methodology used to define the baseline exploration suite of techniques (baseline), as well as the approach that was used to create the cost and time data set that populates the baseline. The resulting product, an online tool for measuring impact, and the aggregated cost and time data are available on the Open EI website for public access (http://en.openei.org).« less

  4. Development of the Orion Crew Module Static Aerodynamic Database. Par 2; Supersonic/Subsonic

    NASA Technical Reports Server (NTRS)

    Bibb, Karen L.; Walker, Eric L.; Brauckmann, Gregory J.; Robinson, Phil

    2011-01-01

    This work describes the process of developing the nominal static aerodynamic coefficients and associated uncertainties for the Orion Crew Module for Mach 8 and below. The database was developed from wind tunnel test data and computational simulations of the smooth Crew Module geometry, with no asymmetries or protuberances. The database covers the full range of Reynolds numbers seen in both entry and ascent abort scenarios. The basic uncertainties were developed as functions of Mach number and total angle of attack from variations in the primary data as well as computations at lower Reynolds numbers, on the baseline geometry, and using different flow solvers. The resulting aerodynamic database represents the Crew Exploration Vehicle Aerosciences Project's best estimate of the nominal aerodynamics for the current Crew Module vehicle.

  5. Emergent HIV-1 Drug Resistance Mutations Were Not Present at Low-Frequency at Baseline in Non-Nucleoside Reverse Transcriptase Inhibitor-Treated Subjects in the STaR Study

    PubMed Central

    Porter, Danielle P.; Daeumer, Martin; Thielen, Alexander; Chang, Silvia; Martin, Ross; Cohen, Cal; Miller, Michael D.; White, Kirsten L.

    2015-01-01

    At Week 96 of the Single-Tablet Regimen (STaR) study, more treatment-naïve subjects that received rilpivirine/emtricitabine/tenofovir DF (RPV/FTC/TDF) developed resistance mutations compared to those treated with efavirenz (EFV)/FTC/TDF by population sequencing. Furthermore, more RPV/FTC/TDF-treated subjects with baseline HIV-1 RNA >100,000 copies/mL developed resistance compared to subjects with baseline HIV-1 RNA ≤100,000 copies/mL. Here, deep sequencing was utilized to assess the presence of pre-existing low-frequency variants in subjects with and without resistance development in the STaR study. Deep sequencing (Illumina MiSeq) was performed on baseline and virologic failure samples for all subjects analyzed for resistance by population sequencing during the clinical study (n = 33), as well as baseline samples from control subjects with virologic response (n = 118). Primary NRTI or NNRTI drug resistance mutations present at low frequency (≥2% to 20%) were detected in 6.6% of baseline samples by deep sequencing, all of which occurred in control subjects. Deep sequencing results were generally consistent with population sequencing but detected additional primary NNRTI and NRTI resistance mutations at virologic failure in seven samples. HIV-1 drug resistance mutations emerging while on RPV/FTC/TDF or EFV/FTC/TDF treatment were not present at low frequency at baseline in the STaR study. PMID:26690199

  6. Emergent HIV-1 Drug Resistance Mutations Were Not Present at Low-Frequency at Baseline in Non-Nucleoside Reverse Transcriptase Inhibitor-Treated Subjects in the STaR Study.

    PubMed

    Porter, Danielle P; Daeumer, Martin; Thielen, Alexander; Chang, Silvia; Martin, Ross; Cohen, Cal; Miller, Michael D; White, Kirsten L

    2015-12-07

    At Week 96 of the Single-Tablet Regimen (STaR) study, more treatment-naïve subjects that received rilpivirine/emtricitabine/tenofovir DF (RPV/FTC/TDF) developed resistance mutations compared to those treated with efavirenz (EFV)/FTC/TDF by population sequencing. Furthermore, more RPV/FTC/TDF-treated subjects with baseline HIV-1 RNA >100,000 copies/mL developed resistance compared to subjects with baseline HIV-1 RNA ≤100,000 copies/mL. Here, deep sequencing was utilized to assess the presence of pre-existing low-frequency variants in subjects with and without resistance development in the STaR study. Deep sequencing (Illumina MiSeq) was performed on baseline and virologic failure samples for all subjects analyzed for resistance by population sequencing during the clinical study (n = 33), as well as baseline samples from control subjects with virologic response (n = 118). Primary NRTI or NNRTI drug resistance mutations present at low frequency (≥2% to 20%) were detected in 6.6% of baseline samples by deep sequencing, all of which occurred in control subjects. Deep sequencing results were generally consistent with population sequencing but detected additional primary NNRTI and NRTI resistance mutations at virologic failure in seven samples. HIV-1 drug resistance mutations emerging while on RPV/FTC/TDF or EFV/FTC/TDF treatment were not present at low frequency at baseline in the STaR study.

  7. Assessing usefulness and researcher satisfaction with consent form templates.

    PubMed

    Larson, Elaine L; Teller, Alan; Aguirre, Alejandra N; Jackson, Jhia; Meyer, Dodi

    2017-08-01

    We aimed to improve the research consenting process by developing and evaluating simplified consent forms. Four templates written at the eighth-tenth grade reading level were developed and trialed by a group of experts in clinical research, health literacy, national regulatory requirements, and end users. Researchers from protocols which had received expedited review were surveyed at 2 time points regarding their use and assessment of the templates. At baseline 18/86 (20.9%) responding researchers had heard of the templates and 5 (5.8%) reported that they had used them; 2 years later, 54.2% (32/59) had heard of the templates and 87.5% (28/32) had used them ( p <0.001). Consent form templates may be one mechanism to improve patient comprehension of research protocols as well as efficiency of the review process, but require considerable time for development and implementation, and one key to their success is involvement and support from the IRB and technical staff.

  8. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  9. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  10. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  11. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  12. VLBI geodesy - 2 parts-per-billion precision in length determinations for transcontinental baselines

    NASA Technical Reports Server (NTRS)

    Davis, J. L.; Herring, T. A.; Shapiro, I. I.

    1988-01-01

    VLBI was to make twenty-two independent measurements, between September 1984 and December 1986, of the length of the 3900-km baseline between the Mojave site in California and the Haystack/Westford site in Massachusetts. These experiments differ from the typical geodetic VLBI experiments in that a large fraction of observations is obtained at elevation angles between 4 and 10 deg. Data from these low elevation angles allow the vertical coordinate of site position, and hence the baseline length, to be estimated with greater precision. For the sixteen experiments processed thus far, the weighted root-mean-square scatter of the estimates of the baseline length is 8 mm.

  13. Project resource reallocation algorithm

    NASA Technical Reports Server (NTRS)

    Myers, J. E.

    1981-01-01

    A methodology for adjusting baseline cost estimates according to project schedule changes is described. An algorithm which performs a linear expansion or contraction of the baseline project resource distribution in proportion to the project schedule expansion or contraction is presented. Input to the algorithm consists of the deck of cards (PACE input data) prepared for the baseline project schedule as well as a specification of the nature of the baseline schedule change. Output of the algorithm is a new deck of cards with all work breakdown structure block and element of cost estimates redistributed for the new project schedule. This new deck can be processed through PACE to produce a detailed cost estimate for the new schedule.

  14. Brain tissue volumes in relation to cognitive function and risk of dementia.

    PubMed

    Ikram, M Arfan; Vrooman, Henri A; Vernooij, Meike W; den Heijer, Tom; Hofman, Albert; Niessen, Wiro J; van der Lugt, Aad; Koudstaal, Peter J; Breteler, Monique M B

    2010-03-01

    We investigated in a population-based cohort study the association of global and lobar brain tissue volumes with specific cognitive domains and risk of dementia. Participants (n=490; 60-90 years) were non-demented at baseline (1995-1996). From baseline brain MRI-scans we obtained global and lobar volumes of CSF, GM, normal WM, white matter lesions and hippocampus. We performed neuropsychological testing at baseline to assess information processing speed, executive function, memory function and global cognitive function. Participants were followed for incident dementia until January 1, 2005. Larger volumes of CSF and WML were associated with worse performance on all neuropsychological tests, and an increased risk of dementia. Smaller WM volume was related to poorer information processing speed and executive function. In contrast, smaller GM volume was associated with worse memory function and increased risk of dementia. When investigating lobar GM volumes, we found that hippocampal volume and temporal GM volume were most strongly associated with risk of dementia, even in persons without objective and subjective cognitive deficits at baseline, followed by frontal and parietal GM volumes. Copyright 2008 Elsevier Inc. All rights reserved.

  15. GPS-Based Precision Baseline Reconstruction for the TanDEM-X SAR-Formation

    NASA Technical Reports Server (NTRS)

    Montenbruck, O.; vanBarneveld, P. W. L.; Yoon, Y.; Visser, P. N. A. M.

    2007-01-01

    The TanDEM-X formation employs two separate spacecraft to collect interferometric Synthetic Aperture Radar (SAR) measurements over baselines of about 1 km. These will allow the generation ofa global Digital Elevation Model (DEM) with an relative vertical accuracy of 2-4 m and a 10 m ground resolution. As part of the ground processing, the separation of the SAR antennas at the time of each data take must be reconstructed with a 1 mm accuracy using measurements from two geodetic grade GPS receivers. The paper discusses the TanDEM-X mission as well as the methods employed for determining the interferometric baseline with utmost precision. Measurements collected during the close fly-by of the two GRACE satellites serve as a reference case to illustrate the processing concept, expected accuracy and quality control strategies.

  16. Using multiple-accumulator CMACs to improve efficiency of the X part of an input-buffered FX correlator

    NASA Astrophysics Data System (ADS)

    Lapshev, Stepan; Hasan, S. M. Rezaul

    2017-04-01

    This paper presents the approach of using complex multiplier-accumulators (CMACs) with multiple accumulators to reduce the total number of memory operations in an input-buffered architecture for the X part of an FX correlator. A processing unit of this architecture uses an array of CMACs that are reused for different groups of baselines. The disadvantage of processing correlations in this way is that each input data sample has to be read multiple times from the memory because each input signal is used in many of these baseline groups. While a one-accumulator CMAC cannot switch to a different baseline until it is finished integrating the current one, a multiple-accumulator CMAC can. Thus, the array of multiple-accumulator CMACs can switch between processing different baselines that share some input signals at any moment to reuse the current data in the processing buffers. In this way significant reductions in the number of memory read operations are achieved with only a few accumulators per CMAC. For example, for a large number of input signals three-accumulator CMACs reduce the total number of memory operations by more than a third. Simulated energy measurements of four VLSI designs in a high-performance 28 nm CMOS technology are presented in this paper to demonstrate that using multiple accumulators can also lead to reduced power dissipation of the processing array. Using three accumulators as opposed to one has been found to reduce the overall energy of 8-bit CMACs by 1.4% through the reduction of the switching activity within their circuits, which is in addition to a more than 30% reduction in the memory.

  17. One-year incidence of carpal tunnel syndrome in Latino poultry processing workers and other Latino manual workers.

    PubMed

    Cartwright, Michael S; Walker, Francis O; Newman, Jill C; Schulz, Mark R; Arcury, Thomas A; Grzywacz, Joseph G; Mora, Dana C; Chen, Haiying; Eaton, Bethany; Quandt, Sara A

    2014-03-01

    To determine the incidence of carpal tunnel syndrome (CTS) over 1 year in Latino poultry processing workers. Symptoms and nerve conduction studies were used to identify Latino poultry processing workers (106 wrists) and Latinos in other manual labor occupations (257 wrists) that did not have CTS at baseline, and these individuals were then evaluated in the same manner 1 year later. Based on wrists, the 1-year incidence of CTS was higher in poultry processing workers than non-poultry manual workers (19.8% vs. 11.7%, P = 0.022). Poultry workers had a higher odds (1.89; P = 0.089) of developing CTS over 1 year compared to non-poultry manual workers. Latino poultry processing workers have an incidence of CTS that is possibly higher than Latinos in other manual labor positions. Latino poultry workers' high absolute and relative risk of CTS likely results from the repetitive and strenuous nature of poultry processing work. © 2013 Wiley Periodicals, Inc.

  18. Baseline Muscle Mass Is a Poor Predictor of Functional Overload-Induced Gain in the Mouse Model

    PubMed Central

    Kilikevicius, Audrius; Bunger, Lutz; Lionikas, Arimantas

    2016-01-01

    Genetic background contributes substantially to individual variability in muscle mass. Muscle hypertrophy in response to resistance training can also vary extensively. However, it is less clear if muscle mass at baseline is predictive of the hypertrophic response. The aim of this study was to examine the effect of genetic background on variability in muscle mass at baseline and in the adaptive response of the mouse fast- and slow-twitch muscles to overload. Males of eight laboratory mouse strains: C57BL/6J (B6, n = 17), BALB/cByJ (n = 7), DBA/2J (D2, n = 12), B6.A-(rs3676616-D10Utsw1)/Kjn (B6.A, n = 9), C57BL/6J-Chr10A/J/NaJ (B6.A10, n = 8), BEH+/+ (n = 11), BEH (n = 12), and DUHi (n = 12), were studied. Compensatory growth of soleus and plantaris muscles was triggered by a 4-week overload induced by synergist unilateral ablation. Muscle weight in the control leg (baseline) varied from 5.2 ± 07 mg soleus and 11.4 ± 1.3 mg plantaris in D2 mice to 18.0 ± 1.7 mg soleus in DUHi and 43.7 ± 2.6 mg plantaris in BEH (p < 0.001 for both muscles). In addition, soleus in the B6.A10 strain was ~40% larger (p < 0.001) compared to the B6. Functional overload increased muscle weight, however, the extent of gain was strain-dependent for both soleus (p < 0.01) and plantaris (p < 0.02) even after accounting for the baseline differences. For the soleus muscle, the BEH strain emerged as the least responsive, with a 1.3-fold increase, compared to a 1.7-fold gain in the most responsive D2 strain, and there was no difference in the gain between the B6.A10 and B6 strains. The BEH strain appeared the least responsive in the gain of plantaris as well, 1.3-fold, compared to ~1.5-fold gain in the remaining strains. We conclude that variation in muscle mass at baseline is not a reliable predictor of that in the overload-induced gain. This suggests that a different set of genes influence variability in muscle mass acquired in the process of normal development, growth, and maintenance, and in the process of adaptive growth of the muscle challenged by overload. PMID:27895593

  19. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    PubMed

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus <2.0 (12 [16%] versus 3 [2%], respectively, P <0.001). Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  20. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  1. Propellant development for the Advanced Solid Rocket Motor

    NASA Technical Reports Server (NTRS)

    Landers, L. C.; Stanley, C. B.; Ricks, D. W.

    1991-01-01

    The properties of a propellant developed for the NASA Advanced Solid Rocket Motor (ASRM) are described in terms of its composition, performance, and compliance to NASA specifications. The class 1.3 HTPB/AP/A1 propellant employs an ester plasticizer and the content of ballistic solids is set at 88 percent. Ammonia evolution is prevented by the utilization of a neutral bonding agent which allows continuous mixing. The propellant also comprises a bimodal AP blend with one ground fraction, ground AP of at least 20 microns, and ferric oxide to control the burning rate. The propellant's characteristics are discussed in terms of tradeoffs in AP particle size and the types of Al powder, bonding agent, and HTPB polymer. The size and shape of the ballistic solids affect the processability, ballistic properties, and structural properties of the propellant. The revised baseline composition is based on maximizing the robustness of in-process viscosity, structural integrity, and burning-rate tailoring range.

  2. Study protocol: the Childhood to Adolescence Transition Study (CATS)

    PubMed Central

    2013-01-01

    Background Puberty is a multifaceted developmental process that begins in late-childhood with a cascade of endocrine changes that ultimately lead to sexual maturation and reproductive capability. The transition through puberty is marked by an increased risk for the onset of a range of health problems, particularly those related to the control of behaviour and emotion. Early onset puberty is associated with a greater risk of cancers of the reproductive tract and cardiovascular disease. Previous studies have had methodological limitations and have tended to view puberty as a unitary process, with little distinction between adrenarche, gonadarche and linear growth. The Childhood to Adolescence Transition Study (CATS) aims to prospectively examine associations between the timing and stage of the different hormonally-mediated changes, as well as the onset and course of common health and behavioural problems that emerge in the transition from childhood to adolescence. The initial focus of CATS is on adrenarche, the first hormonal process in the pubertal cascade, which begins for most children at around 8 years of age. Methods/Design CATS is a longitudinal population-based cohort study. All Grade 3 students (8–9 years of age) from a stratified cluster sample of schools in Melbourne, Australia were invited to take part. In total, 1239 students and a parent/guardian were recruited to participate in the study. Measures are repeated annually and comprise student, parent and teacher questionnaires, and student anthropometric measurements. A saliva sample was collected from students at baseline and will be repeated at later waves, with the primary purpose of measuring hormonal indices of adrenarche and gonadarche. Discussion CATS is uniquely placed to capture biological and phenotypic indices of the pubertal process from its earliest manifestations, together with anthropometric measures and assessment of child health and development. The cohort will provide rich detail of the development, lifestyle, external circumstances and health of children during the transition from childhood through to adolescence. Baseline associations between the hormonal measures and measures of mental health and behaviour will initially be examined cross-sectionally, and then in later waves longitudinally. CATS will make a unique contribution to the understanding of adrenarche and puberty in children’s health and development. PMID:24103080

  3. CryoSat Ice Processor: High-Level Overview of Baseline-C Data and Quality-Control

    NASA Astrophysics Data System (ADS)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Since April 2015, the CryoSat ice products have been generated with the new Baseline-C Instrument Processing Facilities (IPFs). This represents a major upgrade to the CryoSat ice IPFs and is the baseline for the second CryoSat Reprocessing Campaign. Baseline- C introduces major evolutions with respect to Baseline- B, most notably the release of freeboard data within the L2 SAR products, following optimisation of the SAR retracker. Additional L2 improvements include a new Arctic Mean Sea Surface (MSS) in SAR; a new tuneable land ice retracker in LRM; and a new Digital Elevation Model (DEM) in SARIn. At L1B new attitude fields have been introduced and existing datation and range biases reduced. This paper provides a high level overview of the changes and evolutions implemented at Baseline-C in order to improve CryoSat L1B and L2 data characteristics and exploitation over polar regions. An overview of the main Quality Control (QC) activities performed on operational Baseline-C products is also presented.

  4. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    NASA Astrophysics Data System (ADS)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (< 2 km) time series, if they exist, are mainly related to site-specific effects, such as thermal expansion of the monument (TEM). However, only part of the seasonal signal can be explained by known factors due to the limited data span, the GPS processing strategy and/or the adoption of an imperfect TEM model. In this paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of < 1 m and identical monuments. The daily solutions show that there are apparent annual signals with annual amplitude of 1 mm (maximum amplitude of 1.86 ± 0.17 mm) on almost all of the components, which are consistent with the results from previous studies. Semi-annual signal with a maximum amplitude of 0.97 ± 0.25 mm is also present. The analysis of time-correlated noise indicates that instead of flicker (FL) or random walk (RW) noise, band-pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with apparent elevation differences. The scheme adopted in this paper is expected to explicate more seasonal signals in GPS coordinate time series, particularly in the vertical direction.

  5. Trauma Quality Improvement: Reducing Triage Errors by Automating the Level Assignment Process.

    PubMed

    Stonko, David P; O Neill, Dillon C; Dennis, Bradley M; Smith, Melissa; Gray, Jeffrey; Guillamondegui, Oscar D

    2018-04-12

    Trauma patients are triaged by the severity of their injury or need for intervention while en route to the trauma center according to trauma activation protocols that are institution specific. Significant research has been aimed at improving these protocols in order to optimize patient outcomes while striving for efficiency in care. However, it is known that patients are often undertriaged or overtriaged because protocol adherence remains imperfect. The goal of this quality improvement (QI) project was to improve this adherence, and thereby reduce the triage error. It was conducted as part of the formal undergraduate medical education curriculum at this institution. A QI team was assembled and baseline data were collected, then 2 Plan-Do-Study-Act (PDSA) cycles were implemented sequentially. During the first cycle, a novel web tool was developed and implemented in order to automate the level assignment process (it takes EMS-provided data and automatically determines the level); the tool was based on the existing trauma activation protocol. The second PDSA cycle focused on improving triage accuracy in isolated, less than 10% total body surface area burns, which we identified to be a point of common error. Traumas were reviewed and tabulated at the end of each PDSA cycle, and triage accuracy was followed with a run chart. This study was performed at Vanderbilt University Medical Center and Medical School, which has a large level 1 trauma center covering over 75,000 square miles, and which sees urban, suburban, and rural trauma. The baseline assessment period and each PDSA cycle lasted 2 weeks. During this time, all activated, adult, direct traumas were reviewed. There were 180 patients during the baseline period, 189 after the first test of change, and 150 after the second test of change. All were included in analysis. Of 180 patients, 30 were inappropriately triaged during baseline analysis (3 undertriaged and 27 overtriaged) versus 16 of 189 (3 undertriaged and 13 overtriaged) following implementation of the web tool (p = 0.017 for combined errors). Overtriage dropped further from baseline to 10/150 after the second test of change (p = 0.005). The total number of triaged patients dropped from 92.3/week to 75.5/week after the second test of change. There was no statistically significant change in the undertriage rate. The combination of web tool implementation and protocol refinement decreased the combined triage error rate by over 50% (from 16.7%-7.9%). We developed and tested a web tool that improved triage accuracy, and provided a sustainable method to enact future quality improvement. This web tool and QI framework would be easily expandable to other hospitals. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  6. Development and Validation of an Immunoassay for Quantification of Topoisomerase I in Solid Tumor Tissues

    PubMed Central

    Pfister, Thomas D.; Hollingshead, Melinda; Kinders, Robert J.; Zhang, Yiping; Evrard, Yvonne A.; Ji, Jiuping; Khin, Sonny A.; Borgel, Suzanne; Stotler, Howard; Carter, John; Divelbiss, Raymond; Kummar, Shivaani; Pommier, Yves; Parchment, Ralph E.; Tomaszewski, Joseph E.; Doroshow, James H.

    2012-01-01

    Background Topoisomerase I (Top1) is a proven target for cancer therapeutics. Recent data from the Fluorouracil, Oxaliplatin, CPT-11: Use and Sequencing (FOCUS) trial demonstrated that nuclear staining of Top1 correlates with chemotherapeutic efficacy. Such a correlation may help identify patients likely to respond to Top1 inhibitors and illuminate their mechanism of action. Cellular response to Top1 inhibitors is complex, but Top1 target engagement is a necessary first step in this process. This paper reports the development and validation of a quantitative immunoassay for Top1 in tumors. Methodology/Principal Findings We have developed and validated a two-site enzyme chemiluminescent immunoassay for quantifying Top1 levels in tumor biopsies. Analytical validation of the assay established the inter-day coefficient of variation at 9.3%±3.4% and a 96.5%±7.3% assay accuracy. Preclinical fit-for-purpose modeling of topotecan time- and dose-effects was performed using topotecan-responsive and -nonresponsive xenografts in athymic nude mice. Higher baseline levels of Top1 were observed in topotecan-responsive than -nonresponsive tumors. Top1 levels reached a maximal decrease 4 to 7 hours following treatment of engrafted mice with topotecan and the indenoisoquinoline NSC 724998. Conclusions/Significance Our analysis of Top1 levels in control and treated tumors supports the previously proposed mechanism of action for Top1 inhibitor efficacy, wherein higher baseline Top1 levels lead to formation of more covalent-complex-dependent double-strand break damage and, ultimately, cell death. In contrast, xenografts with lower baseline Top1 levels accumulate fewer double-stand breaks, and may be more resistant to Top1 inhibitors. Our results support further investigation into the use of Top1 levels in tumors as a potential predictive biomarker. The Top1 immunoassay described in this paper has been incorporated into a Phase I clinical trial at the National Cancer Institute to assess pharmacodynamic response in tumor biopsies and determine whether baseline Top1 levels are predictive of response to indenoisoquinoline Top1 inhibitors. PMID:23284638

  7. Development of a general baseline toxicity QSAR model for the fish embryo acute toxicity test.

    PubMed

    Klüver, Nils; Vogs, Carolina; Altenburger, Rolf; Escher, Beate I; Scholz, Stefan

    2016-12-01

    Fish embryos have become a popular model in ecotoxicology and toxicology. The fish embryo acute toxicity test (FET) with the zebrafish embryo was recently adopted by the OECD as technical guideline TG 236 and a large database of concentrations causing 50% lethality (LC 50 ) is available in the literature. Quantitative Structure-Activity Relationships (QSARs) of baseline toxicity (also called narcosis) are helpful to estimate the minimum toxicity of chemicals to be tested and to identify excess toxicity in existing data sets. Here, we analyzed an existing fish embryo toxicity database and established a QSAR for fish embryo LC 50 using chemicals that were independently classified to act according to the non-specific mode of action of baseline toxicity. The octanol-water partition coefficient K ow is commonly applied to discriminate between non-polar and polar narcotics. Replacing the K ow by the liposome-water partition coefficient K lipw yielded a common QSAR for polar and non-polar baseline toxicants. This developed baseline toxicity QSAR was applied to compare the final mode of action (MOA) assignment of 132 chemicals. Further, we included the analysis of internal lethal concentration (ILC 50 ) and chemical activity (La 50 ) as complementary approaches to evaluate the robustness of the FET baseline toxicity. The analysis of the FET dataset revealed that specifically acting and reactive chemicals converged towards the baseline toxicity QSAR with increasing hydrophobicity. The developed FET baseline toxicity QSAR can be used to identify specifically acting or reactive compounds by determination of the toxic ratio and in combination with appropriate endpoints to infer the MOA for chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Developing a culturally competent health network: a planning framework and guide.

    PubMed

    Gertner, Eric J; Sabino, Judith N; Mahady, Erica; Deitrich, Lynn M; Patton, Jarret R; Grim, Mary Kay; Geiger, James F; Salas-Lopez, Debbie

    2010-01-01

    The number of cultural competency initiatives in healthcare is increasing due to many factors, including changing demographics, quality improvement and regulatory requirements, equitable care missions, and accreditation standards. To facilitate organization-wide transformation, a hospital or healthcare system must establish strategic goals, objectives, and implementation tasks for culturally competent provision of care. This article reports the largely successful results of a cultural competency program instituted at a large system in eastern Pennsylvania. Prior to the development of its cultural competency initiative, Lehigh Valley Health Network, Allentown, Pennsylvania, saw isolated activities producing innovative solutions to diversity and culture issues in the provision of equitable care. But it took a transformational event to support an organization-wide program in cultural competency by strengthening leadership buy-in and providing a sense of urgency, excitement, and shared vision among multiple stakeholders. A multidisciplinary task force, including senior leaders and a diverse group of employees, was created with the authority and responsibility to enact changes. Through a well-organized strategic planning process, existing patient and community demographic data were reviewed to describe existing disparities, a baseline assessment was completed, a mission statement was created, and clear metrics were developed. The strategic plan, which focused on five key areas (demographics, language-appropriate services, employees, training, and education/communication), was approved by the network's chief executive officer and senior managers to demonstrate commitment prior to implementation. Strategic plan implementation proceeded through a project structure consisting of subproject teams charged with achieving the following specific objectives: develop a cultural material repository, enhance employee recruitment/retention, establish a baseline assessment, standardize data collection, provide language-appropriate services, and develop an education program. Change management and project management methodologies; defined roles and responsibilities; and specific, measurable, attainable, realistic, and time-bound goals were used in the implementation. This process has supported organizational change, thereby promoting high-quality, safe, and equitable care through widespread expectations of culturally competent care delivery across the entire network. Using this "ecologic approach" will ensure long-term success.

  9. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    PubMed

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  10. Aircraft Engine On-Line Diagnostics Through Dual-Channel Sensor Measurements: Development of a Baseline System

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2008-01-01

    In this paper, a baseline system which utilizes dual-channel sensor measurements for aircraft engine on-line diagnostics is developed. This system is composed of a linear on-board engine model (LOBEM) and fault detection and isolation (FDI) logic. The LOBEM provides the analytical third channel against which the dual-channel measurements are compared. When the discrepancy among the triplex channels exceeds a tolerance level, the FDI logic determines the cause of the discrepancy. Through this approach, the baseline system achieves the following objectives: (1) anomaly detection, (2) component fault detection, and (3) sensor fault detection and isolation. The performance of the baseline system is evaluated in a simulation environment using faults in sensors and components.

  11. Intima-Media Thickness and Cognitive Function in Stroke-Free Middle-Aged Adults: Findings From the Coronary Artery Risk Development in Young Adults Study.

    PubMed

    Zeki Al Hazzouri, Adina; Vittinghoff, Eric; Sidney, Stephen; Reis, Jared P; Jacobs, David R; Yaffe, Kristine

    2015-08-01

    The relationship between carotid artery intima-media thickness (IMT) and cognitive function in midlife remains relatively unexplored. We examined the association between IMT and cognitive function in a middle-aged epidemiological cohort of 2618 stroke-free participants. At the year 20 visit (our study baseline), participants from the Coronary Artery Risk Development in Young Adults study had IMT measured by ultrasound at the common carotid artery. Five years later, participants completed a cognitive battery consisting of the Rey Auditory-Verbal Learning Test of verbal memory, the Digit Symbol Substitution Test of processing speed, and the Stroop test of executive function. We transformed cognitive scores into standardized z scores, with negative values indicating worse performance. Mean age at baseline was 45.3 years (SD, 3.6). Greater IMT (per 1 SD difference of 0.12 mm) was significantly associated with worse performance on all cognitive tests (z scores) in unadjusted linear regression models (verbal memory, -0.16; 95% confidence interval [CI], -0.20 to -0.13; processing speed, -0.23; 95% CI, -0.27 to -0.19; and executive function, -0.17; 95% CI, -0.20 to -0.13). In models adjusted for sociodemographics and vascular risk factors that lie earlier in the causal pathway, greater IMT remained negatively associated with processing speed (-0.06; 95% CI, -0.09 to -0.02; P, 0.003) and borderline associated with executive function (-0.03; 95% CI, -0.07 to 0.00; P, 0.07) but not with verbal memory. We observed an association between greater IMT and worse processing speed-a key component of cognitive functioning-at middle age above and beyond traditional vascular risk factors. Efforts targeted at preventing early stages of atherosclerosis may modify the course of cognitive aging. © 2015 American Heart Association, Inc.

  12. Flourishing With Psychosis: A Prospective Examination on the Interactions Between Clinical, Functional, and Personal Recovery Processes on Well-being Among Individuals with Schizophrenia Spectrum Disorders.

    PubMed

    Chan, Randolph C H; Mak, Winnie W S; Chio, Floria H N; Tong, Alan C Y

    2017-09-08

    Well-being is not just the absence of mental disorder but also involves positive feelings and contentment (emotional well-being), meaningful engagement (psychological well-being), and contribution of one's community or society (social well-being). Recovery processes, which encompass mitigation of clinical symptomatology (clinical recovery), improvement in occupational, social, and adaptive functioning (functional recovery), and development of personally valued goals and identity (personal recovery), have demonstrated to be important markers of well-being. This study examined the relative contribution of clinical, functional, and personal recovery processes on well-being among individuals with schizophrenia and explored the effect of personal recovery on people with varying levels of symptom severity and functional ability. A longitudinal quantitative research design was used in which 181 people with schizophrenia spectrum disorders were assessed at baseline and 6 months. At baseline, 28.2% of the participants were considered as flourishing. Around half of the participants (52.5%) were moderately mentally healthy, while 19.3% were identified as languishing. Results showed that clinical recovery was predictive of better well-being at 6-month postbaseline. Personal recovery was found to positively predict well-being, above and beyond the effects of clinical and functional recovery. Moderation analysis showed that the effect of personal recovery on well-being did not depend on clinical and functional recovery, which implied that people with schizophrenia can participate in the process of personal recovery and enjoy positive well-being regardless of their clinical stability and functional competence. Given the robust salutogenic effect of personal recovery, greater emphasis should be placed on developing person-centered, strength-based, recovery-oriented services. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  13. Detecting molecular features of spectra mainly associated with structural and non-structural carbohydrates in co-products from bioEthanol production using DRIFT with uni- and multivariate molecular spectral analyses.

    PubMed

    Yu, Peiqiang; Damiran, Daalkhaijav; Azarfar, Arash; Niu, Zhiyuan

    2011-01-01

    The objective of this study was to use DRIFT spectroscopy with uni- and multivariate molecular spectral analyses as a novel approach to detect molecular features of spectra mainly associated with carbohydrate in the co-products (wheat DDGS, corn DDGS, blend DDGS) from bioethanol processing in comparison with original feedstock (wheat (Triticum), corn (Zea mays)). The carbohydrates related molecular spectral bands included: A_Cell (structural carbohydrates, peaks area region and baseline: ca. 1485-1188 cm(-1)), A_1240 (structural carbohydrates, peak area centered at ca. 1240 cm(-1) with region and baseline: ca. 1292-1198 cm(-1)), A_CHO (total carbohydrates, peaks region and baseline: ca. 1187-950 cm(-1)), A_928 (non-structural carbohydrates, peak area centered at ca. 928 cm(-1) with region and baseline: ca. 952-910 cm(-1)), A_860 (non-structural carbohydrates, peak area centered at ca. 860 cm(-1) with region and baseline: ca. 880-827 cm(-1)), H_1415 (structural carbohydrate, peak height centered at ca. 1415 cm(-1) with baseline: ca. 1485-1188 cm(-1)), H_1370 (structural carbohydrate, peak height at ca. 1370 cm(-1) with a baseline: ca. 1485-1188 cm(-1)). The study shows that the grains had lower spectral intensity (KM Unit) of the cellulosic compounds of A_1240 (8.5 vs. 36.6, P < 0.05), higher (P < 0.05) intensities of the non-structural carbohydrate of A_928 (17.3 vs. 2.0) and A_860 (20.7 vs. 7.6) than their co-products from bioethanol processing. There were no differences (P > 0.05) in the peak area intensities of A_Cell (structural CHO) at 1292-1198 cm(-1) and A_CHO (total CHO) at 1187-950 cm(-1) with average molecular infrared intensity KM unit of 226.8 and 508.1, respectively. There were no differences (P > 0.05) in the peak height intensities of H_1415 and H_1370 (structural CHOs) with average intensities 1.35 and 1.15, respectively. The multivariate molecular spectral analyses were able to discriminate and classify between the corn and corn DDGS molecular spectra, but not wheat and wheat DDGS. This study indicated that the bioethanol processing changes carbohydrate molecular structural profiles, compared with the original grains. However, the sensitivities of different types of carbohydrates and different grains (corn and wheat) to the processing differ. In general, the bioethanol processing increases the molecular spectral intensities for the structural carbohydrates and decreases the intensities for the non-structural carbohydrates. Further study is needed to quantify carbohydrate related molecular spectral features of the bioethanol co-products in relation to nutrient supply and availability of carbohydrates.

  14. Impact of Process Optimization and Quality Improvement Measures on Neonatal Feeding Outcomes at an All-Referral Neonatal Intensive Care Unit.

    PubMed

    Jadcherla, Sudarshan R; Dail, James; Malkar, Manish B; McClead, Richard; Kelleher, Kelly; Nelin, Leif

    2016-07-01

    We hypothesized that the implementation of a feeding quality improvement (QI) program among premature neonates accelerates feeding milestones, safely lowering hospital length of stay (LOS) compared with the baseline period. Baseline data were collected for 15 months (N = 92) prior to initiating the program, which involved development and implementation of a standardized feeding strategy in eligible premature neonates. Process optimization, implementation of feeding strategy, monitoring compliance, multidisciplinary feeding rounds, and continuous education strategies were employed. The main outcomes included the ability and duration to reach enteral feeds-120 (mL/kg/d), oral feeds-120 (mL/kg/d), and ad lib oral feeding. Balancing measures included growth velocities, comorbidities, and LOS. Comparing baseline versus feeding program (N = 92) groups, respectively, the feeding program improved the number of infants receiving trophic feeds (34% vs 80%, P < .002), trophic feeding duration (14.8 ± 10.3 days vs 7.6 ± 8.1 days, P < .0001), time to enteral feeds-120 (16.3 ± 15.4 days vs 11.4 ± 10.4 days, P < .04), time from oral feeding onset to oral feeds-120 (13.2 ± 16.7 days vs 19.5 ± 15.3 days, P < .0001), time from oral feeds-120 to ad lib feeds at discharge (22.4 ± 27.2 days vs 18.6 ± 21.3 days, P < .01), weight velocity (24 ± 6 g/d vs 27 ± 11 g/d, P < .03), and LOS (104.2 ± 51.8 vs 89.3 ± 46.0, P = .02). Mortality, readmissions within 30 days, and comorbidities were similar. Process optimization and the implementation of a standardized feeding strategy minimize practice variability, accelerating the attainment of enteral and oral feeding milestones and decreasing LOS without increasing adverse morbidities. © 2015 American Society for Parenteral and Enteral Nutrition.

  15. Potential for Integrating Entry Guidance into the Multi-Disciplinary Entry Vehicle Optimization Environment

    NASA Technical Reports Server (NTRS)

    D'souza, Sarah N.; Kinney, David J.; Garcia, Joseph A.; Sarigul-Klijn, Nesrin

    2014-01-01

    The state-of-the-art in vehicle design decouples flight feasible trajectory generation from the optimization process of an entry spacecraft shape. The disadvantage to this decoupled process is seen when a particular aeroshell does not meet in-flight requirements when integrated into Guidance, Navigation, and Control simulations. It is postulated that the integration of a guidance algorithm into the design process will provide a real-time, rapid trajectory generation technique to enhance the robustness of vehicle design solutions. The potential benefit of this integration is a reduction in design cycles (possible cost savings) and increased accuracy in the aerothermal environment (possible mass savings). This work examines two aspects: 1) the performance of a reference tracking guidance algorithm for five different geometries with the same reference trajectory and 2) the potential of mass savings from improved aerothermal predictions. An Apollo Derived Guidance (ADG) algorithm is used in this study. The baseline geometry and five test case geometries were flown using the same baseline trajectory. The guided trajectory results are compared to separate trajectories determined in a vehicle optimization study conducted for NASA's Mars Entry, Descent, and Landing System Analysis. This study revealed several aspects regarding the potential gains and required developments for integrating a guidance algorithm into the vehicle optimization environment. First, the generation of flight feasible trajectories is only as good as the robustness of the guidance algorithm. The set of dispersed geometries modelled aerodynamic dispersions that ranged from +/-1% to +/-17% and a single extreme case was modelled where the aerodynamics were approximately 80% less than the baseline geometry. The ADG, as expected, was able to guide the vehicle into the aeroshell separation box at the target location for dispersions up to 17%, but failed for the 80% dispersion cases. Finally, the results revealed that including flight feasible trajectories for a set of dispersed geometries has the potential to save mass up to 430 kg.

  16. Multi-project baselines for potential clean development mechanism projects in the electricity sector in South Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkler, H.; Spalding-Fecher, R.; Sathaye, J.

    2002-06-26

    The United Nations Framework Convention on Climate Change (UNFCCC) aims to reduce emissions of greenhouse gases (GHGs) in order to ''prevent dangerous anthropogenic interference with the climate system'' and promote sustainable development. The Kyoto Protocol, which was adopted in 1997 and appears likely to be ratified by 2002 despite the US withdrawing, aims to provide means to achieve this objective. The Clean Development Mechanism (CDM) is one of three ''flexibility mechanisms'' in the Protocol, the other two being Joint Implementation (JI) and Emissions Trading (ET). These mechanisms allow flexibility for Annex I Parties (industrialized countries) to achieve reductions by extra-territorialmore » as well as domestic activities. The underlying concept is that trade and transfer of credits will allow emissions reductions at least cost. Since the atmosphere is a global, well-mixed system, it does not matter where greenhouse gas emissions are reduced. The CDM allows Annex I Parties to meet part of their emissions reductions targets by investing in developing countries. CDM projects must also meet the sustainable development objectives of the developing country. Further criteria are that Parties must participate voluntarily, that emissions reductions are ''real, measurable and long-term'', and that they are additional to those that would have occurred anyway. The last requirement makes it essential to define an accurate baseline. The remaining parts of section 1 outline the theory of baselines, emphasizing the balance needed between environmental integrity and reducing transaction costs. Section 2 develops an approach to multi-project baseline for the South African electricity sector, comparing primarily to near future capacity, but also considering recent plants. Five potential CDM projects are briefly characterized in section 3, and compared to the baseline in section 4. Section 5 concludes with a discussion of options and choices for South Africa regarding electricity sector baselines.« less

  17. LITERATURE REVIEWS TO SUPPORT ION EXCHANGE TECHNOLOGY SELECTION FOR MODULAR SALT PROCESSING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, W

    2007-11-30

    This report summarizes the results of literature reviews conducted to support the selection of a cesium removal technology for application in a small column ion exchange (SCIX) unit supported within a high level waste tank. SCIX is being considered as a technology for the treatment of radioactive salt solutions in order to accelerate closure of waste tanks at the Savannah River Site (SRS) as part of the Modular Salt Processing (MSP) technology development program. Two ion exchange materials, spherical Resorcinol-Formaldehyde (RF) and engineered Crystalline Silicotitanate (CST), are being considered for use within the SCIX unit. Both ion exchange materials havemore » been studied extensively and are known to have high affinities for cesium ions in caustic tank waste supernates. RF is an elutable organic resin and CST is a non-elutable inorganic material. Waste treatment processes developed for the two technologies will differ with regard to solutions processed, secondary waste streams generated, optimum column size, and waste throughput. Pertinent references, anticipated processing sequences for utilization in waste treatment, gaps in the available data, and technical comparisons will be provided for the two ion exchange materials to assist in technology selection for SCIX. The engineered, granular form of CST (UOP IE-911) was the baseline ion exchange material used for the initial development and design of the SRS SCIX process (McCabe, 2005). To date, in-tank SCIX has not been implemented for treatment of radioactive waste solutions at SRS. Since initial development and consideration of SCIX for SRS waste treatment an alternative technology has been developed as part of the River Protection Project Waste Treatment Plant (RPP-WTP) Research and Technology program (Thorson, 2006). Spherical RF resin is the baseline media for cesium removal in the RPP-WTP, which was designed for the treatment of radioactive waste supernates and is currently under construction in Hanford, WA. Application of RF for cesium removal in the Hanford WTP does not involve in-riser columns but does utilize the resin in large scale column configurations in a waste treatment facility. The basic conceptual design for SCIX involves the dissolution of saltcake in SRS Tanks 1-3 to give approximately 6 M sodium solutions and the treatment of these solutions for cesium removal using one or two columns supported within a high level waste tank. Prior to ion exchange treatment, the solutions will be filtered for removal of entrained solids. In addition to Tanks 1-3, solutions in two other tanks (37 and 41) will require treatment for cesium removal in the SCIX unit. The previous SCIX design (McCabe, 2005) utilized CST for cesium removal with downflow supernate processing and included a CST grinder following cesium loading. Grinding of CST was necessary to make the cesium-loaded material suitable for vitrification in the SRS Defense Waste Processing Facility (DWPF). Because RF resin is elutable (and reusable) and processing requires conversion between sodium and hydrogen forms using caustic and acidic solutions more liquid processing steps are involved. The WTP baseline process involves a series of caustic and acidic solutions (downflow processing) with water washes between pH transitions across neutral. In addition, due to resin swelling during conversion from hydrogen to sodium form an upflow caustic regeneration step is required. Presumably, one of these basic processes (or some variation) will be utilized for MSP for the appropriate ion exchange technology selected. CST processing involves two primary waste products: loaded CST and decontaminated salt solution (DSS). RF processing involves three primary waste products: spent RF resin, DSS, and acidic cesium eluate, although the resin is reusable and typically does not require replacement until completion of multiple treatment cycles. CST processing requires grinding of the ion exchange media, handling of solids with high cesium loading, and handling of liquid wash and conditioning solutions. RF processing requires handling and evaporation of cesium eluates, disposal of spent organic resin, and handling of the various liquid wash and regenerate solutions used. In both cases, the DSS will be immobilized in a low activity waste form. It appears that both technologies are mature, well studied, and generally suitable for this application. Technology selection will likely be based on downstream impacts or preferences between the various processing options for the two materials rather than on some unacceptable performance property identified for one material. As a result, the following detailed technical review and summary of the two technologies should be useful to assist in technology selection for SCIX.« less

  18. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: BASELINE QUESTIONNAIRE (HOUSEHOLD) (UA-D-7.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Baseline Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Household and individual data were combined in a single Baseline Questionnaire data file. Key...

  19. Elevated serum uric acid increases risks for developing high LDL cholesterol and hypertriglyceridemia: A five-year cohort study in Japan.

    PubMed

    Kuwabara, Masanari; Borghi, Claudio; Cicero, Arrigo F G; Hisatome, Ichiro; Niwa, Koichiro; Ohno, Minoru; Johnson, Richard J; Lanaspa, Miguel A

    2018-06-15

    High serum uric acid (SUA) is associated with the dyslipidemia, but whether hyperuricemia predicts an increase in serum low-density lipoprotein (LDL) cholesterol is unknown. This study is to evaluate whether an elevated SUA predicts the development of high LDL cholesterol as well as hypertriglyceridemia. This is a retrospective 5-year cohort study of 6476 healthy Japanese adults (age, 45.7 ± 10.1 years; 2.243 men) who underwent health examinations at 2004 and were reevaluated in 2009 at St. Luke's International Hospital, Tokyo, Japan. Subjects were included if at their baseline examination they did not have hypertension, diabetes mellitus, dyslipidemia, chronic kidney disease, or if they were on medication for hyperuricemia and/or gout. The analysis was adjusted for age, body mass index (BMI), smoking and drinking habits, baseline estimated glomerular filtration rate (eGFR), baseline SUA and SUA change over the 5 years. High baseline SUA was an independent risk for developing high LDL cholesterol both in men (OR: 1.159 per 1 mg/dL increase, 95% CI:1.009-1.331) and women (OR: 1.215, 95% CI:1.061-1.390). Other risk factors included a higher baseline LDL cholesterol, higher BMI, and higher baseline eGFR (the latter two in women only). Increased SUA over 5 years were also independent risks for developing high LDL cholesterol and hypertriglyceridemia, but not for low high-density lipoprotein (HDL) cholesterol. This is the first study to report that an elevated SUA increases the risk for developing high LDL cholesterol, as well as hypertriglyceridemia. This may shed light into the role of SUA in cardiovascular disease. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. EFPI sensor utilizing optical spectrum analyzer with tunable laser: detection of baseline oscillations faster than spectrum acquisition rate

    NASA Astrophysics Data System (ADS)

    Ushakov, Nikolai; Liokumovich, Leonid

    2014-05-01

    A novel approach for extrinsic Fabry-Perot interferometer baseline measurement has been developed. The principles of frequency-scanning interferometry are utilized for registration of the interferometer spectral function, from which the baseline is demodulated. The proposed approach enables one to capture the absolute baseline variations at frequencies much higher than the spectral acquisition rate. Despite the conventional approaches, associating a single baseline indication to the registered spectrum, in the proposed method a modified frequency detection procedure is applied to the spectrum. This provides an ability to capture the baseline variations which took place during the spectrum acquisition. The limitations on the parameters of the possibly registered baseline variations are formulated. The experimental verification of the proposed approach for different perturbations has been performed.

  1. Association of vascular risk factors with cognition in a multiethnic sample.

    PubMed

    Schneider, Brooke C; Gross, Alden L; Bangen, Katherine J; Skinner, Jeannine C; Benitez, Andreana; Glymour, M Maria; Sachs, Bonnie C; Shih, Regina A; Sisco, Shannon; Manly, Jennifer J; Luchsinger, José A

    2015-07-01

    To examine the relationship between cardiovascular risk factors (CVRFs) and cognitive performance in a multiethnic sample of older adults. We used longitudinal data from the Washington Heights-Inwood Columbia Aging Project. A composite score including smoking, stroke, heart disease, diabetes, hypertension, and central obesity represented CVRFs. Multiple group parallel process multivariate random effects regression models were used to model cognitive functioning and examine the contribution of CVRFs to baseline performance and change in general cognitive processing, memory, and executive functioning. Presence of each CVRF was associated with a 0.1 SD lower score in general cognitive processing, memory, and executive functioning in black and Hispanic participants relative to whites. Baseline CVRFs were associated with poorer baseline cognitive performances among black women and Hispanic men. CVRF increase was related to baseline cognitive performance only among Hispanics. CVRFs were not related to cognitive decline. After adjustment for medications, CVRFs were not associated with cognition in Hispanic participants. CVRFs are associated with poorer cognitive functioning, but not cognitive decline, among minority older adults. These relationships vary by gender and medication use. Consideration of unique racial, ethnic, and cultural factors is needed when examining relationships between CVRFs and cognition. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Improving 130nm node patterning using inverse lithography techniques for an analog process

    NASA Astrophysics Data System (ADS)

    Duan, Can; Jessen, Scott; Ziger, David; Watanabe, Mizuki; Prins, Steve; Ho, Chi-Chien; Shu, Jing

    2018-03-01

    Developing a new lithographic process routinely involves usage of lithographic toolsets and much engineering time to perform data analysis. Process transfers between fabs occur quite often. One of the key assumptions made is that lithographic settings are equivalent from one fab to another and that the transfer is fluid. In some cases, that is far from the truth. Differences in tools can change the proximity effect seen in low k1 imaging processes. If you use model based optical proximity correction (MBOPC), then a model built in one fab will not work under the same conditions at another fab. This results in many wafers being patterned to try and match a baseline response. Even if matching is achieved, there is no guarantee that optimal lithographic responses are met. In this paper, we discuss the approach used to transfer and develop new lithographic processes and define MBOPC builds for the new lithographic process in Fab B which was transferred from a similar lithographic process in Fab A. By using PROLITHTM simulations to match OPC models for each level, minimal downtime in wafer processing was observed. Source Mask Optimization (SMO) was also used to optimize lithographic processes using novel inverse lithography techniques (ILT) to simultaneously optimize mask bias, depth of focus (DOF), exposure latitude (EL) and mask error enhancement factor (MEEF) for critical designs for each level.

  3. Patterns and comparisons of human-induced changes in river flood impacts in cities

    NASA Astrophysics Data System (ADS)

    Clark, Stephanie; Sharma, Ashish; Sisson, Scott A.

    2018-03-01

    In this study, information extracted from the first global urban fluvial flood risk data set (Aqueduct) is investigated and visualized to explore current and projected city-level flood impacts driven by urbanization and climate change. We use a novel adaption of the self-organizing map (SOM) method, an artificial neural network proficient at clustering, pattern extraction, and visualization of large, multi-dimensional data sets. Prevalent patterns of current relationships and anticipated changes over time in the nonlinearly-related environmental and social variables are presented, relating urban river flood impacts to socioeconomic development and changing hydrologic conditions. Comparisons are provided between 98 individual cities. Output visualizations compare baseline and changing trends of city-specific exposures of population and property to river flooding, revealing relationships between the cities based on their relative map placements. Cities experiencing high (or low) baseline flood impacts on population and/or property that are expected to improve (or worsen), as a result of anticipated climate change and development, are identified and compared. This paper condenses and conveys large amounts of information through visual communication to accelerate the understanding of relationships between local urban conditions and global processes.

  4. Persistence of psychosis spectrum symptoms in the Philadelphia Neurodevelopmental Cohort: a prospective two‐year follow‐up

    PubMed Central

    Calkins, Monica E.; Moore, Tyler M.; Satterthwaite, Theodore D.; Wolf, Daniel H.; Turetsky, Bruce I.; Roalf, David R.; Merikangas, Kathleen R.; Ruparel, Kosha; Kohler, Christian G.; Gur, Ruben C.; Gur, Raquel E.

    2017-01-01

    Prospective evaluation of youths with early psychotic‐like experiences can enrich our knowledge of clinical, biobehavioral and environmental risk and protective factors associated with the development of psychotic disorders. We aimed to investigate the predictors of persistence or worsening of psychosis spectrum features among US youth through the first large systematic study to evaluate subclinical symptoms in the community. Based on Time 1 screen of 9,498 youth (age 8‐21) from the Philadelphia Neurodevelopmental Cohort, a subsample of participants was enrolled based on the presence (N=249) or absence (N=254) of baseline psychosis spectrum symptoms, prior participation in neuroimaging, and current neuroimaging eligibility. They were invited to participate in a Time 2 assessment two years on average following Time 1. Participants were administered the Structured Interview for Prodromal Syndromes, conducted blind to initial screen status, along with the Schizotypal Personality Questionnaire and other clinical measures, computerized neurocognitive testing, and neuroimaging. Clinical and demographic predictors of symptom persistence were examined using logistic regression. At Time 2, psychosis spectrum features persisted or worsened in 51.4% of youths. Symptom persistence was predicted by higher severity of subclinical psychosis, lower global functioning, and prior psychiatric medication at baseline. Youths classified as having psychosis spectrum symptoms at baseline but not at follow‐up nonetheless exhibited comparatively higher symptom levels and lower functioning at both baseline and follow‐up than typically developing youths. In addition, psychosis spectrum features emerged in a small number of young people who previously had not reported significant symptoms but who had exhibited early clinical warning signs. Together, our findings indicate that varying courses of psychosis spectrum symptoms are evident early in US youth, supporting the importance of investigating psychosis risk as a dynamic developmental process. Neurocognition, brain structure and function, and genomics may be integrated with clinical data to provide early indices of symptom persistence and worsening in youths at risk for psychosis. PMID:28127907

  5. Baseline Risk Factors that Predict the Development of Open-angle Glaucoma in a Population: The Los Angeles Latino Eye Study

    PubMed Central

    Jiang, Xuejuan; Varma, Rohit; Wu, Shuang; Torres, Mina; Azen, Stanley P; Francis, Brian A.; Chopra, Vikas; Nguyen, Betsy Bao-Thu

    2012-01-01

    Objective To determine which baseline socio-demographic, lifestyle, anthropometric, clinical, and ocular risk factors predict the development of open-angle glaucoma (OAG) in an adult population. Design A population-based, prospective cohort study. Participants A total of 3,772 self-identified Latinos aged 40 years and older from Los Angeles, California who were free of OAG at baseline. Methods Participants from the Los Angeles Latino Eye Study had standardized study visits at baseline and 4-year follow-up with structured interviews and a comprehensive ophthalmologic examination. OAG was defined as the presence of an open angle and a glaucomatous visual field abnormality and/or evidence of glaucomatous optic nerve damage in at least one eye. Multivariate logistic regression with stepwise selection was performed to determine which potential baseline risk factors independently predict the development of OAG. Main Outcome Measure Odds ratios for various risk factors. Results Over the 4-year follow-up, 87 participants developed OAG. The baseline risk factors that predict the development of OAG include: older age (odds ratio [OR] per decade, 2.19; 95% confidence intervals [CI], 1.74-2.75; P<0.001), higher intraocular pressure (OR per mmHg, 1.18; 95% CI, 1.10-1.26; P<0.001), longer axial length (OR per mm, 1.48; 95% CI, 1.22-1.80; P<0.001), thinner central cornea (OR per 40 μm thinner, 1.30; 95% CI, 1.00-1.70; P=0.050) higher waist to hip ratio (OR per 0.05 higher, 1.21; 95% CI, 1.05-1.39; P=0.007) and lack of vision insurance (OR, 2.08; 95% CI, 1.26-3.41; P=0.004). Conclusions Despite a mean baseline IOP of 14 mmHg in Latinos, higher intraocular pressure is an important risk factor for developing OAG. Biometric measures suggestive of less structural support such as longer axial length and thin CCT were identified as important risk factors. Lack of health insurance reduces access to eye care and increases the burden of OAG by reducing the likelihood of early detection and treatment of OAG. PMID:22796305

  6. Early patterning and blastodermal fate map of the head in the milkweed bug Oncopeltus fasciatus.

    PubMed

    Birkan, Michael; Schaeper, Nina D; Chipman, Ariel D

    2011-01-01

    The process of head development in insects utilizes a set of widely conserved genes, but this process and its evolution are not well understood. Recent data from Tribolium castaneum have provided a baseline for an understanding of insect head development. However, work on a wider range of insect species, including members of the hemimetabolous orders, is needed in order to draw general conclusions about the evolution of head differentiation and regionalization. We have cloned and studied the expression and function of a number of candidate genes for head development in the hemipteran Oncopeltus fasciatus. These include orthodenticle, empty spiracles, collier, cap 'n' collar, and crocodile. The expression patterns of these genes show a broad conservation relative to Tribolium, as well as differences from Drosophila indicating that Tribolium + Oncopeltus represent a more ancestral pattern. In addition, our data provide a blastodermal fate map for different head regions in later developmental stages and supply us with a "roadmap" for future studies on head development in this species. © 2011 Wiley Periodicals, Inc.

  7. Rotary wave-ejector enhanced pulse detonation engine

    NASA Astrophysics Data System (ADS)

    Nalim, M. R.; Izzy, Z. A.; Akbari, P.

    2012-01-01

    The use of a non-steady ejector based on wave rotor technology is modeled for pulse detonation engine performance improvement and for compatibility with turbomachinery components in hybrid propulsion systems. The rotary wave ejector device integrates a pulse detonation process with an efficient momentum transfer process in specially shaped channels of a single wave-rotor component. In this paper, a quasi-one-dimensional numerical model is developed to help design the basic geometry and operating parameters of the device. The unsteady combustion and flow processes are simulated and compared with a baseline PDE without ejector enhancement. A preliminary performance assessment is presented for the wave ejector configuration, considering the effect of key geometric parameters, which are selected for high specific impulse. It is shown that the rotary wave ejector concept has significant potential for thrust augmentation relative to a basic pulse detonation engine.

  8. The healthy options for nutrition environments in schools (Healthy ONES) group randomized trial: using implementation models to change nutrition policy and environments in low income schools

    PubMed Central

    2012-01-01

    Background The Healthy Options for Nutrition Environments in Schools (Healthy ONES) study was an evidence-based public health (EBPH) randomized group trial that adapted the Institute for Healthcare Improvement’s (IHI) rapid improvement process model to implement school nutrition policy and environmental change. Methods A low-income school district volunteered for participation in the study. All schools in the district agreed to participate (elementary = 6, middle school = 2) and were randomly assigned within school type to intervention (n = 4) and control (n =4) conditions following a baseline environmental audit year. Intervention goals were to 1) eliminate unhealthy foods and beverages on campus, 2) develop nutrition services as the main source on campus for healthful eating (HE), and 3) promote school staff modeling of HE. Schools were followed across a baseline year and two intervention years. Longitudinal assessment of height and weight was conducted with second, third, and sixth grade children. Behavioral observation of the nutrition environment was used to index the amount of outside foods and beverages on campuses. Observations were made monthly in each targeted school environment and findings were presented as items per child per week. Results From an eligible 827 second, third, and sixth grade students, baseline height and weight were collected for 444 second and third grade and 135 sixth grade students (51% reach). Data were available for 73% of these enrolled students at the end of three years. Intervention school outside food and beverage items per child per week decreased over time and control school outside food and beverage items increased over time. The effects were especially pronounced for unhealthy foods and beverage items. Changes in rates of obesity for intervention school (28% baseline, 27% year 1, 30% year 2) were similar to those seen for control school (22% baseline, 22% year 1, 25% year 2) children. Conclusions Healthy ONES adaptation of IHI’s rapid improvement process provided a promising model for implementing nutrition policy and environmental changes that can be used in a variety of school settings. This approach may be especially effective in assisting schools to implement the current federally-mandated wellness policies. PMID:22734945

  9. Automated target classification in high resolution dual frequency sonar imagery

    NASA Astrophysics Data System (ADS)

    Aridgides, Tom; Fernández, Manuel

    2007-04-01

    An improved computer-aided-detection / computer-aided-classification (CAD/CAC) processing string has been developed. The classified objects of 2 distinct strings are fused using the classification confidence values and their expansions as features, and using "summing" or log-likelihood-ratio-test (LLRT) based fusion rules. The utility of the overall processing strings and their fusion was demonstrated with new high-resolution dual frequency sonar imagery. Three significant fusion algorithm improvements were made. First, a nonlinear 2nd order (Volterra) feature LLRT fusion algorithm was developed. Second, a Box-Cox nonlinear feature LLRT fusion algorithm was developed. The Box-Cox transformation consists of raising the features to a to-be-determined power. Third, a repeated application of a subset feature selection / feature orthogonalization / Volterra feature LLRT fusion block was utilized. It was shown that cascaded Volterra feature LLRT fusion of the CAD/CAC processing strings outperforms summing, baseline single-stage Volterra and Box-Cox feature LLRT algorithms, yielding significant improvements over the best single CAD/CAC processing string results, and providing the capability to correctly call the majority of targets while maintaining a very low false alarm rate. Additionally, the robustness of cascaded Volterra feature fusion was demonstrated, by showing that the algorithm yields similar performance with the training and test sets.

  10. Highly Reusable Space Transportation (HRST) Baseline Concepts and Analysis: Rocket/RBCC Options. Part 2; A Comparative Study

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon

    1997-01-01

    This study is an extension of a previous effort by the Principal Investigator to develop baseline data to support comparative analysis of Highly Reusable Space Transportation (HRST) concepts. The analyses presented herin develop baseline data bases for two two-stage-to-orbit (TSTO) concepts: (1) Assisted horizontal take-off all rocket (assisted HTOHL); and (2) Assisted vertical take-off rocket based combined cycle (RBCC). The study objectives were to: (1) Provide configuration definitions and illustrations for assisted HTOHL and assisted RBCC; (2) Develop a rationalization approach and compare these concepts with the HRST reference; and (3) Analyze TSTO configurations which try to maintain SSTO benefits while reducing inert weight sensitivity.

  11. Development and Characterization of Improved NiTiPd High-Temperature Shape-Memory Alloys by Solid-Solution Strengthening and Thermomechanical Processing

    NASA Technical Reports Server (NTRS)

    Bigelow, Glen; Noebe, Ronald; Padula, Santo, II; Garg, Anita; Olson, David

    2006-01-01

    The need for compact, solid-state actuation systems for use in the aerospace, automotive, and other transportation industries is currently motivating research in high-temperature shape-memory alloys (HTSMA) with transformation temperatures greater than 100 C. One of the basic high-temperature alloys investigated to fill this need is Ni(19.5)Ti(50.5)Pd30. Initial testing has indicated that this alloy, while having acceptable work characteristics, suffers from significant permanent deformation (or ratcheting) during thermal cycling under load. In an effort to overcome this deficiency, various solid-solution alloying and thermomechanical processing schemes were investigated. Solid-solution strengthening was achieved by substituting 5at% gold or platinum for palladium in Ni(19.5)Ti(50.5)Pd30, the so-called baseline alloy, to strengthen the martensite and austenite phases against slip processes and improve thermomechanical behavior. Tensile properties, work behavior, and dimensional stability during repeated thermal cycling under load for the ternary and quaternary alloys were compared. The relative difference in yield strength between the martensite and austenite phases and the dimensional stability of the alloy were improved by the quaternary additions, while work output was only minimally impacted. The three alloys were also thermomechanically processed by cycling repeatedly through the transformation range under a constant stress. This so-called training process dramatically improved the dimensional stability in these samples and also recovered the slight decrease in work output caused by quaternary alloying. An added benefit of the solid-solution strengthening was maintenance of enhanced dimensional stability of the trained material to higher temperatures compared to the baseline alloy, providing a greater measure of over-temperature capability.

  12. National Geospatial Data Asset Lifecycle Baseline Maturity Assessment for the Federal Geographic Data Committee

    NASA Astrophysics Data System (ADS)

    Peltz-Lewis, L. A.; Blake-Coleman, W.; Johnston, J.; DeLoatch, I. B.

    2014-12-01

    The Federal Geographic Data Committee (FGDC) is designing a portfolio management process for 193 geospatial datasets contained within the 16 topical National Spatial Data Infrastructure themes managed under OMB Circular A-16 "Coordination of Geographic Information and Related Spatial Data Activities." The 193 datasets are designated as National Geospatial Data Assets (NGDA) because of their significance in implementing to the missions of multiple levels of government, partners and stakeholders. As a starting point, the data managers of these NGDAs will conduct a baseline maturity assessment of the dataset(s) for which they are responsible. The maturity is measured against benchmarks related to each of the seven stages of the data lifecycle management framework promulgated within the OMB Circular A-16 Supplemental Guidance issued by OMB in November 2010. This framework was developed by the interagency Lifecycle Management Work Group (LMWG), consisting of 16 Federal agencies, under the 2004 Presidential Initiative the Geospatial Line of Business,using OMB Circular A-130" Management of Federal Information Resources" as guidance The seven lifecycle stages are: Define, Inventory/Evaluate, Obtain, Access, Maintain, Use/Evaluate, and Archive. This paper will focus on the Lifecycle Baseline Maturity Assessment, and efforts to integration the FGDC approach with other data maturity assessments.

  13. Functional Improvement Following Diastasis Rectus Abdominus Repair in an Active Duty Navy Female.

    PubMed

    Gallus, Katerina M; Golberg, Kathy F; Field, Robert

    2016-08-01

    Return to physical activity following childbirth can be a difficult process complicated by structural changes during pregnancy. A common problem is the development of a diastasis of the rectus abdominus (DRA), defined as a horizontal separation of the abdominus muscles at the linea alba. Recent data indicate that the greater the distance of separation of the muscle, the worse the functional ability. We describe a 24-year-old active duty U.S. Navy female G1P2 with a diagnosis of DRA. At 2 months postpartum, she was referred to physical therapy because of back pain and inability to meet baseline activities of daily living. After 4 months of physical therapy, she was unable to complete curl ups as required by U.S. Navy physical fitness standards. Abdominoplasty with imbrication of the abdominal wall diastasis was performed followed by additional physical therapy, after which she returned to baseline functioning. The restoration of functional ability postoperatively suggests there is a therapeutic indication for surgical correction of DRA. In high-functioning military patients with DRA who fail to return to baseline level of activity following a trial of physical therapy, surgical intervention should be considered to obtain the optimal functional ability. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  14. Systemic inflammation as a predictor of brain aging: Contributions of physical activity, metabolic risk, and genetic risk.

    PubMed

    Corlier, Fabian; Hafzalla, George; Faskowitz, Joshua; Kuller, Lewis H; Becker, James T; Lopez, Oscar L; Thompson, Paul M; Braskie, Meredith N

    2018-05-15

    Inflammatory processes may contribute to risk for Alzheimer's disease (AD) and age-related brain degeneration. Metabolic and genetic risk factors, and physical activity may, in turn, influence these inflammatory processes. Some of these risk factors are modifiable, and interact with each other. Understanding how these processes together relate to brain aging will help to inform future interventions to treat or prevent cognitive decline. We used brain magnetic resonance imaging (MRI) to scan 335 older adult humans (mean age 77.3 ± 3.4 years) who remained non-demented for the duration of the 9-year longitudinal study. We used structural equation modeling (SEM) in a subset of 226 adults to evaluate whether measures of baseline peripheral inflammation (serum C-reactive protein levels; CRP), mediated the baseline contributions of genetic and metabolic risk, and physical activity, to regional cortical thickness in AD-relevant brain regions at study year 9. We found that both baseline metabolic risk and AD risk variant apolipoprotein E ε4 (APOE4), modulated baseline serum CRP. Higher baseline CRP levels, in turn, predicted thinner regional cortex at year 9, and mediated an effect between higher metabolic risk and thinner cortex in those regions. A higher polygenic risk score composed of variants in immune-associated AD risk genes (other than APOE) was associated with thinner regional cortex. However, CRP levels did not mediate this effect, suggesting that other mechanisms may be responsible for the elevated AD risk. We found interactions between genetic and environmental factors and structural brain health. Our findings support the role of metabolic risk and peripheral inflammation in age-related brain decline. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Caregiver Input to Optimize the Design of a Pediatric Care Planning Guide for Rehabilitation: Descriptive Study

    PubMed Central

    Lim, Heather K; Corden, Marya E

    2017-01-01

    Background Caregiver input has informed the design of a valid electronic patient-reported outcome (PRO) measure for use in pediatric rehabilitation. This proxy assessment may be further developed to expedite and enhance patient-centered care planning processes, but user input is first needed to finalize the core requirements that will guide its design. Objective The objective of this study was to examine the feasibility of a stepwise process for building on a baseline assessment of young children's participation in activities to develop a care plan relevant to pediatric rehabilitation. Methods A cross-sectional descriptive study design was employed using qualitative methods. Data were collected via Web-based technology and by telephone. Twenty-five caregivers of young children (9 with developmental delays, 16 without delays) and between 1 and 7 years were recruited from a subsample of parents who had previously enrolled in a Web-based validation of a PRO on children’s participation and provided consent for future contact. Each caregiver completed a demographic questionnaire and Young Children’s Participation and Environment Measure (YC-PEM) online, followed by a 20- to 60-min semistructured and audiotaped phone interview to review and build upon PRO results as summarized in an electronic report. Interview data were content coded to the interview guide and reviewed by multiple research staff to estimate feasibility according to stepwise completion rates, perceptions of difficulty in step completion, and perceptions of overall utility. Results Half of the participants in the final study sample (N=25) fully completed a stepwise process of building on their baseline PRO assessment to develop an initial care plan for their child. In most cases, similar stepwise completion rates and trends in the approaches taken for step completion were found regardless of the child’s disability status. However, more parents of children with disabilities reported difficulties in rank ordering their priorities for change and identified child-focused strategies for goal attainment. Nearly 77% (19/25) of users were willing to use the process to develop and communicate intervention priorities and strategies with professionals, family, and friends. Conclusions Results informed revisions to the care planning guide before usability and feasibility testing of an initial Web-based prototype that is now underway. PMID:29066421

  16. Flat-plate solar array project process development area, process research of non-CZ silicon material

    NASA Technical Reports Server (NTRS)

    Campbell, R. B.

    1984-01-01

    The program is designed to investigate the fabrication of solar cells on N-type base material by a simultaneous diffusion of N-type and P-type dopants to form an P(+)NN(+) structure. The results of simultaneous diffusion experiments are being compared to cells fabricated using sequential diffusion of dopants into N-base material in the same resistivity range. The process used for the fabrication of the simultaneously diffused P(+)NN(+) cells follows the standard Westinghouse baseline sequence for P-base material except that the two diffusion processes (boron and phosphorus) are replaced by a single diffusion step. All experiments are carried out on N-type dendritic web grown in the Westinghouse pre-pilot facility. The resistivities vary from 0.5 (UC OMEGA)cm to 5 (UC OMEGA)cm. The dopant sources used for both the simultaneous and sequential diffusion experiments are commercial metallorganic solutions with phosphorus or boron components. After these liquids are applied to the web surface, they are baked to form a hard glass which acts as a diffusion source at elevated temperatures. In experiments performed thus far, cells produced in sequential diffusion tests have properties essentially equal to the baseline N(+)PP(+) cells. However, the simultaneous diffusions have produced cells with much lower IV characteristics mainly due to cross-doping of the sources at the diffusion temperature. This cross-doping is due to the high vapor pressure phosphorus (applied as a metallorganic to the back surface) diffusion through the SiO2 mask and then acting as a diffusant source for the front surface.

  17. Evaluation of feedback interventions for improving the quality assurance of cancer screening in Japan: study design and report of the baseline survey.

    PubMed

    Machii, Ryoko; Saika, Kumiko; Higashi, Takahiro; Aoki, Ayako; Hamashima, Chisato; Saito, Hiroshi

    2012-02-01

    The importance of quality assurance in cancer screening has recently gained increasing attention in Japan. To evaluate and improve quality, checklists and process indicators have been developed. To explore effective methods of enhancing quality in cancer screening, we started a randomized control study of the methods of evaluation and feedback for cancer control from 2009 to 2014. We randomly assigned 1270 municipal governments, equivalent to 71% of all Japanese municipal governments that performed screening programs, into three groups. The high-intensity intervention groups (n = 425) were individually evaluated using both checklist performance and process indicator values, while the low-intensity intervention groups (n= 421) were individually evaluated on the basis of only checklist performance. The control group (n = 424) received only a basic report that included the national average of checklist performance scores. We repeated the survey for each municipality's quality assurance activity performance using checklists and process indicators. In this paper, we report our study design and the result of the baseline survey. The checklist adherence rates were especially low in the checklist elements related to invitation of individuals, detailed monitoring of process indicators such as cancer detection rates according to screening histories and appropriate selection of screening facilities. Screening rate and percentage of examinees who underwent detailed examination tended to be lower for large cities when compared with smaller cities for all cancer sites. The performance of the Japanese cancer screening program in 2009 was identified for the first time.

  18. Estimating error statistics for Chambon-la-Forêt observatory definitive data

    NASA Astrophysics Data System (ADS)

    Lesur, Vincent; Heumez, Benoît; Telali, Abdelkader; Lalanne, Xavier; Soloviev, Anatoly

    2017-08-01

    We propose a new algorithm for calibrating definitive observatory data with the goal of providing users with estimates of the data error standard deviations (SDs). The algorithm has been implemented and tested using Chambon-la-Forêt observatory (CLF) data. The calibration process uses all available data. It is set as a large, weakly non-linear, inverse problem that ultimately provides estimates of baseline values in three orthogonal directions, together with their expected standard deviations. For this inverse problem, absolute data error statistics are estimated from two series of absolute measurements made within a day. Similarly, variometer data error statistics are derived by comparing variometer data time series between different pairs of instruments over few years. The comparisons of these time series led us to use an autoregressive process of order 1 (AR1 process) as a prior for the baselines. Therefore the obtained baselines do not vary smoothly in time. They have relatively small SDs, well below 300 pT when absolute data are recorded twice a week - i.e. within the daily to weekly measures recommended by INTERMAGNET. The algorithm was tested against the process traditionally used to derive baselines at CLF observatory, suggesting that statistics are less favourable when this latter process is used. Finally, two sets of definitive data were calibrated using the new algorithm. Their comparison shows that the definitive data SDs are less than 400 pT and may be slightly overestimated by our process: an indication that more work is required to have proper estimates of absolute data error statistics. For magnetic field modelling, the results show that even on isolated sites like CLF observatory, there are very localised signals over a large span of temporal frequencies that can be as large as 1 nT. The SDs reported here encompass signals of a few hundred metres and less than a day wavelengths.

  19. The Relation between Severity of Autism and Caregiver-Child Interaction: a Study in the Context of Relationship Development Intervention.

    PubMed

    Hobson, Jessica A; Tarver, Laura; Beurkens, Nicole; Hobson, R Peter

    2016-05-01

    The aim of this study was to examine the relations between severity of children's autism and qualities of parent-child interaction. We studied these variables at two points of time in children receiving a treatment that has a focus on social engagement, Relationship Development Intervention (RDI; Gutstein 2009). Participants were 18 parent-child dyads where the child (16 boys, 2 girls) had a diagnosis of autism and was between the ages of 2 and 12 years. The severity of the children's autism was assessed at baseline and later in treatment using the autism severity metric of the Autism Diagnostic Observation Schedule (ADOS; Gotham et al. Journal of Autism and Developmental Disorders, 39, 693-705 2009). Although the ADOS was designed as a diagnostic measure, ADOS calibrated severity scores (CSS) are increasingly used as one index of change (e.g., Locke et al. Autism, 18, 370-375 2014). Videotapes of parent-child interaction at baseline and later in treatment were rated by independent coders, for a) overall qualities of interpersonal relatedness using the Dyadic Coding Scales (DCS; Humber and Moss The American Journal of Orthopsychiatry, 75, 128-141 2005), and b) second-by-second parent-child Co-Regulation and Intersubjective Engagement (processes targeted by the treatment approach of RDI). Severity of autism was correlated with lower quality of parent-child interaction. Ratings on each of these variables changed over the course of treatment, and there was evidence that improvement was specifically related to the quality of parent-child interaction at baseline.

  20. Carotid intima-media thickness is a novel predictor of new onset of hypertension in normotensive subjects.

    PubMed

    Takase, Hiroyuki; Sugiura, Tonomori; Murai, Shunsuke; Yamashita, Sumiyo; Ohte, Nobuyuki; Dohi, Yasuaki

    2017-08-01

    Increased carotid intima-media thickness (IMT) in individuals without hypertension might indicate other factors promoting the atherosclerotic process that are often simultaneously clustered in individuals. The present study tested the hypothesis that carotid IMT predicts new onset of hypertension in the normotensive subjects.A total of 867 participants were enrolled from our yearly physical checkup program and their carotid IMT was measured. After a baseline examination, the subjects were followed up for a median of 1091 days with the endpoint being the development of hypertension.At baseline, the carotid IMT value was 0.75 ± 0.16 mm. Hypertension developed in 184 subjects during the follow-up (76.9/1000 person-years). The incidence of hypertension was increased across the tertiles of the carotid IMT value (39.6, 70.0, and 134.5/1000 person-years in the first, second, and third tertiles, respectively, P < .001 by log-rank test). Multivariate Cox-hazard analysis after adjustment identified carotid IMT, taken as a continuous variable, as a significant predictor of new-onset hypertension (hazard ratio = 7.08, 95% confidence interval = 3.06-15.39). Furthermore, multivariate linear regression analyses indicated a significant correlation between the carotid IMT at baseline and yearly increases in systolic blood pressure during the follow-up period (β = 0.189, P < .001).Carotid IMT is an independent predictor of hypertension onset in normotensive subjects. The findings also suggested a close association between increased carotid IMT and blood pressure.

  1. Carotid intima-media thickness is a novel predictor of new onset of hypertension in normotensive subjects

    PubMed Central

    Takase, Hiroyuki; Sugiura, Tonomori; Murai, Shunsuke; Yamashita, Sumiyo; Ohte, Nobuyuki; Dohi, Yasuaki

    2017-01-01

    Abstract Increased carotid intima-media thickness (IMT) in individuals without hypertension might indicate other factors promoting the atherosclerotic process that are often simultaneously clustered in individuals. The present study tested the hypothesis that carotid IMT predicts new onset of hypertension in the normotensive subjects. A total of 867 participants were enrolled from our yearly physical checkup program and their carotid IMT was measured. After a baseline examination, the subjects were followed up for a median of 1091 days with the endpoint being the development of hypertension. At baseline, the carotid IMT value was 0.75 ± 0.16 mm. Hypertension developed in 184 subjects during the follow-up (76.9/1000 person-years). The incidence of hypertension was increased across the tertiles of the carotid IMT value (39.6, 70.0, and 134.5/1000 person-years in the first, second, and third tertiles, respectively, P < .001 by log-rank test). Multivariate Cox-hazard analysis after adjustment identified carotid IMT, taken as a continuous variable, as a significant predictor of new-onset hypertension (hazard ratio = 7.08, 95% confidence interval = 3.06–15.39). Furthermore, multivariate linear regression analyses indicated a significant correlation between the carotid IMT at baseline and yearly increases in systolic blood pressure during the follow-up period (β = 0.189, P < .001). Carotid IMT is an independent predictor of hypertension onset in normotensive subjects. The findings also suggested a close association between increased carotid IMT and blood pressure. PMID:28767608

  2. Shuttle Mission STS-50: Orbital Processing of High-Quality CdTe Compound Semiconductors Experiment: Final Flight Sample Characterization Report

    NASA Technical Reports Server (NTRS)

    Larson, David J.; Casagrande, Luis G.; DiMarzio, Don; Alexander, J. Iwan D.; Carlson, Fred; Lee, Taipo; Dudley, Michael; Raghathamachar, Balaji

    1998-01-01

    The Orbital Processing of High-Quality Doped and Alloyed CdTe Compound Semiconductors program was initiated to investigate, quantitatively, the influences of gravitationally dependent phenomena on the growth and quality of bulk compound semiconductors. The objective was to improve crystal quality (both structural and compositional) and to better understand and control the variables within the crystal growth production process. The empirical effort entailed the development of a terrestrial (one-g) experiment baseline for quantitative comparison with microgravity (mu-g) results. This effort was supported by the development of high-fidelity process models of heat transfer, fluid flow and solute redistribution, and thermo-mechanical stress occurring in the furnace, safety cartridge, ampoule, and crystal throughout the melting, seeding, crystal growth, and post-solidification processing. In addition, the sensitivity of the orbital experiments was analyzed with respect to the residual microgravity (mu-g) environment, both steady state and g-jitter. CdZnTe crystals were grown in one-g and in mu-g. Crystals processed terrestrially were grown at the NASA Ground Control Experiments Laboratory (GCEL) and at Grumman Aerospace Corporation (now Northrop Grumman Corporation). Two mu-g crystals were grown in the Crystal Growth Furnace (CGF) during the First United States Microgravity Laboratory Mission (USML-1), STS-50, June 24 - July 9, 1992.

  3. MetAlign 3.0: performance enhancement by efficient use of advances in computer hardware.

    PubMed

    Lommen, Arjen; Kools, Harrie J

    2012-08-01

    A new, multi-threaded version of the GC-MS and LC-MS data processing software, metAlign, has been developed which is able to utilize multiple cores on one PC. This new version was tested using three different multi-core PCs with different operating systems. The performance of noise reduction, baseline correction and peak-picking was 8-19 fold faster compared to the previous version on a single core machine from 2008. The alignment was 5-10 fold faster. Factors influencing the performance enhancement are discussed. Our observations show that performance scales with the increase in processor core numbers we currently see in consumer PC hardware development.

  4. Long Duration Hot Hydrogen Exposure of Nuclear Thermal Rocket Materials

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.; Foote, John P.; Hickman, Robert; Dobson, Chris; Clifton, Scooter

    2007-01-01

    An arc-heater driven hyper-thermal convective environments simulator was recently developed and commissioned for long duration hot hydrogen exposure of nuclear thermal rocket materials. This newly established non-nuclear testing capability uses a high-power, multi-gas, wall-stabilized constricted arc-heater to .produce high-temperature pressurized hydrogen flows representative of nuclear reactor core environments, excepting radiation effects, and is intended to serve as a low cost test facility for the purpose of investigating and characterizing candidate fuel/structural materials and improving associated processing/fabrication techniques. Design and engineering development efforts are fully summarized, and facility operating characteristics are reported as determined from a series of baseline performance mapping runs and long duration capability demonstration tests.

  5. Design of a vehicle based system to prevent ozone loss

    NASA Technical Reports Server (NTRS)

    Talbot, Matthew D.; Eby, Steven C.; Ireland, Glen J.; Mcwithey, Michael C.; Schneider, Mark S.; Youngblood, Daniel L.; Johnson, Matt; Taylor, Chris

    1994-01-01

    This project is designed to be completed over a three year period. Overall project goals are: (1) to understand the processes that contribute to stratospheric ozone loss; (2) to determine the best scheme to prevent ozone loss; and (3) to design a vehicle based system to carry out the prevention scheme. The 1993/1994 design objectives included: (1) to review the results of the 1992/1993 design team, including a reevaluation of the key assumptions used; (2) to develop a matrix of baseline vehicle concepts as candidates for the delivery vehicle; and (3) to develop a selection criteria and perform quantitative trade studies to use in the selection of the specific vehicle concept.

  6. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, H. C.; Wimmer, J. M.

    1986-01-01

    Silicon nitride is a high temperature material currently under consideration for heat engine and other applications. The objective is to improve the net shape fabrication technology of Si3N4 by injection molding. This is to be accomplished by optimizing the process through a series of statistically designed matrix experiments. To provide input to the matrix experiments, a wide range of alternate materials and processing parameters was investigated throughout the whole program. The improvement in the processing is to be demonstrated by a 20 percent increase in strength and a 100 percent increase in the Weibull modulus over that of the baseline material. A full characterization of the baseline process was completed. Material properties were found to be highly dependent on each step of the process. Several important parameters identified thus far are the starting raw materials, sinter/hot isostatic pressing cycle, powder bed, mixing methods, and sintering aid levels.

  7. Memory plasticity in older adults: Cognitive predictors of training response and maintenance following learning of number-consonant mnemonic.

    PubMed

    Sandberg, Petra; Rönnlund, Michael; Derwinger-Hallberg, Anna; Stigsdotter Neely, Anna

    2016-10-01

    The study investigated the relationship between cognitive factors and gains in number recall following training in a number-consonant mnemonic in a sample of 112 older adults (M = 70.9 years). The cognitive factors examined included baseline episodic memory, working memory, processing speed, and verbal knowledge. In addition, predictors of maintenance of gains to a follow-up assessment, eight months later, were examined. Whereas working memory was a prominent predictor of baseline recall, the magnitude of gains in recall from pre- to post-test assessments were predicted by baseline episodic memory, processing speed, and verbal knowledge. Verbal knowledge was the only significant predictor of maintenance. Collectively, the results indicate the need to consider multiple factors to account for individual differences in memory plasticity. The potential contribution of additional factors to individual differences in memory plasticity is discussed.

  8. Extended performance solar electric propulsion thrust system study. Volume 2: Baseline thrust system

    NASA Technical Reports Server (NTRS)

    Poeschel, R. L.; Hawthorne, E. I.

    1977-01-01

    Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30- cm engineering model thruster as the technology base. Emphasis was placed on relatively high-power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power-processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentractor solar array concept and is designed to interface with the space shuttle.

  9. Absolute versus Relative Difference Measures of Priming: Which Is Appropriate when Baseline Scores Change with Age?

    ERIC Educational Resources Information Center

    Murphy, Kristina; McKone, Elinor; Slee, Judith

    2006-01-01

    It is often of theoretical interest to know if implicit memory (repetition priming) develops across childhood under a given circumstance. Methodologically, however, it is difficult to determine whether development is present when baseline performance for unstudied items improves with age. Calculation of priming in absolute…

  10. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: BASELINE QUESTIONNAIRE (HOUSEHOLD) (UA-D-7.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Baseline Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Household and individual data were combined in a single Baseline Questionnaire data file. Keywo...

  11. Integrated process modeling for the laser inertial fusion energy (LIFE) generation system

    NASA Astrophysics Data System (ADS)

    Meier, W. R.; Anklam, T. M.; Erlandson, A. C.; Miles, R. R.; Simon, A. J.; Sawicki, R.; Storm, E.

    2010-08-01

    A concept for a new fusion-fission hybrid technology is being developed at Lawrence Livermore National Laboratory. The primary application of this technology is base-load electrical power generation. However, variants of the baseline technology can be used to "burn" spent nuclear fuel from light water reactors or to perform selective transmutation of problematic fission products. The use of a fusion driver allows very high burn-up of the fission fuel, limited only by the radiation resistance of the fuel form and system structures. As a part of this process, integrated process models have been developed to aid in concept definition. Several models have been developed. A cost scaling model allows quick assessment of design changes or technology improvements on cost of electricity. System design models are being used to better understand system interactions and to do design trade-off and optimization studies. Here we describe the different systems models and present systems analysis results. Different market entry strategies are discussed along with potential benefits to US energy security and nuclear waste disposal. Advanced technology options are evaluated and potential benefits from additional R&D targeted at the different options is quantified.

  12. Materials and Process Activities for NASA's Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Polis, Daniel L.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). The overall goal of the CCM project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project s baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. The materials and process activities were prioritized based on a rapid prototype approach. This approach focused developmental activities on design details with greater risk and uncertainty, such as out-of-autoclave joining, over some of the more traditional lamina and laminate building block levels. While process development and associated building block testing were performed, several anomalies were still observed at the full-scale level due to interactions between process robustness and manufacturing scale-up. This paper describes the process anomalies that were encountered during the CCM development and the subsequent root cause investigations that led to the final design solutions. These investigations highlight the importance of full-scale developmental work early in the schedule of a complex composite design/build project.

  13. Current Status of the Development of a Transportable and Compact VLBI System by NICT and GSI

    NASA Technical Reports Server (NTRS)

    Ishii, Atsutoshi; Ichikawa, Ryuichi; Takiguchi, Hiroshi; Takefuji, Kazuhiro; Ujihara, Hideki; Koyama, Yasuhiro; Kondo, Tetsuro; Kurihara, Shinobu; Miura, Yuji; Matsuzaka, Shigeru; hide

    2010-01-01

    MARBLE (Multiple Antenna Radio-interferometer for Baseline Length Evaluation) is under development by NICT and GSI. The main part of MARBLE is a transportable VLBI system with a compact antenna. The aim of this system is to provide precise baseline length over about 10 km for calibrating baselines. The calibration baselines are used to check and validate surveying instruments such as GPS receiver and EDM (Electro-optical Distance Meter). It is necessary to examine the calibration baselines regularly to keep the quality of the validation. The VLBI technique can examine and evaluate the calibration baselines. On the other hand, the following roles are expected of a compact VLBI antenna in the VLBI2010 project. In order to achieve the challenging measurement precision of VLBI2010, it is well known that it is necessary to deal with the problem of thermal and gravitational deformation of the antenna. One promising approach may be connected-element interferometry between a compact antenna and a VLBI2010 antenna. By measuring repeatedly the baseline between the small stable antenna and the VLBI2010 antenna, the deformation of the primary antenna can be measured and the thermal and gravitational models of the primary antenna will be able to be constructed. We made two prototypes of a transportable and compact VLBI system from 2007 to 2009. We performed VLBI experiments using theses prototypes and got a baseline length between the two prototypes. The formal error of the measured baseline length was 2.7 mm. We expect that the baseline length error will be reduced by using a high-speed A/D sampler.

  14. Space Station Mission Planning System (MPS) development study. Volume 2

    NASA Technical Reports Server (NTRS)

    Klus, W. J.

    1987-01-01

    The process and existing software used for Spacelab payload mission planning were studied. A complete baseline definition of the Spacelab payload mission planning process was established, along with a definition of existing software capabilities for potential extrapolation to the Space Station. This information was used as a basis for defining system requirements to support Space Station mission planning. The Space Station mission planning concept was reviewed for the purpose of identifying areas where artificial intelligence concepts might offer substantially improved capability. Three specific artificial intelligence concepts were to be investigated for applicability: natural language interfaces; expert systems; and automatic programming. The advantages and disadvantages of interfacing an artificial intelligence language with existing FORTRAN programs or of converting totally to a new programming language were identified.

  15. Customized Molecular Phenotyping by Quantitative Gene Expression and Pattern Recognition Analysis

    PubMed Central

    Akilesh, Shreeram; Shaffer, Daniel J.; Roopenian, Derry

    2003-01-01

    Description of the molecular phenotypes of pathobiological processes in vivo is a pressing need in genomic biology. We have implemented a high-throughput real-time PCR strategy to establish quantitative expression profiles of a customized set of target genes. It enables rapid, reproducible data acquisition from limited quantities of RNA, permitting serial sampling of mouse blood during disease progression. We developed an easy to use statistical algorithm—Global Pattern Recognition—to readily identify genes whose expression has changed significantly from healthy baseline profiles. This approach provides unique molecular signatures for rheumatoid arthritis, systemic lupus erythematosus, and graft versus host disease, and can also be applied to defining the molecular phenotype of a variety of other normal and pathological processes. PMID:12840047

  16. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  17. Artifact Correction in Temperature-Dependent Attenuated Total Reflection Infrared (ATR-IR) Spectra.

    PubMed

    Sobieski, Brian; Chase, Bruce; Noda, Isao; Rabolt, John

    2017-08-01

    A spectral processing method was developed and tested for analyzing temperature-dependent attenuated total reflection infrared (ATR-IR) spectra of aliphatic polyesters. Spectra of a bio-based, biodegradable polymer, 3.9 mol% 3HHx poly[(R)-3-hydroxybutyrate- co-(R)-3-hydroxyhexanoate] (PHBHx), were analyzed and corrected prior to analysis using two-dimensional correlation spectroscopy (2D-COS). Removal of the temperature variation of diamond absorbance, correction of the baseline, ATR correction, and appropriate normalization were key to generating more reliable data. Both the processing steps and order were important. A comparison to differential scanning calorimetry (DSC) analysis indicated that the normalization method should be chosen with caution to avoid unintentional trends and distortions of the crystalline sensitive bands.

  18. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan

    2015-10-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reductionmore » in the amount of reserves that must be held to accommodate the uncertainty of solar power output.« less

  19. Development of realtime connected element interferometry at the Goldstone Deep Space Communications Complex

    NASA Technical Reports Server (NTRS)

    Edwards, C. D.

    1990-01-01

    Connected-element interferometry (CEI) has the potential to provide high-accuracy angular spacecraft tracking on short baselines by making use of the very precise phase delay observable. Within the Goldstone Deep Space Communications Complex (DSCC), one of three tracking complexes in the NASA Deep Space Network, baselines of up to 21 km in length are available. Analysis of data from a series of short-baseline phase-delay interferometry experiments are presented to demonstrate the potential tracking accuracy on these baselines. Repeated differential observations of pairs of angularly close extragalactic radio sources were made to simulate differential spacecraft-quasar measurements. Fiber-optic data links and a correlation processor are currently being developed and installed at Goldstone for a demonstration of real-time CEI in 1990.

  20. A new estimator for VLBI baseline length repeatability

    NASA Astrophysics Data System (ADS)

    Titov, O.

    2009-11-01

    The goal of this paper is to introduce a more effective technique to approximate for the “repeatability-baseline length” relationship that is used to evaluate the quality of geodetic VLBI results. Traditionally, this relationship is approximated by a quadratic function of baseline length over all baselines. The new model incorporates the mean number of observed group delays of the reference radio sources (i.e. estimated as global parameters) used in the estimation of each baseline. It is shown that the new method provides a better approximation of the “repeatability-baseline length” relationship than the traditional model. Further development of the new approach comes down to modeling the repeatability as a function of two parameters: baseline length and baseline slewing rate. Within the framework of this new approach the station vertical and horizontal uncertainties can be treated as a function of baseline length. While the previous relationship indicated that the station vertical uncertainties are generally 4-5 times larger than the horizontal uncertainties, the vertical uncertainties as determined by the new method are only larger by a factor of 1.44 over all baseline lengths.

  1. The Impact of Cardiorespiratory Fitness Levels on the Risk of Developing Atherogenic Dyslipidemia.

    PubMed

    Breneman, Charity B; Polinski, Kristen; Sarzynski, Mark A; Lavie, Carl J; Kokkinos, Peter F; Ahmed, Ali; Sui, Xuemei

    2016-10-01

    Low cardiorespiratory fitness has been established as a risk factor for cardiovascular-related morbidity. However, research about the impact of fitness on lipid abnormalities, including atherogenic dyslipidemia, has produced mixed results. The purpose of this investigation is to examine the influence of baseline fitness and changes in fitness on the development of atherogenic dyslipidemia. All participants completed at least 3 comprehensive medical examinations performed by a physician that included a maximal treadmill test between 1976 and 2006 at the Cooper Clinic in Dallas, Texas. Atherogenic dyslipidemia was defined as a triad of lipid abnormalities: low high-density-lipoprotein cholesterol ([HDL-C] <40 mg/dL), high triglycerides ([TGs] ≥200 mg/dL), and high low-density-lipoprotein cholesterol ([LDL-C] ≥160 mg/dL). A total of 193 participants developed atherogenic dyslipidemia during an average of 8.85 years of follow-up. High baseline fitness was protective against the development of atherogenic dyslipidemia in comparison with those with low fitness (odds ratio [OR] 0.57; 95% confidence interval [CI], 0.37-0.89); however, this relationship became nonsignificant after controlling for baseline HDL-C, LDL-C, and TG levels. Participants who maintained fitness over time had lower odds of developing atherogenic dyslipidemia than those with a reduction in fitness (OR 0.56; 95% CI, 0.34-0.91) after adjusting for baseline confounders and changes in known risk factors. High fitness at baseline and maintenance of fitness over time are protective against the development of atherogenic dyslipidemia. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Leadership Identity Development Through Reflection and Feedback in Team-Based Learning Medical Student Teams.

    PubMed

    Alizadeh, Maryam; Mirzazadeh, Azim; Parmelee, Dean X; Peyton, Elizabeth; Mehrdad, Neda; Janani, Leila; Shahsavari, Hooman

    2018-01-01

    Studies on leadership identity development through reflection with Team-Based Learning (TBL) in medical student education are rare. We assumed that reflection and feedback on the team leadership process would advance the progression through leadership identity development stages in medical students within the context of classes using TBL. This study is a quasi-experimental design with pretest-posttest control group. The pretest and posttest were reflection papers of medical students about their experience of leadership during their TBL sessions. In the intervention group, TBL and a team-based, guided reflection and feedback on the team leadership process were performed at the end of all TBL sessions. In the other group, only TBL was used. The Stata 12 software was used. Leadership Identity was treated both as a categorical and quantitative variable to control for differences in baseline and gender variables. Chi-square, t tests, and linear regression analysis were performed. The population was a cohort of 2015-2016 medical students in a TBL setting at Tehran University of Medical Sciences, School of Medicine. Teams of four to seven students were formed by random sorting at the beginning of the academic year (intervention group n = 20 teams, control group n = 19 teams). At baseline, most students in both groups were categorized in the Awareness and Exploration stage of leadership identity: 51 (52%) in the intervention group and 59 (55%) in the control group: uncorrected χ 2 (3) = 15.6, design-based F(2.83, 108) = 4.87, p = .003. In the posttest intervention group, 36 (36%) were in exploration, 33 (33%) were in L-identified, 20 (20%) were in Leadership Differentiated, and 10 (10%) were in the Generativity. None were in the Awareness or Integration stages. In the control group, 3 (20%) were in Awareness, 56 (53%) were in Exploration, 35 (33%) were in Leader Identified, 13 (12%) were in Leadership Differentiated. None were in the Generativity and Integration stages. Our hypothesis was supported by the data: uncorrected χ 2 (4) = 18.6, design-based F(3.77, 143) = 4.46, p = .002. The mean of the leadership identity in the pretest, intervention group equaled 1.93 (SD = 0.85) and the pretest, control group mean was 2.36 (SD = 0.86), p = .004. The mean of the posttest, intervention group was 3.04 (SD = 0.98) and posttest, control group mean was 2.54 (SD = 0.74), T = -4.00, design df = 38, p < .001, and adjusted on baseline and gender T = -8.97, design df = 38, p < .001. Reflection and feedback on the team leadership process in TBL advances the progression in stages of leadership identity development in medical students. Although the TBL strategy itself could have an impact on leadership identity development, this study demonstrates that when a reflection and feedback on leadership intervention are added, there is much greater impact.

  3. Effect of abdominal visceral fat on the development of new erosive oesophagitis: a prospective cohort study.

    PubMed

    Nam, Su Youn; Kim, Young-Woo; Park, Bum Joon; Ryu, Kum Hei; Choi, Il Ju; Nam, Byung-Ho; Kim, Hyun Boem

    2017-04-01

    Although abdominal visceral fat has been associated with erosive oesophagitis in cross-sectional studies, there are no data that describe its longitudinal effects. We aimed to evaluate the longitudinal effects of abdominal visceral fat on the development of new erosive oesophagitis in patients who did not have erosive oesophagitis at baseline. This was a single-centre prospective study. A total of 1503 participants without erosive oesophagitis at baseline were followed up for 34 months and they underwent oesophagogastroduodenoscopy and computed tomography at both baseline and during follow-up. The longitudinal effects of abdominal visceral fat on the development of new erosive oesophagitis were evaluated using odds ratios (ORs) and 95% confidence intervals (CIs). New oesophagitis developed in 83 patients. Compared with the first quartile, the third (OR=3.96, 95% CI: 1.54-10.18) and the fourth (OR=4.67, 95% CI: 1.79-12.23) of baseline visceral fat quartiles, the third (OR=3.03, 95% CI: 1.14-8.04) and the fourth (OR=7.50, 95% CI: 2.92-19.25) follow-up visceral fat quartiles, and the fourth visceral fat change quartile (OR=2.76, 95% CI: 1.47-5.21) were associated with increased development of new erosive oesophagitis, and the P value for each trend was less than 0.001. New erosive oesophagitis was inversely related to the follow-up Helicobacter pylori status and it was associated positively with the presence of a hiatal hernia and smoking during follow-up, but it was not associated with reflux symptoms, the H. pylori status, presence of a hiatal hernia or smoking at baseline. Higher level of visceral fat at baseline and follow-up visceral fat, and greater changes in the visceral level were associated linearly with the development of new erosive oesophagitis in this longitudinal study.

  4. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; hide

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  5. Carotid intima-media thickness is associated with the progression of cognitive impairment in older adults.

    PubMed

    Moon, Jae Hoon; Lim, Soo; Han, Ji Won; Kim, Kyoung Min; Choi, Sung Hee; Park, Kyong Soo; Kim, Ki Woong; Jang, Hak Chul

    2015-04-01

    We investigated the association between cardiovascular risk factors, including carotid intima-media thickness (CIMT), and future risk of mild cognitive impairment (MCI) and dementia in elderly subjects. We conducted a population-based prospective study as a part of the Korean Longitudinal Study on Health and Aging. Our study included 348 participants who were nondemented at the baseline (mean age, 71.7±6.3 years) and underwent cognitive evaluation at the 5-year follow-up. Baseline cardiovascular risk factors were compared according to the development of MCI or dementia during the study period. At the baseline evaluation, 278 subjects were cognitively normal and 70 subjects had MCI. Diagnoses of cognitive function either remained unchanged or improved during the study period in 292 subjects (nonprogression group), whereas 56 subjects showed progression of cognitive impairment to MCI or dementia (progression group). The progression group exhibited a higher prevalence of hypertension and greater CIMT compared with the nonprogression group. Other baseline cardiovascular risk factors, including sex, body mass index, diabetes mellitus, insulin resistance, total cholesterol, waist-to-hip ratio, visceral fat, pulse wave velocity, and ankle-brachial index, were not significantly different between 2 groups. The association between greater baseline CIMT and the progression of cognitive impairment was maintained after adjustment for conventional baseline risk factors of cognitive impairment. Greater baseline CIMT was also independently associated with the development of MCI in the subjects whose baseline cognitive function was normal. Greater baseline CIMT was independently associated with the risk of cognitive impairment, such as MCI and dementia in elderly subjects. © 2015 American Heart Association, Inc.

  6. Comparison of DNQ/novolac resists for e-beam exposure

    NASA Astrophysics Data System (ADS)

    Fedynyshyn, Theodore H.; Doran, Scott P.; Lind, Michele L.; Lyszczarz, Theodore M.; DiNatale, William F.; Lennon, Donna; Sauer, Charles A.; Meute, Jeff

    1999-12-01

    We have surveyed the commercial resist market with the dual purpose of identifying diazoquinone/novolac based resists that have potential for use as e-beam mask making resists and baselining these resists for comparison against future mask making resist candidates. For completeness, this survey would require that each resist be compared with an optimized developer and development process. To accomplish this task in an acceptable time period, e-beam lithography modeling was employed to quickly identify the resist and developer combinations that lead to superior resist performance. We describe the verification of a method to quickly screen commercial i-line resists with different developers, by determining modeling parameters for i-line resists from e-beam exposures, modeling the resist performance, and comparing predicted performance versus actual performance. We determined the lithographic performance of several DNQ/novolac resists whose modeled performance suggests that sensitivities of less than 40 (mu) C/cm2 coupled with less than 10-nm CD change per percent change in dose are possible for target 600-nm features. This was accomplished by performing a series of statistically designed experiments on the leading resists candidates to optimize processing variables, followed by comparing experimentally determined resist sensitivities, latitudes, and profiles of the DNQ/novolac resists a their optimized process.

  7. Conceptual Modeling via Logic Programming

    DTIC Science & Technology

    1990-01-01

    Define User Interface and Query Language L i1W= Ltl k.l 4. Define Procedures for Specifying Output S . Select Logic Programming Language 6. Develop ...baseline s change model. sessions and baselines. It was changed 6. Develop Methodology for C 31 Users. considerably with the advent of the window This...Model Development : Implica- for Conceptual Modeling Via Logic tions for Communications of a Cognitive Programming. Marina del Rey, Calif.: Analysis of

  8. Modeling Nonlinear Errors in Surface Electromyography Due To Baseline Noise: A New Methodology

    PubMed Central

    Law, Laura Frey; Krishnan, Chandramouli; Avin, Keith

    2010-01-01

    The surface electromyographic (EMG) signal is often contaminated by some degree of baseline noise. It is customary for scientists to subtract baseline noise from the measured EMG signal prior to further analyses based on the assumption that baseline noise adds linearly to the observed EMG signal. The stochastic nature of both the baseline and EMG signal, however, may invalidate this assumption. Alternately, “true” EMG signals may be either minimally or nonlinearly affected by baseline noise. This information is particularly relevant at low contraction intensities when signal-to-noise ratios (SNR) may be lowest. Thus, the purpose of this simulation study was to investigate the influence of varying levels of baseline noise (approximately 2 – 40 % maximum EMG amplitude) on mean EMG burst amplitude and to assess the best means to account for signal noise. The simulations indicated baseline noise had minimal effects on mean EMG activity for maximum contractions, but increased nonlinearly with increasing noise levels and decreasing signal amplitudes. Thus, the simple baseline noise subtraction resulted in substantial error when estimating mean activity during low intensity EMG bursts. Conversely, correcting EMG signal as a nonlinear function of both baseline and measured signal amplitude provided highly accurate estimates of EMG amplitude. This novel nonlinear error modeling approach has potential implications for EMG signal processing, particularly when assessing co-activation of antagonist muscles or small amplitude contractions where the SNR can be low. PMID:20869716

  9. Safety of Rural Nursing Home-to-Emergency Department Transfers: Improving Communication and Patient Information Sharing Across Settings.

    PubMed

    Tupper, Judith B; Gray, Carolyn E; Pearson, Karen B; Coburn, Andrew F

    2015-01-01

    The "siloed" approach to healthcare delivery contributes to communication challenges and to potential patient harm when patients transfer between settings. This article reports on the evaluation of a demonstration in 10 rural communities to improve the safety of nursing facility (NF) transfers to hospital emergency departments by forming interprofessional teams of hospital, emergency medical service, and NF staff to develop and implement tools and protocols for standardizing critical interfacility communication pathways and information sharing. We worked with each of the 10 teams to document current communication processes and information sharing tools and to design, implement, and evaluate strategies/tools to increase effective communication and sharing of patient information across settings. A mixed methods approach was used to evaluate changes from baseline in documentation of patient information shared across settings during the transfer process. Study findings showed significant improvement in key areas across the three settings, including infection status and baseline mental functioning. Improvement strategies and performance varied across settings; however, accurate and consistent information sharing of advance directives and medication lists remains a challenge. Study results demonstrate that with neutral facilitation and technical support, collaborative interfacility teams can assess and effectively address communication and information sharing problems that threaten patient safety.

  10. Psychometric evaluation of the Questionnaire about the Process of Recovery (QPR).

    PubMed

    Williams, Julie; Leamy, Mary; Pesola, Francesca; Bird, Victoria; Le Boutillier, Clair; Slade, Mike

    2015-12-01

    Supporting recovery is the aim of national mental health policy in many countries. However, only one measure of recovery has been developed in England: the Questionnaire about the Process of Recovery (QPR), which measures recovery from the perspective of adult mental health service users with a psychosis diagnosis. To independently evaluate the psychometric properties of the 15- and 22-item versions of the QPR. Two samples were used: data-set 1 (n = 88) involved assessment of the QPR at baseline, 2 weeks and 3 months. Data-set 2 (n = 399; trial registration: ISRCTN02507940) involved assessment of the QPR at baseline and 1 year. For the 15-item version, internal consistency was 0.89, convergent validity was 0.73, test-retest reliability was 0.74 and sensitivity to change was 0.40. Confirmatory factor analysis showed the 15-item version offered a good fit. For the 22-item version, the interpersonal subscale was found to underperform and the intrapersonal subscale overlaps substantially with the 15-item version. Both the 15-item and the intrapersonal subscale of the 22-item versions of the QPR demonstrated satisfactory psychometric properties. The 15-item version is slightly more robust and also less burdensome, so it can be recommended for use in research and clinical practice. © The Royal College of Psychiatrists 2015.

  11. Directly Connecting the Very Long Baseline Array

    NASA Astrophysics Data System (ADS)

    Hunt, Gareth; Romney, Jonathan D.; Walker, R. Craig

    At present, the signals received by the 10 antennas of the Very Long Baseline Array (VLBA) are recorded on instrumentation tapes. These tapes are then shipped from the antenna locations --- distributed across the mainland USA, the US Virgin Islands, and Hawaii --- to the processing center in Socorro, New Mexico. The Array operates today at a mean sustained data rate of 128 Mbps per antenna, but peak rates of 256 Mbps and 512 Mbps are also used. Transported tapes provide the cheapest method of attaining these bit rates. The present tape system derives from wideband recording techniques dating back to the late 1970s, and has been in use since the commissioning of the VLBA in 1993. It is in need of replacement on a time scale of a few years. Further, plans are being developed which would increase the required data rates to 1 Gbps in five years and 100 Gbps in ten years. With the advent of higher performance networks, it should be possible to transmit the data directly to the processing center. However, achieving this connectivity is severely complicated by the remoteness of the antennas --- ``the last mile problem.'' In addition, it is not clear that the data rates involved can be obtained at a reasonable cost.

  12. Behavior Change without Behavior Change Communication: Nudging Handwashing among Primary School Students in Bangladesh.

    PubMed

    Dreibelbis, Robert; Kroeger, Anne; Hossain, Kamal; Venkatesh, Mohini; Ram, Pavani K

    2016-01-14

    Behavior change communication for improving handwashing with soap can be labor and resource intensive, yet quality results are difficult to achieve. Nudges are environmental cues engaging unconscious decision-making processes to prompt behavior change. In this proof-of-concept study, we developed an inexpensive set of nudges to encourage handwashing with soap after toilet use in two primary schools in rural Bangladesh. We completed direct observation of behaviors at baseline, after providing traditional handwashing infrastructure, and at multiple time periods following targeted handwashing nudges (1 day, 2 weeks, and 6 weeks). No additional handwashing education or motivational messages were completed. Handwashing with soap among school children was low at baseline (4%), increasing to 68% the day after nudges were completed and 74% at both 2 weeks and 6 weeks post intervention. Results indicate that nudge-based interventions have the potential to improve handwashing with soap among school-aged children in Bangladesh and specific areas of further inquiry are discussed.

  13. Behavior Change without Behavior Change Communication: Nudging Handwashing among Primary School Students in Bangladesh

    PubMed Central

    Dreibelbis, Robert; Kroeger, Anne; Hossain, Kamal; Venkatesh, Mohini; Ram, Pavani K.

    2016-01-01

    Behavior change communication for improving handwashing with soap can be labor and resource intensive, yet quality results are difficult to achieve. Nudges are environmental cues engaging unconscious decision-making processes to prompt behavior change. In this proof-of-concept study, we developed an inexpensive set of nudges to encourage handwashing with soap after toilet use in two primary schools in rural Bangladesh. We completed direct observation of behaviors at baseline, after providing traditional handwashing infrastructure, and at multiple time periods following targeted handwashing nudges (1 day, 2 weeks, and 6 weeks). No additional handwashing education or motivational messages were completed. Handwashing with soap among school children was low at baseline (4%), increasing to 68% the day after nudges were completed and 74% at both 2 weeks and 6 weeks post intervention. Results indicate that nudge-based interventions have the potential to improve handwashing with soap among school-aged children in Bangladesh and specific areas of further inquiry are discussed. PMID:26784210

  14. Development of strain tolerant thermal barrier coating systems, tasks 1 - 3

    NASA Technical Reports Server (NTRS)

    Anderson, N. P.; Sheffler, K. D.

    1983-01-01

    Insulating ceramic thermal barrier coatings can reduce gas turbine airfoil metal temperatures as much as 170 C (about 300 F), providing fuel efficiency improvements greater than one percent and durability improvements of 2 to 3X. The objective was to increase the spalling resistance of zirconia based ceramic turbine coatings. To accomplish this, two baseline and 30 candidate duplex (layered MCrAlY/zirconia based ceramic) coatings were iteratively evaluated microstructurally and in four series of laboratory burner rig tests. This led to the selection of two candidate optimized 0.25 mm (0.010 inch) thick plasma sprayed partially stabilized zirconia ceramics containing six weight percent yttria and applied with two different sets of process parameters over a 0.13 mm (0.005 inch) thick low pressure chamber sprayed MCrAlY bond coat. Both of these coatings demonstrated at least 3X laboratory cyclic spalling life improvement over the baseline systems, as well as cyclic oxidation life equivalent to 15,000 commercial engine flight hours.

  15. Development of a Land Use Database for the Little Blackwater Watershed, Dorchester County, Maryland

    USGS Publications Warehouse

    Milheim, Lesley E.; Jones, John W.; Barlow, Roger A.

    2007-01-01

    Many agricultural and forested areas in proximity to National Wildlife Refuges (NWR) are under increasing economic pressure to develop lands for commercial or residential development. The upper portion of the Little Blackwater River watershed - a 27 square mile area within largely low-lying Dorchester County, Maryland, on the eastern shore of the Chesapeake Bay - is important to the U.S. Fish and Wildlife Service (USFWS) because it flows toward the Blackwater National Wildlife Refuge (BNWR), and developmental impacts of areas upstream from the BNWR are unknown. One of the primary concerns for the refuge is how storm-water runoff may affect living resources downstream. The Egypt Road project (fig. 1), for which approximately 600 residential units have been approved, has the potential to markedly change the land use and land cover on the west bank of the Little Blackwater River. In an effort to limit anticipated impacts, the Maryland Department of Natural Resources (Maryland DNR) recently decided to purchase some of the lands previously slated for development. Local topography, a high water table (typically 1 foot or less below the land surface), and hydric soils present a challenge for the best management of storm-water flow from developed surfaces. A spatial data coordination group was formed by the Dorchester County Soil and Conservation District to collect data to aid decisionmakers in watershed management and on the possible impacts of development on this watershed. Determination of streamflow combined with land cover and impervious-surface baselines will allow linking of hydrologic and geologic factors that influence the land surface. This baseline information will help planners, refuge managers, and developers discuss issues and formulate best management practices to mitigate development impacts on the refuge. In consultation with the Eastern Region Geospatial Information Office, the dataset selected to be that baseline land cover source was the June-July 2005 National Agricultural Imagery Program (NAIP) 1-meter resolution orthoimagery of Maryland. This publicly available, statewide dataset provided imagery corresponding to the closest in time to the installation of a U.S. Geological Survey (USGS) Water Resources Discipline gaging station on the Little Blackwater River. It also captures land cover status just before major residential development occurs. This document describes the process used to create a land use database for the Little Blackwater watershed.

  16. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  17. Machine Learning EEG to Predict Cognitive Functioning and Processing Speed Over a 2-Year Period in Multiple Sclerosis Patients and Controls.

    PubMed

    Kiiski, Hanni; Jollans, Lee; Donnchadha, Seán Ó; Nolan, Hugh; Lonergan, Róisín; Kelly, Siobhán; O'Brien, Marie Claire; Kinsella, Katie; Bramham, Jessica; Burke, Teresa; Hutchinson, Michael; Tubridy, Niall; Reilly, Richard B; Whelan, Robert

    2018-05-01

    Event-related potentials (ERPs) show promise to be objective indicators of cognitive functioning. The aim of the study was to examine if ERPs recorded during an oddball task would predict cognitive functioning and information processing speed in Multiple Sclerosis (MS) patients and controls at the individual level. Seventy-eight participants (35 MS patients, 43 healthy age-matched controls) completed visual and auditory 2- and 3-stimulus oddball tasks with 128-channel EEG, and a neuropsychological battery, at baseline (month 0) and at Months 13 and 26. ERPs from 0 to 700 ms and across the whole scalp were transformed into 1728 individual spatio-temporal datapoints per participant. A machine learning method that included penalized linear regression used the entire spatio-temporal ERP to predict composite scores of both cognitive functioning and processing speed at baseline (month 0), and months 13 and 26. The results showed ERPs during the visual oddball tasks could predict cognitive functioning and information processing speed at baseline and a year later in a sample of MS patients and healthy controls. In contrast, ERPs during auditory tasks were not predictive of cognitive performance. These objective neurophysiological indicators of cognitive functioning and processing speed, and machine learning methods that can interrogate high-dimensional data, show promise in outcome prediction.

  18. Development of a solar-powered residential air conditioner: System optimization preliminary specification

    NASA Technical Reports Server (NTRS)

    Rousseau, J.; Hwang, K. C.

    1975-01-01

    Investigations aimed at the optimization of a baseline Rankine cycle solar powered air conditioner and the development of a preliminary system specification were conducted. Efforts encompassed the following: (1) investigations of the use of recuperators/regenerators to enhance the performance of the baseline system, (2) development of an off-design computer program for system performance prediction, (3) optimization of the turbocompressor design to cover a broad range of conditions and permit operation at low heat source water temperatures, (4) generation of parametric data describing system performance (COP and capacity), (5) development and evaluation of candidate system augmentation concepts and selection of the optimum approach, (6) generation of auxiliary power requirement data, (7) development of a complete solar collector-thermal storage-air conditioner computer program, (8) evaluation of the baseline Rankine air conditioner over a five day period simulating the NASA solar house operation, and (9) evaluation of the air conditioner as a heat pump.

  19. Baseline Vascular Cognitive Impairment Predicts the Course of Apathetic Symptoms After Stroke: The CASPER Study.

    PubMed

    Douven, Elles; Köhler, Sebastian; Schievink, Syenna H J; van Oostenbrugge, Robert J; Staals, Julie; Verhey, Frans R J; Aalten, Pauline

    2018-03-01

    To examine the influence of vascular cognitive impairment (VCI) on the course of poststroke depression (PSD) and poststroke apathy (PSA). Included were 250 stroke patients who underwent neuropsychological and neuropsychiatric assessment 3 months after stroke (baseline) and at a 6- and 12-month follow-up after baseline. Linear mixed models tested the influence of VCI in at least one cognitive domain (any VCI) or multidomain VCI (VCI in multiple cognitive domains) at baseline and domain-specific VCI at baseline on levels of depression and apathy over time, with random effects for intercept and slope. Almost half of the patients showed any VCI at baseline, and any VCI was associated with increasing apathy levels from baseline to the 12-month follow-up. Patients with multidomain VCI had higher apathy scores at the 6- and 12-month follow-up compared with patients with VCI in a single cognitive domain. Domain-specific analyses showed that impaired executive function and slowed information processing speed went together with increasing apathy levels from baseline to 6- and 12-month follow-up. None of the cognitive variables predicted the course of depressive symptoms. Baseline VCI is associated with increasing apathy levels from baseline to the chronic stroke phase, whereas no association was found between baseline VCI and the course of depressive symptoms. Health professionals should be aware that apathy might be absent early after stroke but may evolve over time in patients with VCI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Unraveling the insight paradox: One-year longitudinal study on the relationships between insight, self-stigma, and life satisfaction among people with schizophrenia spectrum disorders.

    PubMed

    Chio, Floria H N; Mak, Winnie W S; Chan, Randolph C H; Tong, Alan C Y

    2018-01-30

    The promotion of insight among people with schizophrenia spectrum disorders has posed a dilemma to service providers as higher insight has been linked to positive clinical outcomes but negative psychological outcomes. The negative meaning that people attached to the illness (self-stigma content) and the recurrence of such stigmatizing thoughts (self-stigma process) may explain why increased insight is associated with negative outcomes. The present study examined how the presence of high self-stigma content and self-stigma process may contribute to the negative association between insight and life satisfaction. A total of 181 people with schizophrenia spectrum disorders were assessed at baseline. 130 and 110 participants were retained and completed questionnaire at 6-month and 1-year follow-up, respectively. Results showed that baseline insight was associated with lower life satisfaction at 6-month when self-stigma process or self-stigma content was high. Furthermore, baseline insight was predictive of better life satisfaction at 1-year follow-up when self-stigma process was low. Findings suggested that the detrimental effects of insight can be a result from both the presence of cognitive content and habitual process of self-stigma. Future insight promotion interventions should also address self-stigma content and process among people with schizophrenia spectrum disorders so as to maximize the beneficial effects of insight. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Promoting dietary change among state health employees in Arkansas through a worksite wellness program: the Healthy Employee Lifestyle Program (HELP).

    PubMed

    Perez, Amanda Philyaw; Phillips, Martha M; Cornell, Carol E; Mays, Glen; Adams, Becky

    2009-10-01

    Maintaining a healthy and productive workforce is essential for employers in public and private sectors. Poor nutrition and obesity contribute to chronic diseases and influence health care costs and productivity. Research indicates that eating a healthy diet is associated with lower body mass index and reduced risk for developing chronic disease. The Arkansas Department of Health implemented the Healthy Employee Lifestyle Program to encourage wellness among state health employees. During the pilot year, participants completed a health risk assessment at baseline and again after 1 year that assessed diet and physical activity, other health risk factors, and readiness to make behavioral changes. Participants were encouraged to eat healthfully, participate in regular exercise, report health behaviors using a Web-based reporting system, accumulate points for healthy behaviors, and redeem points for incentives. Differences in participants' (n = 214) reported dietary behaviors between baseline and follow-up were assessed using chi2 analyses and tests of symmetry. Consumption of sweets/desserts, fats, protein, grains, processed meats, and dairy did not differ significantly from baseline to follow-up. However, at follow-up more participants reported eating 3 or more fruits and vegetables per day than at baseline and being in the action and maintenance stages of readiness to change for eating 5 or more fruits and vegetables per day and for eating a diet low in fat. Further study is needed to examine physical activity and other health risk factors to determine whether the program merits a broader dissemination.

  2. TAPIR--Finnish national geochemical baseline database.

    PubMed

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various environmental applications. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Explaining adolescent exercise behavior change: a longitudinal application of the transtheoretical model.

    PubMed

    Nigg, C R

    2001-01-01

    The developmental decline and benefits of exercise are documented, however, relatively little is known about the mechanisms and motivations underlying adolescent exercise behavior This project investigates which variables drive exercise or are a consequence thereof, within the Transtheoretical Model (TTM). Baseline questionnaires (N = 819) were collected through 5 Canadian high schools. For this longitudinal investigation, all baseline participants were approached for a 3-year follow up. Follow-up questionnaire completers (n = 400: mean baseline age = 14.89, SD = 1.15, mean follow-up age = 17.62 years, SD = 1.18) were not different from noncompleters (n = 419) on all baseline variables, except for sex (54. 75% and 43. 68% females, respectively; p <. 003). Stages, processes, self-efficacy, pros and cons of exercise from the TTM, and self-reported exercise were assessed. Panel analyses revealed that although the directions of the relations were as hypothesized, the processes did not significantly lead to exercise or vice versa. As hypothesized, exercise leads to self-efficacy and pros and cons, showing that the TTM can serve as a framework to understand adolescent exercise behavior Future research needs to incorporate shorter assessment intervals and use larger samples to be able to look at adjacent stage transitions.

  4. Sexual-perception processes in acquaintance-targeted sexual aggression.

    PubMed

    Treat, Teresa A; Viken, Richard J

    2018-05-01

    This study analyzes data from seven published studies to examine whether three performance-based indices of men's misperception of women's sexual interest (MSI), derived from a self-report questionnaire, are associated with sexual-aggression history, rape-supportive attitudes, sociosexuality, problem drinking, and self-reported MSI. Almost 2000 undergraduate men judged the justifiability of a man's increasingly unwanted advances toward a woman on the Heterosocial Perception Survey-Revised. Participants self-reported any sexual-aggression history, and some completed questionnaires assessing rape-supportive attitudes, sociosexuality, problem drinking, and self-reported MSI. A three-parameter logistic function was fitted to participants' justifiability ratings within a non-linear mixed-effects framework, which provided precise participant-specific estimates of three sexual-perception processes (baseline justifiability, bias, and sensitivity). Sexual-aggression history and rape-supportive attitudes predicted: (a) reduced sensitivity to women's affect; (b) more liberal biases, such that the woman's affect had to be more negative before justifiability ratings dropped substantially; and (c) greater baseline justifiability of continued advances after a positive response. Sexual-aggression history and attitudes correlated more strongly with sensitivity than baseline justifiability; remaining variables showed the opposite pattern. This work underscores the role of sexual-perception processes in sexual aggression and illustrates the derivation of performance-based estimates of sexual-perception processes from questionnaire responses. © 2018 Wiley Periodicals, Inc.

  5. The Great East Japan Earthquake: a need to plan for post-disaster surveillance in developed countries

    PubMed Central

    Matsui, Tamano; Partridge, Jeffrey; Kasai, Takeshi

    2011-01-01

    After a devastating earthquake and tsunami struck north-eastern Japan in March 2011, the public health system, including the infectious disease surveillance system, was severely compromised. While models for post-disaster surveillance exist, they focus predominantly on developing countries during the early recovery phase. Such models do not necessarily apply to developed countries, which differ considerably in their baseline surveillance systems. Furthermore, there is a need to consider the process by which a surveillance system recovers post-disaster. The event in Japan has highlighted a need to address these concerns surrounding post-disaster surveillance in developed countries. In May 2011, the World Health Organization convened a meeting where post-disaster surveillance was discussed by experts and public health practitioners. In this paper, we describe a post-disaster surveillance approach that was discussed at the meeting, based on what had actually occurred and what may have been, or would be, ideal. Briefly, we describe the evolution of a surveillance system as it returns to the pre-existing system, starting from an event-based approach during the emergency relief phase, a syndromic approach during the early recovery phase, an enhanced sentinel approach during the late recovery phase and a return to baseline during the development phase. Our aim is not to recommend a specific model but to encourage other developed countries to initiate their own discussions on post-disaster surveillance and develop plans according to their needs and capacities. As natural disasters will continue to occur, we hope that developing such plans during the “inter-disaster” period will help mitigate the surveillance challenges that will arise post-disaster. PMID:23908893

  6. The ankle brachial index and change in lower extremity functioning over time: the Women's Health and Aging Study.

    PubMed

    McDermott, Mary McGrae; Ferrucci, Luigi; Simonsick, Eleanor M; Balfour, Jennifer; Fried, Linda; Ling, Shari; Gibson, Daniel; Guralnik, Jack M

    2002-02-01

    To define the association between baseline ankle brachial index (ABI) level and subsequent onset of severe disability. Prospective cohort study. Baltimore community. Eight hundred forty-seven disabled women aged 65 and older participating in the Women's Health and Aging Study. At baseline, participants underwent measurement of ABI and lower extremity functioning. Measures of lower extremity functioning included patient's report of their ability to walk one-quarter of a mile, number of city blocks walked last week, number of stair flights climbed last week, and performance-based measures including walking speed over 4 meters, five repeated chair stands, and a summary performance score. Functioning was remeasured every 6 months for 3 years. Definitions of severe disability were developed a priori, and participants who met these definitions at baseline were excluded from subsequent analyses. Participants with an ABI of less than 0.60 at baseline had significantly higher cumulative probabilities of developing severe disability than participants with a baseline ABI of 0.90 to 1.50 for walking-specific outcomes (ability to walk a quarter of a mile, number of city blocks walked last week, and walking velocity) but not for the remaining functional outcomes. In age-adjusted Cox proportional hazards analyses, hazard ratios for participants with a baseline ABI of less than 0.60 were 1.63 for becoming unable to walk a quarter of a mile (P = .044), 2.00 for developing severe disability in the number of blocks walked last week (P = .004), and 1.61 for developing severe disability in walking speed (P = .041), compared with participants with a baseline ABI of 0.90 to 1.50. Adjusting for age, race, baseline performance, and comorbidities, an ABI of less than 0.60 remained associated with becoming severely disabled in the number of blocks walked last week (hazard ratio = 1.97, P = .009) and nearly significantly associated with becoming unable to walk a quarter of a mile (hazard ratio = 1.54, P = .09). In fully adjusted random effects models, a baseline ABI of less than 0.60 was associated with significantly greater decline in walking speed per year (P = .019) and nearly significantly greater decline in number of blocks walked last week per year (P = .053) compared with a baseline ABI of 0.90 to 1.50. In community-dwelling disabled older women, a low ABI is associated with a greater incidence of severe disability in walking-specific but not other lower extremity functional outcomes, compared with persons with a normal ABI over 3 years.

  7. Development and pilot testing of a culturally sensitive multimedia program to improve breast cancer screening in Latina women.

    PubMed

    Goel, Mita Sanghavi; Gracia, Gaby; Baker, David W

    2011-07-01

    Our study goal was to assess the effects of a brief patient video on breast cancer knowledge and attitudes among Latina women at a community health center. We conducted pre- and post-testing of knowledge and attitudes in women aged 40 years or older with active screening referrals (n=91). We compared pre- and post-test knowledge and attitudes overall and by baseline values. Mean knowledge increased from 5.8/10 to 6.9/10 (p<0.05), with the greatest increases in those with low baseline knowledge (p<.001). There were no changes in mean attitudes, which were high at baseline (3.8/5); however, among the 16 women with negative/neutral attitudes, 50% developed positive attitudes after watching the video (p<0.05). Baseline intention to complete screening was high at 98%. Although the overall effects were modest, the greatest improvements were in those with low baseline knowledge scores and negative/neutral baseline attitudes. Future testing should examine the effects in a community-based sample. A brief patient video has promise for influencing patient knowledge and perhaps attitudes while being amenable to integration into clinical flow. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Hydratools, a MATLAB® based data processing package for Sontek Hydra data

    USGS Publications Warehouse

    Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.

    2005-01-01

    The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.

  9. Scale-up and process integration of sugar production by acidolysis of municipal solid waste/corn stover blends in ionic liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chenlin; Liang, Ling; Sun, Ning

    The study presents the successful scale-up demonstration of the acid-assisted IL deconstruction on feedstock blends of municipal solid wastes and agricultural residues (corn stover) by 30-fold, relative to the bench scale (6L vs 0.2L), at 10% solid loading. By integrating IL pretreatment and acid hydrolysis with subsequent centrifugation and extraction, the sugar and lignin products can be further recovered efficiently. This scale-up development at Advanced Biofuels/Bioproducts Process Demonstration Unit (ABPDU) will leverage the opportunity and synergistic efforts towards developing a cost-effective IL based deconstruction technology by drastically eliminating enzyme, reducing water usage, and simplifying the downstream sugar/lignin recovery and ILmore » recycling. Results indicate that MSW blends are viable and valuable resource to consider when assessing biomass availability and affordability for lignocellulosic biorefineries. This scale-up evaluation demonstrates that the acid-assisted IL deconstruction technology can be effectively scaled up to larger operations and the current study established the baseline of scaling parameters for this process.« less

  10. Scale-up and process integration of sugar production by acidolysis of municipal solid waste/corn stover blends in ionic liquids

    DOE PAGES

    Li, Chenlin; Liang, Ling; Sun, Ning; ...

    2017-01-05

    The study presents the successful scale-up demonstration of the acid-assisted IL deconstruction on feedstock blends of municipal solid wastes and agricultural residues (corn stover) by 30-fold, relative to the bench scale (6L vs 0.2L), at 10% solid loading. By integrating IL pretreatment and acid hydrolysis with subsequent centrifugation and extraction, the sugar and lignin products can be further recovered efficiently. This scale-up development at Advanced Biofuels/Bioproducts Process Demonstration Unit (ABPDU) will leverage the opportunity and synergistic efforts towards developing a cost-effective IL based deconstruction technology by drastically eliminating enzyme, reducing water usage, and simplifying the downstream sugar/lignin recovery and ILmore » recycling. Results indicate that MSW blends are viable and valuable resource to consider when assessing biomass availability and affordability for lignocellulosic biorefineries. This scale-up evaluation demonstrates that the acid-assisted IL deconstruction technology can be effectively scaled up to larger operations and the current study established the baseline of scaling parameters for this process.« less

  11. The design of a community-based health education intervention for the control of Aedes aegypti.

    PubMed

    Lloyd, L S; Winch, P; Ortega-Canto, J; Kendall, C

    1994-04-01

    This report describes the process used to develop locally appropriate educational materials and to implement the education component of a community-based Aedes aegypti control program in Merida, Yucatan, Mexico. The process is broken into five stages: formative research, developing recommendations for behavior change, development of educational messages, development and production of educational materials, and distribution of the materials. Appropriate terminology and taxonomies for dengue were obtained from open in-depth interviews; baseline data from a knowledge, beliefs, and practices questionnaire served to confirm this information. A larval survey of house lots was carried out to identify the Ae. aegypti larval production sites found on individual house lots. This enabled the program to target the most important larval habitats. Community groups were organized to work on the development of messages and production of the educational materials to be used. The education intervention was successful in stimulating changes in both knowledge and behavior, which were measured in the evaluations of the intervention. To be successful, community-based strategies must be flexible and adapted to the local setting because of ecologic, cultural, and social differences between localities.

  12. Co-development of Problem Gambling and Depression Symptoms in Emerging Adults: A Parallel-Process Latent Class Growth Model.

    PubMed

    Edgerton, Jason D; Keough, Matthew T; Roberts, Lance W

    2018-02-21

    This study examines whether there are multiple joint trajectories of depression and problem gambling co-development in a sample of emerging adults. Data were from the Manitoba Longitudinal Study of Young Adults (n = 679), which was collected in 4 waves across 5 years (age 18-20 at baseline). Parallel process latent class growth modeling was used to identified 5 joint trajectory classes: low decreasing gambling, low increasing depression (81%); low stable gambling, moderate decreasing depression (9%); low stable gambling, high decreasing depression (5%); low stable gambling, moderate stable depression (3%); moderate stable problem gambling, no depression (2%). There was no evidence of reciprocal growth in problem gambling and depression in any of the joint classes. Multinomial logistic regression analyses of baseline risk and protective factors found that only neuroticism, escape-avoidance coping, and perceived level of family social support were significant predictors of joint trajectory class membership. Consistent with the pathways model framework, we observed that individuals in the problem gambling only class were more likely using gambling as a stable way to cope with negative emotions. Similarly, high levels of neuroticism and low levels of family support were associated with increased odds of being in a class with moderate to high levels of depressive symptoms (but low gambling problems). The results suggest that interventions for problem gambling and/or depression need to focus on promoting more adaptive coping skills among more "at-risk" young adults, and such interventions should be tailored in relation to specific subtypes of comorbid mental illness.

  13. The challenges of quantitative evaluation of a multi-setting, multi-strategy community-based childhood obesity prevention programme: lessons learnt from the eat well be active Community Programs in South Australia.

    PubMed

    Wilson, Annabelle M; Magarey, Anthea M; Dollman, James; Jones, Michelle; Mastersson, Nadia

    2010-08-01

    To describe the rationale, development and implementation of the quantitative component of evaluation of a multi-setting, multi-strategy, community-based childhood obesity prevention project (the eat well be active (ewba) Community Programs) and the challenges associated with this process and some potential solutions. ewba has a quasi-experimental design with intervention and comparison communities. Baseline data were collected in 2006 and post-intervention measures will be taken from a non-matched cohort in 2009. Schoolchildren aged 10-12 years were chosen as one litmus group for evaluation purposes. Thirty-nine primary schools in two metropolitan and two rural communities in South Australia. A total of 1732 10-12-year-old school students completed a nutrition and/or a physical activity questionnaire and 1637 had anthropometric measures taken; 983 parents, 286 teachers, thirty-six principals, twenty-six canteen and thirteen out-of-school-hours care (OSHC) workers completed Program-specific questionnaires developed for each of these target groups. The overall child response rate for the study was 49 %. Sixty-five per cent, 43 %, 90 %, 90 % and 68 % of parent, teachers, principals, canteen and OSHC workers respectively, completed and returned questionnaires. A number of practical, logistical and methodological challenges were experienced when undertaking this data collection. Learnings from the process of quantitative baseline data collection for the ewba Community Programs can provide insights for other researchers planning similar studies with similar methods, particularly those evaluating multi-strategy programmes across multiple settings.

  14. Baseline study and risk analysis of landfill leachate - Current state-of-the-science of computer aided approaches.

    PubMed

    Butt, T E; Alam, A; Gouda, H M; Paul, P; Mair, N

    2017-02-15

    For the successful completion of a risk analysis process, its foundation (i.e. a baseline study) has to be well established. For this purpose, a baseline study needs to be more integrated than ever, particularly when environmental legislation is increasingly becoming stringent and integrated. This research investigates and concludes that no clear evidence of computer models for baseline study has been found in a whole-system and integrated format, which risk assessors could readily and effectively use to underpin risk analyses holistically and yet specifically for landfill leachate. This is established on the basis of investigation of software packages that are particularly closely related to landfills. Holistic baseline study is also defined along with its implications and in the context of risk assessment of landfill leachate. The study also indicates a number of factors and features that need to be added to baseline study in order to render it more integrated thereby enhancing risk analyses for landfill leachate. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Detecting Molecular Features of Spectra Mainly Associated with Structural and Non-Structural Carbohydrates in Co-Products from BioEthanol Production Using DRIFT with Uni- and Multivariate Molecular Spectral Analyses

    PubMed Central

    Yu, Peiqiang; Damiran, Daalkhaijav; Azarfar, Arash; Niu, Zhiyuan

    2011-01-01

    The objective of this study was to use DRIFT spectroscopy with uni- and multivariate molecular spectral analyses as a novel approach to detect molecular features of spectra mainly associated with carbohydrate in the co-products (wheat DDGS, corn DDGS, blend DDGS) from bioethanol processing in comparison with original feedstock (wheat (Triticum), corn (Zea mays)). The carbohydrates related molecular spectral bands included: A_Cell (structural carbohydrates, peaks area region and baseline: ca. 1485–1188 cm−1), A_1240 (structural carbohydrates, peak area centered at ca. 1240 cm−1 with region and baseline: ca. 1292–1198 cm−1), A_CHO (total carbohydrates, peaks region and baseline: ca. 1187–950 cm−1), A_928 (non-structural carbohydrates, peak area centered at ca. 928 cm−1 with region and baseline: ca. 952–910 cm−1), A_860 (non-structural carbohydrates, peak area centered at ca. 860 cm−1 with region and baseline: ca. 880–827 cm−1), H_1415 (structural carbohydrate, peak height centered at ca. 1415 cm−1 with baseline: ca. 1485–1188 cm−1), H_1370 (structural carbohydrate, peak height at ca. 1370 cm−1 with a baseline: ca. 1485–1188 cm−1). The study shows that the grains had lower spectral intensity (KM Unit) of the cellulosic compounds of A_1240 (8.5 vs. 36.6, P < 0.05), higher (P < 0.05) intensities of the non-structural carbohydrate of A_928 (17.3 vs. 2.0) and A_860 (20.7 vs. 7.6) than their co-products from bioethanol processing. There were no differences (P > 0.05) in the peak area intensities of A_Cell (structural CHO) at 1292–1198 cm−1 and A_CHO (total CHO) at 1187–950 cm−1 with average molecular infrared intensity KM unit of 226.8 and 508.1, respectively. There were no differences (P > 0.05) in the peak height intensities of H_1415 and H_1370 (structural CHOs) with average intensities 1.35 and 1.15, respectively. The multivariate molecular spectral analyses were able to discriminate and classify between the corn and corn DDGS molecular spectra, but not wheat and wheat DDGS. This study indicated that the bioethanol processing changes carbohydrate molecular structural profiles, compared with the original grains. However, the sensitivities of different types of carbohydrates and different grains (corn and wheat) to the processing differ. In general, the bioethanol processing increases the molecular spectral intensities for the structural carbohydrates and decreases the intensities for the non-structural carbohydrates. Further study is needed to quantify carbohydrate related molecular spectral features of the bioethanol co-products in relation to nutrient supply and availability of carbohydrates. PMID:21673931

  16. Semantic Memory in the Clinical Progression of Alzheimer Disease.

    PubMed

    Tchakoute, Christophe T; Sainani, Kristin L; Henderson, Victor W

    2017-09-01

    Semantic memory measures may be useful in tracking and predicting progression of Alzheimer disease. We investigated relationships among semantic memory tasks and their 1-year predictive value in women with Alzheimer disease. We conducted secondary analyses of a randomized clinical trial of raloxifene in 42 women with late-onset mild-to-moderate Alzheimer disease. We assessed semantic memory with tests of oral confrontation naming, category fluency, semantic recognition and semantic naming, and semantic density in written narrative discourse. We measured global cognition (Alzheimer Disease Assessment Scale, cognitive subscale), dementia severity (Clinical Dementia Rating sum of boxes), and daily function (Activities of Daily Living Inventory) at baseline and 1 year. At baseline and 1 year, most semantic memory scores correlated highly or moderately with each other and with global cognition, dementia severity, and daily function. Semantic memory task performance at 1 year had worsened one-third to one-half standard deviation. Factor analysis of baseline test scores distinguished processes in semantic and lexical retrieval (semantic recognition, semantic naming, confrontation naming) from processes in lexical search (semantic density, category fluency). The semantic-lexical retrieval factor predicted global cognition at 1 year. Considered separately, baseline confrontation naming and category fluency predicted dementia severity, while semantic recognition and a composite of semantic recognition and semantic naming predicted global cognition. No individual semantic memory test predicted daily function. Semantic-lexical retrieval and lexical search may represent distinct aspects of semantic memory. Semantic memory processes are sensitive to cognitive decline and dementia severity in Alzheimer disease.

  17. Impact of body fat percentage change on future diabetes in subjects with normal glucose tolerance.

    PubMed

    Zhao, Tianxue; Lin, Ziwei; Zhu, Hui; Wang, Chen; Jia, Weiping

    2017-12-01

    The aim of the work was to determine the effect of body fat change on risk of diabetes in normal glucose tolerance (NGT) population. A total of 1,857 NGT subjects were included and followed up for an average period of 44.57 months. Body fat percentage (BF%) was measured by bioelectrical impedance analysis. Subjects were grouped based on the BF% and/or body mass index (BMI) state. Among all subjects, 28 developed diabetes after follow-up. Compared with subjects with stable normal BF% (control), subjects who became obesity at follow-up were defects in insulin secretion and had a higher risk of developing diabetes (7.102, 95% confidence intervals [CI] 1.740-28.993), while no difference in diabetic risk could be viewed between subjects with abnormal BF% at baseline but normal at the end of follow-up and control subjects after adjustment of confounding factors. Moreover, compared with those keeping normal BF% and BMI both at baseline and follow-up, subjects who had normal BMI at baseline and follow-up, but abnormal BF% at baseline or/and follow-up still had a higher risk to develop diabetes (4.790, 95% CI 1.061-21.621), while those with normal BF% at baseline and follow-up, but abnormal BMI at baseline or/and follow-up had not. Subjects from normal BF% at baseline to obese at follow-up are associated with an increased risk of diabetes. Maintaining normal body fat is more relevant than BMI in preventing diabetes. © 2017 IUBMB Life, 69(12):947-955, 2017. © 2017 International Union of Biochemistry and Molecular Biology.

  18. Association between Blood Omega-3 Index and Cognition in Typically Developing Dutch Adolescents

    PubMed Central

    van der Wurff, Inge S. M.; von Schacky, Clemens; Berge, Kjetil; Zeegers, Maurice P.; Kirschner, Paul A.; de Groot, Renate H. M.

    2016-01-01

    The impact of omega-3 long-chain polyunsaturated fatty acids (LCPUFAs) on cognition is heavily debated. In the current study, the possible association between omega-3 LCPUFAs in blood and cognitive performance of 266 typically developing adolescents aged 13–15 years is investigated. Baseline data from Food2Learn, a double-blind and randomized placebo controlled krill oil supplementation trial in typically developing adolescents, were used for the current study. The Omega-3 Index was determined with blood from a finger prick. At baseline, participants finished a neuropsychological test battery consisting of the Letter Digit Substitution Test (LDST), D2 test of attention, Digit Span Forward and Backward, Concept Shifting Test and Stroop test. Data were analyzed with multiple regression analyses with correction for covariates. The average Omega-3 Index was 3.83% (SD 0.60). Regression analyses between the Omega-3 Index and the outcome parameters revealed significant associations with scores on two of the nine parameters. The association between the Omega-3 Index and both scores on the LDST (β = 0.136 and p = 0.039), and the number of errors of omission on the D2 (β = −0.053 and p = 0.007). This is a possible indication for a higher information processing speed and less impulsivity in those with a higher Omega-3 Index. PMID:26729157

  19. One year outcome of boys with Duchenne muscular dystrophy using the Bayley-III scales of infant and toddler development.

    PubMed

    Connolly, Anne M; Florence, Julaine M; Cradock, Mary M; Eagle, Michelle; Flanigan, Kevin M; McDonald, Craig M; Karachunski, Peter I; Darras, Basil T; Bushby, Kate; Malkus, Elizabeth C; Golumbek, Paul T; Zaidman, Craig M; Miller, J Philip; Mendell, Jerry R

    2014-06-01

    The pathogenesis of Duchenne muscular dystrophy starts before birth. Despite this, clinical trials exclude young boys because traditional outcome measures rely on cooperation. We recently used the Bayley-III Scales of Infant and Toddler Development to study 24 infants and boys with Duchenne muscular dystrophy. Clinical evaluators at six centers were trained and certified to perform the Bayley-III. Here, we report 6- and 12-month follow-up of two subsets of these boys. Nineteen boys (1.9 ± 0.8 years) were assessed at baseline and 6 months. Twelve boys (1.5 ± 0.8 years) were assessed at baseline, 6, and 12 months. Gross motor scores were lower at baseline compared with published controls (6.2 ± 1.7; normal 10 ± 3; P < 0.0001) and revealed a further declining trend to 5.7 ± 1.7 (P = 0.20) at 6 months. Repeated measures analysis of the 12 boys monitored for 12 months revealed that gross motor scores, again low at baseline (6.6 ± 1.7; P < 0.0001), declined at 6 months (5.9 ± 1.8) and further at 12 months (5.3 ± 2.0) (P = 0.11). Cognitive and language scores were lower at baseline compared with normal children (range, P = 0.002-<0.0001) and did not change significantly at 6 or 12 months (range, P = 0.89-0.09). Fine motor skills, also low at baseline, improved >1 year (P = 0.05). Development can reliably be measured in infants and young boys with Duchenne muscular dystrophy across time using the Bayley-III. Power calculations using these data reveal that motor development may be used as an outcome measure. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. UNIX programmer`s environment and configuration control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, T.R.; Wyatt, P.W.

    1993-12-31

    A package of UNIX utilities has been developed which unities the advantages of the public domain utility ``imake`` and a configuration control system. The ``imake`` utility is portable It allows a user to make Makefiles on a wide variety of platforms without worrying about the machine-dependent idiosyncracies of the UNIX utility ``make.`` Makefiles are a labor-saving device for compiling and linking complicated programs, and ``imake`` is a labor-saving device for making Makefiles, as well as other useful software (like a program`s internal dependencies on included files). This ``Environment,`` which has been developed around ``imake,`` allows a programmer to manage amore » complicated project consisting of multiple executables which may each link with multiple user-created libraries. The configuration control aspect consists of a directory hierarchy (a baseline) which is mirrored in a developer`s workspace. The workspace includes a minimum of files copied from the baseline; it employs soft links into the baseline wherever possible. The utilities are a multi-tiered suite of Bourne shells to copy or check out sources, check them back in, import new sources (sources which are not in the baseline) and link them appropriately, create new low-level directories and link them, compare with the baseline, update Makefiles with minimal effort, and handle dependencies. The directory hierarchy utilizes a single source repository, which is mirrored in the baseline and in a workspace for a several platform architectures. The system was originally written to support C code on Sun-4`s and RS6000`s. It has now been extended to support FORTRAN as well as C on SGI and Cray YMP platforms as well as Sun-4`s and RS6000`s.« less

  1. Epstein-Barr and other herpesvirus infections in patients with early onset type 1 diabetes treated with daclizumab and mycophenolate mofetil.

    PubMed

    Loechelt, Brett J; Boulware, David; Green, Michael; Baden, Lindsey R; Gottlieb, Peter; Krause-Steinrauf, Heidi; Weinberg, Adriana

    2013-01-01

    We assessed the morbidity of herpesviruses in patients with type 1 diabetes mellitus (T1D) enrolled in immunosuppressive treatment studies. Epstein-Barr virus (EBV), cytomegalovirus (CMV), herpes simplex virus (HSV), and varicella zoster virus (VZV) infections were monitored in 126 participants of a randomized, double-blind, placebo-controlled study of daclizumab (DZB) and mycophenolate mofetil (MMF) including DZB(+)MMF(+), DZB(-)MMF(+), DZB(+)MMF(-), and DZB(-)MMF(-). During the 2-year follow-up, herpesviral infections were monitored clinically, by serology and blood DNA polymerase chain reaction. Among 57 baseline EBV-seronegative participants, 9 developed EBV primary infections, including 2 with infectious mononucleosis syndrome. There were no appreciable differences in the course of the primary EBV infections across treatment groups. Among 69 baseline EBV-seropositive participants, 22 had virologic reactivations, including 1 symptomatic DZB(-)MMF(+) subject. Compared with 7 DZB(-)MMF(-) EBV reactivators, the 9 DZB(+)MMF(+) reactivators tended to have more prolonged viremia (11.4 vs 4.4 months; P = .06) and higher cumulative viral burden (14.2 vs 12.5 log EBV copies/mL; P = .06). Four of 85 baseline CMV-seronegative subjects developed asymptomatic primary CMV infections. There were no CMV reactivations. Of 30 baseline HSV-seropositive subjects, 8 developed ≥1 episode of herpes labialis; 1 subject had a primary HSV infection; and 1 subject without baseline serology information had a new diagnosis of genital HSV. There were no significant differences in the incidence of HSV recurrences across treatment groups. Of 100 baseline VZV-seropositive subjects, 1 DZB(-)MMF(-) subject developed herpes zoster and 1 DZB(-)MMF(+) subject had Bell's palsy possibly related to VZV. The use of DZB alone or in combination with MMF was not associated with increased morbidity due to herpesviruses. NCT00100178.

  2. Development of an Impervious-Surface Database for the Little Blackwater River Watershed, Dorchester County, Maryland

    USGS Publications Warehouse

    Milheim, Lesley E.; Jones, John W.; Barlow, Roger A.

    2007-01-01

    Many agricultural and forested areas in proximity to National Wildlife Refuges (NWR) are under increasing economic pressure for commercial or residential development. The upper portion of the Little Blackwater River watershed - a 27 square mile area within largely low-lying Dorchester County, Maryland, on the eastern shore of the Chesapeake Bay - is important to the U.S. Fish and Wildlife Service (USFWS) because it flows toward the Blackwater National Wildlife Refuge (BNWR), and developmental impacts of areas upstream from the BNWR are unknown. One of the primary concerns for the Refuge is how storm-water runoff may affect living resources downstream. The Egypt Road project (fig. 1), for which approximately 600 residential units have been approved, has the potential to markedly change the land use and land cover on the west bank of the Little Blackwater River. In an effort to limit anticipated impacts, the Maryland Department of Natural Resources (Maryland DNR) recently decided to purchase some of the lands previously slated for development. Local topography, a high water table (typically 1 foot or less below the land surface), and hydric soils present a challenge for the best management of storm-water flow from developed surfaces. A spatial data coordination group was formed by the Dorchester County Soil and Conservation District to collect data to aid decisionmakers in watershed management and on the possible impacts of development on this watershed. Determination of streamflow combined with land cover and impervious-surface baselines will allow linking of hydrologic and geologic factors that influence the land surface. This baseline information will help planners, refuge managers, and developers discuss issues and formulate best management practices to mitigate development impacts on the refuge. In consultation with the Eastern Region Geospatial Information Office, the dataset selected to be that baseline land cover source was the June-July 2005 National Agricultural Imagery Program (NAIP) 1-meter resolution orthoimagery of Maryland. This publicly available, statewide dataset provided imagery corresponding to the closest in time to the installation of a U.S. Geological Survey (USGS) Water Resources Discipline gaging station on the Little Blackwater River. It also captures land cover status just before major residential development occurs. This document describes the process used to create a database of impervious surfaces for the Little Blackwater watershed.

  3. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  4. Environmental baseline conditions for impact assessment of unconventional gas exploitation: the G-Baseline project

    NASA Astrophysics Data System (ADS)

    Kloppmann, Wolfram; Mayer, Berhard; Millot, Romain; Parker, Beth L.; Gaucher, Eric; Clarkson, Christopher R.; Cherry, John A.; Humez, Pauline; Cahill, Aaron

    2015-04-01

    A major scientific challenge and an indispensible prerequisite for environmental impact assessment in the context of unconventional gas development is the determination of the baseline conditions against which potential environmental impacts on shallow freshwater resources can be accurately and quantitatively tested. Groundwater and surface water resources overlying the low-permeability hydrocarbon host rocks containing shale gas may be impacted to different extents by naturally occurring saline fluids and by natural gas emanations. Baseline assessments in areas of previous conventional hydrocarbon production may also reveal anthropogenic impacts from these activities not related to unconventional gas development. Once unconventional gas exploitation has started, the baseline may be irrevocably lost by the intricate superposition of geogenic and potential anthropogenic contamination by stray gas, formation waters and chemicals used during hydraulic fracturing. The objective of the Franco-Canadian NSERC-ANR project G-Baseline is to develop an innovative and comprehensive methodology of geochemical and isotopic characterization of the environmental baseline for water and gas samples from all three essential zones: (1) the production zone, including flowback waters, (2) the intermediate zone comprised of overlying formations, and (3) shallow aquifers and surface water systems where contamination may result from diverse natural or human impacts. The outcome will be the establishment of a methodology based on innovative tracer and monitoring techniques, including traditional and non-traditional isotopes (C, H, O, S, B, Sr, Cl, Br, N, U, Li, Cu, Zn, CSIA...) for detecting, quantifying and modeling of potential leakage of stray gas and of saline formation water mixed with flowback fluids into fresh groundwater resources and surface waters taking into account the pathways and mechanisms of fluid and gas migration. Here we present an outline of the project as well as first results from chemical and isotopic analyses on gas, fluid and solid samples collected during a baseline monitoring program at the Carbon Management Canada field research site in south-eastern Alberta, Canada.

  5. Association Between Coronary Artery Calcification and the Hemoglobin Glycation Index: The Kangbuk Samsung Health Study.

    PubMed

    Rhee, Eun-Jung; Cho, Jung-Hwan; Kwon, Hyemi; Park, Se Eun; Park, Cheol-Young; Oh, Ki-Won; Park, Sung-Woo; Lee, Won-Young

    2017-12-01

    The hemoglobin glycation index (HGI) is known to be correlated with the risk for cardiovascular disease. To analyze the association between incident coronary artery calcification (CAC) and the changes in HGI among participants without diabetes, over 4 years. A retrospective study of 2052 nondiabetic participants in whom the coronary artery calcium score was measured repeatedly over 4 years, as part of a health checkup program in Kangbuk Samsung Hospital in Korea, and who had no CAC at baseline. The HGI was defined as the difference between the measured and predicted hemoglobin A1c (HbA1c) levels. A total of 201 participants developed CAC after 4 years, and the mean baseline HGI was significantly higher in those patients. The incidence of CAC gradually increased from the first to the fourth quartile groups of baseline HGI. The odds ratio (OR) for incident CAC was the highest among the four groups divided by the quartiles of the baseline HGI and was significant after adjustment for confounding variables (vs first quartile group: OR, 1.632; 95% confidence interval, 1.024 to 2.601). The incidence of and risk for CAC development were significantly higher than in other groups compared with the low-to-low group after adjustment for confounding factors; however, when baseline HbA1c level was included in the model, only participants with a low-to-high HGI over 4 years showed a significantly increased OR for CAC development compared with the low-to-low group (OR, 1.722; 95% confidence interval, 1.046 to 2.833). The participants with a high baseline HGI and consistently high HGI showed a higher risk for incident CAC than those with a low baseline HGI. An increased HGI over 4 years significantly increased the risk for CAC regardless of the baseline HbA1c levels. Copyright © 2017 Endocrine Society

  6. Advancing the application of systems thinking in health: analysing the contextual and social network factors influencing the use of sustainability indicators in a health system--a comparative study in Nepal and Somaliland.

    PubMed

    Blanchet, Karl; Palmer, Jennifer; Palanchowke, Raju; Boggs, Dorothy; Jama, Ali; Girois, Susan

    2014-08-26

    Health systems strengthening is becoming a key component of development agendas for low-income countries worldwide. Systems thinking emphasizes the role of diverse stakeholders in designing solutions to system problems, including sustainability. The objective of this paper is to compare the definition and use of sustainability indicators developed through the Sustainability Analysis Process in two rehabilitation sectors, one in Nepal and one in Somaliland, and analyse the contextual factors (including the characteristics of system stakeholder networks) influencing the use of sustainability data. Using the Sustainability Analysis Process, participants collectively clarified the boundaries of their respective systems, defined sustainability, and identified sustainability indicators. Baseline indicator data was gathered, where possible, and then researched again 2 years later. As part of the exercise, system stakeholder networks were mapped at baseline and at the 2-year follow-up. We compared stakeholder networks and interrelationships with baseline and 2-year progress toward self-defined sustainability goals. Using in-depth interviews and observations, additional contextual factors affecting the use of sustainability data were identified. Differences in the selection of sustainability indicators selected by local stakeholders from Nepal and Somaliland reflected differences in the governance and structure of the present rehabilitation system. At 2 years, differences in the structure of social networks were more marked. In Nepal, the system stakeholder network had become more dense and decentralized. Financial support by an international organization facilitated advancement toward self-identified sustainability goals. In Somaliland, the small, centralised stakeholder network suffered a critical rupture between the system's two main information brokers due to competing priorities and withdrawal of international support to one of these. Progress toward self-defined sustainability was nil. The structure of the rehabilitation system stakeholder network characteristics in Nepal and Somaliland evolved over time and helped understand the changing nature of relationships between actors and their capacity to work as a system rather than a sum of actors. Creating consensus on a common vision of sustainability requires additional system-level interventions such as identification of and support to stakeholders who promote systems thinking above individual interests.

  7. Application of electrochemical methods in corrosion and battery research

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoli

    Various electrochemical methods have been applied in the development of corrosion protection methods for ammonia/water absorption heat pumps and the evaluation of the stability of metallic materials in Li-ion battery electrolyte. Rare earth metal salts (REMSs) and organic inhibitors have been evaluated for corrosion protection of mild steel in the baseline solution of 5 wt% NH 3 + 0.2 wt% NaOH to replace the conventionally used toxic chromate salt inhibitors. Cerium nitrate provided at least comparable corrosion inhibition efficiency as dichromate in the baseline solution at 100°C. The cerium (IV) oxide formed on mild steel through the cerating process exhibited increasing corrosion protection for mild steel with prolonged exposure time in the hot baseline solution. The optimum cerating process was found to be first cerating in a solution of 2.3 g/L CeCl3 + 4.4 wt% H2O2 + appropriate additives for 20 minutes at pH 2.2 at room temperature with 30 minutes solution aging prior to use, then sealing in 10% sodium (meta) silicate or sodium molybdate at 50°C for 30 minutes. Yttrium salts provided less corrosion protection for mild steel in the baseline solution than cerium salts. Glycerophosphate was found to be a promising chromate-free organic inhibitor for mild steel; however, its thermostability in hot ammonia/water solutions has not been confirmed yet. The stability of six metallic materials used in Li-ion batteries has been evaluated in 1M lithium hexafluorophosphate (LiPF6) dissolved in a 1:1 volume mixture of ethylene carbonate and diethyl carbonate at 37°C in a dry-box. Aluminum is the most stable material, while Copper is active under anodic potentials and susceptible to localized corrosion and galvanic corrosion. The higher the concentration of the alloying elements Al and/or V in a titanium alloy, the higher was the stability of the titanium alloy in the battery electrolyte. 90Pt-10Ir can cause decomposition of the electrolyte resulting in a low stable potential window.

  8. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  9. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  10. Genetic Modulation of Training and Transfer in Older Adults: BDNF Val66Met Polymorphism is Associated with Wider Useful Field of View

    PubMed Central

    Colzato, Lorenza S.; van Muijden, Jesse; Band, Guido P. H.; Hommel, Bernhard

    2011-01-01

    Western society has an increasing proportion of older adults. Increasing age is associated with a general decrease in the control over task-relevant mental processes. In the present study we investigated the possibility that successful transfer of game-based cognitive improvements to untrained tasks in elderly people is modulated by preexisting neuro-developmental factors as genetic variability related to levels of the brain-derived neurotrophic factor (BDNF), an important neuromodulator underlying cognitive processes. We trained participants, genotyped for the BDNF Val66Met polymorphism, on cognitive tasks developed to improve dynamic attention. Pre-training (baseline) and post-training measures of attentional processes (divided and selective attention) were acquired by means of the useful field of view task. As expected, Val/Val homozygous individuals showed larger beneficial transfer effects than Met/-carriers. Our findings support the idea that genetic predisposition modulates transfer effects. PMID:21909331

  11. Toddler parasympathetic regulation and fear: Links to maternal appraisal and behavior

    PubMed Central

    Cho, Sunghye; Buss, Kristin A.

    2017-01-01

    There is a growing recognition that parental socialization influences interact with young children’s emerging capacity for physiological regulation and shape children’s developmental trajectories. Nevertheless, the transactional processes linking parental socialization and physiological regulatory processes remain not well understood, particularly for fear-prone toddlers. To address this gap in the literature, the present study investigated the biopsychosocial processes that underlie toddlers’ fear regulation by examining the relations among toddler parasympathetic regulation, maternal appraisal, and parenting behaviors. Participants included 124 mothers and their toddlers (Mage = 24.43 months), who participated in a longitudinal study of temperament and socio-emotional development. Toddlers’ parasympathetic reactivity was found to moderate the links between maternal anticipatory appraisal of child fearfulness and (a) maternal provision of physical comfort and (b) preschool-age child inhibition. Additionally, maternal comforting behaviors during the low-threat task predicted preschool-age separation distress, specifically for toddlers demonstrating a low baseline RSA. PMID:27785806

  12. Mechanical Design of a Performance Test Rig for the Turbine Air-Flow Task (TAFT)

    NASA Technical Reports Server (NTRS)

    Forbes, John C.; Xenofos, George D.; Farrow, John L.; Tyler, Tom; Williams, Robert; Sargent, Scott; Moharos, Jozsef

    2004-01-01

    To support development of the Boeing-Rocketdyne RS84 rocket engine, a full-flow, reaction turbine geometry was integrated into the NASA-MSFC turbine air-flow test facility. A mechanical design was generated which minimized the amount of new hardware while incorporating all test and instrumentation requirements. This paper provides details of the mechanical design for this Turbine Air-Flow Task (TAFT) test rig. The mechanical design process utilized for this task included the following basic stages: Conceptual Design. Preliminary Design. Detailed Design. Baseline of Design (including Configuration Control and Drawing Revision). Fabrication. Assembly. During the design process, many lessons were learned that should benefit future test rig design projects. Of primary importance are well-defined requirements early in the design process, a thorough detailed design package, and effective communication with both the customer and the fabrication contractors.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickerson, Patricia O'Donnell; Summa, Deborah Ann; Liu, Cheng

    The goals of this project were to demonstrate reliable, reproducible solid state bonding of aluminum 6061 alloy plates together to encapsulate DU-10 wt% Mo surrogate fuel foils. This was done as part of the CONVERT Fuel Fabrication Capability effort in Process Baseline Development . Bonding was done using Hot Isotatic Pressing (HIP) of evacuated stainless steel cans (a.k.a HIP cans) containing fuel plate components and strongbacks. Gross macroscopic measurements of HIP cans prior to HIP and after HIP were used as part of this demonstration, and were used to determine the accuracy of a finitie element model of the HIPmore » bonding process. The quality of the bonding was measured by controlled miniature bulge testing for Al-Al, Al-Zr, and Zr-DU bonds. A special objective was to determine if the HIP process consistently produces good quality bonding and to determine the best characterization techniques for technology transfer.« less

  14. Silicon solar cell process development, fabrication, and analysis

    NASA Technical Reports Server (NTRS)

    Yoo, H. I.; Iles, P. A.; Leung, D. C.

    1981-01-01

    Work has progressed in fabrication and characterization of solar cells from ubiquitous crystallization process (UCP) wafers and LASS ribbons. Gettering tests applied to UCP wafers made little change on their performance compared with corresponding baseline data. Advanced processes such as shallow junction (SJ), back surface field (BSF), and multilayer antireflection (MLAR) were also applied. While BSF by Al paste had shunting problems, cells with SJ and BSF by evaporated Al, and MLAR did achieve 14.1% AMI on UCP silicon. The study of LASS material was very preliminary. Only a few cells with SJ, BSR, (no BSF) and MLAR were completed due to mechanical yield problems after lapping the material. Average efficiency was 10.7% AMI with 13.4% AMI for CZ controls. Relatively high minority carrier diffusion lengths were obtained. The lower than expected Jsc could be partially explained by low active area due to irregular sizes.

  15. Children's binge eating and development of metabolic syndrome.

    PubMed

    Tanofsky-Kraff, M; Shomaker, L B; Stern, E A; Miller, R; Sebring, N; Dellavalle, D; Yanovski, S Z; Hubbard, V S; Yanovski, J A

    2012-07-01

    Binge eating predisposes children to excessive weight gain. However, it is unknown if pediatric binge eating predicts other obesity-associated adverse health outcomes. The objective of this study was to investigate the relationship between binge eating and metabolic syndrome (MetS) in children. Children aged 5-12 years at high risk for adult obesity, either because they were overweight/obese when first examined or because their parents were overweight/obese, were recruited from Washington, DC and its suburbs. Children completed a questionnaire assessment of binge eating at baseline and underwent measurements of MetS components at baseline and at a follow-up visit approximately 5 years later. Magnetic resonance imaging was used to measure the visceral adipose tissue (VAT) in a subset. In all, 180 children were studied between July 1996 and August 2010. Baseline self-reported binge eating presence was associated with a 5.33 greater odds of having MetS at follow-up (95% confidence interval (CI): 1.47, 19.27, P=0.01). The association between binge eating and body mass index (BMI) only partially explained changes in MetS components: baseline binge eating predicted higher follow-up triglycerides, even after accounting for baseline triglycerides, baseline BMI, BMI change, sex, race, baseline age and time in study (P = 0.05). Also, adjusting for baseline VAT and demographics, baseline binge eating predicted greater follow-up L(2-3) VAT (P = 0.01). Children's reports of binge eating predicted development of MetS, worsening triglycerides and increased VAT. The excessive weight gain associated with children's binge eating partly explained its adverse metabolic health outcomes. Reported binge eating may represent an early behavioral marker upon which to focus interventions for obesity and MetS.

  16. Assessment of psychosocial risk factors for the development of non-specific chronic disabling low back pain in Japanese workers-findings from the Japan Epidemiological Research of Occupation-related Back Pain (JOB) study.

    PubMed

    Matsudaira, Ko; Kawaguchi, Mika; Isomura, Tatsuya; Inuzuka, Kyoko; Koga, Tadashi; Miyoshi, Kota; Konishi, Hiroaki

    2015-01-01

    To investigate the associations between psychosocial factors and the development of chronic disabling low back pain (LBP) in Japanese workers. A 1 yr prospective cohort of the Japan Epidemiological Research of Occupation-related Back Pain (JOB) study was used. The participants were office workers, nurses, sales/marketing personnel, and manufacturing engineers. Self-administered questionnaires were distributed twice: at baseline and 1 yr after baseline. The outcome of interest was the development of chronic disabling LBP during the 1 yr follow-up period. Incidence was calculated for the participants who experienced disabling LBP during the month prior to baseline. Logistic regression was used to assess risk factors for chronic disabling LBP. Of 5,310 participants responding at baseline (response rate: 86.5%), 3,811 completed the questionnaire at follow-up. Among 171 eligible participants who experienced disabling back pain during the month prior to baseline, 29 (17.0%) developed chronic disabling LBP during the follow-up period. Multivariate logistic regression analysis implied reward to work (not feeling rewarded, OR: 3.62, 95%CI: 1.17-11.19), anxiety (anxious, OR: 2.89, 95%CI: 0.97-8.57), and daily-life satisfaction (not satisfied, ORs: 4.14, 95%CI: 1.18-14.58) were significant. Psychosocial factors are key to the development of chronic disabling LBP in Japanese workers. Psychosocial interventions may reduce the impact of LBP in the workplace.

  17. The relationship between psychological distress and baseline sports-related concussion testing.

    PubMed

    Bailey, Christopher M; Samples, Hillary L; Broshek, Donna K; Freeman, Jason R; Barth, Jeffrey T

    2010-07-01

    This study examined the effect of psychological distress on neurocognitive performance measured during baseline concussion testing. Archival data were utilized to examine correlations between personality testing and computerized baseline concussion testing. Significantly correlated personality measures were entered into linear regression analyses, predicting baseline concussion testing performance. Suicidal ideation was examined categorically. Athletes underwent testing and screening at a university athletic training facility. Participants included 47 collegiate football players 17 to 19 years old, the majority of whom were in their first year of college. Participants were administered the Concussion Resolution Index (CRI), an internet-based neurocognitive test designed to monitor and manage both at-risk and concussed athletes. Participants took the Personality Assessment Inventory (PAI), a self-administered inventory designed to measure clinical syndromes, treatment considerations, and interpersonal style. Scales and subscales from the PAI were utilized to determine the influence psychological distress had on the CRI indices: simple reaction time, complex reaction time, and processing speed. Analyses revealed several significant correlations among aspects of somatic concern, depression, anxiety, substance abuse, and suicidal ideation and CRI performance, each with at least a moderate effect. When entered into a linear regression, the block of combined psychological symptoms accounted for a significant amount of baseline CRI performance, with moderate to large effects (r = 0.23-0.30). When examined categorically, participants with suicidal ideation showed significantly slower simple reaction time and complex reaction time, with a similar trend on processing speed. Given the possibility of obscured concussion deficits after injury, implications for premature return to play, and the need to target psychological distress outright, these findings heighten the clinical importance of screening for psychological distress during baseline and post-injury concussion evaluations.

  18. [Environmental geochemical baseline of heavy metals in soils of the Ili river basin and pollution evaluation].

    PubMed

    Zhao, Xin-Ru; Nasier, Telajin; Cheng, Yong-Yi; Zhan, Jiang-Yu; Yang, Jian-Hong

    2014-06-01

    Environmental geochemical baseline models of Cu, Zn, Pb, As, Hg were established by standardized method in the ehernozem, chestnut soil, sierozem and saline soil from the Ili river valley region. The theoretical baseline values were calculated. Baseline factor pollution index evaluation method, environmental background value evaluation method and heavy metal cleanliness evaluation method were used to compare soil pollution degrees. The baseline factor pollution index evaluation showed that As pollution was the most prominent among the four typical types of soils within the river basin, with 7.14%, 9.76%, 7.50% of sampling points in chernozem, chestnut soil and sierozem reached the heavy pollution, respectively. 7.32% of sampling points of chestnut soil reached the permitted heavy metal Pb pollution index in the chestnut soil. The variation extent of As and Pb was the largest, indicating large human disturbance. Environmental background value evaluation showed that As was the main pollution element, followed by Cu, Zn and Pb. Heavy metal cleanliness evaluation showed that Cu, Zn and Pb were better than cleanliness level 2 and Hg was the of cleanliness level 1 in all four types of soils. As showed moderate pollution in sierozem, and it was of cleanliness level 2 or better in chernozem, chestnut soil and saline-alkali soil. Comparing the three evaluation systems, the baseline factor pollution index evaluation more comprehensively reflected the geochemical migration characteristics of elements and the soil formation processes, and the pollution assessment could be specific to the sampling points. The environmental background value evaluation neglected the natural migration of heavy metals and the deposition process in the soil since it was established on the regional background values. The main purpose of the heavy metal cleanliness evaluation was to evaluate the safety degree of soil environment.

  19. An Approach for Measuring Reductions in Operations, Maintenance, and Energy Costs: Baseline Measures of Construction Industry Practices for the National Construction Goals.

    ERIC Educational Resources Information Center

    Chapman, Robert E.; Rennison, Roderick

    The Construction and Building Subcommittee of the National Science and Technology Council (NSTC) has established seven National Construction Goals for the construction industry and is developing baseline measures for current practices and progress with respect to each goal. This document provides a detailed set of baseline measures for the NSTC…

  20. Hydrology and water quality in two mountain basins of the northeastern US: Assessing baseline conditions and effects of ski area development

    USGS Publications Warehouse

    Wemple, B.; Shanley, J.; Denner, J.; Ross, D.; Mills, K.

    2007-01-01

    Mountain regions throughout the world face intense development pressures associated with recreational and tourism uses. Despite these pressures, much of the research on bio-geophysical impacts of humans in mountain regions has focused on the effects of natural resource extraction. This paper describes findings from the first 3 years of a study examining high elevation watershed processes in a region undergoing alpine resort development. Our study is designed as a paired-watershed experiment. The Ranch Brook watershed (9.6 km2) is a relatively pristine, forested watershed and serves as the undeveloped 'control' basin. West Branch (11.7 km2) encompasses an existing alpine ski resort, with approximately 17% of the basin occupied by ski trails and impervious surfaces, and an additional 7% slated for clearing and development. Here, we report results for water years 2001-2003 of streamflow and water quality dynamics for these watersheds. Precipitation increases significantly with elevation in the watersheds, and winter precipitation represents 36-46% of annual precipitation. Artificial snowmaking from water within West Branch watershed currently augments annual precipitation by only 3-4%. Water yield in the developed basin exceeded that in the control by 18-36%. Suspended sediment yield was more than two and a half times greater and fluxes of all major solutes were higher in the developed basin. Our study is the first to document the effects of existing ski area development on hydrology and water quality in the northeastern US and will serve as an important baseline for evaluating the effects of planned resort expansion activities in this area.

Top