Sample records for processing unit based

  1. Acceleration of integral imaging based incoherent Fourier hologram capture using graphic processing unit.

    PubMed

    Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung

    2012-10-08

    Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.

  2. Device and method to enhance availability of cluster-based processing systems

    NASA Technical Reports Server (NTRS)

    Lupia, David J. (Inventor); Ramos, Jeremy (Inventor); Samson, Jr., John R. (Inventor)

    2010-01-01

    An electronic computing device including at least one processing unit that implements a specific fault signal upon experiencing an associated fault, a control unit that generates a specific recovery signal upon receiving the fault signal from the at least one processing unit, and at least one input memory unit. The recovery signal initiates specific recovery processes in the at least one processing unit. The input memory buffers input data signals input to the at least one processing unit that experienced the fault during the recovery period.

  3. Estimating Missing Unit Process Data in Life Cycle Assessment Using a Similarity-Based Approach.

    PubMed

    Hou, Ping; Cai, Jiarui; Qu, Shen; Xu, Ming

    2018-05-01

    In life cycle assessment (LCA), collecting unit process data from the empirical sources (i.e., meter readings, operation logs/journals) is often costly and time-consuming. We propose a new computational approach to estimate missing unit process data solely relying on limited known data based on a similarity-based link prediction method. The intuition is that similar processes in a unit process network tend to have similar material/energy inputs and waste/emission outputs. We use the ecoinvent 3.1 unit process data sets to test our method in four steps: (1) dividing the data sets into a training set and a test set; (2) randomly removing certain numbers of data in the test set indicated as missing; (3) using similarity-weighted means of various numbers of most similar processes in the training set to estimate the missing data in the test set; and (4) comparing estimated data with the original values to determine the performance of the estimation. The results show that missing data can be accurately estimated when less than 5% data are missing in one process. The estimation performance decreases as the percentage of missing data increases. This study provides a new approach to compile unit process data and demonstrates a promising potential of using computational approaches for LCA data compilation.

  4. Reliability and performance of a system-on-a-chip by predictive wear-out based activation of functional components

    DOEpatents

    Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong

    2013-10-01

    A processor-implemented method for determining aging of a processing unit in a processor the method comprising: calculating an effective aging profile for the processing unit wherein the effective aging profile quantifies the effects of aging on the processing unit; combining the effective aging profile with process variation data, actual workload data and operating conditions data for the processing unit; and determining aging through an aging sensor of the processing unit using the effective aging profile, the process variation data, the actual workload data, architectural characteristics and redundancy data, and the operating conditions data for the processing unit.

  5. Technical and Energy Performance of an Advanced, Aqueous Ammonia-Based CO2 Capture Technology for a 500 MW Coal-Fired Power Station.

    PubMed

    Li, Kangkang; Yu, Hai; Feron, Paul; Tade, Moses; Wardhaugh, Leigh

    2015-08-18

    Using a rate-based model, we assessed the technical feasibility and energy performance of an advanced aqueous-ammonia-based postcombustion capture process integrated with a coal-fired power station. The capture process consists of three identical process trains in parallel, each containing a CO2 capture unit, an NH3 recycling unit, a water separation unit, and a CO2 compressor. A sensitivity study of important parameters, such as NH3 concentration, lean CO2 loading, and stripper pressure, was performed to minimize the energy consumption involved in the CO2 capture process. Process modifications of the rich-split process and the interheating process were investigated to further reduce the solvent regeneration energy. The integrated capture system was then evaluated in terms of the mass balance and the energy consumption of each unit. The results show that our advanced ammonia process is technically feasible and energy-competitive, with a low net power-plant efficiency penalty of 7.7%.

  6. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  7. An Examination of the USAF (Q,R) Policies for Managing Depot-Base Inventories.

    DTIC Science & Technology

    1976-10-15

    x OC. ½ EOQ J = [ ~- ~ ~~~ ___.1~ (A3) 0C~ = base j’s order processing cost = $5; tIC = unit acquisition cost of the given item...l - —— (C2) Uc x (11C0 + (llC~ - HCD) x j~~l (n~MDR~ /MDn 0)) where OC9 = depot order processing cost = $270.16; 0C~ = base order ... processing cost LIC unit acquisition cost of item; MCD = cost to hold caeh unit of the given item/yearat the depot , expressed as a fraction of its un i

  8. A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan

    2018-04-01

    This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.

  9. Model for mapping settlements

    DOEpatents

    Vatsavai, Ranga Raju; Graesser, Jordan B.; Bhaduri, Budhendra L.

    2016-07-05

    A programmable media includes a graphical processing unit in communication with a memory element. The graphical processing unit is configured to detect one or more settlement regions from a high resolution remote sensed image based on the execution of programming code. The graphical processing unit identifies one or more settlements through the execution of the programming code that executes a multi-instance learning algorithm that models portions of the high resolution remote sensed image. The identification is based on spectral bands transmitted by a satellite and on selected designations of the image patches.

  10. Modeling of yield and environmental impact categories in tea processing units based on artificial neural networks.

    PubMed

    Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa

    2017-12-01

    In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for processing of green tea, oolong tea, and black tea were calculated as 58,182, 60,947, and 66,301 MJ per ton of dry tea, respectively.

  11. Performance Testing of GPU-Based Approximate Matching Algorithm on Network Traffic

    DTIC Science & Technology

    2015-03-01

    Defense Department’s use. vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF CONTENTS I.  INTRODUCTION...22  D.  GENERATING DIGESTS ............................................................................23  1.  Reference...the-shelf GPU Graphical Processing Unit GPGPU General -Purpose Graphic Processing Unit HBSS Host-Based Security System HIPS Host Intrusion

  12. Efficient utilization of greenhouse gases in a gas-to-liquids process combined with CO2/steam-mixed reforming and Fe-based Fischer-Tropsch synthesis.

    PubMed

    Zhang, Chundong; Jun, Ki-Won; Ha, Kyoung-Su; Lee, Yun-Jo; Kang, Seok Chang

    2014-07-15

    Two process models for carbon dioxide utilized gas-to-liquids (GTL) process (CUGP) mainly producing light olefins and Fischer-Tropsch (F-T) synthetic oils were developed by Aspen Plus software. Both models are mainly composed of a reforming unit, an F-T synthesis unit and a recycle unit, while the main difference is the feeding point of fresh CO2. In the reforming unit, CO2 reforming and steam reforming of methane are combined together to produce syngas in flexible composition. Meanwhile, CO2 hydrogenation is conducted via reverse water gas shift on the Fe-based catalysts in the F-T synthesis unit to produce hydrocarbons. After F-T synthesis, the unreacted syngas is recycled to F-T synthesis and reforming units to enhance process efficiency. From the simulation results, it was found that the carbon efficiencies of both CUGP options were successfully improved, and total CO2 emissions were significantly reduced, compared with the conventional GTL processes. The process efficiency was sensitive to recycle ratio and more recycle seemed to be beneficial for improving process efficiency and reducing CO2 emission. However, the process efficiency was rather insensitive to split ratio (recycle to reforming unit/total recycle), and the optimum split ratio was determined to be zero.

  13. Effects of the Scientific Argumentation Based Learning Process on Teaching the Unit of Cell Division and Inheritance to Eighth Grade Students

    ERIC Educational Resources Information Center

    Balci, Ceyda; Yenice, Nilgun

    2016-01-01

    The aim of this study is to analyse the effects of scientific argumentation based learning process on the eighth grade students' achievement in the unit of "cell division and inheritance". It also deals with the effects of this process on their comprehension about the nature of scientific knowledge, their willingness to take part in…

  14. Rail inspection system based on iGPS

    NASA Astrophysics Data System (ADS)

    Fu, Xiaoyan; Wang, Mulan; Wen, Xiuping

    2018-05-01

    Track parameters include gauge, super elevation, cross level and so on, which could be calculated through the three-dimensional coordinates of the track. The rail inspection system based on iGPS (indoor/infrared GPS) was composed of base station, receiver, rail inspection frame, wireless communication unit, display and control unit and data processing unit. With the continuous movement of the inspection frame, the system could accurately inspect the coordinates of rail; realize the intelligent detection and precision measurement. According to principle of angle intersection measurement, the inspection model was structured, and detection process was given.

  15. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  16. Exploring a Framework for Professional Development in Curriculum Innovation: Empowering Teachers for Designing Context-Based Chemistry Education

    NASA Astrophysics Data System (ADS)

    Stolk, Machiel J.; de Jong, Onno; Bulte, Astrid M. W.; Pilot, Albert

    2011-05-01

    Involving teachers in early stages of context-based curriculum innovations requires a professional development programme that actively engages teachers in the design of new context-based units. This study considers the implementation of a teacher professional development framework aiming to investigate processes of professional development. The framework is based on Galperin's theory of the internalisation of actions and it is operationalised into a professional development programme to empower chemistry teachers for designing new context-based units. The programme consists of the teaching of an educative context-based unit, followed by the designing of an outline of a new context-based unit. Six experienced chemistry teachers participated in the instructional meetings and practical teaching in their respective classrooms. Data were obtained from meetings, classroom discussions, and observations. The findings indicated that teachers became only partially empowered for designing a new context-based chemistry unit. Moreover, the process of professional development leading to teachers' empowerment was not carried out as intended. It is concluded that the elaboration of the framework needs improvement. The implications for a new programme are discussed.

  17. 76 FR 34031 - United States Standards for Grades of Processed Raisins

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ...The Agricultural Marketing Service (AMS), of the United States Department of Agriculture (USDA) is withdrawing a notice soliciting comments on its proposed revision to the United States Standards for Grades of Processed Raisins. Based on the petitioner's request to withdraw their petition, the agency has decided not to proceed with this action.

  18. Change champions at the grassroots level: practice innovation using team process.

    PubMed

    Scott, J; Rantz, M

    1994-01-01

    A nursing administrative group recognized the critical value of staff participation in the formulation of a restructuring project and guidance throughout the project. Using a team approach, a task force of three staff nurses, two assistant nurse managers, a nurse clinician, a nursing practice specialist, and a representative from nursing administration came together. They were given responsibility for researching and setting the course for restructuring change. A unit-based team including a unit secretary, a nursing attendant, licensed practical nurse (LPN), and six staff nurses was formed from volunteers from the 40-bed medicine unit to develop that unit's plan for restructuring. The unit-based team analyzed patient care needs and staff member roles. They created a new patient care technician role as well as a nurse care coordinator role. The role of the LPN was envisioned as providing technical support. Staffing mix was also determined by the unit-based team. Both the task force and the unit-based team continue to evaluate, troubleshoot, and take every opportunity to sell their vision to solidify it further as the foundation for the future of patient care services at the hospital. The process will soon move forward to a large surgical unit.

  19. ISO 9001 in a neonatal intensive care unit (NICU).

    PubMed

    Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel

    2011-01-01

    The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.

  20. Moon Munchies: Human Exploration Project Engineering Design Challenge--A Standards-Based Elementary School Model Unit Guide--Design, Build, and Evaluate (Lessons 1-6). Engineering By Design: Advancing Technological Literacy--A Standards-Based Program Series. EP-2007-08-92-MSFC

    ERIC Educational Resources Information Center

    Weaver, Kim M.

    2005-01-01

    In this unit, elementary students design and build a lunar plant growth chamber using the Engineering Design Process. The purpose of the unit is to help students understand and apply the design process as it relates to plant growth on the moon. This guide includes six lessons, which meet a number of national standards and benchmarks in…

  1. A Shipping Container-Based Sterile Processing Unit for Low Resources Settings

    PubMed Central

    2016-01-01

    Deficiencies in the sterile processing of medical instruments contribute to poor outcomes for patients, such as surgical site infections, longer hospital stays, and deaths. In low resources settings, such as some rural and semi-rural areas and secondary and tertiary cities of developing countries, deficiencies in sterile processing are accentuated due to the lack of access to sterilization equipment, improperly maintained and malfunctioning equipment, lack of power to operate equipment, poor protocols, and inadequate quality control over inventory. Inspired by our sterile processing fieldwork at a district hospital in Sierra Leone in 2013, we built an autonomous, shipping-container-based sterile processing unit to address these deficiencies. The sterile processing unit, dubbed “the sterile box,” is a full suite capable of handling instruments from the moment they leave the operating room to the point they are sterile and ready to be reused for the next surgery. The sterile processing unit is self-sufficient in power and water and features an intake for contaminated instruments, decontamination, sterilization via non-electric steam sterilizers, and secure inventory storage. To validate efficacy, we ran tests of decontamination and sterilization performance. Results of 61 trials validate convincingly that our sterile processing unit achieves satisfactory outcomes for decontamination and sterilization and as such holds promise to support healthcare facilities in low resources settings. PMID:27007568

  2. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  3. Electrodialysis-based separation process for salt recovery and recycling from waste water

    DOEpatents

    Tsai, S.P.

    1997-07-08

    A method for recovering salt from a process stream containing organic contaminants is provided, comprising directing the waste stream to a desalting electrodialysis unit so as to create a concentrated and purified salt permeate and an organic contaminants-containing stream, and contacting said concentrated salt permeate to a water-splitting electrodialysis unit so as to convert the salt to its corresponding base and acid. 6 figs.

  4. Electrodialysis-based separation process for salt recovery and recycling from waste water

    DOEpatents

    Tsai, Shih-Perng

    1997-01-01

    A method for recovering salt from a process stream containing organic contaminants is provided, comprising directing the waste stream to a desalting electrodialysis unit so as to create a concentrated and purified salt permeate and an organic contaminants containing stream, and contacting said concentrated salt permeate to a water-splitting electrodialysis unit so as to convert the salt to its corresponding base and acid.

  5. Business Process Improvement Applied to Written Temporary Duty Travel Orders within the United States Air Force

    DTIC Science & Technology

    1993-12-01

    Generally Accepted Process While neither DoD Directives nor USAF Regulations specify exact mandatory TDY order processing methods, most USAF units...functional input. Finally, TDY order processing functional experts at Hanscom, Los Angeles and McClellan AFBs provided inputs based on their experiences...current electronic auditing capabilities. 81 DTPS Initiative. This DFAS-initiated action to standardize TDY order processing throughout DoD is currently

  6. Graphics processing unit based computation for NDE applications

    NASA Astrophysics Data System (ADS)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  7. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units

    PubMed Central

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-01-01

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684

  8. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units.

    PubMed

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-02-12

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.

  9. Quality Assurance in American and British Higher Education: A Comparison.

    ERIC Educational Resources Information Center

    Stanley, Elizabeth C.; Patrick, William J.

    1998-01-01

    Compares quality improvement and accountability processes in the United States and United Kingdom. For the United Kingdom, looks at quality audits, institutional assessment, standards-based quality assurance, and research assessment; in the United States, looks at regional and specialized accreditation, performance indicator systems, academic…

  10. Method of up-front load balancing for local memory parallel processors

    NASA Technical Reports Server (NTRS)

    Baffes, Paul Thomas (Inventor)

    1990-01-01

    In a parallel processing computer system with multiple processing units and shared memory, a method is disclosed for uniformly balancing the aggregate computational load in, and utilizing minimal memory by, a network having identical computations to be executed at each connection therein. Read-only and read-write memory are subdivided into a plurality of process sets, which function like artificial processing units. Said plurality of process sets is iteratively merged and reduced to the number of processing units without exceeding the balance load. Said merger is based upon the value of a partition threshold, which is a measure of the memory utilization. The turnaround time and memory savings of the instant method are functions of the number of processing units available and the number of partitions into which the memory is subdivided. Typical results of the preferred embodiment yielded memory savings of from sixty to seventy five percent.

  11. Cost calculator methods for estimating casework time in child welfare services: A promising approach for use in implementation of evidence-based practices and other service innovations.

    PubMed

    Holmes, Lisa; Landsverk, John; Ward, Harriet; Rolls-Reutz, Jennifer; Saldana, Lisa; Wulczyn, Fred; Chamberlain, Patricia

    2014-04-01

    Estimating costs in child welfare services is critical as new service models are incorporated into routine practice. This paper describes a unit costing estimation system developed in England (cost calculator) together with a pilot test of its utility in the United States where unit costs are routinely available for health services but not for child welfare services. The cost calculator approach uses a unified conceptual model that focuses on eight core child welfare processes. Comparison of these core processes in England and in four counties in the United States suggests that the underlying child welfare processes generated from England were perceived as very similar by child welfare staff in California county systems with some exceptions in the review and legal processes. Overall, the adaptation of the cost calculator for use in the United States child welfare systems appears promising. The paper also compares the cost calculator approach to the workload approach widely used in the United States and concludes that there are distinct differences between the two approaches with some possible advantages to the use of the cost calculator approach, especially in the use of this method for estimating child welfare costs in relation to the incorporation of evidence-based interventions into routine practice.

  12. Design and implementation of the monitoring system for underground coal fires in Xinjiang region, China

    NASA Astrophysics Data System (ADS)

    Li-bo, Dang; Jia-chun, Wu; Yue-xing, Liu; Yuan, Chang; Bin, Peng

    2017-04-01

    Underground coal fire (UCF) is serious in Xinjiang region of China. In order to deal with this problem efficiently, a UCF monitoring System, which is based on the use of wireless communication technology and remote sensing images, was designed and implemented by Xinjiang Coal Fire Fighting Bureau. This system consists of three parts, i.e., the data collecting unit, the data processing unit and the data output unit. For the data collecting unit, temperature sensors and gas sensors were put together on the sites with depth of 1.5 meter from the surface of coal fire zone. Information on these sites' temperature and gas was transferred immediately to the data processing unit. The processing unit was developed by coding based on GIS software. Generally, the processed datum were saved in the computer by table format, which can be displayed on the screen as the curve. Remote sensing image for each coal fire was saved in this system as the background for each monitoring site. From the monitoring data, the changes of the coal fires were displayed directly. And it provides a solid basis for analyzing the status of coal combustion of coal fire, the gas emission and possible dominant direction of coal fire propagation, which is helpful for making-decision of coal fire extinction.

  13. Plant Puzzles, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of the units is based on an experience-oriented process that encourages self-paced independent student work. The purpose of this unit is to familiarize students with…

  14. Shadows, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The units are self-contained and require minimal teacher preparation. The philosophy behind the units is based on an experience-oriented process that encourages self-paced independent work. This unit on shadows is designed for all elementary levels,…

  15. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  16. From Empiricism to Total Quality Management in Greek Education

    NASA Astrophysics Data System (ADS)

    Karavasilis, Ioannis; Samoladas, Ioannis; Nedos, Apostolos

    Nowadays the education system in Greece moves towards democratization and decentralization. School unit is the cell and the base of the education system. Principal's role is highly demanding, multi-dimensional, and a critical determinant of school performance and effectiveness. The paper proposes an effective organizational plan of school units in Primary Education based on basic administration processes and Total Quality Management. Using theory of emotional intelligence and Blake-Mouton's grid it emphasizes the impact of Principal's leadership on democratizing the school unit, on creating a safe and secure environment and positive school climate and motivating teachers committee to participate in the decision making process.

  17. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

  18. Visemic Processing in Audiovisual Discrimination of Natural Speech: A Simultaneous fMRI-EEG Study

    ERIC Educational Resources Information Center

    Dubois, Cyril; Otzenberger, Helene; Gounot, Daniel; Sock, Rudolph; Metz-Lutz, Marie-Noelle

    2012-01-01

    In a noisy environment, visual perception of articulatory movements improves natural speech intelligibility. Parallel to phonemic processing based on auditory signal, visemic processing constitutes a counterpart based on "visemes", the distinctive visual units of speech. Aiming at investigating the neural substrates of visemic processing in a…

  19. Oaks, Acorns, Climate and Squirrels, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of the units is based on an experience-oriented process that encourages self-paced independent student work. In this particular unit, oaks and acorns are the vehicle…

  20. Restructuring a Large IT Organization: Theory, Model, Process, and Initial Results.

    ERIC Educational Resources Information Center

    Luker, Mark; And Others

    1995-01-01

    Recently the University of Wisconsin-Madison merged three existing but disparate technology-related units into a single division reporting to a chief information officer. The new division faced many challenges, beginning with the need to restructure the old units into a cohesive new organization. The restructuring process, based on structural…

  1. The Years Alone: A Reading Comprehension Unit (7-9).

    ERIC Educational Resources Information Center

    Blair-Broeker, Lynn

    Based on Bloom's Taxonomy of thought, this thematic reading comprehension unit on "loneliness" is intended for teachers of grades 7-9. The thinking process is broken into six categories: (1) recall; (2) inference; (3) application; (4) analysis; (5) synthesis; and (6) evaluation. A short description is given for each of these processes.…

  2. A portable liquid crystal-based polarized light system for the detection of organophosphorus nerve gas.

    PubMed

    He, Feng Jie; Liu, Hui Long; Chen, Long Cong; Xiong, Xing Liang

    2018-03-01

    Liquid crystal (LC)-based sensors have the advantageous properties of being fast, sensitive, and label-free, the results of which can be accessed directly only through the naked eye. However, the inherent disadvantages possessed by LC sensors, such as relying heavily on polarizing microscopes and the difficulty to quantify, have limited the possibility of field applications. Herein, we have addressed these issues by constructing a portable polarized detection system with constant temperature control. This system is mainly composed of four parts: the LC cell, the optics unit, the automatic temperature control unit, and the image processing unit. The LC cell was based on the ordering transitions of LCs in the presence of analytes. The optics unit based on the imaging principle of LCs was designed to substitute the polarizing microscope for the real-time observation. The image processing unit is expected to quantify the concentration of analytes. The results have shown that the presented system can detect dimethyl methyl phosphonate (a stimulant for organophosphorus nerve gas) within 25 s, and the limit of detection is about 10 ppb. In all, our portable system has potential in field applications.

  3. A portable liquid crystal-based polarized light system for the detection of organophosphorus nerve gas

    NASA Astrophysics Data System (ADS)

    He, Feng Jie; Liu, Hui Long; Chen, Long Cong; Xiong, Xing Liang

    2018-03-01

    Liquid crystal (LC)-based sensors have the advantageous properties of being fast, sensitive, and label-free, the results of which can be accessed directly only through the naked eye. However, the inherent disadvantages possessed by LC sensors, such as relying heavily on polarizing microscopes and the difficulty to quantify, have limited the possibility of field applications. Herein, we have addressed these issues by constructing a portable polarized detection system with constant temperature control. This system is mainly composed of four parts: the LC cell, the optics unit, the automatic temperature control unit, and the image processing unit. The LC cell was based on the ordering transitions of LCs in the presence of analytes. The optics unit based on the imaging principle of LCs was designed to substitute the polarizing microscope for the real-time observation. The image processing unit is expected to quantify the concentration of analytes. The results have shown that the presented system can detect dimethyl methyl phosphonate (a stimulant for organophosphorus nerve gas) within 25 s, and the limit of detection is about 10 ppb. In all, our portable system has potential in field applications.

  4. Object-based neglect in number processing

    PubMed Central

    2013-01-01

    Recent evidence suggests that neglect patients seem to have particular problems representing relatively smaller numbers corresponding to the left part of the mental number line. However, while this indicates space-based neglect for representational number space little is known about whether and - if so - how object-based neglect influences number processing. To evaluate influences of object-based neglect in numerical cognition, a group of neglect patients and two control groups had to compare two-digit numbers to an internally represented standard. Conceptualizing two-digit numbers as objects of which the left part (i.e., the tens digit should be specifically neglected) we were able to evaluate object-based neglect for number magnitude processing. Object-based neglect was indicated by a larger unit-decade compatibility effect actually reflecting impaired processing of the leftward tens digits. Additionally, faster processing of within- as compared to between-decade items provided further evidence suggesting particular difficulties in integrating tens and units into the place-value structure of the Arabic number system. In summary, the present study indicates that, in addition to the spatial representation of number magnitude, also the processing of place-value information of multi-digit numbers seems specifically impaired in neglect patients. PMID:23343126

  5. Recycle Requirements for NASA's 30 cm Xenon Ion Thruster

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Rawlin, Vincent K.

    1994-01-01

    Electrical breakdowns have been observed during ion thruster operation. These breakdowns, or arcs, can be caused by several conditions. In flight systems, the power processing unit must be designed to handle these faults autonomously. This has a strong impact on power processor requirements and must be understood fully for the power processing unit being designed for the NASA Solar Electric Propulsion Technology Application Readiness program. In this study, fault conditions were investigated using a NASA 30 cm ion thruster and a power console. Power processing unit output specifications were defined based on the breakdown phenomena identified and characterized.

  6. Brine Shrimp and Their Habitat, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within the existing curriculum. The unit is self-contained and students are encouraged to work at their own speed. The philosophy of the unit is based on an experience-oriented process that encourages independent student work. This unit explores the life cycle of brine shrimp and…

  7. Web-Based Honorarium Confirmation System Prototype

    NASA Astrophysics Data System (ADS)

    Wisswani, N. W.; Catur Bawa, I. G. N. B.

    2018-01-01

    Improving services in academic environment can be applied by regulating salary payment process for all employees. As a form of control to maintain financial transparency, employees should have information concerning salary payment process. Currently, notification process of committee honorarium will be accepted by the employees in a manual manner. The salary will be received by the employee bank account and to know its details, they should go to the accounting unit to find out further information. Though there are some employees entering the accounting unit, they still find difficulty to obtain information about detailed honor information that they received in their accounts. This can be caused by many data collected and to be managed. Based on this issue, this research will design a prototype of web-based system for accounting unit system in order to provide detailed financial transaction confirmation to employee bank accounts that have been informed through mobile banking system. This prototype will be developed with Waterfall method through testing on final users after it is developed through PHP program with MySQL as DBMS

  8. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  9. Suggestions for a competency-based orientation for an orthopaedic unit.

    PubMed

    Bryant, G A

    1997-01-01

    Effective orientation programs should provide new RN and LPN employees with very specific performance expectations. Competency-based orientation provides such a structure. This approach not only decreases the orientee's anxiety, but it also acts as a basis for establishing competencies specific to that unit. Because the existing staff members are intimately involved in the process, socialization within the unit and cohesiveness of purpose are enhanced. Adult learning theory, educational principles, self-paced learning modules, and the use of preceptors and check-off lists are employed in this Competency-Based Orientation (CBO) program for an adult orthopaedic unit. Samples of various aspects of a CBO are included.

  10. A GPU-Based Wide-Band Radio Spectrometer

    NASA Astrophysics Data System (ADS)

    Chennamangalam, Jayanth; Scott, Simon; Jones, Glenn; Chen, Hong; Ford, John; Kepley, Amanda; Lorimer, D. R.; Nie, Jun; Prestage, Richard; Roshi, D. Anish; Wagner, Mark; Werthimer, Dan

    2014-12-01

    The graphics processing unit has become an integral part of astronomical instrumentation, enabling high-performance online data reduction and accelerated online signal processing. In this paper, we describe a wide-band reconfigurable spectrometer built using an off-the-shelf graphics processing unit card. This spectrometer, when configured as a polyphase filter bank, supports a dual-polarisation bandwidth of up to 1.1 GHz (or a single-polarisation bandwidth of up to 2.2 GHz) on the latest generation of graphics processing units. On the other hand, when configured as a direct fast Fourier transform, the spectrometer supports a dual-polarisation bandwidth of up to 1.4 GHz (or a single-polarisation bandwidth of up to 2.8 GHz).

  11. Modelling and simulation of a robotic work cell

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Gwiazda, A.; Kost, G.; Banaś, W.

    2017-08-01

    The subject of considerations presented in this work concerns the designing and simulation of a robotic work cell. The designing of robotic cells is the process of synergistic combining the components in the group, combining this groups into specific, larger work units or dividing the large work units into small ones. Combinations or divisions are carried out in the terms of the needs of realization the assumed objectives to be performed in these unit. The designing process bases on the integrated approach what lets to take into consideration all needed elements of this process. Each of the elements of a design process could be an independent design agent which could tend to obtain its objectives.

  12. Enhanced teaching and student learning through a simulator-based course in chemical unit operations design

    NASA Astrophysics Data System (ADS)

    Ghasem, Nayef

    2016-07-01

    This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes through simulators. A case study presenting the teaching method was evaluated using student surveys and faculty assessments, which were designed to measure the quality and effectiveness of the teaching method. The results of the questionnaire conclusively demonstrate that this method is an extremely efficient way of teaching a simulator-based course. In addition to that, this teaching method can easily be generalised and used in other courses. A student's final mark is determined by a combination of in-class assessments conducted based on cooperative and peer learning, progress tests and a final exam. Results revealed that peer learning can improve the overall quality of student learning and enhance student understanding.

  13. 48 CFR 9904.418-60 - Illustrations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... of which perform various functions on units of the work-in-process of multiple final cost objectives... assembly overhead cost pool. The business unit finds it impractical to use an allocation measure based on... occasionally does significant amounts of work for other activities of the business unit. The labor used in...

  14. 48 CFR 9904.418-60 - Illustrations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... of which perform various functions on units of the work-in-process of multiple final cost objectives... assembly overhead cost pool. The business unit finds it impractical to use an allocation measure based on... occasionally does significant amounts of work for other activities of the business unit. The labor used in...

  15. 48 CFR 9904.418-60 - Illustrations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... of which perform various functions on units of the work-in-process of multiple final cost objectives... assembly overhead cost pool. The business unit finds it impractical to use an allocation measure based on... occasionally does significant amounts of work for other activities of the business unit. The labor used in...

  16. Fish and Water Temperature, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of this series is based on an experience-oriented process that encourages self-paced independent student work. This particular unit illustrates the interrelationship…

  17. Soil, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of the series is based on an experience-oriented process that encourages self-paced independent student work. This particular unit investigates soil in relation to…

  18. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loughry, Thomas A.

    As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to tenmore » times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.« less

  19. Graphics processing unit accelerated intensity-based optical coherence tomography angiography using differential frames with real-time motion correction.

    PubMed

    Watanabe, Yuuki; Takahashi, Yuhei; Numazawa, Hiroshi

    2014-02-01

    We demonstrate intensity-based optical coherence tomography (OCT) angiography using the squared difference of two sequential frames with bulk-tissue-motion (BTM) correction. This motion correction was performed by minimization of the sum of the pixel values using axial- and lateral-pixel-shifted structural OCT images. We extract the BTM-corrected image from a total of 25 calculated OCT angiographic images. Image processing was accelerated by a graphics processing unit (GPU) with many stream processors to optimize the parallel processing procedure. The GPU processing rate was faster than that of a line scan camera (46.9 kHz). Our OCT system provides the means of displaying structural OCT images and BTM-corrected OCT angiographic images in real time.

  20. Numerical Study on Wake Flow Field Characteristic of the Base-Bleed Unit under Fast Depressurization Process

    NASA Astrophysics Data System (ADS)

    Xue, Xiaochun; Yu, Yonggang

    2017-04-01

    Numerical analyses have been performed to study the influence of fast depressurization on the wake flow field of the base-bleed unit (BBU) with a secondary combustion when the base-bleed projectile is propelled out of the muzzle. Two-dimensional axisymmetric Navier-Stokes equations for a multi-component chemically reactive system is solved by Fortran program to calculate the couplings of the internal flow field and wake flow field with consideration of the combustion of the base-bleed propellant and secondary combustion effect. Based on the comparison with the experiments, the unsteady variation mechanism and secondary combustion characteristic of wake flow field under fast depressurization process is obtained numerically. The results show that in the fast depressurization process, the variation extent of the base pressure of the BBU is larger in first 0.9 ms and then decreases gradually and after 1.5 ms, it remains basically stable. The pressure and temperature of the base-bleed combustion chamber experience the decrease and pickup process. Moreover, after the pressure and temperature decrease to the lowest point, the phenomenon that the external gases are flowing back into the base-bleed combustion chamber appears. Also, with the decrease of the initial pressure, the unsteady process becomes shorter and the temperature gradient in the base-bleed combustion chamber declines under the fast depressurization process, which benefits the combustion of the base-bleed propellant.

  1. Enhanced Teaching and Student Learning through a Simulator-Based Course in Chemical Unit Operations Design

    ERIC Educational Resources Information Center

    Ghasem, Nayef

    2016-01-01

    This paper illustrates a teaching technique used in computer applications in chemical engineering employed for designing various unit operation processes, where the students learn about unit operations by designing them. The aim of the course is not to teach design, but rather to teach the fundamentals and the function of unit operation processes…

  2. A numerical investigation of the scale-up effects on flow, heat transfer, and kinetics processes of FCC units.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S. L.

    1998-08-25

    Fluid Catalytic Cracking (FCC) technology is the most important process used by the refinery industry to convert crude oil to valuable lighter products such as gasoline. Process development is generally very time consuming especially when a small pilot unit is being scaled-up to a large commercial unit because of the lack of information to aide in the design of scaled-up units. Such information can now be obtained by analysis based on the pilot scale measurements and computer simulation that includes controlling physics of the FCC system. A Computational fluid dynamic (CFD) code, ICRKFLO, has been developed at Argonne National Laboratorymore » (ANL) and has been successfully applied to the simulation of catalytic petroleum cracking risers. It employs hybrid hydrodynamic-chemical kinetic coupling techniques, enabling the analysis of an FCC unit with complex chemical reaction sets containing tens or hundreds of subspecies. The code has been continuously validated based on pilot-scale experimental data. It is now being used to investigate the effects of scaled-up FCC units. Among FCC operating conditions, the feed injection conditions are found to have a strong impact on the product yields of scaled-up FCC units. The feed injection conditions appear to affect flow and heat transfer patterns and the interaction of hydrodynamics and cracking kinetics causes the product yields to change accordingly.« less

  3. Chemical interaction matrix between reagents in a Purex based process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brahman, R.K.; Hennessy, W.P.; Paviet-Hartmann, P.

    2008-07-01

    The United States Department of Energy (DOE) is the responsible entity for the disposal of the United States excess weapons grade plutonium. DOE selected a PUREX-based process to convert plutonium to low-enriched mixed oxide fuel for use in commercial nuclear power plants. To initiate this process in the United States, a Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) is under construction and will be operated by Shaw AREVA MOX Services at the Savannah River Site. This facility will be licensed and regulated by the U.S. Nuclear Regulatory Commission (NRC). A PUREX process, similar to the one used at La Hague,more » France, will purify plutonium feedstock through solvent extraction. MFFF employs two major process operations to manufacture MOX fuel assemblies: (1) the Aqueous Polishing (AP) process to remove gallium and other impurities from plutonium feedstock and (2) the MOX fuel fabrication process (MP), which processes the oxides into pellets and manufactures the MOX fuel assemblies. The AP process consists of three major steps, dissolution, purification, and conversion, and is the center of the primary chemical processing. A study of process hazards controls has been initiated that will provide knowledge and protection against the chemical risks associated from mixing of reagents over the life time of the process. This paper presents a comprehensive chemical interaction matrix evaluation for the reagents used in the PUREX-based process. Chemical interaction matrix supplements the process conditions by providing a checklist of any potential inadvertent chemical reactions that may take place. It also identifies the chemical compatibility/incompatibility of the reagents if mixed by failure of operations or equipment within the process itself or mixed inadvertently by a technician in the laboratories. (aut0010ho.« less

  4. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The results of the free space reactor experimental work are summarized. Overall, the objectives were achieved and the unit can be confidently scaled to the EPSDU size based on the experimental work and supporting theoretical analyses. The piping and instrumentation of the fluidized bed reactor was completed.

  5. Microchannel Distillation of JP-8 Jet Fuel for Sulfur Content Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Feng; Stenkamp, Victoria S.; TeGrotenhuis, Ward E.

    2006-09-16

    In microchannel based distillation processes, thin vapor and liquid films are contacted in small channels where mass transfer is diffusion-limited. The microchannel architecture enables improvements in distillation processes. A shorter height equivalent of a theoretical plate (HETP) and therefore a more compact distillation unit can be achieved. A microchannel distillation unit was used to produce a light fraction of JP-8 fuel with reduced sulfur content for use as feed to produce fuel-cell grade hydrogen. The HETP of the microchannel unit is discussed, as well as the effects of process conditions such as feed temperature, flow rate, and reflux ratio.

  6. Plants in the Classroom, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within the existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of this series is based on an experience-oriented process that encourages self-paced independent student work. This particular unit, designed for the primary grades,…

  7. Establishment of an unrelated umbilical cord blood bank qualification program: ensuring quality while meeting Food and Drug Administration vendor qualification requirements.

    PubMed

    Rabe, Fran; Kadidlo, Diane; Van Orsow, Lisa; McKenna, David

    2013-10-01

    Qualification of a cord blood bank (CBB) is a complex process that includes evaluation of multiple aspects of donor screening and testing, processing, accreditation and approval by professional cell therapy groups, and results of received cord blood units. The University of Minnesota Medical Center Cell Therapy Laboratory has established a CBB vendor qualification process to ensure the CBB meets established regulatory and quality requirements. The deployed qualification of CBBs is based on retrospective and prospective review of the CBB. Forty-one CBBs were evaluated retrospectively: seven CBBs were disqualified based on failed quality control (QC) results. Eight CBBs did not meet the criteria for retrospective qualification because fewer than 3 cord blood units were received and the CBB was not accredited. As of March 2012, three US and one non-US CBBs have been qualified prospectively. One CBB withdrew from the qualification process after successful completion of the comprehensive survey and subsequent failure of the provided QC unit to pass the minimum criteria. One CBB failed the prospective qualification process based on processing methods that were revealed during the paper portion of the evaluation. A CBB qualification process is necessary for a transplant center to manage the qualification of the large number of CBBs needed to support a umbilical cord blood transplantation program. A transplant center that has utilized cord blood for a number of years before implementation of a qualification process should use a retrospective qualification process along with a prospective process. © 2013 American Association of Blood Banks.

  8. Speaking the right language: the scientific method as a framework for a continuous quality improvement program within academic medical research compliance units.

    PubMed

    Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S

    2008-10-01

    The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.

  9. GUIDING PRINCIPLES FOR GOOD PRACTICES IN HOSPITAL-BASED HEALTH TECHNOLOGY ASSESSMENT UNITS.

    PubMed

    Sampietro-Colom, Laura; Lach, Krzysztof; Pasternack, Iris; Wasserfallen, Jean-Blaise; Cicchetti, Americo; Marchetti, Marco; Kidholm, Kristian; Arentz-Hansen, Helene; Rosenmöller, Magdalene; Wild, Claudia; Kahveci, Rabia; Ulst, Margus

    2015-01-01

    Health technology assessment (HTA) carried out for policy decision making has well-established principles unlike hospital-based HTA (HB-HTA), which differs from the former in the context characteristics and ways of operation. This study proposes principles for good practices in HB-HTA units. A framework for good practice criteria was built inspired by the EFQM excellence business model and information from six literature reviews, 107 face-to-face interviews, forty case studies, large-scale survey, focus group, Delphi survey, as well as local and international validation. In total, 385 people from twenty countries have participated in defining the principles for good practices in HB-HTA units. Fifteen guiding principles for good practices in HB-HTA units are grouped in four dimensions. Dimension 1 deals with principles of the assessment process aimed at providing contextualized information for hospital decision makers. Dimension 2 describes leadership, strategy and partnerships of HB-HTA units which govern and facilitate the assessment process. Dimension 3 focuses on adequate resources that ensure the operation of HB-HTA units. Dimension 4 deals with measuring the short- and long-term impact of the overall performance of HB-HTA units. Finally, nine core guiding principles were selected as essential requirements for HB-HTA units based on the expertise of the HB-HTA units participating in the project. Guiding principles for good practices set up a benchmark for HB-HTA because they represent the ideal performance of HB-HTA units; nevertheless, when performing HTA at hospital level, context also matters; therefore, they should be adapted to ensure their applicability in the local context.

  10. Use of a graphics processing unit (GPU) to facilitate real-time 3D graphic presentation of the patient skin-dose distribution during fluoroscopic interventional procedures

    PubMed Central

    Rana, Vijay; Rudin, Stephen; Bednarek, Daniel R.

    2012-01-01

    We have developed a dose-tracking system (DTS) that calculates the radiation dose to the patient’s skin in real-time by acquiring exposure parameters and imaging-system-geometry from the digital bus on a Toshiba Infinix C-arm unit. The cumulative dose values are then displayed as a color map on an OpenGL-based 3D graphic of the patient for immediate feedback to the interventionalist. Determination of those elements on the surface of the patient 3D-graphic that intersect the beam and calculation of the dose for these elements in real time demands fast computation. Reducing the size of the elements results in more computation load on the computer processor and therefore a tradeoff occurs between the resolution of the patient graphic and the real-time performance of the DTS. The speed of the DTS for calculating dose to the skin is limited by the central processing unit (CPU) and can be improved by using the parallel processing power of a graphics processing unit (GPU). Here, we compare the performance speed of GPU-based DTS software to that of the current CPU-based software as a function of the resolution of the patient graphics. Results show a tremendous improvement in speed using the GPU. While an increase in the spatial resolution of the patient graphics resulted in slowing down the computational speed of the DTS on the CPU, the speed of the GPU-based DTS was hardly affected. This GPU-based DTS can be a powerful tool for providing accurate, real-time feedback about patient skin-dose to physicians while performing interventional procedures. PMID:24027616

  11. Use of a graphics processing unit (GPU) to facilitate real-time 3D graphic presentation of the patient skin-dose distribution during fluoroscopic interventional procedures.

    PubMed

    Rana, Vijay; Rudin, Stephen; Bednarek, Daniel R

    2012-02-23

    We have developed a dose-tracking system (DTS) that calculates the radiation dose to the patient's skin in real-time by acquiring exposure parameters and imaging-system-geometry from the digital bus on a Toshiba Infinix C-arm unit. The cumulative dose values are then displayed as a color map on an OpenGL-based 3D graphic of the patient for immediate feedback to the interventionalist. Determination of those elements on the surface of the patient 3D-graphic that intersect the beam and calculation of the dose for these elements in real time demands fast computation. Reducing the size of the elements results in more computation load on the computer processor and therefore a tradeoff occurs between the resolution of the patient graphic and the real-time performance of the DTS. The speed of the DTS for calculating dose to the skin is limited by the central processing unit (CPU) and can be improved by using the parallel processing power of a graphics processing unit (GPU). Here, we compare the performance speed of GPU-based DTS software to that of the current CPU-based software as a function of the resolution of the patient graphics. Results show a tremendous improvement in speed using the GPU. While an increase in the spatial resolution of the patient graphics resulted in slowing down the computational speed of the DTS on the CPU, the speed of the GPU-based DTS was hardly affected. This GPU-based DTS can be a powerful tool for providing accurate, real-time feedback about patient skin-dose to physicians while performing interventional procedures.

  12. Incremental terrain processing for large digital elevation models

    NASA Astrophysics Data System (ADS)

    Ye, Z.

    2012-12-01

    Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined in (3), compare the resulting drainage pattern with the drainage pattern established at the coarser scale and adjust the drainage boundaries and rivers if necessary.

  13. Effect of Cord Blood Processing on Transplant Outcomes after Single Myeloablative Umbilical Cord Blood Transplantation

    PubMed Central

    Ballen, Karen K.; Logan, Brent R.; Laughlin, Mary J.; He, Wensheng; Ambruso, Daniel R.; Armitage, Susan E.; Beddard, Rachel L.; Bhatla, Deepika; Hwang, William Y.K.; Kiss, Joseph E.; Koegler, Gesine; Kurtzberg, Joanne; Nagler, Arnon; Oh, David; Petz, Lawrence D.; Price, Thomas H.; Quinones, Ralph R.; Ratanatharathorn, Voravit; Rizzo, J. Douglas; Sazama, Kathleen; Scaradavou, Andromachi; Schuster, Michael W.; Sender, Leonard S.; Shpall, Elizabeth J.; Spellman, Stephen R.; Sutton, Millicent; Weitekamp, Lee Ann; Wingard, John R.; Eapen, Mary

    2015-01-01

    Variations in cord blood manufacturing and administration are common, and the optimal practice, not known. We compared processing and banking practices at 16 public cord blood banks (CBB) in the United States, and assessed transplant outcomes on 530 single umbilical cord blood (UCB) myeloablative transplantations for hematologic malignancies, facilitated by these banks. UCB banking practices were separated into three mutually exclusive groups based on whether processing was automated or manual; units were plasma and red blood cell reduced or buffy coat production method or plasma reduced. Compared to the automated processing system for units, the day-28 neutrophil recovery was significantly lower after transplantation of units that were manually processed and plasma reduced (red cell replete) (odds ratio [OR] 0.19 p=0.001) or plasma and red cell reduced (OR 0.54, p=0.05). Day-100 survival did not differ by CBB. However, day-100 survival was better with units that were thawed with the dextran-albumin wash method compared to the “no wash” or “dilution only” techniques (OR 1.82, p=0.04). In conclusion, CBB processing has no significant effect on early (day 100) survival despite differences in kinetics of neutrophil recovery. PMID:25543094

  14. A System for the Individualization and Optimization of Learning Through Computer Management of the Educational Process. Final Report.

    ERIC Educational Resources Information Center

    Schure, Alexander

    A computer-based system model for the monitoring and management of the instructional process was conceived, developed and refined through the techniques of systems analysis. This report describes the various aspects and components of this project in a series of independent and self-contained units. The first unit provides an overview of the entire…

  15. Computer-Based Training Starter Kit.

    ERIC Educational Resources Information Center

    Federal Interagency Group for Computer-Based Training, Washington, DC.

    Intended for use by training professionals with little or no background in the application of automated data processing (ADP) systems, processes, or procurement requirements, this reference manual provides guidelines for establishing a computer based training (CBT) program within a federal agency of the United States government. The manual covers:…

  16. Development of a web-based learning medium on mechanism of labour for nursing students.

    PubMed

    Gerdprasert, Sailom; Pruksacheva, Tassanee; Panijpan, Bhinyo; Ruenwongsa, Pintip

    2010-07-01

    This study aimed to develop a web-based learning media on the process and mechanism of labour for the third-year university nursing and midwifery students. This media was developed based on integrating principles of the mechanism of labour with the 5Es inquiry cycle and interactive features of information technology. In this study, the web-based learning unit was used to supplement the conventional lecture as in the traditional teaching. Students' achievements were assessed by using the pre- and post-test on factual knowledge and semi-structured interviews on attitude to the unit. Supplementation with this learning unit made learning significantly more effective than the traditional lecture by itself. The students also showed positive attitude toward the learning unit. Copyright 2009 Elsevier Ltd. All rights reserved.

  17. Creating a Knowledge-Based Economy in the United Arab Emirates: Realising the Unfulfilled Potential of Women in the Science, Technology and Engineering Fields

    ERIC Educational Resources Information Center

    Aswad, Noor Ghazal; Vidican, Georgeta; Samulewicz, Diana

    2011-01-01

    As the United Arab Emirates (UAE) moves towards a knowledge-based economy, maximising the participation of the national workforce, especially women, in the transformation process is crucial. Using survey methods and semi-structured interviews, this paper examines the factors that influence women's decisions regarding their degree programme and…

  18. Reducing Operating Costs by Optimizing Space in Facilities

    DTIC Science & Technology

    2012-03-01

    Base level 5 engineering units will provide facility floor plans, furniture layouts, and staffing documentation as necessary. One obstacle...due to the quantity and diverse locations. Base level engineering units provided facility floor plans, furniture layouts, and staffing documentation... furniture purchases and placement 5. Follow a quality systematic process in all decisions The per person costs can be better understood with a real

  19. Plasma Processing of Model Residential Solid Waste

    NASA Astrophysics Data System (ADS)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  20. Concept of Smart Cyberspace for Smart Grid Implementation

    NASA Astrophysics Data System (ADS)

    Zhukovskiy, Y.; Malov, D.

    2018-05-01

    The concept of Smart Cyberspace for Smart Grid (SG) implementation is presented in the paper. The classification of electromechanical units, based on the amount of analysing data, the classification of electromechanical units, based on the data processing speed; and the classification of computational network organization, based on required resources, are proposed in this paper. The combination of the considered classifications is formalized, which can be further used in organizing and planning of SG.

  1. mizuRoute version 1: A river network routing tool for a continental domain water resources applications

    USGS Publications Warehouse

    Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.

  2. A practical approach for calculating the settlement and storage capacity of landfills based on the space and time discretization of the landfilling process.

    PubMed

    Gao, Wu; Xu, Wenjie; Bian, Xuecheng; Chen, Yunmin

    2017-11-01

    The settlement of any position of the municipal solid waste (MSW) body during the landfilling process and after its closure has effects on the integrity of the internal structure and storage capacity of the landfill. This paper proposes a practical approach for calculating the settlement and storage capacity of landfills based on the space and time discretization of the landfilling process. The MSW body in the landfill was divided into independent column units, and the filling process of each column unit was determined by a simplified complete landfilling process. The settlement of a position in the landfill was calculated with the compression of each MSW layer in every column unit. Then, the simultaneous settlement of all the column units was integrated to obtain the settlement of the landfill and storage capacity of all the column units; this allowed to obtain the storage capacity of the landfill based on the layer-wise summation method. When the compression of each MSW layer was calculated, the effects of the fluctuation of the main leachate level and variation in the unit weight of the MSW on the overburdened effective stress were taken into consideration by introducing the main leachate level's proportion and the unit weight and buried depth curve. This approach is especially significant for MSW with a high kitchen waste content and landfills in developing countries. The stress-biodegradation compression model was used to calculate the compression of each MSW layer. A software program, Settlement and Storage Capacity Calculation System for Landfills, was developed by integrating the space and time discretization of the landfilling process and the settlement and storage capacity algorithms. The landfilling process of the phase IV of Shanghai Laogang Landfill was simulated using this software. The maximum geometric volume of the landfill error between the calculated and measured values is only 2.02%, and the accumulated filling weight error between the calculated value and measured value is less than 5%. These results show that this approach is practical for satisfactorily and reliably calculating the settlement and storage capacity. In addition, the development of the elevation lines in the landfill sections created with the software demonstrates that the optimization of the design of the structures should be based on the settlement of the landfill. Since this practical approach can reasonably calculate the storage capacity of landfills and efficiently provide the development of the settlement of each landfilling stage, it can be used for the optimizations of landfilling schemes and structural designs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The Research and Test of Fast Radio Burst Real-time Search Algorithm Based on GPU Acceleration

    NASA Astrophysics Data System (ADS)

    Wang, J.; Chen, M. Z.; Pei, X.; Wang, Z. Q.

    2017-03-01

    In order to satisfy the research needs of Nanshan 25 m radio telescope of Xinjiang Astronomical Observatory (XAO) and study the key technology of the planned QiTai radio Telescope (QTT), the receiver group of XAO studied the GPU (Graphics Processing Unit) based real-time FRB searching algorithm which developed from the original FRB searching algorithm based on CPU (Central Processing Unit), and built the FRB real-time searching system. The comparison of the GPU system and the CPU system shows that: on the basis of ensuring the accuracy of the search, the speed of the GPU accelerated algorithm is improved by 35-45 times compared with the CPU algorithm.

  4. Speedup for quantum optimal control from automatic differentiation based on graphics processing units

    NASA Astrophysics Data System (ADS)

    Leung, Nelson; Abdelhafez, Mohamed; Koch, Jens; Schuster, David

    2017-04-01

    We implement a quantum optimal control algorithm based on automatic differentiation and harness the acceleration afforded by graphics processing units (GPUs). Automatic differentiation allows us to specify advanced optimization criteria and incorporate them in the optimization process with ease. We show that the use of GPUs can speedup calculations by more than an order of magnitude. Our strategy facilitates efficient numerical simulations on affordable desktop computers and exploration of a host of optimization constraints and system parameters relevant to real-life experiments. We demonstrate optimization of quantum evolution based on fine-grained evaluation of performance at each intermediate time step, thus enabling more intricate control on the evolution path, suppression of departures from the truncated model subspace, as well as minimization of the physical time needed to perform high-fidelity state preparation and unitary gates.

  5. Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations

    ERIC Educational Resources Information Center

    O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John

    2009-01-01

    Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…

  6. An electron transporting unit linked multifunctional Ir(III) complex: a promising strategy to improve the performance of solution-processed phosphorescent organic light-emitting diodes.

    PubMed

    Giridhar, Thota; Saravanan, Chinnusamy; Cho, Woosum; Park, Young Geun; Lee, Jin Yong; Jin, Sung-Ho

    2014-04-18

    An oxadiazole based electron transporting (ET) unit was glued to the heteroleptic Ir(III) complex (TPQIr-ET) and used as a dopant for phosphorescent organic light-emitting diodes (PhOLEDs). It shows superior device performance than the dopant without the ET unit (TPQIr) due to the balanced charge carrier injection by the ET unit.

  7. The componential processing of fractions in adults and children: effects of stimuli variability and contextual interference

    PubMed Central

    Gabriel, Florence C.; Szücs, Dénes

    2014-01-01

    Recent studies have indicated that people have a strong tendency to compare fractions based on constituent numerators or denominators. This is called componential processing. This study explored whether componential processing was preferred in tasks involving high stimuli variability and high contextual interference, when fractions could be compared based either on the holistic values of fractions or on their denominators. Here, stimuli variability referred to the fact that fractions were not monotonous but diversiform. Contextual interference referred to the fact that the processing of fractions was interfered by other stimuli. To our ends, three tasks were used. In Task 1, participants compared a standard fraction 1/5 to unit fractions. This task was used as a low stimuli variability and low contextual interference task. In Task 2 stimuli variability was increased by mixing unit and non-unit fractions. In Task 3, high contextual interference was created by incorporating decimals into fractions. The RT results showed that the processing patterns of fractions were very similar for adults and children. In task 1 and task 3, only componential processing was utilzied. In contrast, both holistic processing and componential processing were utilized in task 2. These results suggest that, if individuals are presented with the opportunity to perform componential processing, both adults and children will tend to do so, even if they are faced with high variability of fractions or high contextual interference. PMID:25249995

  8. The componential processing of fractions in adults and children: effects of stimuli variability and contextual interference.

    PubMed

    Zhang, Li; Fang, Qiaochu; Gabriel, Florence C; Szücs, Dénes

    2014-01-01

    Recent studies have indicated that people have a strong tendency to compare fractions based on constituent numerators or denominators. This is called componential processing. This study explored whether componential processing was preferred in tasks involving high stimuli variability and high contextual interference, when fractions could be compared based either on the holistic values of fractions or on their denominators. Here, stimuli variability referred to the fact that fractions were not monotonous but diversiform. Contextual interference referred to the fact that the processing of fractions was interfered by other stimuli. To our ends, three tasks were used. In Task 1, participants compared a standard fraction 1/5 to unit fractions. This task was used as a low stimuli variability and low contextual interference task. In Task 2 stimuli variability was increased by mixing unit and non-unit fractions. In Task 3, high contextual interference was created by incorporating decimals into fractions. The RT results showed that the processing patterns of fractions were very similar for adults and children. In task 1 and task 3, only componential processing was utilzied. In contrast, both holistic processing and componential processing were utilized in task 2. These results suggest that, if individuals are presented with the opportunity to perform componential processing, both adults and children will tend to do so, even if they are faced with high variability of fractions or high contextual interference.

  9. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  10. Accreditation of specialized asthma units for adults in Spain: an applicable experience for the management of difficult-to-control asthma

    PubMed Central

    Cisneros, Carolina; Díaz-Campos, Rocío Magdalena; Marina, Núria; Melero, Carlos; Padilla, Alicia; Pascual, Silvia; Pinedo, Celia; Trisán, Andrea

    2017-01-01

    This paper, developed by consensus of staff physicians of accredited asthma units for the management of severe asthma, presents information on the process and requirements for already-existing asthma units to achieve official accreditation by the Spanish Society of Pneumology and Thoracic Surgery (SEPAR). Three levels of specialized asthma care have been established based on available resources, which include specialized units for highly complex asthma, specialized asthma units, and basic asthma units. Regardless of the level of accreditation obtained, the distinction of “excellence” could be granted when more requirements in the areas of provision of care, technical and human resources, training in asthma, and teaching and research activities were met at each level. The Spanish experience in the process of accreditation of specialized asthma units, particularly for the care of patients with difficult-to-control asthma, may be applicable to other health care settings. PMID:28533690

  11. Cost unit accounting based on a clinical pathway: a practical tool for DRG implementation.

    PubMed

    Feyrer, R; Rösch, J; Weyand, M; Kunzmann, U

    2005-10-01

    Setting up a reliable cost unit accounting system in a hospital is a fundamental necessity for economic survival, given the current general conditions in the healthcare system. Definition of a suitable cost unit is a crucial factor for success. We present here the development and use of a clinical pathway as a cost unit as an alternative to the DRG. Elective coronary artery bypass grafting was selected as an example. Development of the clinical pathway was conducted according to a modular concept that mirrored all the treatment processes across various levels and modules. Using service records and analyses the process algorithms of the clinical pathway were developed and visualized with CorelTM iGrafix Process 2003. A detailed process cost record constituted the basis of the pathway costing, in which financial evaluation of the treatment processes was performed. The result of this study was a structured clinical pathway for coronary artery bypass grafting together with a cost calculation in the form of cost unit accounting. The use of a clinical pathway as a cost unit offers considerable advantages compared to the DRG or clinical case. The variance in the diagnoses and procedures within a pathway is minimal, so the consumption of resources is homogeneous. This leads to a considerable improvement in the value of cost unit accounting as a strategic control instrument in hospitals.

  12. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    NASA Astrophysics Data System (ADS)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  13. How Differences Between Manager and Clinician Perceptions of Safety Culture Impact Hospital Processes of Care.

    PubMed

    Richter, Jason; Mazurenko, Olena; Kazley, Abby Swanson; Ford, Eric W

    2017-11-04

    Evidenced-based processes of care improve patient outcomes, yet universal compliance is lacking, and perceptions of the quality of care are highly variable. The purpose of this study is to examine how differences in clinician and management perceptions on teamwork and communication relate to adherence to hospital processes of care. Hospitals submitted identifiable data for the 2012 Hospital Survey on Patient Safety Culture and the Centers for Medicare and Medicaid Services' Hospital Compare. The dependent variable was a composite, developed from the scores on adherence to acute myocardial infarction, heart failure, and pneumonia process of care measures. The primary independent variables reflected 4 safety culture domains: communication openness, feedback about errors, teamwork within units, and teamwork between units. We assigned each hospital into one of 4 groups based on agreement between managers and clinicians on each domain. Each hospital was categorized as "high" (above the median) or "low" (below) for clinicians and managers in communication and teamwork. We found a positive relationship between perceived teamwork and communication climate and processes of care measures. If managers and clinicians perceived the communication openness as high, the hospital was more likely to adhere with processes of care. Similarly, if clinicians perceived teamwork across units as high, the hospital was more likely to adhere to processes of care. Manager and staff perceptions about teamwork and communications impact adherence to processes of care. Policies should recognize the importance of perceptions of both clinicians and managers on teamwork and communication and seek to improve organizational climate and practices. Clinician perceptions of teamwork across units are more closely linked to processes of care, so managers should be cognizant and try to improve their perceptions.

  14. Trinary arithmetic and logic unit (TALU) using savart plate and spatial light modulator (SLM) suitable for optical computation in multivalued logic

    NASA Astrophysics Data System (ADS)

    Ghosh, Amal K.; Bhattacharya, Animesh; Raul, Moumita; Basuray, Amitabha

    2012-07-01

    Arithmetic logic unit (ALU) is the most important unit in any computing system. Optical computing is becoming popular day-by-day because of its ultrahigh processing speed and huge data handling capability. Obviously for the fast processing we need the optical TALU compatible with the multivalued logic. In this regard we are communicating the trinary arithmetic and logic unit (TALU) in modified trinary number (MTN) system, which is suitable for the optical computation and other applications in multivalued logic system. Here the savart plate and spatial light modulator (SLM) based optoelectronic circuits have been used to exploit the optical tree architecture (OTA) in optical interconnection network.

  15. Synthesis, Properties, Calculations and Applications of Small Molecular Host Materials Containing Oxadiazole Units with Different Nitrogen and Oxygen Atom Orientations for Solution-Processable Blue Phosphorescent OLEDs

    NASA Astrophysics Data System (ADS)

    Ye, Hua; Wu, Hongyu; Chen, Liangyuan; Ma, Songhua; Zhou, Kaifeng; Yan, Guobing; Shen, Jiazhong; Chen, Dongcheng; Su, Shi-Jian

    2018-03-01

    A series of new small molecules based on symmetric electron-acceptor of 1,3,4-oxadiazole moiety or its asymmetric isomer of 1,2,4-oxadiazole unit were successfully synthesized and applied to solution-processable blue phosphorescent organic light-emitting diodes for the first time, and their thermal, photophysical, electrochemical properties and density functional theory calculations were studied thoroughly. Due to the high triplet energy levels ( E T, 2.82-2.85 eV), the energy from phosphorescent emitter of iridium(III) bis[(4,6-difluorophenyl)-pyridinate- N,C2']picolinate (FIrpic) transfer to the host molecules could be effectively suppressed and thus assuring the emission of devices was all from FIrpic. In comparison with the para-mode conjugation in substitution of five-membered 1,3,4-oxadiazole in 134OXD, the meta-linkages of 1,2,4-isomer appending with two phenyl rings cause the worse conjugation degree and the electron delocalization as well as the lower electron-withdrawing ability for the other 1,2,4-oxadiazole-based materials. Noting that the solution-processed device based on 134OXD containing 1,3,4-oxadiazole units without extra vacuum thermal-deposited hole/exciton-blocking layer and electron-transporting layer showed the highest maximum current efficiency (CEmax) of 8.75 cd/A due to the excellent charge transporting ability of 134OXD, which far surpassed the similar devices based on other host materials containing 1,2,4-oxadiazole units. Moreover, the device based on 134OXD presented small efficiency roll-off with current efficiency (CE) of 6.26 cd/A at high brightness up to 100 cd/m2. This work demonstrates different nitrogen and oxygen atom orientations of the oxadiazole-based host materials produce major impact on the optoelectronic characteristics of the solution-processable devices.

  16. An investigation of collisions between fiber positioning units in LAMOST

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-Jie; Wang, Gang

    2016-04-01

    The arrangement of fiber positioning units in the LAMOST focal plane may lead to collisions during the fiber allocation process. To avoid these collisions, a software-based protection system has to abandon some targets located in the overlapping field of adjacent fiber units. In this paper, we first analyze the probability of collisions between fibers and infer their possible reasons. It is useful to solve the problem of collisions among fiber positioning units so as to improve the efficiency of LAMOST. Based on this, a collision handling system is designed by using a master-slave control structure between the micro control unit and microcomputer. Simulated experiments validate that the system can provide real-time inspection and swap information between the fiber unit controllers and the main controller.

  17. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  18. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  19. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  20. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  1. INDIVIDUALIZING UNIVERSITY INSTRUCTION, EXPLORING COMPUTER POTENTIAL TO AID COLLEGE TEACHERS BY DIRECTING THE LEARNING PROCESS. INTER-UNIVERSITY PROJECT ONE, PUBLICATIONS SERIES.

    ERIC Educational Resources Information Center

    FALL, CHARLES R.

    THIS DOCUMENT CONCLUDES THAT INSTRUCTION BY COMPUTER-BASED RESOURCE UNITS CAN FACILITATE LEARNING AND PROVIDE THE INSTRUCTOR WITH VALUABLE ASSISTANCE. BY PRE-PLANNING THE TEACHING-LEARNING SITUATION, RESOURCE UNITS CAN FREE THE INSTRUCTOR FOR DECISION-MAKING TASKS. RESOURCE UNITS CAN ALSO PROVIDE APPROPRIATE LEARNING GOALS AND STUDY GUIDES TO EACH…

  2. Information processing in an urban fire department communication system.

    PubMed

    Siegel, J; Weitzman, D O

    1975-09-01

    One of the most important functions of any fire department is to provide selective contact with fire fighting units and to dispatch these units based on information gathered from street alarm boxes and telephones. This paper is concerned with the problem of dispatching tactical response information to remote fire fighting units and with the effect of workload on the dispatch function.

  3. Quantification of the resist dissolution process: an in situ analysis using high speed atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Santillan, Julius Joseph; Shichiri, Motoharu; Itani, Toshiro

    2016-03-01

    This work focuses on the application of a high speed atomic force microscope (HS-AFM) for the in situ visualization / quantification of the resist dissolution process. This technique, as reported in the past, has provided useful pointers on the formation of resist patterns during dissolution. This paper discusses about an investigation made on the quantification of what we refer to as "dissolution unit size" or the basic units of patterning material dissolution. This was done through the establishment of an originally developed analysis method which extracts the difference between two succeeding temporal states of the material film surface (images) to indicate the amount of change occurring in the material film at a specific span of time. Preliminary experiments with actual patterning materials were done using a positive-tone EUV model resist composed only of polyhydroxystyrene (PHS)-based polymer with a molecular weight of 2,500 and a polydispersity index of 1.2. In the absence of a protecting group, the material was utilized at a 50nm film thickness with post application bake of 90°C/60s. The resulting film is soluble in the alkali-based developer even without exposure. Results have shown that the dissolution components (dissolution unit size) of the PHS-based material are not of fixed size. Instead, it was found that aside from one constantly dissolving unit size, another, much larger dissolution unit size trend also occurs during material dissolution. The presence of this larger dissolution unit size suggests an occurrence of "polymer clustering". Such polymer clustering was not significantly present during the initial stages of dissolution (near the original film surface) but becomes more persistently obvious after the dissolution process reaches a certain film thickness below the initial surface.

  4. Advancing perinatal patient safety through application of safety science principles using health IT.

    PubMed

    Webb, Jennifer; Sorensen, Asta; Sommerness, Samantha; Lasater, Beth; Mistry, Kamila; Kahwati, Leila

    2017-12-19

    The use of health information technology (IT) has been shown to promote patient safety in Labor and Delivery (L&D) units. The use of health IT to apply safety science principles (e.g., standardization) to L&D unit processes may further advance perinatal safety. Semi-structured interviews were conducted with L&D units participating in the Agency for Healthcare Research and Quality's (AHRQ's) Safety Program for Perinatal Care (SPPC) to assess units' experience with program implementation. Analysis of interview transcripts was used to characterize the process and experience of using health IT for applying safety science principles to L&D unit processes. Forty-six L&D units from 10 states completed participation in SPPC program implementation; thirty-two (70%) reported the use of health IT as an enabling strategy for their local implementation. Health IT was used to improve standardization of processes, use of independent checks, and to facilitate learning from defects. L&D units standardized care processes through use of electronic health record (EHR)-based order sets and use of smart pumps and other technology to improve medication safety. Units also standardized EHR documentation, particularly related to electronic fetal monitoring (EFM) and shoulder dystocia. Cognitive aids and tools were integrated into EHR and care workflows to create independent checks such as checklists, risk assessments, and communication handoff tools. Units also used data from EHRs to monitor processes of care to learn from defects. Units experienced several challenges incorporating health IT, including obtaining organization approval, working with their busy IT departments, and retrieving standardized data from health IT systems. Use of health IT played an integral part in the planning and implementation of SPPC for participating L&D units. Use of health IT is an encouraging approach for incorporating safety science principles into care to improve perinatal safety and should be incorporated into materials to facilitate the implementation of perinatal safety initiatives.

  5. Computational Investigations of Multiword Chunks in Language Learning.

    PubMed

    McCauley, Stewart M; Christiansen, Morten H

    2017-07-01

    Second-language learners rarely arrive at native proficiency in a number of linguistic domains, including morphological and syntactic processing. Previous approaches to understanding the different outcomes of first- versus second-language learning have focused on cognitive and neural factors. In contrast, we explore the possibility that children and adults may rely on different linguistic units throughout the course of language learning, with specific focus on the granularity of those units. Following recent psycholinguistic evidence for the role of multiword chunks in online language processing, we explore the hypothesis that children rely more heavily on multiword units in language learning than do adults learning a second language. To this end, we take an initial step toward using large-scale, corpus-based computational modeling as a tool for exploring the granularity of speakers' linguistic units. Employing a computational model of language learning, the Chunk-Based Learner, we compare the usefulness of chunk-based knowledge in accounting for the speech of second-language learners versus children and adults speaking their first language. Our findings suggest that while multiword units are likely to play a role in second-language learning, adults may learn less useful chunks, rely on them to a lesser extent, and arrive at them through different means than children learning a first language. Copyright © 2017 Cognitive Science Society, Inc.

  6. Potential of Using Solar Energy for Drinking Water Treatment Plant

    NASA Astrophysics Data System (ADS)

    Bukhary, S. S.; Batista, J.; Ahmad, S.

    2016-12-01

    Where water is essential to energy generation, energy usage is integral to life cycle processes of water extraction, treatment, distribution and disposal. Increasing population, climate change and greenhouse gas production challenges the water industry for energy conservation of the various water-related operations as well as limiting the associated carbon emissions. One of the ways to accomplish this is by incorporating renewable energy into the water sector. Treatment of drinking water, an important part of water life cycle processes, is vital for the health of any community. This study explores the feasibility of using solar energy for a drinking water treatment plant (DWTP) with the long-term goal of energy independence and sustainability. A 10 MGD groundwater DWTP in southwestern US was selected, using the treatment processes of coagulation, filtration and chlorination. Energy consumption in units of kWh/day and kWh/MG for each unit process was separately determined using industry accepted design criteria. Associated carbon emissions were evaluated in units of CO2 eq/MG. Based on the energy consumption and the existing real estate holdings, the DWTP was sized for distributed solar. Results showed that overall the motors used to operate the pumps including the groundwater intake pumps were the largest consumers of energy. Enough land was available around DWTP to deploy distributed solar. Results also showed that solar photovoltaics could potentially be used to meet the energy demands of the selected DWTP, but warrant the use of a large storage capacity, and thus increased costs. Carbon emissions related to solar based design were negligible compared to the original case. For future, this study can be used to analyze unit processes of other DWTP based on energy consumption, as well as for incorporating sustainability into the DWTP design.

  7. U.S. Global Defense Posture, 1783-2011

    DTIC Science & Technology

    2012-01-01

    in global affairs to further the nation’s interests. Second, as a consequence of its victory in the Spanish -American War of 1898, the United States...planning process . Despite their importance, the Pentagon needs to ensure that its global defense posture is developed from a top-down, not a bottom-up...to the continental United States (CONUS) or leaving it, both of which have implications for the base realignment and closure process . Despite this

  8. Identification of different geologic units using fuzzy constrained resistivity tomography

    NASA Astrophysics Data System (ADS)

    Singh, Anand; Sharma, S. P.

    2018-01-01

    Different geophysical inversion strategies are utilized as a component of an interpretation process that tries to separate geologic units based on the resistivity distribution. In the present study, we present the results of separating different geologic units using fuzzy constrained resistivity tomography. This was accomplished using fuzzy c means, a clustering procedure to improve the 2D resistivity image and geologic separation within the iterative minimization through inversion. First, we developed a Matlab-based inversion technique to obtain a reliable resistivity image using different geophysical data sets (electrical resistivity and electromagnetic data). Following this, the recovered resistivity model was converted into a fuzzy constrained resistivity model by assigning the highest probability value of each model cell to the cluster utilizing fuzzy c means clustering procedure during the iterative process. The efficacy of the algorithm is demonstrated using three synthetic plane wave electromagnetic data sets and one electrical resistivity field dataset. The presented approach shows improvement on the conventional inversion approach to differentiate between different geologic units if the correct number of geologic units will be identified. Further, fuzzy constrained resistivity tomography was performed to examine the augmentation of uranium mineralization in the Beldih open cast mine as a case study. We also compared geologic units identified by fuzzy constrained resistivity tomography with geologic units interpreted from the borehole information.

  9. Bench-Scale Silicone Process for Low-Cost CO{sub 2} Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vipperla, Ravikumar; Yee, Michael; Steele, Ray

    This report presents system and economic analysis for a carbon capture unit which uses an amino-silicone solvent for CO{sub 2} capture and sequestration (CCS) in a pulverized coal (PC) boiler. The amino-silicone solvent is based on GAP-1 with Tri-Ethylene Glycol (TEG) as a co-solvent. The report also shows results for a CCS unit based on a conventional approach using mono-ethanol amine (MEA). Models were developed for both processes and used to calculate mass and energy balances. Capital costs and energy penalty were calculated for both systems, as well as the increase in cost of electricity. The amino-silicone solvent based systemmore » demonstrates significant advantages compared to the MEA system.« less

  10. Architecture and data processing alternatives for Tse computer. Volume 1: Tse logic design concepts and the development of image processing machine architectures

    NASA Technical Reports Server (NTRS)

    Rickard, D. A.; Bodenheimer, R. E.

    1976-01-01

    Digital computer components which perform two dimensional array logic operations (Tse logic) on binary data arrays are described. The properties of Golay transforms which make them useful in image processing are reviewed, and several architectures for Golay transform processors are presented with emphasis on the skeletonizing algorithm. Conventional logic control units developed for the Golay transform processors are described. One is a unique microprogrammable control unit that uses a microprocessor to control the Tse computer. The remaining control units are based on programmable logic arrays. Performance criteria are established and utilized to compare the various Golay transform machines developed. A critique of Tse logic is presented, and recommendations for additional research are included.

  11. FINAL REPORT FOR VERIFICATION OF THE METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)

    EPA Science Inventory

    The United States Environmental Protection Agency (USEPA) has prepared a computer process simulation package for the metal finishing industry that enables users to predict process outputs based upon process inputs and other operating conditions. This report documents the developm...

  12. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  13. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  14. Sequential microfluidic droplet processing for rapid DNA extraction.

    PubMed

    Pan, Xiaoyan; Zeng, Shaojiang; Zhang, Qingquan; Lin, Bingcheng; Qin, Jianhua

    2011-11-01

    This work describes a novel droplet-based microfluidic device, which enables sequential droplet processing for rapid DNA extraction. The microdevice consists of a droplet generation unit, two reagent addition units and three droplet splitting units. The loading/washing/elution steps required for DNA extraction were carried out by sequential microfluidic droplet processing. The movement of superparamagnetic beads, which were used as extraction supports, was controlled with magnetic field. The microdevice could generate about 100 droplets per min, and it took about 1 min for each droplet to perform the whole extraction process. The extraction efficiency was measured to be 46% for λ-DNA, and the extracted DNA could be used in subsequent genetic analysis such as PCR, demonstrating the potential of the device for fast DNA extraction. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Implementation of GPU accelerated SPECT reconstruction with Monte Carlo-based scatter correction.

    PubMed

    Bexelius, Tobias; Sohlberg, Antti

    2018-06-01

    Statistical SPECT reconstruction can be very time-consuming especially when compensations for collimator and detector response, attenuation, and scatter are included in the reconstruction. This work proposes an accelerated SPECT reconstruction algorithm based on graphics processing unit (GPU) processing. Ordered subset expectation maximization (OSEM) algorithm with CT-based attenuation modelling, depth-dependent Gaussian convolution-based collimator-detector response modelling, and Monte Carlo-based scatter compensation was implemented using OpenCL. The OpenCL implementation was compared against the existing multi-threaded OSEM implementation running on a central processing unit (CPU) in terms of scatter-to-primary ratios, standardized uptake values (SUVs), and processing speed using mathematical phantoms and clinical multi-bed bone SPECT/CT studies. The difference in scatter-to-primary ratios, visual appearance, and SUVs between GPU and CPU implementations was minor. On the other hand, at its best, the GPU implementation was noticed to be 24 times faster than the multi-threaded CPU version on a normal 128 × 128 matrix size 3 bed bone SPECT/CT data set when compensations for collimator and detector response, attenuation, and scatter were included. GPU SPECT reconstructions show great promise as an every day clinical reconstruction tool.

  16. Unified Ecoregions of Alaska: 2001

    USGS Publications Warehouse

    Nowacki, Gregory J.; Spencer, Page; Fleming, Michael; Brock, Terry; Jorgenson, Torre

    2003-01-01

    Major ecosystems have been mapped and described for the State of Alaska and nearby areas. Ecoregion units are based on newly available datasets and field experience of ecologists, biologists, geologists and regional experts. Recently derived datasets for Alaska included climate parameters, vegetation, surficial geology and topography. Additional datasets incorporated in the mapping process were lithology, soils, permafrost, hydrography, fire regime and glaciation. Thirty two units are mapped using a combination of the approaches of Bailey (hierarchial), and Omernick (integrated). The ecoregions are grouped into two higher levels using a 'tri-archy' based on climate parameters, vegetation response and disturbance processes. The ecoregions are described with text, photos and tables on the published map.

  17. Establishing an Intellectual and Theoretical Foundation for the After Action Review Process - A Literature Review

    DTIC Science & Technology

    2011-04-01

    Research Institute Technology-Based Training Research Unit Stephen L. Goldberg , Chief April 2011 United States Army...Research Unit Stephen L. Goldberg , Chief U.S. Army Research Institute for the Behavioral and Social Sciences 2511 Jefferson Davis Highway...statements of approval voiced by command elements. Rather, researchers must complete a program of transfer of training studies to show that variations in

  18. Optimal placement of fast cut back units based on the theory of cellular automata and agent

    NASA Astrophysics Data System (ADS)

    Yan, Jun; Yan, Feng

    2017-06-01

    The thermal power generation units with the function of fast cut back could serve power for auxiliary system and keep island operation after a major blackout, so they are excellent substitute for the traditional black-start power sources. Different placement schemes for FCB units have different influence on the subsequent restoration process. Considering the locality of the emergency dispatching rules, the unpredictability of specific dispatching instructions and unexpected situations like failure of transmission line energization, a novel deduction model for network reconfiguration based on the theory of cellular automata and agent is established. Several indexes are then defined for evaluating the placement schemes for FCB units. The attribute weights determination method based on subjective and objective integration and grey relational analysis are combinatorically used to determine the optimal placement scheme for FCB unit. The effectiveness of the proposed method is validated by the test results on the New England 10-unit 39-bus power system.

  19. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  20. Development of a wearable measurement and control unit for personal customizing machine-supported exercise.

    PubMed

    Wang, Zhihui; Tamura, Naoki; Kiryu, Tohru

    2005-01-01

    Wearable technology has been used in various health-related fields to develop advanced monitoring solutions. However, the monitoring function alone cannot meet all the requirements of personal customizing machine-supported exercise that have biosignal-based controls. In this paper, we propose a new wearable unit design equipped with measurement and control functions to support the personal customization process. The wearable unit can measure the heart rate and electromyogram signals during exercise and output workload control commands to the exercise machines. We then applied a prototype of the wearable unit to an Internet-based cycle ergometer system. The wearable unit was examined using twelve young people to check its feasibility. The results verified that the unit could successfully adapt to the control of the workload and was effective for continuously supporting gradual changes in physical activities.

  1. An Infinite Game in a Finite Setting: Visualizing Foreign Language Teaching and Learning in America.

    ERIC Educational Resources Information Center

    Mantero, Miguel

    According to contemporary thought and foundational research, this paper presents various elements of the foreign language teaching profession and language learning environment in the United States as either product-driven or process-based. It is argued that a process-based approach to language teaching and learning benefits not only second…

  2. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    PubMed

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  3. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  4. Development of Product Availability Monitoring System In Production Unit In Automotive Component Industry

    NASA Astrophysics Data System (ADS)

    Hartono, Rachmad; Raharno, Sri; Yuwana Martawirya, Yatna; Arthaya, Bagus

    2018-03-01

    This paper described a methodology to monitor the availability of products in a production unit in the automotive component industry. Automotive components made are automotive components made through sheet metal working. Raw material coming into production unit in the form of pieces of plates that have a certain size. Raw materials that come stored in the warehouse. Data of raw each material in the warehouse are recorded and stored in a data base system. The material will then undergo several production processes in the production unit. When the material is taken from the warehouse, material data are also recorded and stored in a data base. The data recorded are the amount of material, material type, and date when the material is out of the warehouse. The material coming out of the warehouse is labeled with information related to the production processes that the material must pass. Material out of the warehouse is a product will be made. The products have been completed, are stored in the warehouse products. When the product is entered into the product warehouse, product data is also recorded by scanning the barcode contained on the label. By recording the condition of the product at each stage of production, we can know the availability of the product in a production unit in the form of a raw material, the product being processed and the finished product.

  5. Psychiatry training in the United Kingdom--part 2: the training process.

    PubMed

    Christodoulou, N; Kasiakogia, K

    2015-01-01

    In the second part of this diptych, we shall deal with psychiatric training in the United Kingdom in detail, and we will compare it--wherever this is meaningful--with the equivalent system in Greece. As explained in the first part of the paper, due to the recently increased emigration of Greek psychiatrists and psychiatric trainees, and the fact that the United Kingdom is a popular destination, it has become necessary to inform those aspiring to train in the United Kingdom of the system and the circumstances they should expect to encounter. This paper principally describes the structure of the United Kingdom's psychiatric training system, including the different stages trainees progress through and their respective requirements and processes. Specifically, specialty and subspecialty options are described and explained, special paths in training are analysed, and the notions of "special interest day" and the optional "Out of programme experience" schemes are explained. Furthermore, detailed information is offered on the pivotal points of each of the stages of the training process, with special care to explain the important differences and similarities between the systems in Greece and the United Kingdom. Special attention is given to The Royal College of Psychiatrists' Membership Exams (MRCPsych) because they are the only exams towards completing specialisation in Psychiatry in the United Kingdom. Also, the educational culture of progressing according to a set curriculum, of utilising diverse means of professional development, of empowering the trainees' autonomy by allowing initiative-based development and of applying peer supervision as a tool for professional development is stressed. We conclude that psychiatric training in the United Kingdom differs substantially to that of Greece in both structure and process. Τhere are various differences such as pure psychiatric training in the United Kingdom versus neurological and medical modules in Greece, in-training exams in the United Kingdom versus an exit exam in Greece, and of course the three years of higher training, which prepares trainees towards functioning as consultants. However, perhaps the most important difference is one of mentality; namely a culture of competency- based training progression in the United Kingdom, which further extends beyond training into professional revalidation. We believe that, with careful cultural adaptation, the systems of psychiatric training in the United Kingdom and Greece may benefit from sharing some of their features. Lastly, as previously clarified, this diptych paper is meant to be informative, not advisory.

  6. Experience of Data Handling with IPPM Payload

    NASA Astrophysics Data System (ADS)

    Errico, Walter; Tosi, Pietro; Ilstad, Jorgen; Jameux, David; Viviani, Riccardo; Collantoni, Daniele

    2010-08-01

    A simplified On-Board Data Handling system has been developed by CAEN AURELIA SPACE and ABSTRAQT as PUS-over-SpaceWire demonstration platform for the Onboard Payload Data Processing laboratory at ESTEC. The system is composed of three Leon2-based IPPM (Integrated Payload Processing Module) computers that play the roles of Instrument, Payload Data Handling Unit and Satellite Management Unit. Two PCs complete the test set-up simulating an external Memory Management Unit and the Ground Control Unit. Communication among units take place primarily through SpaceWire links; RMAP[2] protocol is used for configuration and housekeeping. A limited implementation of ECSS-E-70-41B Packet Utilisation Standard (PUS)[1] over CANbus and MIL-STD-1553B has been also realized. The Open Source RTEMS is running on the IPPM AT697E CPU as real-time operating system.

  7. A software package for interactive motor unit potential classification using fuzzy k-NN classifier.

    PubMed

    Rasheed, Sarbast; Stashuk, Daniel; Kamel, Mohamed

    2008-01-01

    We present an interactive software package for implementing the supervised classification task during electromyographic (EMG) signal decomposition process using a fuzzy k-NN classifier and utilizing the MATLAB high-level programming language and its interactive environment. The method employs an assertion-based classification that takes into account a combination of motor unit potential (MUP) shapes and two modes of use of motor unit firing pattern information: the passive and the active modes. The developed package consists of several graphical user interfaces used to detect individual MUP waveforms from a raw EMG signal, extract relevant features, and classify the MUPs into motor unit potential trains (MUPTs) using assertion-based classifiers.

  8. [Organization of clinical emergency units. Mission and environmental factors determine the organizational concept].

    PubMed

    Genewein, U; Jakob, M; Bingisser, R; Burla, S; Heberer, M

    2009-02-01

    Mission and organization of emergency units were analysed to understand the underlying principles and concepts. The recent literature (2000-2007) on organizational structures and functional concepts of clinical emergency units was reviewed. An organizational portfolio based on the criteria specialization (presence of medical specialists on the emergency unit) and integration (integration of the emergency unit into the hospital structure) was established. The resulting organizational archetypes were comparatively assessed based on established efficiency criteria (efficiency of resource utilization, process efficiency, market efficiency). Clinical emergency units differ with regard to autonomy (within the hospital structure), range of services and service depth (horizontal and vertical integration). The "specialization"-"integration"-portfolio enabled the definition of typical organizational patterns (so-called archetypes): profit centres primarily driven by economic objectives, service centres operating on the basis of agreements with the hospital board, functional clinical units integrated into medical specialty units (e.g., surgery, gynaecology) and modular organizations characterized by small emergency teams that would call specialists immediately after triage and initial diagnostic. There is no "one fits all" concept for the organization of clinical emergency units. Instead, a number of well characterized organizational concepts are available enabling a rational choice based on a hospital's mission and demand.

  9. A new window of opportunity to reject process-based biotechnology regulation

    PubMed Central

    Marchant, Gary E; Stevens, Yvonne A

    2015-01-01

    ABSTRACT. The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach. PMID:26930116

  10. A new window of opportunity to reject process-based biotechnology regulation.

    PubMed

    Marchant, Gary E; Stevens, Yvonne A

    2015-01-01

    The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach.

  11. Assessment of mammographic film processor performance in a hospital and mobile screening unit.

    PubMed

    Murray, J G; Dowsett, D J; Laird, O; Ennis, J T

    1992-12-01

    In contrast to the majority of mammographic breast screening programmes, film processing at this centre occurs on site in both hospital and mobile trailer units. Initial (1989) quality control (QC) sensitometric tests revealed a large variation in film processor performance in the mobile unit. The clinical significance of these variations was assessed and acceptance limits for processor performance determined. Abnormal mammograms were used as reference material and copied using high definition 35 mm film over a range of exposure settings. The copies were than matched with QC film density variation from the mobile unit. All films were subsequently ranked for spatial and contrast resolution. Optimal values for processing time of 2 min (equivalent to film transit time 3 min and developer time 46 s) and temperature of 36 degrees C were obtained. The widespread anomaly of reporting film transit time as processing time is highlighted. Use of mammogram copies as a means of measuring the influence of film processor variation is advocated. Careful monitoring of the mobile unit film processor performance has produced stable quality comparable with the hospital based unit. The advantages of on site film processing are outlined. The addition of a sensitometric step wedge to all mammography film stock as a means of assessing image quality is recommended.

  12. Towards marine seismological Network: real time small aperture seismic array

    NASA Astrophysics Data System (ADS)

    Ilinskiy, Dmitry

    2017-04-01

    Most powerful and dangerous seismic events are generated in underwater subduction zones. Existing seismological networks are based on land seismological stations. Increased demands for accuracy of location, magnitude, rupture process of coming earthquakes and at the same time reduction of data processing time require information from seabed seismic stations located near the earthquake generation area. Marine stations provide important contribution for clarification of the tectonic settings in most active subduction zones of the world. Early warning system for subduction zone area is based on marine seabed array which located near the area of most hazardous seismic zone in the region. Fast track processing for location of the earthquake hypocenter and energy takes place in buoy surface unit. Information about detected and located earthquake reaches the onshore seismological center earlier than the first break waves from the same earthquake will reach the nearest onshore seismological station. Implementation of small aperture array is based on existed and shown a good proven performance and costs effective solutions such as weather moored buoy and self-pop up autonomous seabed seismic nodes. Permanent seabed system for real-time operation has to be installed in deep sea waters far from the coast. Seabed array consists of several self-popup seismological stations which continuously acquire the data, detect the events of certain energy class and send detected event parameters to the surface buoy via acoustic link. Surface buoy unit determine the earthquake location by receiving the event parameters from seabed units and send such information in semi-real time to the onshore seismological center via narrow band satellite link. Upon the request from the cost the system could send wave form of events of certain energy class, bottom seismic station battery status and other environmental parameters. When the battery life of particular seabed unit is close to became empty, the seabed unit is switching into sleep mode and send that information to surface buoy and father to the onshore data center. Then seabed unit can wait for the vessel of opportunity for recovery of seabed unit to sea surface and replacing seabed station to another one with fresh batteries. All collected permanent seismic data by seabed unit could than downloaded for father processing and analysis. In our presentation we will demonstrate the several working prototypes of proposed system such as real time cable broad band seismological station and real time buoy seabed seismological station.

  13. Evaluation of virus reduction efficiency in wastewater treatment unit processes as a credit value in the multiple-barrier system for wastewater reclamation and reuse.

    PubMed

    Ito, Toshihiro; Kato, Tsuyoshi; Hasegawa, Makoto; Katayama, Hiroyuki; Ishii, Satoshi; Okabe, Satoshi; Sano, Daisuke

    2016-12-01

    The virus reduction efficiency of each unit process is commonly determined based on the ratio of virus concentration in influent to that in effluent of a unit, but the virus concentration in wastewater has often fallen below the analytical quantification limit, which does not allow us to calculate the concentration ratio at each sampling event. In this study, left-censored datasets of norovirus (genogroup I and II), and adenovirus were used to calculate the virus reduction efficiency in unit processes of secondary biological treatment and chlorine disinfection. Virus concentration in influent, effluent from the secondary treatment, and chlorine-disinfected effluent of four municipal wastewater treatment plants were analyzed by a quantitative polymerase chain reaction (PCR) approach, and the probabilistic distributions of log reduction (LR) were estimated by a Bayesian estimation algorithm. The mean values of LR in the secondary treatment units ranged from 0.9 and 2.2, whereas those in the free chlorine disinfection units were from -0.1 and 0.5. The LR value in the secondary treatment was virus type and unit process dependent, which raised the importance for accumulating the data of virus LR values applicable to the multiple-barrier system, which is a global concept of microbial risk management in wastewater reclamation and reuse.

  14. A portable device for rapid nondestructive detection of fresh meat quality

    NASA Astrophysics Data System (ADS)

    Lin, Wan; Peng, Yankun

    2014-05-01

    Quality attributes of fresh meat influence nutritional value and consumers' purchasing power. In order to meet the demand of inspection department for portable device, a rapid and nondestructive detection device for fresh meat quality based on ARM (Advanced RISC Machines) processor and VIS/NIR technology was designed. Working principal, hardware composition, software system and functional test were introduced. Hardware system consisted of ARM processing unit, light source unit, detection probe unit, spectral data acquisition unit, LCD (Liquid Crystal Display) touch screen display unit, power unit and the cooling unit. Linux operating system and quality parameters acquisition processing application were designed. This system has realized collecting spectral signal, storing, displaying and processing as integration with the weight of 3.5 kg. 40 pieces of beef were used in experiment to validate the stability and reliability. The results indicated that prediction model developed using PLSR method using SNV as pre-processing method had good performance, with the correlation coefficient of 0.90 and root mean square error of 1.56 for validation set for L*, 0.95 and 1.74 for a*,0.94 and 0.59 for b*, 0.88 and 0.13 for pH, 0.79 and 12.46 for tenderness, 0.89 and 0.91 for water content, respectively. The experimental result shows that this device can be a useful tool for detecting quality of meat.

  15. Patterns in Nature Forming Patterns in Minds: An Evaluation of an Introductory Physics Unit

    NASA Astrophysics Data System (ADS)

    Sheaffer, Christopher Ryan

    Educators are increasingly focused on the process over the content. In science especially, teachers want students to understand the nature of science and investigation. The emergence of scientific inquiry and engineering design teaching methods have led to the development of new teaching and evaluation methods that concentrate on steps in a process rather than facts in a topic. Research supports the notion that an explicit focus on the scientific process can lead to student science knowledge gains. In response to new research and standards many teachers have been developing teaching methods that seem to work well in their classrooms, but lack the time and resources to test them in other classroom environments. A high school Physics teacher (Bradford Hill) has developed a unit called Patterns in Nature (PIN) with objectives relating mathematical modeling to the scientific process. Designed for use in his large public school classroom, the unit was taken and used in a charter school with small classes. This study looks at specifically whether or not the PIN unit effectively teaches students how to graph the data they gather and fit an appropriate mathematical pattern, using that model to predict future measurements. Additionally, the study looks at the students' knowledge and views about the nature of science and the process of scientific investigation as it is affected by the PIN unit. Findings show that students are able to identify and apply patterns to data, but have difficulties explaining the meaning of the math. Students' show increases in their knowledge of the process of science, and the majority develop positive views about science in general. A major goal of this study is to place this unit in the cyclical process of Design-Based Research and allow for Pattern in Nature's continuous improvement, development and evaluation. Design-Based Research (DBR) is an approach that can be applied to the implementation and evaluation of classroom materials. This method incorporates the complexities of different contexts and changing treatments into the research methods and analysis. From the use of DBR teachers can understand more about how the designed materials affect the students. Others may be able to use the development and analysis of PIN study as a guide to look at similar aspects of science units developed elsewhere.

  16. 20 CFR 655.166 - Requests for determinations based on nonavailability of U.S. workers.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... nonavailability of U.S. workers. 655.166 Section 655.166 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR TEMPORARY EMPLOYMENT OF FOREIGN WORKERS IN THE UNITED STATES Labor Certification Process for Temporary Agricultural Employment in the United States (H-2A Workers) Labor...

  17. 20 CFR 655.166 - Requests for determinations based on nonavailability of U.S. workers.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... nonavailability of U.S. workers. 655.166 Section 655.166 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR TEMPORARY EMPLOYMENT OF FOREIGN WORKERS IN THE UNITED STATES Labor Certification Process for Temporary Agricultural Employment in the United States (H-2A Workers) Labor...

  18. Correlating Computer Database Programs with Social Studies Instruction.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This unit emphasizes the integration of software in a focus on the classroom instruction process. Student activities are based on plans and ideas for instructional units presented by a teacher who describes and demonstrates the activities. Integration has occurred when computer applications are included in an instructional activity. This guide…

  19. SEPARATIONS RESEARCH AT THE UNITED STATES ENVIRONMENTAL PROTECTION AGENCY - TOWARDS RECOVERY OF VOCS AND METALS USING MEMBRANES AND ADSORPTION PROCESSES

    EPA Science Inventory

    The USEPA's National Risk Management Research Laboratory is investigating new separations materials and processes for removal and recovery of volatile organic compounds (VOCs) and toxic metals from wastestreams and industrial process streams. Research applying membrane-based perv...

  20. Applying the Theory of Constraints to a Base Civil Engineering Operations Branch

    DTIC Science & Technology

    1991-09-01

    Figure Page 1. Typical Work Order Processing . .......... 7 2. Typical Job Order Processing . .......... 8 3. Typical Simplified In-Service Work Plan for...Customers’ Customer Request Service Planning Unit Production] Control Center Material Control Scheduling CE Shops Figure 1.. Typical Work Order Processing 7

  1. High-efficiency CRISPR/Cas9 multiplex gene editing using the glycine tRNA-processing system-based strategy in maize.

    PubMed

    Qi, Weiwei; Zhu, Tong; Tian, Zhongrui; Li, Chaobin; Zhang, Wei; Song, Rentao

    2016-08-11

    CRISPR/Cas9 genome editing strategy has been applied to a variety of species and the tRNA-processing system has been used to compact multiple gRNAs into one synthetic gene for manipulating multiple genes in rice. We optimized and introduced the multiplex gene editing strategy based on the tRNA-processing system into maize. Maize glycine-tRNA was selected to design multiple tRNA-gRNA units for the simultaneous production of numerous gRNAs under the control of one maize U6 promoter. We designed three gRNAs for simplex editing and three multiple tRNA-gRNA units for multiplex editing. The results indicate that this system not only increased the number of targeted sites but also enhanced mutagenesis efficiency in maize. Additionally, we propose an advanced sequence selection of gRNA spacers for relatively more efficient and accurate chromosomal fragment deletion, which is important for complete abolishment of gene function especially long non-coding RNAs (lncRNAs). Our results also indicated that up to four tRNA-gRNA units in one expression cassette design can still work in maize. The examples reported here demonstrate the utility of the tRNA-processing system-based strategy as an efficient multiplex genome editing tool to enhance maize genetic research and breeding.

  2. Motor unit action potential conduction velocity estimated from surface electromyographic signals using image processing techniques.

    PubMed

    Soares, Fabiano Araujo; Carvalho, João Luiz Azevedo; Miosso, Cristiano Jacques; de Andrade, Marcelino Monteiro; da Rocha, Adson Ferreira

    2015-09-17

    In surface electromyography (surface EMG, or S-EMG), conduction velocity (CV) refers to the velocity at which the motor unit action potentials (MUAPs) propagate along the muscle fibers, during contractions. The CV is related to the type and diameter of the muscle fibers, ion concentration, pH, and firing rate of the motor units (MUs). The CV can be used in the evaluation of contractile properties of MUs, and of muscle fatigue. The most popular methods for CV estimation are those based on maximum likelihood estimation (MLE). This work proposes an algorithm for estimating CV from S-EMG signals, using digital image processing techniques. The proposed approach is demonstrated and evaluated, using both simulated and experimentally-acquired multichannel S-EMG signals. We show that the proposed algorithm is as precise and accurate as the MLE method in typical conditions of noise and CV. The proposed method is not susceptible to errors associated with MUAP propagation direction or inadequate initialization parameters, which are common with the MLE algorithm. Image processing -based approaches may be useful in S-EMG analysis to extract different physiological parameters from multichannel S-EMG signals. Other new methods based on image processing could also be developed to help solving other tasks in EMG analysis, such as estimation of the CV for individual MUs, localization and tracking of innervation zones, and study of MU recruitment strategies.

  3. Development of an Integrated Multi-Contaminant Removal Process Applied to Warm Syngas Cleanup for Coal-Based Advanced Gasification Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Howard

    2010-11-30

    This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less

  4. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    NASA Astrophysics Data System (ADS)

    Braams, Bastiaan J.; Chung, Hyun-Kyung

    2012-05-01

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint work towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.

  5. Opcode counting for performance measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gara, Alan; Satterfield, David L.; Walkup, Robert E.

    Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions ofmore » the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.« less

  6. Opcode counting for performance measurement

    DOEpatents

    Gara, Alan; Satterfield, David L; Walkup, Robert E

    2013-10-29

    Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.

  7. Opcode counting for performance measurement

    DOEpatents

    Gara, Alan; Satterfield, David L.; Walkup, Robert E.

    2015-08-11

    Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.

  8. Opcode counting for performance measurement

    DOEpatents

    Gara, Alan; Satterfield, David L.; Walkup, Robert E.

    2016-10-18

    Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.

  9. Continuous Manufacturing of Recombinant Therapeutic Proteins: Upstream and Downstream Technologies.

    PubMed

    Patil, Rohan; Walther, Jason

    2017-03-07

    Continuous biomanufacturing of recombinant therapeutic proteins offers several potential advantages over conventional batch processing, including reduced cost of goods, more flexible and responsive manufacturing facilities, and improved and consistent product quality. Although continuous approaches to various upstream and downstream unit operations have been considered and studied for decades, in recent years interest and application have accelerated. Researchers have achieved increasingly higher levels of process intensification, and have also begun to integrate different continuous unit operations into larger, holistically continuous processes. This review first discusses approaches for continuous cell culture, with a focus on perfusion-enabling cell separation technologies including gravitational, centrifugal, and acoustic settling, as well as filtration-based techniques. We follow with a review of various continuous downstream unit operations, covering categories such as clarification, chromatography, formulation, and viral inactivation and filtration. The review ends by summarizing case studies of integrated and continuous processing as reported in the literature.

  10. Set a Structure of Objects with a Help of Grouping to Ten Strategy to Understand the Idea of Unitizing

    ERIC Educational Resources Information Center

    Assiti, Saliza Safta; Zulkardi; Darmawijoyo

    2013-01-01

    The intention of the present study is to know how the pupils can learn to make a group of ten to understand the idea of unitizing. The pupils were given a contextual problem "Counting the Beads" in order to promote their understanding about the idea of unitizing. The process of designing the problem was based on the 5 tenets of…

  11. United States transportation fuel economics (1975 - 1995)

    NASA Technical Reports Server (NTRS)

    Alexander, A. D., III

    1975-01-01

    The United States transportation fuel economics in terms of fuel resources options, processing alternatives, and attendant economics for the period 1975 to 1995 are evaluated. The U.S. energy resource base is reviewed, portable fuel-processing alternatives are assessed, and selected future aircraft fuel options - JP fuel, liquid methane, and liquid hydrogen - are evaluated economically. Primary emphasis is placed on evaluating future aircraft fuel options and economics to provide guidance for future strategy of NASA in the development of aviation and air transportation research and technology.

  12. Quantum memory on a charge qubit in an optical microresonator

    NASA Astrophysics Data System (ADS)

    Tsukanov, A. V.

    2017-10-01

    A quantum-memory unit scheme on the base of a semiconductor structure with quantum dots is proposed. The unit includes a microresonator with single and double quantum dots performing frequencyconverter and charge-qubit functions, respectively. The writing process is carried out in several stages and it is controlled by optical fields of the resonator and laser. It is shown that, to achieve high writing probability, it is necessary to use high-Q resonators and to be able to suppress relaxation processes in quantum dots.

  13. Collaboration processes and perceived effectiveness of integrated care projects in primary care: a longitudinal mixed-methods study.

    PubMed

    Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-10-09

    Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.

  14. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    PubMed

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  15. RTD-based Material Tracking in a Fully-Continuous Dry Granulation Tableting Line.

    PubMed

    Martinetz, M C; Karttunen, A-P; Sacher, S; Wahl, P; Ketolainen, J; Khinast, J G; Korhonen, O

    2018-06-06

    Continuous manufacturing (CM) offers quality and cost-effectiveness benefits over currently dominating batch processing. One challenge that needs to be addressed when implementing CM is traceability of materials through the process, which is needed for the batch/lot definition and control strategy. In this work the residence time distributions (RTD) of single unit operations (blender, roller compactor and tablet press) of a continuous dry granulation tableting line were captured with NIR based methods at selected mass flow rates to create training data. RTD models for continuous operated unit operations and the entire line were developed based on transfer functions. For semi-continuously operated bucket conveyor and pneumatic transport an assumption based the operation frequency was used. For validation of the parametrized process model, a pre-defined API step change and its propagation through the manufacturing line was computed and compared to multi-scale experimental runs conducted with the fully assembled continuous operated manufacturing line. This novel approach showed a very good prediction power at the selected mass flow rates for a complete continuous dry granulation line. Furthermore, it shows and proves the capabilities of process simulation as a tool to support development and control of pharmaceutical manufacturing processes. Copyright © 2018. Published by Elsevier B.V.

  16. Evolution of the Power Processing Units Architecture for Electric Propulsion at CRISA

    NASA Astrophysics Data System (ADS)

    Palencia, J.; de la Cruz, F.; Wallace, N.

    2008-09-01

    Since 2002, the team formed by EADS Astrium CRISA, Astrium GmbH Friedrichshafen, and QinetiQ has participated in several flight programs where the Electric Propulsion based on Kaufman type Ion Thrusters is the baseline conceptOn 2002, CRISA won the contract for the development of the Ion Propulsion Control Unit (IPCU) for GOCE. This unit together with the T5 thruster by QinetiQ provides near perfect atmospheric drag compensation offering thrust levels in the range of 1 to 20mN.By the end of 2003, CRISA started the adaptation of the IPCU concept to the QinetiQ T6 Ion Thruster for the Alphabus program.This paper shows how the Power Processing Unit design evolved in time including the current developments.

  17. 32 CFR 807.1 - General requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Information Management will set up procedures to meet these needs and will make available Master Publications... chief, base information management, for processing. (c) Units will process requests under the Foreign... Technical Information Service (NTIS), Defense Publication Section, US Department of Commerce, 4285 Port...

  18. 32 CFR 807.1 - General requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Information Management will set up procedures to meet these needs and will make available Master Publications... chief, base information management, for processing. (c) Units will process requests under the Foreign... Technical Information Service (NTIS), Defense Publication Section, US Department of Commerce, 4285 Port...

  19. 32 CFR 807.1 - General requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Information Management will set up procedures to meet these needs and will make available Master Publications... chief, base information management, for processing. (c) Units will process requests under the Foreign... Technical Information Service (NTIS), Defense Publication Section, US Department of Commerce, 4285 Port...

  20. 32 CFR 807.1 - General requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Information Management will set up procedures to meet these needs and will make available Master Publications... chief, base information management, for processing. (c) Units will process requests under the Foreign... Technical Information Service (NTIS), Defense Publication Section, US Department of Commerce, 4285 Port...

  1. Increasing relational memory in childhood with unitization strategies.

    PubMed

    Robey, Alison; Riggins, Tracy

    2018-01-01

    Young children often experience relational memory failures, which are thought to result from immaturity of the recollection processes presumed to be required for these tasks. However, research in adults has suggested that relational memory tasks can be accomplished using familiarity, a process thought to be mature by the end of early childhood. The goal of the present study was to determine whether relational memory performance could be improved in childhood by teaching young children memory strategies that have been shown to increase the contribution of familiarity in adults (i.e., unitization). Groups of 6- and 8-year-old children were taught to use visualization strategies that either unitized or did not unitize pictures and colored borders. Estimates of familiarity and recollection were extracted by fitting receiver operator characteristic curves (Yonelinas, Journal of Experimental Psychology: Learning, Memory, and Cognition 20, 1341-1354, 1994, Yonelinas, Memory & Cognition 25, 747-763, 1997) based on dual-process models of recognition. Bayesian analysis revealed that strategies involving unitization improved memory performance and increased the contribution of familiarity in both age groups.

  2. Process Improvement to Enhance Quality in a Large Volume Labor and Birth Unit.

    PubMed

    Bell, Ashley M; Bohannon, Jessica; Porthouse, Lisa; Thompson, Heather; Vago, Tony

    The goal of the perinatal team at Mercy Hospital St. Louis is to provide a quality patient experience during labor and birth. After the move to a new labor and birth unit in 2013, the team recognized many of the routines and practices needed to be modified based on different demands. The Lean process was used to plan and implement required changes. This technique was chosen because it is based on feedback from clinicians, teamwork, strategizing, and immediate evaluation and implementation of common sense solutions. Through rapid improvement events, presence of leaders in the work environment, and daily huddles, team member engagement and communication were enhanced. The process allowed for team members to offer ideas, test these ideas, and evaluate results, all within a rapid time frame. For 9 months, frontline clinicians met monthly for a weeklong rapid improvement event to create better experiences for childbearing women and those who provide their care, using Lean concepts. At the end of each week, an implementation plan and metrics were developed to help ensure sustainment. The issues that were the focus of these process improvements included on-time initiation of scheduled cases such as induction of labor and cesarean birth, timely and efficient assessment and triage disposition, postanesthesia care and immediate newborn care completed within approximately 2 hours, transfer from the labor unit to the mother baby unit, and emergency transfers to the main operating room and intensive care unit. On-time case initiation for labor induction and cesarean birth improved, length of stay in obstetric triage decreased, postanesthesia recovery care was reorganized to be completed within the expected 2-hour standard time frame, and emergency transfers to the main hospital operating room and intensive care units were standardized and enhanced for efficiency and safety. Participants were pleased with the process improvements and quality outcomes. Working together as a team using the Lean process, frontline clinicians identified areas that needed improvement, developed and implemented successful strategies that addressed each gap, and enhanced the quality and safety of care for a large volume perinatal service.

  3. Educating fellows in practice-based learning and improvement and systems-based practice: The value of quality improvement in clinical practice.

    PubMed

    Carey, William A; Colby, Christopher E

    2013-02-01

    In 1999, the Accreditation Council for Graduate Medical Education identified 6 general competencies in which all residents must receive training. In the decade since these requirements went into effect, practice-based learning and improvement (PBLI) and systems-based practice (SBP) have proven to be the most challenging competencies to teach and assess. Because PBLI and SBP both are related to quality improvement (QI) principles and processes, we developed a QI-based curriculum to teach these competencies to our fellows. This experiential curriculum engaged our fellows in our neonatal intensive care unit's (NICU's) structured QI process. After identifying specific patient outcomes in need of improvement, our fellows applied validated QI methods to develop evidence-based treatment protocols for our neonatal intensive care unit. These projects led to immediate and meaningful improvements in patient care and also afforded our fellows various means by which to demonstrate their competence in PBLI and SBP. Our use of portfolios enabled us to document our fellows' performance in these competencies quite easily and comprehensively. Given the clinical and educational structures common to most intensive care unit-based training programs, we believe that a QI-based curriculum such as ours could be adapted by others to teach and assess PBLI and SBP. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Current knowledge and potential applications of cavitation technologies for the petroleum industry.

    PubMed

    Avvaru, Balasubrahmanyam; Venkateswaran, Natarajan; Uppara, Parasuveera; Iyengar, Suresh B; Katti, Sanjeev S

    2018-04-01

    Technologies based on cavitation, produced by either ultrasound or hydrodynamic means, are part of growing literature for individual refinery unit processes. In this review, we have explained the mechanism through which these cavitation technologies intensify individual unit processes such as enhanced oil recovery, demulsification of water in oil emulsions during desalting stage, crude oil viscosity reduction, oxidative desulphurisation/demetallization, and crude oil upgrading. Apart from these refinery processes, applications of this technology are also mentioned for other potential crude oil sources such as oil shale and oil sand extraction. The relative advantages and current situation of each application/process at commercial scale is explained. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Degradation data analysis based on a generalized Wiener process subject to measurement error

    NASA Astrophysics Data System (ADS)

    Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2017-09-01

    Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.

  6. Bio-inspired multi-mode optic flow sensors for micro air vehicles

    NASA Astrophysics Data System (ADS)

    Park, Seokjun; Choi, Jaehyuk; Cho, Jihyun; Yoon, Euisik

    2013-06-01

    Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.

  7. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat.

    PubMed

    Aasebø, Ida E J; Lepperød, Mikkel E; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute; Hafting, Torkel; Fyhn, Marianne

    2017-01-01

    The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model.

  8. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat

    PubMed Central

    Aasebø, Ida E. J.; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute

    2017-01-01

    Abstract The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model. PMID:28791331

  9. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS) Evolution of Computer Integrated Manufacturing (CIM) Technologies

    DTIC Science & Technology

    1988-11-01

    Manufacturing System 22 4. Similar Parts Based Shape or Manufactuting Process 24 5. Projected Annual Unit Robot Sales and Installed Base Through 1992 30 6. U.S...effort needed to perform personnel, product design, marketing , and advertising, and finance tasks of the firm. Level III controls the resource...planning and accounting functions of the firm. Systems at this level support purchasing, accounts payable, accounts receivable, master scheduling and sales

  10. A GPS-based Real-time Road Traffic Monitoring System

    NASA Astrophysics Data System (ADS)

    Tanti, Kamal Kumar

    In recent years, monitoring systems are astonishingly inclined towards ever more automatic; reliably interconnected, distributed and autonomous operation. Specifically, the measurement, logging, data processing and interpretation activities may be carried out by separate units at different locations in near real-time. The recent evolution of mobile communication devices and communication technologies has fostered a growing interest in the GIS & GPS-based location-aware systems and services. This paper describes a real-time road traffic monitoring system based on integrated mobile field devices (GPS/GSM/IOs) working in tandem with advanced GIS-based application software providing on-the-fly authentications for real-time monitoring and security enhancement. The described system is developed as a fully automated, continuous, real-time monitoring system that employs GPS sensors and Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based on the requirements of client’s satisfaction. Due to the modular architecture of the system, other sensor types may be supported with minimal effort. Data on the distributed network & measurements are transmitted via cellular SIM cards to a Control Unit, which provides for post-processing and network management. The Control Unit may be remotely accessed via an Internet connection. The new system will not only provide more consistent data about the road traffic conditions but also will provide methods for integrating with other Intelligent Transportation Systems (ITS). For communication between the mobile device and central monitoring service GSM technology is used. The resulting system is characterized by autonomy, reliability and a high degree of automation.

  11. Electrochemical and ab initio investigations to design a new phenothiazine based organic redox polymeric material for metal-ion battery cathodes.

    PubMed

    Godet-Bar, T; Leprêtre, J-C; Le Bacq, O; Sanchez, J-Y; Deronzier, A; Pasturel, A

    2015-10-14

    Different N-substituted phenothiazines have been synthesized and their electrochemical behavior has been investigated in CH3CN in order to design the best polyphenothiazine based cathodic material candidate for lithium batteries. These compounds exhibit two successive reversible one-electron oxidation processes. Ab initio calculations demonstrate that the potential of the first process is a result of both the hybridization effects between the substituent and the phenothiazine unit as well as the change of conformation of the phenothiazine heterocycle during the oxidation process. More specifically, we show that an asymmetric molecular orbital spreading throughout an external cycle of the phenothiazine unit and the alkyl fragment is formed only if the alkyl fragment is long enough (from the methyl moiety onwards) and is at the origin of the bent conformation for N-substituted phenothiazines during oxidation. Electrochemical investigations supported by ab initio calculations allow the selection of a phenothiazinyl unit which is then polymerized by a Suzuki coupling strategy to avoid the common solubilization issue in carbonate-based liquid electrolytes of lithium cells. The first electrochemical measurements performed show that phenothiazine derivatives pave the way for a promising family of redox polymers intended to be used as organic positives for lithium batteries.

  12. Scalable and responsive event processing in the cloud

    PubMed Central

    Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul

    2013-01-01

    Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164

  13. Modeling of the radiation belt megnetosphere in decisional timeframes

    DOEpatents

    Koller, Josef; Reeves, Geoffrey D; Friedel, Reiner H.W.

    2013-04-23

    Systems and methods for calculating L* in the magnetosphere with essentially the same accuracy as with a physics based model at many times the speed by developing a surrogate trained to be a surrogate for the physics-based model. The trained model can then beneficially process input data falling within the training range of the surrogate model. The surrogate model can be a feedforward neural network and the physics-based model can be the TSK03 model. Operatively, the surrogate model can use parameters on which the physics-based model was based, and/or spatial data for the location where L* is to be calculated. Surrogate models should be provided for each of a plurality of pitch angles. Accordingly, a surrogate model having a closed drift shell can be used from the plurality of models. The feedforward neural network can have a plurality of input-layer units, there being at least one input-layer unit for each physics-based model parameter, a plurality of hidden layer units and at least one output unit for the value of L*.

  14. Delta II JPSS-1 First Stage Transport to SLC-1 for Processing

    NASA Image and Video Library

    2016-04-05

    The first stage of United Launch Alliance Delta II rocket for the Joint Polar Satellite System, or JPSS-1, is transported from NASA Hangar 836 to the Horizontal Processing Facility, located at Vandenberg Air Force Base in California.

  15. Writing Editorials.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2003-01-01

    Presents a thematic unit for middle schools on editorial writing, or persuasive writing, based on the Pathways Model for information skills lessons. Includes assessing other editorials; student research process journals; information literacy and process skills; and two lesson plans that involve library media specialists as well as teachers. (LRW)

  16. Process evaluation to explore internal and external validity of the "Act in Case of Depression" care program in nursing homes.

    PubMed

    Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J

    2012-06-01

    A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  17. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braams, Bastiaan J.; Chung, Hyun-Kyung

    2012-05-25

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint workmore » towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.« less

  18. System and method for motor speed estimation of an electric motor

    DOEpatents

    Lu, Bin [Kenosha, WI; Yan, Ting [Brookfield, WI; Luebke, Charles John [Sussex, WI; Sharma, Santosh Kumar [Viman Nagar, IN

    2012-06-19

    A system and method for a motor management system includes a computer readable storage medium and a processing unit. The processing unit configured to determine a voltage value of a voltage input to an alternating current (AC) motor, determine a frequency value of at least one of a voltage input and a current input to the AC motor, determine a load value from the AC motor, and access a set of motor nameplate data, where the set of motor nameplate data includes a rated power, a rated speed, a rated frequency, and a rated voltage of the AC motor. The processing unit is also configured to estimate a motor speed based on the voltage value, the frequency value, the load value, and the set of nameplate data and also store the motor speed on the computer readable storage medium.

  19. Activity-based costing in radiology. Application in a pediatric radiological unit.

    PubMed

    Laurila, J; Suramo, I; Brommels, M; Tolppanen, E M; Koivukangas, P; Lanning, P; Standertskjöld-Nordenstam, G

    2000-03-01

    To get an informative and detailed picture of the resource utilization in a radiology department in order to support its pricing and management. A system based mainly on the theoretical foundations of activity-based costing (ABC) was designed, tested and compared with conventional costing. The study was performed at the Pediatric Unit of the Department of Radiology, Oulu University Hospital. The material consisted of all the 7,452 radiological procedures done in the unit during the first half of 1994, when both methods of costing where in use. Detailed cost data were obtained from the hospital financial and personnel systems and then related to activity data captured in the radiology information system. The allocation of overhead costs was greatly reduced by the introduction of ABC compared to conventional costing. The overhead cost as a percentage of total costs dropped to one-fourth of total costs, from 57% to 16%. The change of unit costs of radiological procedures varied from -42% to +82%. Costing is much more detailed and precise, and the percentage of unspecified allocated overhead costs diminishes drastically when ABC is used. The new information enhances effective departmental management, as the whole process of radiological procedures is identifiable by single activities, amenable to corrective actions and process improvement.

  20. Prioritization of Stockpile Maintenance with Layered Pareto Fronts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, Sarah E.; Anderson-Cook, Christine M.; Lu, Lu

    Difficult choices are required for a decision-making process where resources and budgets are increasingly constrained. This study demonstrates a structured decision-making approach using layered Pareto fronts to identify priorities about how to allocate funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are not available. This case study, while specific to the characteristics of a group of munitions stockpiles, illustrates the general process of structured decision-making based on first identifying appropriate metrics that summarize the important dimensions of the decision, and then objectively eliminating non-contenders frommore » further consideration. Finally, the final subjective stage incorporates user priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities.« less

  1. Prioritization of Stockpile Maintenance with Layered Pareto Fronts

    DOE PAGES

    Burke, Sarah E.; Anderson-Cook, Christine M.; Lu, Lu; ...

    2017-10-11

    Difficult choices are required for a decision-making process where resources and budgets are increasingly constrained. This study demonstrates a structured decision-making approach using layered Pareto fronts to identify priorities about how to allocate funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are not available. This case study, while specific to the characteristics of a group of munitions stockpiles, illustrates the general process of structured decision-making based on first identifying appropriate metrics that summarize the important dimensions of the decision, and then objectively eliminating non-contenders frommore » further consideration. Finally, the final subjective stage incorporates user priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities.« less

  2. Development Of Simulation Model For Fluid Catalytic Cracking

    NASA Astrophysics Data System (ADS)

    Ghosh, Sobhan

    2010-10-01

    Fluid Catalytic Cracking (FCC) is the most widely used secondary conversion process in the refining industry, for producing gasoline, olefins, and middle distillate from heavier petroleum fractions. There are more than 500 units in the world with a total processing capacity of about 17 to 20% of the crude capacity. FCC catalyst is the highest consumed catalyst in the process industry. On one hand, FCC is quite flexible with respect to it's ability to process wide variety of crudes with a flexible product yield pattern, and on the other hand, the interdependence of the major operating parameters makes the process extremely complex. An operating unit is self balancing and some fluctuations in the independent parameters are automatically adjusted by changing the temperatures and flow rates at different sections. However, a good simulation model is very useful to the refiner to get the best out of the process, in terms of selection of the best catalyst, to cope up with the day to day changing of the feed quality and the demands of different products from FCC unit. In addition, a good model is of great help in designing the process units and peripherals. A simple empirical model is often adequate to monitor the day to day operations, but they are not of any use in handling the other problems such as, catalyst selection or, design / modification of the plant. For this, a kinetic based rigorous model is required. Considering the complexity of the process, large number of chemical species undergoing "n" number of parallel and consecutive reactions, it is virtually impossible to develop a simulation model based on the kinetic parameters. The most common approach is to settle for a semi empirical model. We shall take up the key issues for developing a FCC model and the contribution of such models in the optimum operation of the plant.

  3. The importance of organizational climate and implementation strategy at the introduction of a new working tool in primary health care.

    PubMed

    Carlfjord, S; Andersson, A; Nilsen, P; Bendtsen, P; Lindberg, M

    2010-12-01

    The transmission of research findings into routine care is a slow and unpredictable process. Important factors predicting receptivity for innovations within organizations have been identified, but there is a need for further research in this area. The aim of this study was to describe contextual factors and evaluate if organizational climate and implementation strategy influenced outcome, when a computer-based concept for lifestyle intervention was introduced in primary health care (PHC). The study was conducted using a prospective intervention design. The computer-based concept was implemented at six PHC units. Contextual factors in terms of size, leadership, organizational climate and political environment at the units included in the study were assessed before implementation. Organizational climate was measured using the Creative Climate Questionnaire (CCQ). Two different implementation strategies were used: one explicit strategy, based on Rogers' theories about the innovation-decision process, and one implicit strategy. After 6 months, implementation outcome in terms of the proportion of patients who had been referred to the test, was measured. The CCQ questionnaire response rates among staff ranged from 67% to 91% at the six units. Organizational climate differed substantially between the units. Managers scored higher on CCQ than staff at the same unit. A combination of high CCQ scores and explicit implementation strategy was associated with a positive implementation outcome. Organizational climate varies substantially between different PHC units. High CCQ scores in combination with an explicit implementation strategy predict a positive implementation outcome when a new working tool is introduced in PHC. © 2010 Blackwell Publishing Ltd.

  4. A binary-decision-diagram-based two-bit arithmetic logic unit on a GaAs-based regular nanowire network with hexagonal topology.

    PubMed

    Zhao, Hong-Quan; Kasai, Seiya; Shiratori, Yuta; Hashizume, Tamotsu

    2009-06-17

    A two-bit arithmetic logic unit (ALU) was successfully fabricated on a GaAs-based regular nanowire network with hexagonal topology. This fundamental building block of central processing units can be implemented on a regular nanowire network structure with simple circuit architecture based on graphical representation of logic functions using a binary decision diagram and topology control of the graph. The four-instruction ALU was designed by integrating subgraphs representing each instruction, and the circuitry was implemented by transferring the logical graph structure to a GaAs-based nanowire network formed by electron beam lithography and wet chemical etching. A path switching function was implemented in nodes by Schottky wrap gate control of nanowires. The fabricated circuit integrating 32 node devices exhibits the correct output waveforms at room temperature allowing for threshold voltage variation.

  5. Mi-STAR: Designing Integrated Science Curriculum to Address the Next Generation Science Standards and Their Foundations

    NASA Astrophysics Data System (ADS)

    Gochis, E. E.; Huntoon, J. E.

    2015-12-01

    Mi-STAR (Michigan Science Teaching and Assessment Reform, http://mi-star.mtu.edu/) was funded by the Herbert H. and Grace A. Dow Foundation to reform K-12 science education to present science as an integrated body of knowledge that is applied to address societal issues. To achieve this goal, Mi-STAR is developing an integrated science curriculum for the middle grades that will be aligned with the Next Generation Science Standards (NGSS). Similar to the geosciences, the curriculum requires the integration of science, engineering and math content to explore 21st-century issues and demonstrates how these concepts can be used in service of society. The curriculum is based on the Mi-STAR Unit Specification Chart which pairs interdisciplinary themes with bundled NGSS Performance Expectations. Each unit is developed by a collaborative team of K-12 teachers, university STEM content experts and science education experts. Prior to developing a unit, each member on the team attends the on-line Mi-STAR Academy, completing 18+ hours of professional development (PD). This on-line PD program familiarizes teachers and experts with necessary pedagogical and content background knowledge, including NGSS and three-dimensional learning. With this background, teams use a staged, backwards design process to craft a multi-week unit based on a series of performance based tasks, or 'challenges' that engage students in actively doing science and engineering. Each unit includes Disciplinary Core Ideas from multiple disciplines, which focus on local and familiar examples that demonstrate the relevance of science in student's lives. Performance-based assessments are interwoven throughout the unit. Mi-STAR units will go through extensive pilot testing in several school districts across the state of Michigan. Additionally, the Mi-STAR program will develop teacher professional development programs to support implementation of the curriculum and design a pre-service teacher program in integrated science. We will share preliminary results on the collaborative Mi-STAR process of designing integrated science curriculum to address NGSS.

  6. Efficient Acceleration of the Pair-HMMs Forward Algorithm for GATK HaplotypeCaller on Graphics Processing Units.

    PubMed

    Ren, Shanshan; Bertels, Koen; Al-Ars, Zaid

    2018-01-01

    GATK HaplotypeCaller (HC) is a popular variant caller, which is widely used to identify variants in complex genomes. However, due to its high variants detection accuracy, it suffers from long execution time. In GATK HC, the pair-HMMs forward algorithm accounts for a large percentage of the total execution time. This article proposes to accelerate the pair-HMMs forward algorithm on graphics processing units (GPUs) to improve the performance of GATK HC. This article presents several GPU-based implementations of the pair-HMMs forward algorithm. It also analyzes the performance bottlenecks of the implementations on an NVIDIA Tesla K40 card with various data sets. Based on these results and the characteristics of GATK HC, we are able to identify the GPU-based implementations with the highest performance for the various analyzed data sets. Experimental results show that the GPU-based implementations of the pair-HMMs forward algorithm achieve a speedup of up to 5.47× over existing GPU-based implementations.

  7. Pilot-Scale Silicone Process for Low-Cost Carbon Dioxide Capture Preliminary Techno-Economic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surinder; Spiry, Irina; Wood, Benjamin

    This report presents system and economic analysis for a carbon-capture unit which uses an aminosilicone-based solvent for CO{sub 2} capture in a pulverized coal (PC) boiler. The aminosilicone solvent is a 60/40 wt/wt mixture of 3-aminopropyl end-capped polydimethylsiloxane (GAP-1m) with tri-ethylene glycol (TEG) as a co-solvent. For comparison purposes, the report also shows results for a carbon-capture unit based on a conventional approach using mono-ethanol amine (MEA). The first year removal cost of CO{sub 2} for the aminosilicone-based carbon-capture process ismore » $46.04/ton of CO2 as compared to $$60.25/ton of CO{sub 2} when MEA is used. The aminosilicone-based process has <77% of the CAPEX of a system using MEA solvent. The lower CAPEX is due to several factors, including the higher working capacity of the aminosilicone solvent compared the MEA, which reduces the solvent flow rate required, reducing equipment sizes. If it is determined that carbon steel can be used in the rich-lean heat exchanger in the carbon capture unit, the first year removal cost of CO{sub 2} decreases to $$44.12/ton. The aminosilicone-based solvent has a higher thermal stability than MEA, allowing desorption to be conducted at higher temperatures and pressures, decreasing the number of compressor stages needed. The aminosilicone-based solvent also has a lower vapor pressure, allowing the desorption to be conducted in a continuous-stirred tank reactor versus a more expensive packed column. The aminosilicone-based solvent has a lower heat capacity, which decreases the heat load on the desorber. In summary, the amino-silicone solvent has significant advantages over conventional systems using MEA.« less

  8. Pilot-Scale Silicone Process for Low-Cost Carbon Dioxide Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surinder; Spiry, Irina; Wood, Benjamin

    This report presents system and economicanalysis for a carbon-capture unit which uses an aminosilicone-based solvent for CO₂ capture in a pulverized coal (PC) boiler. The aminosilicone solvent is a 60/40 wt/wt mixture of 3-aminopropyl end-capped polydimethylsiloxane (GAP-1m) with tri-ethylene glycol (TEG) as a co-solvent. Forcomparison purposes, the report also shows results for a carbon-capture unit based on a conventional approach using mono-ethanol amine (MEA). The first year removal cost of CO₂ for the aminosilicone-based carbon-capture process is $46.04/ton of CO₂ as compared to $60.25/ton of CO₂ when MEA is used. The aminosilicone- based process has <77% of the CAPEX ofmore » a system using MEA solvent. The lower CAPEX is due to several factors, including the higher working capacity of the aminosilicone solvent compared the MEA, which reduces the solvent flow rate required, reducing equipment sizes. If it is determined that carbon steel can be used in the rich-lean heat exchanger in the carbon capture unit, the first year removal cost of CO₂ decreases to $44.12/ton. The aminosilicone-based solvent has a higherthermal stability than MEA, allowing desorption to be conducted at higher temperatures and pressures, decreasing the number of compressor stages needed. The aminosilicone-based solvent also has a lowervapor pressure, allowing the desorption to be conducted in a continuous-stirred tank reactor versus a more expensive packed column. The aminosilicone-based solvent has a lowerheat capacity, which decreases the heat load on the desorber. In summary, the amino-silicone solvent has significant advantages overconventional systems using MEA.« less

  9. FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting

    NASA Astrophysics Data System (ADS)

    Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.

    2009-10-01

    The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.

  10. Improved preconditioned conjugate gradient algorithm and application in 3D inversion of gravity-gradiometry data

    NASA Astrophysics Data System (ADS)

    Wang, Tai-Han; Huang, Da-Nian; Ma, Guo-Qing; Meng, Zhao-Hai; Li, Ye

    2017-06-01

    With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noisecontaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airborne gravity-gradiometry data from Vinton salt dome (southwest Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.

  11. Improving Initial Assessment in Work-Based Learning.

    ERIC Educational Resources Information Center

    Green, Muriel

    This document, which is designed to assist managers, trainers, or assessors in work-based provision across the United Kingdom, shares the experiences of five work-based learning providers that sought to improve their initial assessment processes. Section 1 explains the purpose of initial assessment and presents guidelines for evaluating intake…

  12. Research plan for lands administered by the U.S. Department of the Interior in the Interior Columbia Basin and Snake River Plateau

    USGS Publications Warehouse

    Beever, Erik A.; Pyke, David A.

    2002-01-01

    The research strategy focuses on disturbance processes and events that have been the primary drivers of change, to provide a predictive model for future changes. These drivers include fire, nonnative plants, herbivory, roads and associated human influences, and climate change. Whereas management in the western United States has striven to move from an inefficient species-based approach to a habitat-based approach, the plan focuses on ecosystem function and ecological processes as critical measures of habitat response. Because of the large amount and contiguity of public lands in the western United States, the region presents both a compelling opportunity to implement landscape-level science and a challenge to underst

  13. Assessing LiDAR elevation data for KDOT applications.

    DOT National Transportation Integrated Search

    2013-02-01

    LiDAR-based elevation surveys are a cost-effective means for mapping topography over large areas. LiDAR : surveys use an airplane-mounted or ground-based laser radar unit to scan terrain. Post-processing techniques are : applied to remove vegetation ...

  14. Developing criterion-based competencies for tele-intensive care unit.

    PubMed

    Schleifer, Sarah Joy; Carroll, Karen; Moseley, Marthe J

    2014-01-01

    Over the last 5 years, telemedicine has developed nursing roles that differ from traditional bedside care. In the midst of this transition, current competency development models focused on task completion may not be the most effective form of proficiency validation. The procedure of competency creation for the role of tele-intensive care unit registered nurse requires a thoughtful process using stakeholders from institutional leadership to frontline staff. The process must include stakeholder approval to ensure appropriate buy-in and follow-through on the agreed-upon criteria. This can be achieved using a standardized method of concept stimulation related to the behaviors, not a memorized list of tasks, expected of a telemedicine registered nurse. This process serves as the foundation for the development of criterion-based competency statements that then allows for clearer expectations. Continually reviewing the written competencies, ensuring current applicability, and revising as needed are necessities for maintaining competence and, therefore, patient safety.

  15. Women in the Progressive Era. A Unit of Study for Grades 9-12.

    ERIC Educational Resources Information Center

    Jason, Alli; Strickland, Louise; McMillen, Margaret

    A specific dramatic episode in history that allows students to delve into the deeper meanings of selected landmark events and explore a wider context of historical narrative is represented within this supplementary teaching unit. This approach helps students materialize history as an ongoing, open-ended process that is based upon decisions made in…

  16. Constructing Knowledge about the Trigonometric Functions and Their Geometric Meaning on the Unit Circle

    ERIC Educational Resources Information Center

    Altman, Renana; Kidron, Ivy

    2016-01-01

    Processes of knowledge construction are investigated. A learner is constructing knowledge about the trigonometric functions and their geometric meaning on the unit circle. The analysis is based on the dynamically nested epistemic action model for abstraction in context. Different tasks are offered to the learner. In his effort to perform the…

  17. Experiments with a Knowledge-Based System on a Multiprocessor

    DTIC Science & Technology

    1987-10-01

    variables on the node. Functions have access to those instance variables. Gajski et. al. [ Gajski 82] summarize the principles underlying pure data flow...ConSider the following numerical example from Gajski et. al. ( Gajski 82]. The pseudo-code representation of the problem is as follows: co - 0 A=x £ 1 Ixo... Gajski et. al. They assume that division takes three processing units, multiplication takes two units, and addition takes one unit. As noted in their paper

  18. COLA: Optimizing Stream Processing Applications via Graph Partitioning

    NASA Astrophysics Data System (ADS)

    Khandekar, Rohit; Hildrum, Kirsten; Parekh, Sujay; Rajan, Deepak; Wolf, Joel; Wu, Kun-Lung; Andrade, Henrique; Gedik, Buğra

    In this paper, we describe an optimization scheme for fusing compile-time operators into reasonably-sized run-time software units called processing elements (PEs). Such PEs are the basic deployable units in System S, a highly scalable distributed stream processing middleware system. Finding a high quality fusion significantly benefits the performance of streaming jobs. In order to maximize throughput, our solution approach attempts to minimize the processing cost associated with inter-PE stream traffic while simultaneously balancing load across the processing hosts. Our algorithm computes a hierarchical partitioning of the operator graph based on a minimum-ratio cut subroutine. We also incorporate several fusion constraints in order to support real-world System S jobs. We experimentally compare our algorithm with several other reasonable alternative schemes, highlighting the effectiveness of our approach.

  19. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  20. Learning Computers, Speaking English: Cooperative Activities for Learning English and Basic Word Processing.

    ERIC Educational Resources Information Center

    Quann, Steve; Satin, Diana

    This textbook leads high-beginning and intermediate English-as-a-Second-Language (ESL) students through cooperative computer-based activities that combine language learning with training in basic computer skills and word processing. Each unit concentrates on a basic concept of word processing while also focusing on a grammar topic. Skills are…

  1. People detection method using graphics processing units for a mobile robot with an omnidirectional camera

    NASA Astrophysics Data System (ADS)

    Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki

    2011-12-01

    This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.

  2. A Novel Reconfigurable Logic Unit Based on the DNA-Templated Potassium-Concentration-Dependent Supramolecular Assembly.

    PubMed

    Yang, Chunrong; Zou, Dan; Chen, Jianchi; Zhang, Linyan; Miao, Jiarong; Huang, Dan; Du, Yuanyuan; Yang, Shu; Yang, Qianfan; Tang, Yalin

    2018-03-15

    Plenty of molecular circuits with specific functions have been developed; however, logic units with reconfigurability, which could simplify the circuits and speed up the information process, are rarely reported. In this work, we designed a novel reconfigurable logic unit based on a DNA-templated, potassium-concentration-dependent, supramolecular assembly, which could respond to the input stimuli of H + and K + . By inputting different concentrations of K + , the logic unit could implement three significant functions, including a half adder, a half subtractor, and a 2-to-4 decoder. Considering its reconfigurable ability and good performance, the novel prototypes developed here may serve as a promising proof of principle in molecular computers. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Nursing unit leaders' influence on the long-term sustainability of evidence-based practice improvements.

    PubMed

    Fleiszer, Andrea R; Semenic, Sonia E; Ritchie, Judith A; Richer, Marie-Claire; Denis, Jean-Louis

    2016-04-01

    To describe how actions of nursing unit leaders influenced the long-term sustainability of a best practice guidelines (BPG) program on inpatient units. Several factors influence the initial implementation of evidence-based practice improvements in nursing, with leadership recognized as essential. However, there is limited knowledge about enduring change, including how frontline nursing leaders influence the sustainability of practice improvements over the long term. A qualitative descriptive case study included 39 in-depth interviews, observations, and document reviews. Four embedded nursing unit subcases had differing levels of program sustainability at 7 years (average) following implementation. Higher levels of BPG sustainability occurred on units where formal leadership teams used an integrated set of strategies and activities. Two key strategies were maintaining priorities and reinforcing expectations. The coordinated use of six activities (e.g., discussing, evaluating, integrating) promoted the continuation of BPG practices among staff. These leadership processes, fostering exchange and learning, contributed to sustainability-promoting environments characterized by teamwork and accountability. Unit leaders are required to strategically orchestrate several overlapping and synergistic efforts to achieve long-term sustainability of BPG-based practice improvements. As part of managing overall unit performance, unit leaders may influence practice improvement sustainability by aligning vision, strategies, and activities. © 2015 John Wiley & Sons Ltd.

  4. Personal customizing exercise with a wearable measurement and control unit.

    PubMed

    Wang, Zhihui; Kiryu, Tohru; Tamura, Naoki

    2005-06-28

    Recently, wearable technology has been used in various health-related fields to develop advanced monitoring solutions. However, the monitoring function alone cannot meet all the requirements of customizing machine-based exercise on an individual basis by relying on biosignal-based controls. We propose a new wearable unit design equipped with measurement and control functions to support the customization process. The wearable unit can measure the heart rate and electromyogram signals during exercise performance and output workload control commands to the exercise machines. The workload is continuously tracked with exercise programs set according to personally customized workload patterns and estimation results from the measured biosignals by a fuzzy control method. Exercise programs are adapted by relying on a computer workstation, which communicates with the wearable unit via wireless connections. A prototype of the wearable unit was tested together with an Internet-based cycle ergometer system to demonstrate that it is possible to customize exercise on an individual basis. We tested the wearable unit in nine people to assess its suitability to control cycle ergometer exercise. The results confirmed that the unit could successfully control the ergometer workload and continuously support gradual changes in physical activities. The design of wearable units equipped with measurement and control functions is an important step towards establishing a convenient and continuously supported wellness environment.

  5. Analysis of the policymaking process in Burkina Faso's health sector: case studies of the creation of two health system support units.

    PubMed

    Zida, Andre; Lavis, John N; Sewankambo, Nelson K; Kouyate, Bocar; Moat, Kaelan; Shearer, Jessica

    2017-02-13

    Burkina Faso has made a number of health system policy decisions to improve performance on health indicators and strengthen responsiveness to health-related challenges. These included the creation of a General Directorate of Health Information and Statistics (DGISS) and a technical unit to coordinate performance-based financing (CT-FBR). We analysed the policymaking processes associated with the establishment of these units, and documented the factors that influenced this process. We used a multiple-case study design based on Kingdon's agenda-setting model to investigate the DGISS and CT-FBR policymaking processes. Data were collected from interviews with key informants (n = 28), published literature, policy documents (including two strategic and 230 action plans), and 55 legal/regulatory texts. Interviews were analysed using thematic qualitative analysis. Data from the documentary analysis were triangulated with the qualitative interview data. Key factors influencing the policymaking processes associated with the two units involved the 'problem' (problem identification), 'policy' (formation of policy proposals), and 'politics' (political climate/change) streams, which came together in a way that resulted in proposals being placed on the decision agenda. A number of problems with Burkina Faso's health information and financing systems were identified. Policy proposals for the DGISS and CT-FBR units were developed in response to these problems, emerging from several sources including development partners. Changes in political and public service administrations (specifically the 2008 appointment of a new Minister of Health and the establishment of a new budget allocation system), with corresponding changes in the actors and interests involved, appeared key in elevating the proposals to the decision agenda. Efforts to improve performance on health indicators and strengthen responsiveness to health-related challenges need focus on the need for a compelling problem, a viable policy, and conducive politics in order to make it to the decision agenda.

  6. Modeling the Hydrologic Processes of a Permeable Pavement ...

    EPA Pesticide Factsheets

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has been developed in this study. The developed model can continuously simulate infiltration through the permeable pavement surface, exfiltration from the storage to the surrounding in situ soils, and clogging impacts on infiltration/exfiltration capacity at the pavement surface and the bottom of the subsurface storage unit. The exfiltration modeling component simulates vertical and horizontal exfiltration independently based on Darcy’s formula with the Green-Ampt approximation. The developed model can be arranged with physically-based modeling parameters, such as hydraulic conductivity, Manning’s friction flow parameters, saturated and field capacity volumetric water contents, porosity, density, etc. The developed model was calibrated using high-frequency observed data. The modeled water depths are well matched with the observed values (R2 = 0.90). The modeling results show that horizontal exfiltration through the side walls of the subsurface storage unit is a prevailing factor in determining the hydrologic performance of the system, especially where the storage unit is developed in a long, narrow shape; or with a high risk of bottom compaction and clogging. This paper presents unit

  7. Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).

    PubMed

    Yang, Owen; Choi, Bernard

    2013-01-01

    To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.

  8. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  9. AN OVERVIEW OF THE INTEROPERABILITY ROADMAP FOR COM/.NET-BASED CAPE-OPEN

    EPA Science Inventory

    The CAPE-OPEN standard interfaces have been designed to permit flexibility and modularization of process simulation environments (PMEs) in order to use process modeling components such as unit operation or thermodynamic property models across a range of tolls employed in the life...

  10. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  11. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A web-based tree crown condition training and evaluation tool for urban and community forestry

    Treesearch

    Matthew F. Winn; Neil A. Clark; Philip A. Araman; Sang-Mook Lee

    2007-01-01

    Training personnel for natural resource related field work can be a costly and time-consuming process. For that reason, web-based training is considered by many to be a more attractive alternative to on-site training. The U.S. Forest Service Southern Research Station unit with Virginia Tech cooperators in Blacksburg, Va., are in the process of constructing a web site...

  13. Regional variation of flow duration curves in the eastern United States: Process-based analyses of the interaction between climate and landscape properties

    Treesearch

    Wafa Chouaib; Peter V. Caldwell; Younes Alila

    2018-01-01

    This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the...

  14. The impact of intensivists' base specialty of training on care process and outcomes of critically ill trauma patients.

    PubMed

    Matsushima, Kazuhide; Goldwasser, Eleanor R; Schaefer, Eric W; Armen, Scott B; Indeck, Matthew C

    2013-09-01

    The care of the critically ill trauma patients is provided by intensivists with various base specialties of training. The purpose of this study was to investigate the impact of intensivists' base specialty of training on the disparity of care process and patient outcome. We performed a retrospective review of an institutional trauma registry at an academic level 1 trauma center. Two intensive care unit teams staffed by either board-certified surgery or anesthesiology intensivists were assigned to manage critically ill trauma patients. Both teams provided care, collaborating with a trauma surgeon in house. We compared patient characteristics, care processes, and outcomes between surgery and anesthesiology groups using Wilcoxon tests or chi-square tests, as appropriate. We identified a total of 620 patients. Patient baseline characteristics including age, sex, transfer status, injury type, injury severity score, and Glasgow coma scale were similar between groups. We found no significant difference in care processes and outcomes between groups. In a logistic regression model, intensivists' base specialty of training was not a significant factor for mortality (odds ratio, 1.46; 95% confidence interval; 0.79-2.80; P = 0.22) and major complication (odds ratio, 1.11; 95% confidence interval, 0.73-1.67; P = 0.63). Intensive care unit teams collaborating with trauma surgeons had minimal disparity of care processes and similar patient outcomes regardless of intensivists' base specialty of training. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. [Work process and workers' health in a food and nutrition unit: prescribed versus actual work].

    PubMed

    Colares, Luciléia Granhen Tavares; Freitas, Carlos Machado de

    2007-12-01

    This study focuses on the relationship between the work process in a food and nutrition unit and workers' health, in the words of the participants themselves. Direct observation, a semi-structured interview, and focus groups were used to collect the data. The reference was the dialogue between human ergonomics and work psychodynamics. The results showed that work organization in the study unit represents a routine activity, the requirements of which in terms of the work situation are based on criteria set by the institution. Variability in the activities is influenced mainly by the available equipment, instruments, and materials, thereby generating improvisation in meal production that produces both a physical and psychological cost for workers. Dissatisfaction during the performance of tasks results mainly from the supervisory style and relationship to immediate superiors. Workers themselves proposed changes in the work organization, based on greater dialogue and trust between supervisors and the workforce. Finally, the study identifies the need for an intervention that encourages workers' participation as agents of change.

  16. Keyphrase based Evaluation of Automatic Text Summarization

    NASA Astrophysics Data System (ADS)

    Elghannam, Fatma; El-Shishtawy, Tarek

    2015-05-01

    The development of methods to deal with the informative contents of the text units in the matching process is a major challenge in automatic summary evaluation systems that use fixed n-gram matching. The limitation causes inaccurate matching between units in a peer and reference summaries. The present study introduces a new Keyphrase based Summary Evaluator KpEval for evaluating automatic summaries. The KpEval relies on the keyphrases since they convey the most important concepts of a text. In the evaluation process, the keyphrases are used in their lemma form as the matching text unit. The system was applied to evaluate different summaries of Arabic multi-document data set presented at TAC2011. The results showed that the new evaluation technique correlates well with the known evaluation systems: Rouge1, Rouge2, RougeSU4, and AutoSummENG MeMoG. KpEval has the strongest correlation with AutoSummENG MeMoG, Pearson and spearman correlation coefficient measures are 0.8840, 0.9667 respectively.

  17. Assessing LiDAR elevation data for KDOT applications : [technical summary].

    DOT National Transportation Integrated Search

    2013-02-01

    LiDAR-based elevation surveys : are a cost-effective means for : mapping topography over large : areas. LiDAR surveys use an : airplane-mounted or ground-based : laser radar unit to scan terrain. : Post-processing techniques are : applied to remove v...

  18. Understanding the development of minimum unit pricing of alcohol in Scotland: a qualitative study of the policy process.

    PubMed

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing's development by taking a 'multiple-lenses' approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon's multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy 'image' to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process, with Multi-Level Governance particularly useful for highlighting important lessons for the future of public health policy.

  19. Understanding the Development of Minimum Unit Pricing of Alcohol in Scotland: A Qualitative Study of the Policy Process

    PubMed Central

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Background Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing’s development by taking a ‘multiple-lenses’ approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon’s multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Methods Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. Findings The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy ‘image’ to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Conclusions Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process, with Multi-Level Governance particularly useful for highlighting important lessons for the future of public health policy. PMID:24670519

  20. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badal, Andreu; Badano, Aldo

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less

  1. The Effects of Science Models on Students' Understanding of Scientific Processes

    NASA Astrophysics Data System (ADS)

    Berglin, Riki Susan

    This action research study investigated how the use of science models affected fifth-grade students' ability to transfer their science curriculum to a deeper understanding of scientific processes. This study implemented a variety of science models into a chemistry unit throughout a 6-week study. The research question addressed was: In what ways do using models to learn and teach science help students transfer classroom knowledge to a deeper understanding of the scientific processes? Qualitative and quantitative data were collected through pre- and post-science interest inventories, observations field notes, student work samples, focus group interviews, and chemistry unit tests. These data collection tools assessed students' attitudes, engagement, and content knowledge throughout their chemistry unit. The results of the data indicate that the model-based instruction program helped with students' engagement in the lessons and understanding of chemistry content. The results also showed that students displayed positive attitudes toward using science models.

  2. Cord Blood Banking and Transplantation in China: A Ten Years Experience of a Single Public Bank.

    PubMed

    Liu, Jinhui; He, Ji; Chen, Shu; Qin, Fei; Wang, Fang; Xu, Gang; Zhu, Faming; Lv, Hangjun; Yan, Lixing

    2012-02-01

    BACKGROUND: Umbilical cord blood (UCB) has successfully used for transplantation to treat hematologic malignancies and genetic diseases. Herein, we describe the experience generated in a single public UCB bank at Zhejiang Province in China. METHODS: Good manufacturing practice and standard operating procedures were used to address donor selection as well as UCB collection, processing, and cryopreservation. Total nucleated cells (TNCs), cellular viability, CD34+ cells, and colony-forming units were determined, and infectious diseases screening test, sterility test, and HLA typing for UCB units were done. RESULTS: Only 18.51% of all collected UCB units met storage criteria, and 7,056 UCB units were cryopreserved in 10 years. The volume of UCB units was 95.0 ± 22.0 ml. The number of TNCs before and after processing was 13.32 ± 3.63 × 10(8) and 10.63 ± 2.80 × 10(8), respectively, and the recovery rate was 80.71 ± 11.26%. 0.4344 ± 0.1874% of the TNCs were CD34+ cells. The CFU-GM was 32.1 ± 28.0 colonies per 1 × 10(5) nucleated cells. Based mainly on HLA and nucleated cell content, 26 UCB units were released for transplantation. CONCLUSIONS: A public UCB bank was successfully established in China; collection and processing of UCB units should be optimized in order to gain maximum volume and cell count.

  3. Design and process integration of organic Rankine cycle utilizing biomass for power generation

    NASA Astrophysics Data System (ADS)

    Ependi, S.; Nur, T. B.

    2018-02-01

    Indonesia has high potential biomass energy sources from palm oil mill industry activities. The growing interest on Organic Rankine Cycle (ORC) application to produce electricity by utilizing biomass energy sources are increasingly due to its successfully used for generating electricity from rejected waste heat to the environment in industrial processes. In this study, the potential of the palm oil empty fruit bunch, and wood chip have been used as fuel for biomass to generate electricity based ORC with combustion processes. The heat from combustion burner was transfer by thermal oil heater to evaporate ORC working fluid in the evaporator unit. The Syltherm-XLT thermal oil was used as the heat carrier from combustion burner, while R245fa was used as the working fluid for ORC unit. Appropriate designs integration from biomass combustion unit to ORC unit have been analyzed and proposed to generate expander shaft-work. Moreover, the effect of recuperator on the total system efficiency has also been investigated. It was observed that the fuel consumption was increased when the ORC unit equipped recuperator operated until certain pressure and decreased when operated at high pressure.

  4. ConnectX2 In niBand Management Queues: New support for Network Of oaded

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, Richard L; Poole, Stephen W; Shamis, Pavel

    2010-01-01

    This paper introduces the newly developed InfiniBand (IB) Management Queue capability, used by the Host Channel Adapter (HCA) to manage network task data flow dependancies, and progress the communications associated with such flows. These tasks include sends, receives, and the newly supported wait task, and are scheduled by the HCA based on a data dependency description provided by the user. This functionality is supported by the ConnectX-2 HCA, and provides the means for delegating collective communication management and progress to the HCA, also known as collective communication offload. This provides a means for overlapping collective communications managed by the HCAmore » and computation on the Central Processing Unit (CPU), thus making it possible to reduce the impact of system noise on parallel applications using collective operations. This paper further describes how this new capability can be used to implement scalable Message Passing Interface (MPI) collective operations, describing the high level details of how this new capability is used to implement the MPI Barrier collective operation, focusing on the latency sensitive performance aspects of this new capability. This paper concludes with small scale benchmark experiments comparing implementations of the barrier collective operation, using the new network offload capabilities, with established point-to-point based implementations of these same algorithms, which manage the data flow using the central processing unit. These early results demonstrate the promise this new capability provides to improve the scalability of high-performance applications using collective communications. The latency of the HCA based implementation of the barrier is similar to that of the best performing point-to-point based implementation managed by the central processing unit, starting to outperform these as the number of processes involved in the collective operation increases.« less

  5. ConnectX-2 InfiniBand Management Queues: First Investigation of the New Support for Network Offloaded Collective Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, Richard L; Poole, Stephen W; Shamis, Pavel

    2010-01-01

    This paper introduces the newly developed Infini-Band (IB) Management Queue capability, used by the Host Channel Adapter (HCA) to manage network task data flow dependancies, and progress the communications associated with such flows. These tasks include sends, receives, and the newly supported wait task, and are scheduled by the HCA based on a data dependency description provided by the user. This functionality is supported by the ConnectX-2 HCA, and provides the means for delegating collective communication management and progress to the HCA, also known as collective communication offload. This provides a means for overlapping collective communications managed by the HCAmore » and computation on the Central Processing Unit (CPU), thus making it possible to reduce the impact of system noise on parallel applications using collective operations. This paper further describes how this new capability can be used to implement scalable Message Passing Interface (MPI) collective operations, describing the high level details of how this new capability is used to implement the MPI Barrier collective operation, focusing on the latency sensitive performance aspects of this new capability. This paper concludes with small scale benchmark experiments comparing implementations of the barrier collective operation, using the new network offload capabilities, with established point-to-point based implementations of these same algorithms, which manage the data flow using the central processing unit. These early results demonstrate the promise this new capability provides to improve the scalability of high performance applications using collective communications. The latency of the HCA based implementation of the barrier is similar to that of the best performing point-to-point based implementation managed by the central processing unit, starting to outperform these as the number of processes involved in the collective operation increases.« less

  6. Risk-based Prioritization of Facility Decommissioning and Environmental Restoration Projects in the National Nuclear Legacy Liabilities Program at the Chalk River Laboratory - 13564

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Jerel G.; Kruzic, Michael; Castillo, Carlos

    2013-07-01

    Chalk River Laboratory (CRL), located in Ontario Canada, has a large number of remediation projects currently in the Nuclear Legacy Liabilities Program (NLLP), including hundreds of facility decommissioning projects and over one hundred environmental remediation projects, all to be executed over the next 70 years. Atomic Energy of Canada Limited (AECL) utilized WorleyParsons to prioritize the NLLP projects at the CRL through a risk-based prioritization and ranking process, using the WorleyParsons Sequencing Unit Prioritization and Estimating Risk Model (SUPERmodel). The prioritization project made use of the SUPERmodel which has been previously used for other large-scale site prioritization and sequencing ofmore » facilities at nuclear laboratories in the United States. The process included development and vetting of risk parameter matrices as well as confirmation/validation of project risks. Detailed sensitivity studies were also conducted to understand the impacts that risk parameter weighting and scoring had on prioritization. The repeatable prioritization process yielded an objective, risk-based and technically defendable process for prioritization that gained concurrence from all stakeholders, including Natural Resources Canada (NRCan) who is responsible for the oversight of the NLLP. (authors)« less

  7. Unequal Bargaining? Australia's Aviation Trade Relations with the United States

    NASA Technical Reports Server (NTRS)

    Solomon, Russell

    2001-01-01

    International aviation trade bargaining is distinguished by its use of a formal process of bilateral bargaining based on the reciprocal exchange of rights by states. Australia-United States aviation trade relations are currently without rancour, but this has not always been the case and in the late 1980s and early 1990s, their formal bilateral aviation negotiations were a forum for a bitter conflict between two competing international aviation policies. In seeking to explain the bilateral aviation outcomes between Australia and the United States and how Australia has sought to improve upon these, analytical frameworks derived from international political economy were considered, along with the bilateral bargaining process itself. The paper adopts a modified neorealist model and concludes that to understand how Australia has sought to improve upon these aviation outcomes, neorealist assumptions that relative power capabilities determine outcomes must be qualified by reference to the formal bilateral bargaining process. In particular, Australia's use of this process and its application of certain bargaining tactics within that process remain critical to understanding bilateral outcomes.

  8. Very large scale monoclonal antibody purification: the case for conventional unit operations.

    PubMed

    Kelley, Brian

    2007-01-01

    Technology development initiatives targeted for monoclonal antibody purification may be motivated by manufacturing limitations and are often aimed at solving current and future process bottlenecks. A subject under debate in many biotechnology companies is whether conventional unit operations such as chromatography will eventually become limiting for the production of recombinant protein therapeutics. An evaluation of the potential limitations of process chromatography and filtration using today's commercially available resins and membranes was conducted for a conceptual process scaled to produce 10 tons of monoclonal antibody per year from a single manufacturing plant, a scale representing one of the world's largest single-plant capacities for cGMP protein production. The process employs a simple, efficient purification train using only two chromatographic and two ultrafiltration steps, modeled after a platform antibody purification train that has generated 10 kg batches in clinical production. Based on analyses of cost of goods and the production capacity of this very large scale purification process, it is unlikely that non-conventional downstream unit operations would be needed to replace conventional chromatographic and filtration separation steps, at least for recombinant antibodies.

  9. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    PubMed

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  10. Ventilation-Perfusion Relationships Following Experimental Pulmonary Contusion

    DTIC Science & Technology

    2007-06-14

    696.7 6.1 to 565.0 24.3 Hounsfield units ), as did VOL (4.3 0.5 to 33.5 3.2%). Multivariate linear regression of MGSD, VOL, VD/VT, and QS vs. PaO2...parenchyma was separated into four regions based on the Hounsfield unit (HU) ranges reported by Gattinoni et al. (23) via a segmentation process executed...determined by repeated measures ANOVA. CT, computed tomography; MGSD, mean gray-scale density of the entire lung by CT scan; HU, Hounsfield units

  11. Reproducing (Dis)advantage: The Role of Family-Based, School-Based, and Cumulative-Based Processes

    ERIC Educational Resources Information Center

    Conner, Sonya

    2012-01-01

    Pierre Bourdieu's theory of cultural and social reproduction (Bourdieu 1973; Bourdieu and Passeron 1977) offers a model that can be used to explain the existence of persistent educational stratification in the United States, which contributes to perpetuation of social inequality, more generally. This theoretical model purports three…

  12. Comprehensive cost analysis of sentinel node biopsy in solid head and neck tumors using a time-driven activity-based costing approach.

    PubMed

    Crott, Ralph; Lawson, Georges; Nollevaux, Marie-Cécile; Castiaux, Annick; Krug, Bruno

    2016-09-01

    Head and neck cancer (HNC) is predominantly a locoregional disease. Sentinel lymph node (SLN) biopsy offers a minimally invasive means of accurately staging the neck. Value in healthcare is determined by both outcomes and the costs associated with achieving them. Time-driven activity-based costing (TDABC) may offer more precise estimates of the true cost. Process maps were developed for nuclear medicine, operating room and pathology care phases. TDABC estimates the costs by combining information about the process with the unit cost of each resource used. Resource utilization is based on observation of care and staff interviews. Unit costs are calculated as a capacity cost rate, measured as a Euros/min (2014), for each resource consumed. Multiplying together the unit costs and resource quantities and summing across all resources used will produce the average cost for each phase of care. Three time equations with six different scenarios were modeled based on the type of camera, the number of SLN and the type of staining used. Total times for different SLN scenarios vary between 284 and 307 min, respectively, with a total cost between 2794 and 3541€. The unit costs vary between 788€/h for the intraoperative evaluation with a gamma-probe and 889€/h for a preoperative imaging with a SPECT/CT. The unit costs for the lymphadenectomy and the pathological examination are, respectively, 560 and 713€/h. A 10 % increase of time per individual activity generates only 1 % change in the total cost. TDABC evaluates the cost of SLN in HNC. The total costs across all phases which varied between 2761 and 3744€ per standard case.

  13. When high achievers and low achievers work in the same group: the roles of group heterogeneity and processes in project-based learning.

    PubMed

    Cheng, Rebecca Wing-yi; Lam, Shui-fong; Chan, Joanne Chung-yan

    2008-06-01

    There has been an ongoing debate about the inconsistent effects of heterogeneous ability grouping on students in small group work such as project-based learning. The present research investigated the roles of group heterogeneity and processes in project-based learning. At the student level, we examined the interaction effect between students' within-group achievement and group processes on their self- and collective efficacy. At the group level, we examined how group heterogeneity was associated with the average self- and collective efficacy reported by the groups. The participants were 1,921 Hong Kong secondary students in 367 project-based learning groups. Student achievement was determined by school examination marks. Group processes, self-efficacy and collective efficacy were measured by a student-report questionnaire. Hierarchical linear modelling was used to analyse the nested data. When individual students in each group were taken as the unit of analysis, results indicated an interaction effect of group processes and students' within-group achievement on the discrepancy between collective- and self-efficacy. When compared with low achievers, high achievers reported lower collective efficacy than self-efficacy when group processes were of low quality. However, both low and high achievers reported higher collective efficacy than self-efficacy when group processes were of high quality. With 367 groups taken as the unit of analysis, the results showed that group heterogeneity, group gender composition and group size were not related to the discrepancy between collective- and self-efficacy reported by the students. Group heterogeneity was not a determinant factor in students' learning efficacy. Instead, the quality of group processes played a pivotal role because both high and low achievers were able to benefit when group processes were of high quality.

  14. Object Interpolation in Three Dimensions

    ERIC Educational Resources Information Center

    Kellman, Philip J.; Garrigan, Patrick; Shipley, Thomas F.

    2005-01-01

    Perception of objects in ordinary scenes requires interpolation processes connecting visible areas across spatial gaps. Most research has focused on 2-D displays, and models have been based on 2-D, orientation-sensitive units. The authors present a view of interpolation processes as intrinsically 3-D and producing representations of contours and…

  15. POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.

    EPA Science Inventory

    A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...

  16. Investigation of Copper Sorption by Sugar Beet Processing Lime Waste

    EPA Science Inventory

    In the western United States, sugar beet processing for sugar recovery generates a lime-based waste product (~250,000 Mg yr-1) that has little liming value in the region’s calcareous soils. This area has recently experienced an increase in dairy production, with dairi...

  17. Engaging Frontline Staff in Performance Improvement: The American Organization of Nurse Executives Implementation of Transforming Care at the Bedside Collaborative.

    PubMed

    Needleman, Jack; Pearson, Marjorie L; Upenieks, Valda V; Yee, Tracy; Wolstein, Joelle; Parkerton, Melissa

    2016-02-01

    Process improvement stresses the importance of engaging frontline staff in implementing new processes and methods. Yet questions remain on how to incorporate these activities into the workday of hospital staff or how to create and maintain its commitment. In a 15-month American Organization of Nurse Executives collaborative involving frontline medical/surgical staff from 67 hospitals, Transforming Care at the Bedside (TCAB) was evaluated to assess whether participating units successfully implemented recommended change processes, engaged staff, implemented innovations, and generated support from hospital leadership and staff. In a mixed-methods analysis, multiple data sources, including leader surveys, unit staff surveys, administrative data, time study data, and collaborative documents were used. All units reported establishing unit-based teams, of which >90% succeeded in conducting tests of change, with unit staff selecting topics and making decisions on adoption. Fifty-five percent of unit staff reported participating in unit meetings, and 64%, in tests of change. Unit managers reported substantial increase in staff support for the initiative. An average 36 tests of change were conducted per unit, with 46% of tested innovations sustained, and 20% spread to other units. Some 95% of managers and 97% of chief nursing officers believed that the program had made unit staff more likely to initiate change. Among staff, 83% would encourage adoption of the initiative. Given the strong positive assessment of TCAB, evidence of substantial engagement of staff in the work, and the high volume of innovations tested, implemented, and sustained, TCAB appears to be a productive model for organizing and implementing a program of frontline-led improvement.

  18. Forest conditions and trends in the northern United States

    Treesearch

    Stephen R. Shifley; Francisco X. Aguilar; Nianfu Song; Susan I. Stewart; David J. Nowak; Dale D. Gormanson; W. Keith Moser; Sherri Wormstead; Eric J. Greenfield

    2012-01-01

    This section describes current conditions and trends for the 20 Northern States by focusing on selected characteristics associated with forest sustainability. Its format is based upon a set of 64 indicators within 7 broad criteria that the United States and 11 other countries have adopted under the auspices of the Montréal Process Working Group on Criteria and...

  19. Conserving Our Health. Seychelles Integrated Science. [Teacher and Pupil Booklets]. Unit 12.

    ERIC Educational Resources Information Center

    Brophy, M.; Fryars, M.

    Seychelles Integrated Science (SIS), a 3-year laboratory-based science program for students (ages 11-15) in upper primary grades 7, 8, and 9, was developed from an extensive evaluation and modification of previous P7-P9 materials. This P9 SIS unit deals with conserving health, focusing on such body processes as breathing, digestion, excretion,…

  20. Realisation of all 16 Boolean logic functions in a single magnetoresistance memory cell

    NASA Astrophysics Data System (ADS)

    Gao, Shuang; Yang, Guang; Cui, Bin; Wang, Shouguo; Zeng, Fei; Song, Cheng; Pan, Feng

    2016-06-01

    Stateful logic circuits based on next-generation nonvolatile memories, such as magnetoresistance random access memory (MRAM), promise to break the long-standing von Neumann bottleneck in state-of-the-art data processing devices. For the successful commercialisation of stateful logic circuits, a critical step is realizing the best use of a single memory cell to perform logic functions. In this work, we propose a method for implementing all 16 Boolean logic functions in a single MRAM cell, namely a magnetoresistance (MR) unit. Based on our experimental results, we conclude that this method is applicable to any MR unit with a double-hump-like hysteresis loop, especially pseudo-spin-valve magnetic tunnel junctions with a high MR ratio. Moreover, after simply reversing the correspondence between voltage signals and output logic values, this method could also be applicable to any MR unit with a double-pit-like hysteresis loop. These results may provide a helpful solution for the final commercialisation of MRAM-based stateful logic circuits in the near future.Stateful logic circuits based on next-generation nonvolatile memories, such as magnetoresistance random access memory (MRAM), promise to break the long-standing von Neumann bottleneck in state-of-the-art data processing devices. For the successful commercialisation of stateful logic circuits, a critical step is realizing the best use of a single memory cell to perform logic functions. In this work, we propose a method for implementing all 16 Boolean logic functions in a single MRAM cell, namely a magnetoresistance (MR) unit. Based on our experimental results, we conclude that this method is applicable to any MR unit with a double-hump-like hysteresis loop, especially pseudo-spin-valve magnetic tunnel junctions with a high MR ratio. Moreover, after simply reversing the correspondence between voltage signals and output logic values, this method could also be applicable to any MR unit with a double-pit-like hysteresis loop. These results may provide a helpful solution for the final commercialisation of MRAM-based stateful logic circuits in the near future. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03169b

  1. Visions of success and achievement in recreation-related USDA Forest Service NEPA processes

    Treesearch

    Mac J. Stern; Dale J. Blahna; Lee K. Cerveny; Michael J. Mortimer

    2009-01-01

    The National Environmental Policy Act (NEPA) is incorporated into the planning and decisionmaking culture of all natural resource agencies in the United States. Yet, we know little about how the attitudes and internal interactions of interdisciplinary (ID) teams engaged in NEPA processes influence process outcomes. We conducted a Web-based survey of 106 ID team leaders...

  2. Electrophysiological Evidence for the Influence of Unitization on the Processes Engaged During Episodic Retrieval: Enhancing Familiarity Based Remembering

    ERIC Educational Resources Information Center

    Rhodes, Sinead M.; Donaldson, David I.

    2007-01-01

    Episodic memory depends upon multiple dissociable retrieval processes. Here we investigated the degree to which the processes engaged during successful retrieval are dependent on the properties of the representations that underlie memory for an event. Specifically we examined whether the individual elements of an event can, under some conditions,…

  3. Comparative assessment of several post-processing methods for correcting evapotranspiration forecasts derived from TIGGE datasets.

    NASA Astrophysics Data System (ADS)

    Tian, D.; Medina, H.

    2017-12-01

    Post-processing of medium range reference evapotranspiration (ETo) forecasts based on numerical weather prediction (NWP) models has the potential of improving the quality and utility of these forecasts. This work compares the performance of several post-processing methods for correcting ETo forecasts over the continental U.S. generated from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database using data from Europe (EC), the United Kingdom (MO), and the United States (NCEP). The pondered post-processing techniques are: simple bias correction, the use of multimodels, the Ensemble Model Output Statistics (EMOS, Gneitting et al., 2005) and the Bayesian Model Averaging (BMA, Raftery et al., 2005). ETo estimates based on quality-controlled U.S. Regional Climate Reference Network measurements, and computed with the FAO 56 Penman Monteith equation, are adopted as baseline. EMOS and BMA are generally the most efficient post-processing techniques of the ETo forecasts. Nevertheless, the simple bias correction of the best model is commonly much more rewarding than using multimodel raw forecasts. Our results demonstrate the potential of different forecasting and post-processing frameworks in operational evapotranspiration and irrigation advisory systems at national scale.

  4. Rapid Parallel Calculation of shell Element Based On GPU

    NASA Astrophysics Data System (ADS)

    Wanga, Jian Hua; Lia, Guang Yao; Lib, Sheng; Li, Guang Yao

    2010-06-01

    Long computing time bottlenecked the application of finite element. In this paper, an effective method to speed up the FEM calculation by using the existing modern graphic processing unit and programmable colored rendering tool was put forward, which devised the representation of unit information in accordance with the features of GPU, converted all the unit calculation into film rendering process, solved the simulation work of all the unit calculation of the internal force, and overcame the shortcomings of lowly parallel level appeared ever before when it run in a single computer. Studies shown that this method could improve efficiency and shorten calculating hours greatly. The results of emulation calculation about the elasticity problem of large number cells in the sheet metal proved that using the GPU parallel simulation calculation was faster than using the CPU's. It is useful and efficient to solve the project problems in this way.

  5. GPU-based streaming architectures for fast cone-beam CT image reconstruction and demons deformable registration.

    PubMed

    Sharp, G C; Kandasamy, N; Singh, H; Folkert, M

    2007-10-07

    This paper shows how to significantly accelerate cone-beam CT reconstruction and 3D deformable image registration using the stream-processing model. We describe data-parallel designs for the Feldkamp, Davis and Kress (FDK) reconstruction algorithm, and the demons deformable registration algorithm, suitable for use on a commodity graphics processing unit. The streaming versions of these algorithms are implemented using the Brook programming environment and executed on an NVidia 8800 GPU. Performance results using CT data of a preserved swine lung indicate that the GPU-based implementations of the FDK and demons algorithms achieve a substantial speedup--up to 80 times for FDK and 70 times for demons when compared to an optimized reference implementation on a 2.8 GHz Intel processor. In addition, the accuracy of the GPU-based implementations was found to be excellent. Compared with CPU-based implementations, the RMS differences were less than 0.1 Hounsfield unit for reconstruction and less than 0.1 mm for deformable registration.

  6. Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning

    NASA Astrophysics Data System (ADS)

    Belozerov, V. A.; Uteshev, M. H.

    2016-08-01

    This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.

  7. Last night I had the strangest dream: Varieties of rational thought processes in dream reports.

    PubMed

    Wolman, Richard N; Kozmová, Miloslava

    2007-12-01

    From the neurophysiological perspective, thinking in dreaming and the quality of dream thought have been considered hallucinatory, bizarre, illogical, improbable, or even impossible. This empirical phenomenological research concentrates on testing whether dream thought can be defined as rational in the sense of an intervening mental process between sensory perception and the creation of meaning, leading to a conclusion or to taking action. From 10 individual dream journals of male participants aged 22-59 years and female participants aged 25-49 years, we delimited four dreams per journal and randomly selected five thought units from each dream for scoring. The units provided a base for testing a hypothesis that the thought processes of dream construction are rational. The results support the hypothesis and demonstrate that eight fundamental rational thought processes can be applied to the dreaming process.

  8. The Research and Implementation of MUSER CLEAN Algorithm Based on OpenCL

    NASA Astrophysics Data System (ADS)

    Feng, Y.; Chen, K.; Deng, H.; Wang, F.; Mei, Y.; Wei, S. L.; Dai, W.; Yang, Q. P.; Liu, Y. B.; Wu, J. P.

    2017-03-01

    It's urgent to carry out high-performance data processing with a single machine in the development of astronomical software. However, due to the different configuration of the machine, traditional programming techniques such as multi-threading, and CUDA (Compute Unified Device Architecture)+GPU (Graphic Processing Unit) have obvious limitations in portability and seamlessness between different operation systems. The OpenCL (Open Computing Language) used in the development of MUSER (MingantU SpEctral Radioheliograph) data processing system is introduced. And the Högbom CLEAN algorithm is re-implemented into parallel CLEAN algorithm by the Python language and PyOpenCL extended package. The experimental results show that the CLEAN algorithm based on OpenCL has approximately equally operating efficiency compared with the former CLEAN algorithm based on CUDA. More important, the data processing in merely CPU (Central Processing Unit) environment of this system can also achieve high performance, which has solved the problem of environmental dependence of CUDA+GPU. Overall, the research improves the adaptability of the system with emphasis on performance of MUSER image clean computing. In the meanwhile, the realization of OpenCL in MUSER proves its availability in scientific data processing. In view of the high-performance computing features of OpenCL in heterogeneous environment, it will probably become the preferred technology in the future high-performance astronomical software development.

  9. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  10. Widespread Refreezing of Both Surface and Basal Melt Water Beneath the Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Bell, R. E.; Tinto, K. J.; Das, I.; Wolovick, M.; Chu, W.; Creyts, T. T.; Frearson, N.

    2013-12-01

    The isotopically and chemically distinct, bubble-free ice observed along the Greenland Ice Sheet margin both in the Russell Glacier and north of Jacobshavn must have formed when water froze from subglacial networks. Where this refreezing occurs and what impact it has on ice sheet processes remain unclear. We use airborne radar data to demonstrate that freeze-on to the ice sheet base and associated deformation produce large ice units up to 700 m thick throughout northern Greenland. Along the ice sheet margin, in the ablation zone, surface meltwater, delivered via moulins, refreezes to the ice sheet base over rugged topography. In the interior, water melted from the ice sheet base is refrozen and surrounded by folded ice. A significant fraction of the ice sheet is modified by basal freeze-on and associated deformation. For the Eqip and Petermann catchments, representing the ice sheet margin and interior respectively, extensive airborne radar datasets show that 10%-13% of the base of the ice sheet and up to a third of the catchment width is modified by basal freeze-on. The interior units develop over relatively subdued topography with modest water flux from basal melt where conductive cooling likely dominates. Steps in the bed topography associated with subglacial valley networks may foster glaciohydraulic supercooling. The ablation zone units develop where both surface melt and crevassing are widespread and large volumes of surface meltwater will reach the base of the ice sheet. The relatively steep topography at the upslope edge of the ablation zone units combined with the larger water flux suggests that supercooling plays a greater role in their formation. The ice qualities of the ablation zone units should reflect the relatively fresh surface melt whereas the chemistry of the interior units should reflect solute-rich basal melt. Changes in basal conditions such as the presence of till patches may contribute to the formation of the large basal units near the Northeast Ice Stream. The contrasting rheology of glacial and interglacial ice may also enhance the deformation associated with freeze-on beneath large ice sheets. The occurrence of basal units both in the ice sheet interior and in the thermally very different ablation zone indicates refreezing is widespread and can occur in many environments beneath an ice sheet. This process appears to influence the morphology and behavior of the ice sheet from top to bottom.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, B.E.; Kanna, R.L.; Chambers, R.D.

    There is a great need for alternatives to open burn/open detonation of explosives and propellants from dismantled munitions. LANL has investigated the use of base hydrolysis for the demilitarization of explosives. Hydrolysates of Comp B, Octol, Tritonal, and PBXN-109 were processed in the pilot molten salt unit (in building 191). NOx and CO emissions were found to be low, except for CO from PBXN-109 processing. This report describes experimental results of the destruction of the base hydrolysates.

  12. Continuous-flow hydrogenation of carbon dioxide to pure formic acid using an integrated scCO2 process with immobilized catalyst and base.

    PubMed

    Wesselbaum, Sebastian; Hintermair, Ulrich; Leitner, Walter

    2012-08-20

    Dual role for CO(2): Pure formic acid can be obtained continuously by hydrogenation of CO(2) in a single processing unit. An immobilized ruthenium organometallic catalyst and a nonvolatile base in an ionic liquid (IL) are combined with supercritical CO(2) as both reactant and extractive phase. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Reducing intraoperative red blood cell unit wastage in a large academic medical center.

    PubMed

    Whitney, Gina M; Woods, Marcella C; France, Daniel J; Austin, Thomas M; Deegan, Robert J; Paroskie, Allison; Booth, Garrett S; Young, Pampee P; Dmochowski, Roger R; Sandberg, Warren S; Pilla, Michael A

    2015-11-01

    The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p < 0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15-0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. © 2015 AABB.

  14. Reducing intraoperative red blood cell unit wastage in a large academic medical center

    PubMed Central

    Whitney, Gina M.; Woods, Marcella C.; France, Daniel J.; Austin, Thomas M.; Deegan, Robert J.; Paroskie, Allison; Booth, Garrett S.; Young, Pampee P.; Dmochowski, Roger R.; Sandberg, Warren S.; Pilla, Michael A.

    2015-01-01

    BACKGROUND The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. STUDY DESIGN AND METHODS Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. RESULTS Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p <0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15–0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. CONCLUSIONS These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. PMID:26202213

  15. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  16. Optical Sensor of Thermal Gas Flow Based on Fiber Bragg Grating.

    PubMed

    Jiang, Xu; Wang, Keda; Li, Junqing; Zhan, Hui; Song, Zhenan; Che, Guohang; Lyu, Guohui

    2017-02-15

    This paper aims at solving the problem of explosion proof in measurement of thermal gas flow using electronic sensor by presenting a new type of flow sensor by optical fiber heating. A measuring unit based on fiber Bragg grating (FBG) for fluid temperature and a unit for heat dissipation are designed to replace the traditional electronic sensors. The light in C band from the amplified spontaneous emission (ASE) light source is split, with one part used to heat the absorbing coating and the other part used in the signal processing unit. In the heating unit, an absorbing coating is introduced to replace the traditional resistance heating module to minimize the risk of explosion. The measurement results demonstrate a fine consistency between the flow and temperature difference in simulation. The method to enhance the measurement resolution of flow is also discussed.

  17. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  18. Continued investigation of solid propulsion economics. Task 1B: Large solid rocket motor case fabrication methods - Supplement process complexity factor cost technique

    NASA Technical Reports Server (NTRS)

    Baird, J.

    1967-01-01

    This supplement to Task lB-Large Solid Rocket Motor Case Fabrication Methods supplies additional supporting cost data and discusses in detail the methodology that was applied to the task. For the case elements studied, the cost was found to be directly proportional to the Process Complexity Factor (PCF). The PCF was obtained for each element by identifying unit processes that are common to the elements and their alternative manufacturing routes, by assigning a weight to each unit process, and by summing the weighted counts. In three instances of actual manufacture, the actual cost per pound equaled the cost estimate based on PCF per pound, but this supplement, recognizes that the methodology is of limited, rather than general, application.

  19. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    ERIC Educational Resources Information Center

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  20. Tomato seeds maturity detection system based on chlorophyll fluorescence

    NASA Astrophysics Data System (ADS)

    Li, Cuiling; Wang, Xiu; Meng, Zhijun

    2016-10-01

    Chlorophyll fluorescence intensity can be used as seed maturity and quality evaluation indicator. Chlorophyll fluorescence intensity of seed coats is tested to judge the level of chlorophyll content in seeds, and further to judge the maturity and quality of seeds. This research developed a detection system of tomato seeds maturity based on chlorophyll fluorescence spectrum technology, the system included an excitation light source unit, a fluorescent signal acquisition unit and a data processing unit. The excitation light source unit consisted of two high power LEDs, two radiators and two constant current power supplies, and it was designed to excite chlorophyll fluorescence of tomato seeds. The fluorescent signal acquisition unit was made up of a fluorescence spectrometer, an optical fiber, an optical fiber scaffolds and a narrowband filter. The data processing unit mainly included a computer. Tomato fruits of green ripe stage, discoloration stage, firm ripe stage and full ripe stage were harvested, and their seeds were collected directly. In this research, the developed tomato seeds maturity testing system was used to collect fluorescence spectrums of tomato seeds of different maturities. Principal component analysis (PCA) method was utilized to reduce the dimension of spectral data and extract principal components, and PCA was combined with linear discriminant analysis (LDA) to establish discriminant model of tomato seeds maturity, the discriminant accuracy was greater than 90%. Research results show that using chlorophyll fluorescence spectrum technology is feasible for seeds maturity detection, and the developed tomato seeds maturity testing system has high detection accuracy.

  1. Cognitive Requirements for Small Unit Leaders in Military Operations in Urban Terrain

    DTIC Science & Technology

    1998-09-01

    operations specifically. A cognitive task analysis , based on in depth interviews with subject matter experts (n=7), was conducted to expose the...process. The findings of the cognitive task analysis guided the development of training recommendations, particularly the need for a scenario based

  2. EVALUATION OF THE FULL-SCALE BASE CATALYZED DECOMPOSITION PROCESS (BCDP) UNIT LOCATED IN GUAM

    EPA Science Inventory

    This report summarizes performance data collected in February 1997 on the removal of polychlorinated biphenyls (PCBs), polychlorinated dibenzo-p-dioxins (PCDDs), and polychlorinated dibenzofurans (PCDFs) from soil fed to a first-stage rotary kiln reactor of the Base Catalyzed Dec...

  3. Using Graphical Processing Units to Accelerate Orthorectification, Atmospheric Correction and Transformations for Big Data

    NASA Astrophysics Data System (ADS)

    O'Connor, A. S.; Justice, B.; Harris, A. T.

    2013-12-01

    Graphics Processing Units (GPUs) are high-performance multiple-core processors capable of very high computational speeds and large data throughput. Modern GPUs are inexpensive and widely available commercially. These are general-purpose parallel processors with support for a variety of programming interfaces, including industry standard languages such as C. GPU implementations of algorithms that are well suited for parallel processing can often achieve speedups of several orders of magnitude over optimized CPU codes. Significant improvements in speeds for imagery orthorectification, atmospheric correction, target detection and image transformations like Independent Components Analsyis (ICA) have been achieved using GPU-based implementations. Additional optimizations, when factored in with GPU processing capabilities, can provide 50x - 100x reduction in the time required to process large imagery. Exelis Visual Information Solutions (VIS) has implemented a CUDA based GPU processing frame work for accelerating ENVI and IDL processes that can best take advantage of parallelization. Testing Exelis VIS has performed shows that orthorectification can take as long as two hours with a WorldView1 35,0000 x 35,000 pixel image. With GPU orthorecification, the same orthorectification process takes three minutes. By speeding up image processing, imagery can successfully be used by first responders, scientists making rapid discoveries with near real time data, and provides an operational component to data centers needing to quickly process and disseminate data.

  4. Retrofit concept for small safety related stationary machines

    NASA Astrophysics Data System (ADS)

    Epple, S.; Jalba, C. K.; Muminovic, A.; Jung, R.

    2017-05-01

    More and more old machines have the problem that their control electronics’ lifecycle comes to its intended end of life, whilst the mechanics itself and process capability is still in very good condition. This article shows an example of a reactive ion etcher originally built in 1988, which was refitted with a new control concept. The original control unit was repaired several times based on manufacturer’s obsolescence management. At start of the retrofit project the integrated circuits were no longer available for further repair of the original control unit. Safety, repeatability and stability of the process were greatly improved.

  5. Soil Functional Mapping: A Geospatial Framework for Scaling Soil Carbon Cycling

    NASA Astrophysics Data System (ADS)

    Lawrence, C. R.

    2017-12-01

    Climate change is dramatically altering biogeochemical cycles in most terrestrial ecosystems, particularly the cycles of water and carbon (C). These changes will affect myriad ecosystem processes of importance, including plant productivity, C exports to aquatic systems, and terrestrial C storage. Soil C storage represents a critical feedback to climate change as soils store more C than the atmosphere and aboveground plant biomass combined. While we know plant and soil C cycling are strongly coupled with soil moisture, substantial unknowns remain regarding how these relationships can be scaled up from soil profiles to ecosystems. This greatly limits our ability to build a process-based understanding of the controls on and consequences of climate change at regional scales. In an effort to address this limitation we: (1) describe an approach to classifying soils that is based on underlying differences in soil functional characteristics and (2) examine the utility of this approach as a scaling tool that honors the underlying soil processes. First, geospatial datasets are analyzed in the context of our current understanding of soil C and water cycling in order to predict soil functional units that can be mapped at the scale of ecosystems or watersheds. Next, the integrity of each soil functional unit is evaluated using available soil C data and mapping units are refined as needed. Finally, targeted sampling is conducted to further differentiate functional units or fill in any data gaps that are identified. Completion of this workflow provides new geospatial datasets that are based on specific soil functions, in this case the coupling of soil C and water cycling, and are well suited for integration with regional-scale soil models. Preliminary results from this effort highlight the advantages of a scaling approach that balances theory, measurement, and modeling.

  6. Personal customizing exercise with a wearable measurement and control unit

    PubMed Central

    Wang, Zhihui; Kiryu, Tohru; Tamura, Naoki

    2005-01-01

    Background Recently, wearable technology has been used in various health-related fields to develop advanced monitoring solutions. However, the monitoring function alone cannot meet all the requirements of customizing machine-based exercise on an individual basis by relying on biosignal-based controls. We propose a new wearable unit design equipped with measurement and control functions to support the customization process. Methods The wearable unit can measure the heart rate and electromyogram signals during exercise performance and output workload control commands to the exercise machines. The workload is continuously tracked with exercise programs set according to personally customized workload patterns and estimation results from the measured biosignals by a fuzzy control method. Exercise programs are adapted by relying on a computer workstation, which communicates with the wearable unit via wireless connections. A prototype of the wearable unit was tested together with an Internet-based cycle ergometer system to demonstrate that it is possible to customize exercise on an individual basis. Results We tested the wearable unit in nine people to assess its suitability to control cycle ergometer exercise. The results confirmed that the unit could successfully control the ergometer workload and continuously support gradual changes in physical activities. Conclusion The design of wearable units equipped with measurement and control functions is an important step towards establishing a convenient and continuously supported wellness environment. PMID:15982425

  7. United States-Philippines bases agreements: prospect for its renewal. Research report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahimer, S.M.

    1988-05-01

    Remarks on the problems and issues related to the United States-Philippines Bases Agreement and prospect for its renewal are included namely: analysis of the provisions of the new Philippine Constitution; ASEAN perspective on the bases; US policy on nuclear weapons and its interest and options; Philippine interests and priorities, including alternate plans to compensate for the possible withdrawal of the US from the Philippines; and then an assessment of the effects of these factors on the renewal of the Bases Agreement. There are difficulties and barriers to the renewal of the said Agreement posed by conflicting policies of both partiesmore » and also due to divergent views on priorities, constitutional processes of both countries, and time constraints for concluding an agreement. However there are options for the United States regarding the problem, depending upon the desired level of its presence in Asia/Pacific region and how central the Philippine bases are to US national security interests.« less

  8. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms

    PubMed Central

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts’ Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2–100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms. PMID:28487831

  9. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms.

    PubMed

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts' Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2-100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms.

  10. A Framework for Distributed Problem Solving

    NASA Astrophysics Data System (ADS)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  11. VLTI auxiliary telescopes: a full object-oriented approach

    NASA Astrophysics Data System (ADS)

    Chiozzi, Gianluca; Duhoux, Philippe; Karban, Robert

    2000-06-01

    The Very Large Telescope (VLT) Telescope Control Software (TCS) is a portable system. It is now in use or will be used in a whole family of ESO telescopes VLT Unit Telescopes, VLTI Auxiliary Telescopes, NTT, La Silla 3.6, VLT Survey Telescope and Astronomical Site Monitors in Paranal and La Silla). Although it has been developed making extensive usage of Object Oriented (OO) methodologies, the overall development process chosen at the beginning of the project used traditional methods. In order to warranty a longer lifetime to the system (improving documentation and maintainability) and to prepare for future projects, we have introduced a full OO process. We have taken as a basis the United Software Development Process with the Unified Modeling Language (UML) and we have adapted the process to our specific needs. This paper describes how the process has been applied to the VLTI Auxiliary Telescopes Control Software (ATCS). The ATCS is based on the portable VLT TCS, but some subsystems are new or have specific characteristics. The complete process has been applied to the new subsystems, while reused code has been integrated in the UML models. We have used the ATCS on one side to tune the process and train the team members and on the other side to provide a UML and WWW based documentation for the portable VLT TCS.

  12. Use of lignocellulosic materials as sorbents for pesticide and phosphate residues

    Treesearch

    Mandla A. Tshabalala

    2006-01-01

    We previously reported results from limited field trials of a bark-based filtration unit designed to remove phosphorus from cranberry bog tail water. In that report we also identified some barriers that needed to be overcome to improve the performance of such a filtration unit. One barrier was lack of a cost effective process for large-scale conversion of bark to an...

  13. The Future of the Book. Part III. New Technologies in Book Distribution: The United States Experience. Studies on Books and Reading No. 18.

    ERIC Educational Resources Information Center

    Paul, Sandra K.; Kranberg, Susan

    The third report from a comprehensive Unesco study, this document traces the history of the application of computer-based technology to the book distribution process in the United States and indicates functional areas currently showing the effects of using this technology. Ways in which computer use is altering book distribution management…

  14. A disturbance-based ecosystem approach to maintaining and restoring freshwater habitats of evolutionarily significant units of anadromous salmonids in the Pacific Northwest.

    Treesearch

    G.H. Reeves; L.E. Benda; K.M. Burnett; P.A. Bisson; J.R. Sedell

    1995-01-01

    To preserve and recover evolutionarily significant units (ESUs) of anadromous salmonids Oncorhynchus spp. in the Pacific Northwest, long-term and short-term ecological processes that create and maintain freshwater habitats must be restored and protected. Aquatic ecosystems through- out the region are dynamic in space and time, and lack of...

  15. Toward inventory-based estimates of soil organic carbon in forests of the United States

    Treesearch

    G.M. Domke; C.H. Perry; B.F. Walters; L.E. Nave; C.W. Woodall; C.W. Swanston

    2017-01-01

    Soil organic carbon (SOC) is the largest terrestrial carbon (C) sink on Earth; this pool plays a critical role in ecosystem processes and climate change. Given the cost and time required to measure SOC, and particularly changes in SOC, many signatory nations to the United Nations Framework Convention on Climate Change report estimates of SOC stocks and stock changes...

  16. Comparative Assessment of Gasification Based Coal Power Plants with Various CO2 Capture Technologies Producing Electricity and Hydrogen

    PubMed Central

    2014-01-01

    Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H2), with and without carbon dioxide (CO2) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool “Aspen Plus”. The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency. PMID:24578590

  17. Comparative Assessment of Gasification Based Coal Power Plants with Various CO2 Capture Technologies Producing Electricity and Hydrogen.

    PubMed

    Mukherjee, Sanjay; Kumar, Prashant; Hosseini, Ali; Yang, Aidong; Fennell, Paul

    2014-02-20

    Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H 2 ), with and without carbon dioxide (CO 2 ) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool "Aspen Plus". The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO 2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO 2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO 2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO 2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H 2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency.

  18. Evaluating MC&A effectiveness to verify the presence of nuclear materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, P. G.; Morzinski, J. A.; Ostenak, Carl A.

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process maymore » move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.« less

  19. Appendix A: Ecoprovinces of the Central North American Cordillera and adjacent plains

    Treesearch

    Dennis A. Demarchi

    1994-01-01

    The fundamental difference between the map presented here and other regional ecosystem classifications is that this map's ecological units are based on climatic processes rather than vegetation communities (map appears at the end of this appendix). Macroclimatic processes are the physical and thermodynamic interaction between climatic controls, or the relatively...

  20. WEPP Model applications for evaluations of best management practices

    Treesearch

    D. C. Flanagan; W. J. Elliott; J. R. Frankenberger; C. Huang

    2010-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based erosion prediction technology for application to small watersheds and hillslope profiles, under agricultural, forested, rangeland, and other land management conditions. Developed by the United States Department of Agriculture (USDA) over the past 25 years, WEPP simulates many of the physical processes...

  1. A Neuroconstructivist Model of Past Tense Development and Processing

    ERIC Educational Resources Information Center

    Westermann, Gert; Ruh, Nicolas

    2012-01-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated…

  2. Fracturing Writing Spaces: Multimodal Storytelling Ignites Process Writing

    ERIC Educational Resources Information Center

    Lenters, Kimberly; Winters, Kari-Lynn

    2013-01-01

    In this paper, we explore the affordances of literature-based, arts-infused and digital media processes for students, as multimodal practices take centre stage in an English Language Arts unit on fractured fairy tales. The study takes up the challenge of addressing multimodal literacy instruction and research in ways that utilize a range of…

  3. Performance and scalability of Fourier domain optical coherence tomography acceleration using graphics processing units.

    PubMed

    Li, Jian; Bloch, Pavel; Xu, Jing; Sarunic, Marinko V; Shannon, Lesley

    2011-05-01

    Fourier domain optical coherence tomography (FD-OCT) provides faster line rates, better resolution, and higher sensitivity for noninvasive, in vivo biomedical imaging compared to traditional time domain OCT (TD-OCT). However, because the signal processing for FD-OCT is computationally intensive, real-time FD-OCT applications demand powerful computing platforms to deliver acceptable performance. Graphics processing units (GPUs) have been used as coprocessors to accelerate FD-OCT by leveraging their relatively simple programming model to exploit thread-level parallelism. Unfortunately, GPUs do not "share" memory with their host processors, requiring additional data transfers between the GPU and CPU. In this paper, we implement a complete FD-OCT accelerator on a consumer grade GPU/CPU platform. Our data acquisition system uses spectrometer-based detection and a dual-arm interferometer topology with numerical dispersion compensation for retinal imaging. We demonstrate that the maximum line rate is dictated by the memory transfer time and not the processing time due to the GPU platform's memory model. Finally, we discuss how the performance trends of GPU-based accelerators compare to the expected future requirements of FD-OCT data rates.

  4. A real-time GNSS-R system based on software-defined radio and graphics processing units

    NASA Astrophysics Data System (ADS)

    Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki

    2012-04-01

    Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.

  5. Development of microcontroller-based acquisition and processing unit for fiber optic vibration sensor

    NASA Astrophysics Data System (ADS)

    Suryadi; Puranto, P.; Adinanta, H.; Waluyo, T. B.; Priambodo, P. S.

    2017-04-01

    Microcontroller based acquisition and processing unit (MAPU) has been developed to measure vibration signal from fiber optic vibration sensor. The MAPU utilizes a 32-bit ARM microcontroller to perform acquisition and processing of the input signal. The input signal is acquired with 12 bit ADC and processed using FFT method to extract frequency information. Stability of MAPU is characterized by supplying a constant input signal at 500 Hz for 29 hours and shows a stable operation. To characterize the frequency response, input signal is swapped from 20 to 1000 Hz with 20 Hz interval. The characterization result shows that MAPU can detect input signal from 20 to 1000 Hz with minimum signal of 4 mV RMS. The experiment has been set that utilizes the MAPU with singlemode-multimode-singlemode (SMS) fiber optic sensor to detect vibration which is induced by a transducer in a wooden platform. The experimental result indicates that vibration signal from 20 to 600 Hz has been successfully detected. Due to the limitation of the vibration source used in the experiment, vibration signal above 600 Hz is undetected.

  6. Lebedev acceleration and comparison of different photometric models in the inversion of lightcurves for asteroids

    NASA Astrophysics Data System (ADS)

    Lu, Xiao-Ping; Huang, Xiang-Jie; Ip, Wing-Huen; Hsia, Chi-Hao

    2018-04-01

    In the lightcurve inversion process where asteroid's physical parameters such as rotational period, pole orientation and overall shape are searched, the numerical calculations of the synthetic photometric brightness based on different shape models are frequently implemented. Lebedev quadrature is an efficient method to numerically calculate the surface integral on the unit sphere. By transforming the surface integral on the Cellinoid shape model to that on the unit sphere, the lightcurve inversion process based on the Cellinoid shape model can be remarkably accelerated. Furthermore, Matlab codes of the lightcurve inversion process based on the Cellinoid shape model are available on Github for free downloading. The photometric models, i.e., the scattering laws, also play an important role in the lightcurve inversion process, although the shape variations of asteroids dominate the morphologies of the lightcurves. Derived from the radiative transfer theory, the Hapke model can describe the light reflectance behaviors from the viewpoint of physics, while there are also many empirical models in numerical applications. Numerical simulations are implemented for the comparison of the Hapke model with the other three numerical models, including the Lommel-Seeliger, Minnaert, and Kaasalainen models. The results show that the numerical models with simple function expressions can fit well with the synthetic lightcurves generated based on the Hapke model; this good fit implies that they can be adopted in the lightcurve inversion process for asteroids to improve the numerical efficiency and derive similar results to those of the Hapke model.

  7. Gasification of land-based biomass. Final report July 78-December 82

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chynoweth, D.P.; Jerger, D.E.; Conrad, J.R.

    1983-06-01

    The objective of this research was to develop efficient processes for conversion of land-based biomass to methane and other resources. One task was to determine the relative suitability of selected species or feedstocks for biological and thermal gasification processes. The second task was to narrow options for design and operation of the experimental test unit (ETU) on water hyacinth and sludge at Walt Disney World (WDW) and to provide a scientific base for understanding rate- and yield-limiting reactions for biogasification of these feedstocks, (separately and as blends).

  8. Implementation of the HbA1c IFCC unit --from the laboratory to the consumer: The New Zealand experience.

    PubMed

    Florkowski, Christopher; Crooke, Michael; Reed, Maxine

    2014-05-15

    In 2007, an international consensus statement recommended that HbA1c results should be reported world-wide in IFCC units (mmol/mol) and also the more familiar derived percentage units using a master equation. In New Zealand, the HbA1c IFCC units have been successfully implemented and used exclusively since 3rd October 2011 (following a 2 year period of reporting both units) for both patient monitoring and the diagnosis of diabetes, with a diagnostic cut-off of ≥50 mmol/mol. The consultation process in New Zealand dates back to 2003, well before the international recommendations were made. It reflects the close cooperation between the clinical and laboratory communities in New Zealand, particularly through the agency of the New Zealand Society for the Study of Diabetes (NZSSD), a key organisation in New Zealand open to all those involved in the care of people with diabetes and the national advisory body on scientific and clinical diabetes care and standards. There was a phased process of consultation designed to increase familiarity and comfort with the new units and the final step was coupled with the adoption of HbA1c as a diagnostic test with some evidence-based pragmatism around using the rounded cut-off. Genuine clinical engagement is vital in such a process. © 2013.

  9. PID Controller Settings Based on a Transient Response Experiment

    ERIC Educational Resources Information Center

    Silva, Carlos M.; Lito, Patricia F.; Neves, Patricia S.; Da Silva, Francisco A.

    2008-01-01

    An experimental work on controller tuning for chemical engineering undergraduate students is proposed using a small heat exchange unit. Based upon process reaction curves in open-loop configuration, system gain and time constant are determined for first order model with time delay with excellent accuracy. Afterwards students calculate PID…

  10. Process Evaluation of Healthy Bodies, Healthy Souls: A Church-Based Health Intervention Program in Baltimore City

    ERIC Educational Resources Information Center

    Wang, H. Echo; Lee, Matthew; Hart, Adante; Summers, Amber C.; Steeves, Elizabeth Anderson; Gittelsohn, Joel

    2013-01-01

    Soaring obesity rates in the United States demand comprehensive health intervention strategies that simultaneously address dietary patterns, physical activity, psychosocial factors and the food environment. Healthy Bodies, Healthy Souls (HBHS) is a church-based, community-participatory, cluster-randomized health intervention trial conducted in…

  11. Understanding Ecology Content Knowledge and Acquiring Science Process Skills through Project-Based Science Instruction

    ERIC Educational Resources Information Center

    Colley, Kabba E.

    2006-01-01

    This activity discusses a two-day unit on ecology implemented during the summer of 2004 using the project-based science instructional (PBSI) approach. Through collaborative fieldwork, group discussions, presentations, and reflections, students planned, implemented, and reported their own scientific investigations on the environmental health of…

  12. Manipulatives-Based Laboratory for Majors Biology – a Hands-On Approach to Understanding Respiration and Photosynthesis †

    PubMed Central

    Boomer, Sarah M.; Latham, Kristin L.

    2011-01-01

    The first course in our year-long introductory series for Biology majors encompasses four learning units: biological molecules and cells, metabolism, genetics, and evolution. Of these, the metabolism unit, which includes respiration and photosynthesis, has shown the lowest student exam scores, least interest, and lowest laboratory ratings. Consequently, we hypothesized that modeling metabolic processes in the laboratory would improve student content learning during this course unit. Specifically, we developed manipulatives-based laboratory exercises that combined paper cutouts, movable blocks, and large diagrams of the cell. In particular, our novel use of connecting LEGO blocks allowed students to move model electrons and phosphates between molecules and within defined spaces of the cell. We assessed student learning using both formal (content indicators and attitude surveys) and informal (the identification of misconceptions or discussions with students) approaches. On the metabolism unit content exam, student performance improved by 46% over pretest scores and by the end of the course, the majority of students rated metabolism as their most-improved (43%) and favorite (33%) subject as compared with other unit topics. The majority of students rated manipulatives-based labs as very helpful, as compared to non-manipulatives-based labs. In this report, we will demonstrate that students made learning gains across all content areas, but most notably in the unit that covered respiration and photosynthesis. PMID:23653756

  13. A dual-channel fusion system of visual and infrared images based on color transfer

    NASA Astrophysics Data System (ADS)

    Pei, Chuang; Jiang, Xiao-yu; Zhang, Peng-wei; Liang, Hao-cong

    2013-09-01

    A dual-channel fusion system of visual and infrared images based on color transfer The increasing availability and deployment of imaging sensors operating in multiple spectrums has led to a large research effort in image fusion, resulting in a plethora of pixel-level image fusion algorithms. However, most of these algorithms have gray or false color fusion results which are not adapt to human vision. Transfer color from a day-time reference image to get natural color fusion result is an effective way to solve this problem, but the computation cost of color transfer is expensive and can't meet the request of real-time image processing. We developed a dual-channel infrared and visual images fusion system based on TMS320DM642 digital signal processing chip. The system is divided into image acquisition and registration unit, image fusion processing unit, system control unit and image fusion result out-put unit. The image registration of dual-channel images is realized by combining hardware and software methods in the system. False color image fusion algorithm in RGB color space is used to get R-G fused image, then the system chooses a reference image to transfer color to the fusion result. A color lookup table based on statistical properties of images is proposed to solve the complexity computation problem in color transfer. The mapping calculation between the standard lookup table and the improved color lookup table is simple and only once for a fixed scene. The real-time fusion and natural colorization of infrared and visual images are realized by this system. The experimental result shows that the color-transferred images have a natural color perception to human eyes, and can highlight the targets effectively with clear background details. Human observers with this system will be able to interpret the image better and faster, thereby improving situational awareness and reducing target detection time.

  14. ZBB--a new skill for the financial manager.

    PubMed

    Thompson, G B; Pyhrr, P A

    1979-03-01

    Zero-based budgeting (ZBB) is a management decision-making tool currently gaining wide acceptance. ZBB is a budgeting approach which is useful for planning, controlling and coordinating financial and human resources. It involves the re-evaluation of all budgeted activities in terms of priorities established by the management. The traditional process of incremental budgeting differs from ZBB in that only the planned changes are evaluated in the former. In incremental budgeting, the base budget is considered authorized and required little attention. The ZBB process focuses on the whol budget. This is accomplished by: (1) identifying decision units; (2) evaluating each decision unit in terms of performance, costs, benefits, and alternate means of accomplishiing the objectives; (3) ranking the decision packages; and (4) preparing a budget for the highest priority decision packages. The effect of the ZBB approach is that new high priority programs may be funded by eliminating or reducing existing lower-priority programs. ZBB is viewed as a logical process which can combine many of the elements of good management.

  15. Design of video interface conversion system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Heng; Wang, Xiang-jun

    2014-11-01

    This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.

  16. Development of a sterilizing in-place application for a production machine using Vaporized Hydrogen Peroxide.

    PubMed

    Mau, T; Hartmann, V; Burmeister, J; Langguth, P; Häusler, H

    2004-01-01

    The use of steam in sterilization processes is limited by the implementation of heat-sensitive components inside the machines to be sterilized. Alternative low-temperature sterilization methods need to be found and their suitability evaluated. Vaporized Hydrogen Peroxide (VHP) technology was adapted for a production machine consisting of highly sensitive pressure sensors and thermo-labile air tube systems. This new kind of "cold" surface sterilization, known from the Barrier Isolator Technology, is based on the controlled release of hydrogen peroxide vapour into sealed enclosures. A mobile VHP generator was used to generate the hydrogen peroxide vapour. The unit was combined with the air conduction system of the production machine. Terminal vacuum pumps were installed to distribute the gas within the production machine and for its elimination. In order to control the sterilization process, different physical process monitors were incorporated. The validation of the process was based on biological indicators (Geobacillus stearothermophilus). The Limited Spearman Karber Method (LSKM) was used to statistically evaluate the sterilization process. The results show that it is possible to sterilize surfaces in a complex tube system with the use of gaseous hydrogen peroxide. A total microbial reduction of 6 log units was reached.

  17. Video rate morphological processor based on a redundant number representation

    NASA Astrophysics Data System (ADS)

    Kuczborski, Wojciech; Attikiouzel, Yianni; Crebbin, Gregory A.

    1992-03-01

    This paper presents a video rate morphological processor for automated visual inspection of printed circuit boards, integrated circuit masks, and other complex objects. Inspection algorithms are based on gray-scale mathematical morphology. Hardware complexity of the known methods of real-time implementation of gray-scale morphology--the umbra transform and the threshold decomposition--has prompted us to propose a novel technique which applied an arithmetic system without carrying propagation. After considering several arithmetic systems, a redundant number representation has been selected for implementation. Two options are analyzed here. The first is a pure signed digit number representation (SDNR) with the base of 4. The second option is a combination of the base-2 SDNR (to represent gray levels of images) and the conventional twos complement code (to represent gray levels of structuring elements). Operation principle of the morphological processor is based on the concept of the digit level systolic array. Individual processing units and small memory elements create a pipeline. The memory elements store current image windows (kernels). All operation primitives of processing units apply a unified direction of digit processing: most significant digit first (MSDF). The implementation technology is based on the field programmable gate arrays by Xilinx. This paper justified the rationality of a new approach to logic design, which is the decomposition of Boolean functions instead of Boolean minimization.

  18. Low-Energy, Low-Cost Production of Ethylene by Low- Temperature Oxidative Coupling of Methane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radaelli, Guido; Chachra, Gaurav; Jonnavittula, Divya

    In this project, we develop a catalytic process technology for distributed small-scale production of ethylene by oxidative coupling of methane at low temperatures using an advanced catalyst. The Low Temperature Oxidative Coupling of Methane (LT-OCM) catalyst system is enabled by a novel chemical catalyst and process pioneered by Siluria, at private expense, over the last six years. Herein, we develop the LT-OCM catalyst system for distributed small-scale production of ethylene by identifying and addressing necessary process schemes, unit operations and process parameters that limit the economic viability and mass penetration of this technology to manufacture ethylene at small-scales. The outputmore » of this program is process concepts for small-scale LT-OCM catalyst based ethylene production, lab-scale verification of the novel unit operations adopted in the proposed concept, and an analysis to validate the feasibility of the proposed concepts.« less

  19. Spreading a medication administration intervention organizationwide in six hospitals.

    PubMed

    Kliger, Julie; Singer, Sara; Hoffman, Frank; O'Neil, Edward

    2012-02-01

    Six hospitals from the San Francisco Bay Area participated in a 12-month quality improvement project conducted by the Integrated Nurse Leadership Program (INLP). A quality improvement intervention that focused on improving medication administration accuracy was spread from two pilot units to all inpatient units in the hospitals. INLP developed a 12-month curriculum, presented in a combination of off-site training sessions and hospital-based training and consultant-led meetings, to teach clinicians the key skills needed to drive organizationwide change. Each hospital established a nurse-led project team, as well as unit teams to address six safety processes designed to improve medication administration accuracy: compare medication to the medication administration record; keep medication labeled throughout; check two patient identifications; explain drug to patient (if applicable); chart immediately after administration; and protect process from distractions and interruptions. From baseline until one year after project completion, the six hospitals improved their medication accuracy rates, on average, from 83.4% to 98.0% in the spread units. The spread units also improved safety processes overall from 83.1% to 97.2%. During the same time, the initial pilot units also continued to improve accuracy from 94.0% to 96.8% and safety processes overall from 95.3% to 97.2%. With thoughtful planning, engaging those doing the work early and focusing on the "human side of change" along with technical knowledge of improvement methodologies, organizations can spread initiatives enterprisewide. This program required significant training of frontline workers in problem-solving skills, leading change, team management, data tracking, and communication.

  20. Improved soft-agar colony assay in a fluid processing apparatus.

    PubMed

    Forsman, A D; Herpich, A R; Chapes, S K

    1999-01-01

    The standard method for quantitating bone marrow precursor cells has been to count the number of colony-forming units that form in semisolid (0.3%) agar. Recently we adapted this assay for use in hardware, the Fluid Processing Apparatus, that is flown in standard payload lockers of the space shuttle. When mouse or rat macrophage colony-forming units were measured with this hardware in ground-based assays, we found significantly more colony growth than that seen in standard plate assays. The improved growth correlates with increased agar thickness but also appears to be due to properties inherent to the Fluid Processing Apparatus. This paper describes an improved method for determining bone marrow macrophage precursor numbers in semisolid agar.

  1. Model of a programmable quantum processing unit based on a quantum transistor effect

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  2. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model -Documentation of the Hydrogeologic-Unit Flow (HUF) Package

    USGS Publications Warehouse

    Anderman, E.R.; Hill, M.C.

    2000-01-01

    This report documents the Hydrogeologic-Unit Flow (HUF) Package for the groundwater modeling computer program MODFLOW-2000. The HUF Package is an alternative internal flow package that allows the vertical geometry of the system hydrogeology to be defined explicitly within the model using hydrogeologic units that can be different than the definition of the model layers. The HUF Package works with all the processes of MODFLOW-2000. For the Ground-Water Flow Process, the HUF Package calculates effective hydraulic properties for the model layers based on the hydraulic properties of the hydrogeologic units, which are defined by the user using parameters. The hydraulic properties are used to calculate the conductance coefficients and other terms needed to solve the ground-water flow equation. The sensitivity of the model to the parameters defined within the HUF Package input file can be calculated using the Sensitivity Process, using observations defined with the Observation Process. Optimal values of the parameters can be estimated by using the Parameter-Estimation Process. The HUF Package is nearly identical to the Layer-Property Flow (LPF) Package, the major difference being the definition of the vertical geometry of the system hydrogeology. Use of the HUF Package is illustrated in two test cases, which also serve to verify the performance of the package by showing that the Parameter-Estimation Process produces the true parameter values when exact observations are used.

  3. Do Chinese Readers Follow the National Standard Rules for Word Segmentation during Reading?

    PubMed Central

    Liu, Ping-Ping; Li, Wei-Jun; Lin, Nan; Li, Xing-Shan

    2013-01-01

    We conducted a preliminary study to examine whether Chinese readers’ spontaneous word segmentation processing is consistent with the national standard rules of word segmentation based on the Contemporary Chinese language word segmentation specification for information processing (CCLWSSIP). Participants were asked to segment Chinese sentences into individual words according to their prior knowledge of words. The results showed that Chinese readers did not follow the segmentation rules of the CCLWSSIP, and their word segmentation processing was influenced by the syntactic categories of consecutive words. In many cases, the participants did not consider the auxiliary words, adverbs, adjectives, nouns, verbs, numerals and quantifiers as single word units. Generally, Chinese readers tended to combine function words with content words to form single word units, indicating they were inclined to chunk single words into large information units during word segmentation. Additionally, the “overextension of monosyllable words” hypothesis was tested and it might need to be corrected to some degree, implying that word length have an implicit influence on Chinese readers’ segmentation processing. Implications of these results for models of word recognition and eye movement control are discussed. PMID:23408981

  4. Allograft update: the current status of tissue regulation, procurement, processing, and sterilization.

    PubMed

    McAllister, David R; Joyce, Michael J; Mann, Barton J; Vangsness, C Thomas

    2007-12-01

    Allografts are commonly used during sports medicine surgical procedures in the United States, and their frequency of use is increasing. Based on surgeon reports, it is estimated that more than 60 000 allografts were used in knee surgeries by members of the American Orthopaedic Society for Sports Medicine in 2005. In the United States, there are governmental agencies and other regulatory bodies involved in the oversight of tissue banks. In 2005, the Food and Drug Administration finalized its requirements for current good tissue practice and has mandated new rules regarding the "manufacture" of allogenic tissue. In response to well-publicized infections associated with the implantation of allograft tissue, some tissue banks have developed methods to sterilize allograft tissue. Although many surgeons have significant concerns about the safety of allografts, the majority believe that sterilized allografts are safe but that the sterilization process negatively affects tissue biology and biomechanics. However, most know very little about the principles of sterilization and the proprietary processes currently used in tissue banking. This article will review the current status of allograft tissue regulation, procurement, processing, and sterilization in the United States.

  5. Programmable partitioning for high-performance coherence domains in a multiprocessor system

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Salapura, Valentina [Chappaqua, NY

    2011-01-25

    A multiprocessor computing system and a method of logically partitioning a multiprocessor computing system are disclosed. The multiprocessor computing system comprises a multitude of processing units, and a multitude of snoop units. Each of the processing units includes a local cache, and the snoop units are provided for supporting cache coherency in the multiprocessor system. Each of the snoop units is connected to a respective one of the processing units and to all of the other snoop units. The multiprocessor computing system further includes a partitioning system for using the snoop units to partition the multitude of processing units into a plurality of independent, memory-consistent, adjustable-size processing groups. Preferably, when the processor units are partitioned into these processing groups, the partitioning system also configures the snoop units to maintain cache coherency within each of said groups.

  6. Design and research of sun sensor based on technology of optical fiber

    NASA Astrophysics Data System (ADS)

    Li, Ye; Zhou, Wang; Li, Dan

    2010-08-01

    A kind of sun sensor is designed based on the optical fiber. This project consists of three parts: optical head, photoelectric sensor and signal processing unit. The innovation of this design lies in the improvement of traditional sun sensor, where multi-fibers, used as a leader, are symmetrically distributed on the surface of a spacecraft. To determine the attitude of a spacecraft, the sun sensor should measure the direction of the sun. Because the fiber length can be adjusted according to the fact, photoelectric sensor can be placed deeply inside a spacecraft to protect the photoelectric sensor against the damage by the high-energy particles from outer space. The processing unit calculates the difference value of sun energy imported by each pair of opposite optical fiber so as to obtain the angle and the orientation between the spacecraft and the sun. This sun sensor can suit multi-field of view, both small and large. It improves the accuracy of small field of view and increases the precision of locating a spacecraft. This paper briefly introduces the design of processing unit. This sun sensor is applicable to detect the attitude of a spacecraft. In addition, it can also be used in solar tracking system of PV technology.

  7. Lean Six Sigma to Reduce Intensive Care Unit Length of Stay and Costs in Prolonged Mechanical Ventilation.

    PubMed

    Trzeciak, Stephen; Mercincavage, Michael; Angelini, Cory; Cogliano, William; Damuth, Emily; Roberts, Brian W; Zanotti, Sergio; Mazzarelli, Anthony J

    Patients with prolonged mechanical ventilation (PMV) represent important "outliers" of hospital length of stay (LOS) and costs (∼$26 billion annually in the United States). We tested the hypothesis that a Lean Six Sigma (LSS) approach for process improvement could reduce hospital LOS and the associated costs of care for patients with PMV. Before-and-after cohort study. Multidisciplinary intensive care unit (ICU) in an academic medical center. Adult patients admitted to the ICU and treated with PMV, as defined by diagnosis-related group (DRG). We implemented a clinical redesign intervention based on LSS principles. We identified eight distinct processes in preparing patients with PMV for post-acute care. Our clinical redesign included reengineering daily patient care rounds ("Lean ICU rounds") to reduce variation and waste in these processes. We compared hospital LOS and direct cost per case in patients with PMV before (2013) and after (2014) our LSS intervention. Among 259 patients with PMV (131 preintervention; 128 postintervention), median hospital LOS decreased by 24% during the intervention period (29 vs. 22 days, p < .001). Accordingly, median hospital direct cost per case decreased by 27% ($66,335 vs. $48,370, p < .001). We found that a LSS-based clinical redesign reduced hospital LOS and the costs of care for patients with PMV.

  8. A new state evaluation method of oil pump unit based on AHP and FCE

    NASA Astrophysics Data System (ADS)

    Lin, Yang; Liang, Wei; Qiu, Zeyang; Zhang, Meng; Lu, Wenqing

    2017-05-01

    In order to make an accurate state evaluation of oil pump unit, a comprehensive evaluation index should be established. A multi-parameters state evaluation method of oil pump unit is proposed in this paper. The oil pump unit is analyzed by Failure Mode and Effect Analysis (FMEA), so evaluation index can be obtained based on FMEA conclusions. The weights of different parameters in evaluation index are discussed using Analytic Hierarchy Process (AHP) with expert experience. According to the evaluation index and the weight of each parameter, the state evaluation is carried out by Fuzzy Comprehensive Evaluation (FCE) and the state is divided into five levels depending on status value, which is inspired by human body health. In order to verify the effectiveness and feasibility of the proposed method, a state evaluation of oil pump used in a pump station is taken as an example.

  9. Multiscale analysis of the correlation of processing parameters on viscidity of composites fabricated by automated fiber placement

    NASA Astrophysics Data System (ADS)

    Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya

    2017-10-01

    Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.

  10. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  11. Modification and fixed-point analysis of a Kalman filter for orientation estimation based on 9D inertial measurement unit data.

    PubMed

    Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger

    2013-01-01

    A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.

  12. The implementation of a postoperative care process on a neurosurgical unit.

    PubMed

    Douglas, Mary; Rowed, Sheila

    2005-12-01

    The postoperative phase is a critical time for any neurosurgical patient. Historically, certain patients having neurosurgical procedures, such as craniotomies and other more complex surgeries, have been nursed postoperatively in the intensive care unit (ICU) for an overnight stay, prior to transfer to a neurosurgical floor. At the Hospital for Sick Children in Toronto, because of challenges with access to ICU beds and the cancellation of surgeries because of lack of available nurses for the ICU setting, this practice was reexamined. A set of criteria was developed to identify which postoperative patients should come directly to the neurosurgical unit immediately following their anesthetic recovery. The criteria were based on patient diagnosis, preoperative condition, comorbidities, the surgical procedure, intraoperative complications, and postoperative status. A detailed process was then outlined that allowed the optimum patients to be selected for this process to ensure patient safety. Included in this process was a postoperative protocol addressing details such as standard physician orders and the levels of monitoring required. Outcomes of this new process include fewer surgical cancellations for patients and families, equally safe, or better patient care, and the conservation of limited ICU resources. The program has currently been expanded to include patients who have undergone endovascular therapies.

  13. Recent developments in membrane-based separations in biotechnology processes: review.

    PubMed

    Rathore, A S; Shirke, A

    2011-01-01

    Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.

  14. New methods of multimode fiber interferometer signal processing

    NASA Astrophysics Data System (ADS)

    Vitrik, Oleg B.; Kulchin, Yuri N.; Maxaev, Oleg G.; Kirichenko, Oleg V.; Kamenev, Oleg T.; Petrov, Yuri S.

    1995-06-01

    New methods of multimode fiber interferometers signal processing are suggested. For scheme of single fiber multimode interferometers with two excited modes, the method based on using of special fiber unit is developed. This unit provides the modes interaction and further sum optical field filtering. As a result the amplitude of output signal is modulated by external influence on interferometer. The stabilization of interferometer sensitivity is achieved by using additional special modulation of output signal. For scheme of single fiber multimode interferometers with excitation of wide mode spectrum, the signal of intermode interference is registered by photodiode matrix and then special electronic unit performs correlation processing. For elimination of temperature destabilization, the registered signal is adopted to multimode interferometers optical signal temperature changes. The achieved parameters for double mode scheme: temporary stability--0.6% per hour, sensitivity to interferometer length deviations--3,2 nm; for multimode scheme: temperature stability--(0.5%)/(K), temporary nonstability--0.2% per hour, sensitivity to interferometer length deviations--20 nm, dynamic range--35 dB.

  15. The integrated design and archive of space-borne signal processing and compression coding

    NASA Astrophysics Data System (ADS)

    He, Qiang-min; Su, Hao-hang; Wu, Wen-bo

    2017-10-01

    With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.

  16. A mobile unit for memory retrieval in daily life based on image and sensor processing

    NASA Astrophysics Data System (ADS)

    Takesumi, Ryuji; Ueda, Yasuhiro; Nakanishi, Hidenobu; Nakamura, Atsuyoshi; Kakimori, Nobuaki

    2003-10-01

    We developed a Mobile Unit which purpose is to support memory retrieval of daily life. In this paper, we describe the two characteristic factors of this unit. (1)The behavior classification with an acceleration sensor. (2)Extracting the difference of environment with image processing technology. In (1), By analyzing power and frequency of an acceleration sensor which turns to gravity direction, the one's activities can be classified using some techniques to walk, stay, and so on. In (2), By extracting the difference between the beginning scene and the ending scene of a stay scene with image processing, the result which is done by user is recognized as the difference of environment. Using those 2 techniques, specific scenes of daily life can be extracted, and important information at the change of scenes can be realized to record. Especially we describe the effect to support retrieving important things, such as a thing left behind and a state of working halfway.

  17. Assessment of DoD Enterprise Resource Planning Business Systems

    DTIC Science & Technology

    2011-02-01

    activities, and processes to the organizational units that execute them • Architecture standards, such as application of the BPMN • Recommended...Recommendation: • Create and use style guides for the development of BPMN based process models. The style guide would probably include specifications such as...o All processes must have ‘entry points’ or ‘triggers’ in the form of BPMN Events o All processes must have ‘outcomes’ also in the form of BPMN

  18. Capture of Fluorescence Decay Times by Flow Cytometry

    PubMed Central

    Naivar, Mark A.; Jenkins, Patrick; Freyer, James P.

    2012-01-01

    In flow cytometry, the fluorescence decay time of an excitable species has been largely underutilized and is not likely found as a standard parameter on any imaging cytometer, sorting, or analyzing system. Most cytometers lack fluorescence lifetime hardware mainly owing to two central issues. Foremost, research and development with lifetime techniques has lacked proper exploitation of modern laser systems, data acquisition boards, and signal processing techniques. Secondly, a lack of enthusiasm for fluorescence lifetime applications in cells and with bead-based assays has persisted among the greater cytometry community. In this unit, we describe new approaches that address these issues and demonstrate the simplicity of digitally acquiring fluorescence relaxation rates in flow. The unit is divided into protocol and commentary sections in order to provide a most comprehensive discourse on acquiring the fluorescence lifetime with frequency-domain methods. The unit covers (i) standard fluorescence lifetime acquisition (protocol-based) with frequency-modulated laser excitation, (ii) digital frequency-domain cytometry analyses, and (iii) interfacing fluorescence lifetime measurements onto sorting systems. Within the unit is also a discussion on how digital methods are used for aliasing in order to harness higher frequency ranges. Also, a final discussion is provided on heterodyning and processing of waveforms for multi-exponential decay extraction. PMID:25419263

  19. Synthesis and characterization of donor-acceptor copolymers carrying triphenylamine units for photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Neumann, Katharina; Thelakkat, Mukundan

    2012-09-01

    The synthesis and analysis of solution processable polymers for organic solar cells is crucial for innovative solar cell technologies such as printing processes. In the field of donor materials for photovoltaic applications, polymers based on tetraphenylamine (TPA) are well known hole conducting materials. Here, we synthesized two conjugated TPA containing copolymers via Suzuki polycondensation. We investigated the tuning of the energy levels of the TPA based polymers by two different concepts. Firstly, we introduced an acceptor unit in the side chain. The main-chain of this copolymer was built from TPA units. The resulting copolymer 2-(4-((4'-((4-(2-ethylhexyloxy)phenyl)(paratolyl) amino)biphenyl-4-yl)(para-tolyl)amino)benzylidene) malononitrile P1 showed a broader absorption up to 550 nm. Secondly, we used a donor-acceptor concept by synthesizing a copolymer with alternating electron donating TPA and electron withdrawing Thieno[3,4-b]thiophene ester units. Consequently, the absorption maximum in the copolymer octyl-6-(4-((4-(2-ethylhexyloxy)phenyl)(p-tolyl)amino)phenyl)-4-methylthieno[3,4-b]thiophene-2-carboxylate P2 was red shifted to 580 nm. All three polymers showed high thermal stability. By UV-vis and Cyclic voltammetry measurements the optical and electrochemical properties of the polymers were analyzed.

  20. Developing an experimental case in aluminium foils 1100 to determine the maximum angle of formability in a piece by Dieless-SPIF process

    NASA Astrophysics Data System (ADS)

    Gabriel, Paramo; Adrian, Benitez

    2014-07-01

    Incremental sheet forming by the method of single point incremental forming Dieless-SPIF, is a widely studied process, experimented and developed in countries with high manufacturing technologies, with friendly costs when the productive configuration in a productivity system is based in small production batches. United states, United kingdom and France lead this type of studies and cases, developing various proof with experimental geometries, different from the national environment such as Colombia, Bolivia, Chile, Ecuador and Peru where this process where discretely studied. Previously mentioned, it pretends develop an experimental case of a particular geometry, identifying the maximum formability angle of material permissible for the forming of a piece in one pass, the analysis of forming limit curve (FLC), with the objective to emphasizes in this innovative method based in CAD-CAM technologies, compare with other analogous process of deformation sheet metal like embossing, take correct decisions about the viability and applicability of this process (Dieless) in a particular industrial piece, which responses to the necessities of productive configurations mentioned and be highly taken like a manufacturing alternative to the other conventional process of forming sheet metal like embossing, for systems with slow batches production.

  1. A Novel Framework Based on FastICA for High Density Surface EMG Decomposition

    PubMed Central

    Chen, Maoqi; Zhou, Ping

    2015-01-01

    This study presents a progressive FastICA peel-off (PFP) framework for high density surface electromyogram (EMG) decomposition. The novel framework is based on a shift-invariant model for describing surface EMG. The decomposition process can be viewed as progressively expanding the set of motor unit spike trains, which is primarily based on FastICA. To overcome the local convergence of FastICA, a “peel off” strategy (i.e. removal of the estimated motor unit action potential (MUAP) trains from the previous step) is used to mitigate the effects of the already identified motor units, so more motor units can be extracted. Moreover, a constrained FastICA is applied to assess the extracted spike trains and correct possible erroneous or missed spikes. These procedures work together to improve the decomposition performance. The proposed framework was validated using simulated surface EMG signals with different motor unit numbers (30, 70, 91) and signal to noise ratios (SNRs) (20, 10, 0 dB). The results demonstrated relatively large numbers of extracted motor units and high accuracies (high F1-scores). The framework was also tested with 111 trials of 64-channel electrode array experimental surface EMG signals during the first dorsal interosseous (FDI) muscle contraction at different intensities. On average 14.1 ± 5.0 motor units were identified from each trial of experimental surface EMG signals. PMID:25775496

  2. Knowledge Integration in Global R&D Networks

    NASA Astrophysics Data System (ADS)

    Erkelens, Rose; van den Hooff, Bart; Vlaar, Paul; Huysman, Marleen

    This paper reports a qualitative study conducted at multinational organizations' R&D departments about their process of knowledge integration. Taking into account the knowledge based view (KBV) of the firm and the practice-based view of knowledge, and building on the literatures concerning specialization and integration of knowledge in organizations, we explore which factors may have a significant influence on the integration process of knowledge between R&D units. The findings indicated (1) the contribution of relevant factors influencing knowledge integration processes and (2) a thoughtful balance between engineering and emergent approaches to be helpful in understanding and overcoming knowledge integration issues.

  3. Teaching physics using project-based engineering curriculum with a theme of alternative energy

    NASA Astrophysics Data System (ADS)

    Tasior, Bryan

    The Next Generation Science Standards (NGSS) provide a new set of science standards that, if adopted, shift the focus from content knowledge-based to skill-based education. Students will be expected to use science to investigate the natural world and solve problems using the engineering design process. The world also is facing an impending crisis related to climate, energy supply and use, and alternative energy development. Education has an opportunity to help provide the much needed paradigm shift from our current methods of providing the energy needs of society. The purpose of this research was to measure the effectiveness of a unit that accomplishes the following objectives: uses project-based learning to teach the engineering process and standards of the NGSS, addresses required content expectations of energy and electricity from the HSCE's, and provides students with scientific evidence behind issues (both environmental and social/economic) relating to the energy crisis and current dependence of fossil fuels as our primary energy source. The results of the research indicate that a physics unit can be designed to accomplish these objectives. The unit that was designed, implemented and reported here also shows that it was highly effective at improving students' science content knowledge, implementing the engineering design standards of the NGSS, while raising awareness, knowledge and motivations relating to climate and the energy crisis.

  4. Spatially explicit land-use and land-cover scenarios for the Great Plains of the United States

    USGS Publications Warehouse

    Sohl, Terry L.; Sleeter, Benjamin M.; Sayler, Kristi L.; Bouchard, Michelle A.; Reker, Ryan R.; Bennett, Stacie L.; Sleeter, Rachel R.; Kanengieter, Ronald L.; Zhu, Zhi-Liang

    2012-01-01

    The Great Plains of the United States has undergone extensive land-use and land-cover change in the past 150 years, with much of the once vast native grasslands and wetlands converted to agricultural crops, and much of the unbroken prairie now heavily grazed. Future land-use change in the region could have dramatic impacts on ecological resources and processes. A scenario-based modeling framework is needed to support the analysis of potential land-use change in an uncertain future, and to mitigate potentially negative future impacts on ecosystem processes. We developed a scenario-based modeling framework to analyze potential future land-use change in the Great Plains. A unique scenario construction process, using an integrated modeling framework, historical data, workshops, and expert knowledge, was used to develop quantitative demand for future land-use change for four IPCC scenarios at the ecoregion level. The FORE-SCE model ingested the scenario information and produced spatially explicit land-use maps for the region at relatively fine spatial and thematic resolutions. Spatial modeling of the four scenarios provided spatial patterns of land-use change consistent with underlying assumptions and processes associated with each scenario. Economically oriented scenarios were characterized by significant loss of natural land covers and expansion of agricultural and urban land uses. Environmentally oriented scenarios experienced modest declines in natural land covers to slight increases. Model results were assessed for quantity and allocation disagreement between each scenario pair. In conjunction with the U.S. Geological Survey's Biological Carbon Sequestration project, the scenario-based modeling framework used for the Great Plains is now being applied to the entire United States.

  5. Science Teaching as Educational Interrogation of Scientific Research

    ERIC Educational Resources Information Center

    Ginev, Dimitri

    2013-01-01

    The main argument of this article is that science teaching based on a pedagogy of questions is to be modeled on a hermeneutic conception of scientific research as a process of the constitution of texts. This process is spelled out in terms of hermeneutic phenomenology. A text constituted by scientific practices is at once united by a hermeneutic…

  6. Self-Employment for People with Disabilities in the United States: A Recommended Process for Vocational Rehabilitation Agencies.

    ERIC Educational Resources Information Center

    Arnold, Nancy; Seekins, Tom; Ipsen, Catherine; Colling, Kyle

    2003-01-01

    Recommends a research-based process for rehabilitation agencies assisting clients with self-employment. Steps include counselor-client dialog about self-employment, use of assessment tools and resources, education/training, development of a business plan, start-up funding from the agency and other sources, business start-up, and evaluation of…

  7. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  8. Using ArcObjects for automating fireshed assessments and analyzing wildfire risk

    Treesearch

    Alan A. Ager; Bernhard Bahro; Mark Finney

    2006-01-01

    Firesheds are geographic units used by the Forest Service to delineate areas with similar fire regimes, fire history, and wildland fire risk issues. Fireshed assessment is a collaborative process where specialists design fuel treatments to mitigate wildfire risk. Fireshed assessments are an iterative process where fuel treatments are proposed for specific stands based...

  9. Friction Stir Weld System for Welding and Weld Repair

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey (Inventor); Romine, Peter L. (Inventor); Oelgoetz, Peter A. (Inventor)

    2001-01-01

    A friction stir weld system for welding and weld repair has a base foundation unit connected to a hydraulically controlled elevation platform and a hydraulically adjustable pin tool. The base foundation unit may be fixably connected to a horizontal surface or may be connected to a mobile support in order to provide mobility to the friction stir welding system. The elevation platform may be utilized to raise and lower the adjustable pin tool about a particular axis. Additional components which may be necessary for the friction stir welding process include back plate tooling, fixturing and/or a roller mechanism.

  10. Real-time track-less Cherenkov ring fitting trigger system based on Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cretaro, P.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Gianoli, A.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Piccini, M.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.

    2017-12-01

    The parallel computing power of commercial Graphics Processing Units (GPUs) is exploited to perform real-time ring fitting at the lowest trigger level using information coming from the Ring Imaging Cherenkov (RICH) detector of the NA62 experiment at CERN. To this purpose, direct GPU communication with a custom FPGA-based board has been used to reduce the data transmission latency. The GPU-based trigger system is currently integrated in the experimental setup of the RICH detector of the NA62 experiment, in order to reconstruct ring-shaped hit patterns. The ring-fitting algorithm running on GPU is fed with raw RICH data only, with no information coming from other detectors, and is able to provide more complex trigger primitives with respect to the simple photodetector hit multiplicity, resulting in a higher selection efficiency. The performance of the system for multi-ring Cherenkov online reconstruction obtained during the NA62 physics run is presented.

  11. Low-SWaP coincidence processing for Geiger-mode LIDAR video

    NASA Astrophysics Data System (ADS)

    Schultz, Steven E.; Cervino, Noel P.; Kurtz, Zachary D.; Brown, Myron Z.

    2015-05-01

    Photon-counting Geiger-mode lidar detector arrays provide a promising approach for producing three-dimensional (3D) video at full motion video (FMV) data rates, resolution, and image size from long ranges. However, coincidence processing required to filter raw photon counts is computationally expensive, generally requiring significant size, weight, and power (SWaP) and also time. In this paper, we describe a laboratory test-bed developed to assess the feasibility of low-SWaP, real-time processing for 3D FMV based on Geiger-mode lidar. First, we examine a design based on field programmable gate arrays (FPGA) and demonstrate proof-of-concept results. Then we examine a design based on a first-of-its-kind embedded graphical processing unit (GPU) and compare performance with the FPGA. Results indicate feasibility of real-time Geiger-mode lidar processing for 3D FMV and also suggest utility for real-time onboard processing for mapping lidar systems.

  12. Technology development for lunar base water recycling

    NASA Technical Reports Server (NTRS)

    Schultz, John R.; Sauer, Richard L.

    1992-01-01

    This paper will review previous and ongoing work in aerospace water recycling and identify research activities required to support development of a lunar base. The development of a water recycle system for use in the life support systems envisioned for a lunar base will require considerable research work. A review of previous work on aerospace water recycle systems indicates that more efficient physical and chemical processes are needed to reduce expendable and power requirements. Development work on biological processes that can be applied to microgravity and lunar environments also needs to be initiated. Biological processes are inherently more efficient than physical and chemical processes and may be used to minimize resupply and waste disposal requirements. Processes for recovering and recycling nutrients such as nitrogen, phosphorus, and sulfur also need to be developed to support plant growth units. The development of efficient water quality monitors to be used for process control and environmental monitoring also needs to be initiated.

  13. Development studies for a novel wet oxidation process. Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-07-01

    DETOX{sup SM} is a catalyzed wet oxidation process which destroys organic materials in an acidic water solution of iron at 373 to 473 K. The solution can be used repeatedly to destroy great amounts of organic materials. Since the process is conducted in a contained vessel, air emissions from the process can be well controlled. The solution is also capable of dissolving and concentrating many heavy and radioactive metals for eventual stabilization and disposal. The Phase 2 effort for this project is site selection and engineering design for a DETOX demonstration unit. Site selection was made using a set ofmore » site selection criteria and evaluation factors. A survey of mixed wastes at DOE sites was conducted using the Interim Mixed Waste Inventory Report. Sites with likely suitable waste types were identified. Potential demonstration sites were ranked based on waste types, interest, regulatory needs, scheduling, ability to provide support, and available facilities. Engineering design for the demonstration unit is in progress and is being performed by Jacobs Applied Technology. The engineering design proceeded through preliminary process flow diagrams (PFDs), calculation of mass and energy balances for representative waste types, process and instrumentation diagrams (P and IDs), preparation of component specifications, and a firm cost estimate for fabrication of the demonstration unit.« less

  14. Computer-supported weight-based drug infusion concentrations in the neonatal intensive care unit.

    PubMed

    Giannone, Gay

    2005-01-01

    This article addresses the development of a computerized provider order entry (CPOE)-embedded solution for weight-based neonatal drug infusion developed during the transition from a legacy CPOE system to a customized application of a neonatal CPOE product during a hospital-wide information system transition. The importance of accurate fluid management in the neonate is reviewed. The process of tailoring the system that eventually resulted in the successful development of a computer application enabling weight-based medication infusion calculation for neonates within the CPOE information system is explored. In addition, the article provides guidelines on how to customize a vendor solution for hospitals with neonatal intensive care unit.

  15. Pharmaceutical residues in the drinking water supply: modeling residue concentrations in surface waters of drugs prescribed in the United States.

    PubMed

    Guerrero-Preston, Rafael; Brandt-Rauf, Paul

    2008-09-01

    Pharmaceutical residues and other organic wastewater contaminants (OWC) have been shown to survive conventional water-treatment processes and persist in potable water supplies. To estimate the geographical distribution of the Predicted Environmental Concentration (PEC) of selected drugs prescribed by office based physicians in the United States (US), after non-metabolized residues have been excreted and processed in wastewater treatment plants. The geographical distribution of the PEC in surface waters of pharmaceutical residues was calculated, in four regions of the US. Prescription drug data was obtained from the National Ambulatory Medical Care Survey (NAMCS). The PEC of three drugs prescribed by office based physicians in the US between 1998 and 2000 was compared to the concentrations of these pharmaceuticals found in a surface water characterization project conducted by the United States Geological Survey between 1999 and 2000. There were 803,185,420 medications prescribed by office-based physicians in the US between 1998 and 2000. Relief of pain, hormonal, cardiovascular and antimicrobial medications followed very similar prescription patterns, both in terms of quantity and geographical distribution. Together these four types of medications account for more than half of the medications prescribed between 1998 and 2000. The concentration of pharmaceutical residues found in the drinking water supply was not significantly correlated to the PEC of pharmaceuticals prescribed by office-based physicians. The geographical distribution of medications prescribed by office based physicians in the US underlines the need to implement effective public health strategies.

  16. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  17. Adaptive real-time methodology for optimizing energy-efficient computing

    DOEpatents

    Hsu, Chung-Hsing [Los Alamos, NM; Feng, Wu-Chun [Blacksburg, VA

    2011-06-28

    Dynamic voltage and frequency scaling (DVFS) is an effective way to reduce energy and power consumption in microprocessor units. Current implementations of DVFS suffer from inaccurate modeling of power requirements and usage, and from inaccurate characterization of the relationships between the applicable variables. A system and method is proposed that adjusts CPU frequency and voltage based on run-time calculations of the workload processing time, as well as a calculation of performance sensitivity with respect to CPU frequency. The system and method are processor independent, and can be applied to either an entire system as a unit, or individually to each process running on a system.

  18. Employing OpenCL to Accelerate Ab Initio Calculations on Graphics Processing Units.

    PubMed

    Kussmann, Jörg; Ochsenfeld, Christian

    2017-06-13

    We present an extension of our graphics processing units (GPU)-accelerated quantum chemistry package to employ OpenCL compute kernels, which can be executed on a wide range of computing devices like CPUs, Intel Xeon Phi, and AMD GPUs. Here, we focus on the use of AMD GPUs and discuss differences as compared to CUDA-based calculations on NVIDIA GPUs. First illustrative timings are presented for hybrid density functional theory calculations using serial as well as parallel compute environments. The results show that AMD GPUs are as fast or faster than comparable NVIDIA GPUs and provide a viable alternative for quantum chemical applications.

  19. pKa shifting in double-stranded RNA is highly dependent upon nearest neighbors and bulge positioning.

    PubMed

    Wilcox, Jennifer L; Bevilacqua, Philip C

    2013-10-22

    Shifting of pKa's in RNA is important for many biological processes; however, the driving forces responsible for shifting are not well understood. Herein, we determine how structural environments surrounding protonated bases affect pKa shifting in double-stranded RNA (dsRNA). Using (31)P NMR, we determined the pKa of the adenine in an A(+)·C base pair in various sequence and structural environments. We found a significant dependence of pKa on the base pairing strength of nearest neighbors and the location of a nearby bulge. Increasing nearest neighbor base pairing strength shifted the pKa of the adenine in an A(+)·C base pair higher by an additional 1.6 pKa units, from 6.5 to 8.1, which is well above neutrality. The addition of a bulge two base pairs away from a protonated A(+)·C base pair shifted the pKa by only ~0.5 units less than a perfectly base paired hairpin; however, positioning the bulge just one base pair away from the A(+)·C base pair prohibited formation of the protonated base pair as well as several flanking base pairs. Comparison of data collected at 25 °C and 100 mM KCl to biological temperature and Mg(2+) concentration revealed only slight pKa changes, suggesting that similar sequence contexts in biological systems have the potential to be protonated at biological pH. We present a general model to aid in the determination of the roles protonated bases may play in various dsRNA-mediated processes including ADAR editing, miRNA processing, programmed ribosomal frameshifting, and general acid-base catalysis in ribozymes.

  20. Daily and seasonal trends of electricity and water use on pasture-based automatic milking dairy farms.

    PubMed

    Shortall, J; O'Brien, B; Sleator, R D; Upton, J

    2018-02-01

    The objective of this study was to identify the major electricity and water-consuming components of a pasture-based automatic milking (AM) system and to establish the daily and seasonal consumption trends. Electricity and water meters were installed on 7 seasonal calving pasture-based AM farms across Ireland. Electricity-consuming processes and equipment that were metered for consumption included milk cooling components, air compressors, AM unit(s), auxiliary water heaters, water pumps, lights, sockets, automatic manure scrapers, and so on. On-farm direct water-consuming processes and equipment were metered and included AM unit(s), auxiliary water heaters, tubular coolers, wash-down water pumps, livestock drinking water supply, and miscellaneous water taps. Data were collected and analyzed for the 12-mo period of 2015. The average AM farm examined had 114 cows, milking with 1.85 robots, performing a total of 105 milkings/AM unit per day. Total electricity consumption and costs were 62.6 Wh/L of milk produced and 0.91 cents/L, respectively. Milking (vacuum and milk pumping, within-AM unit water heating) had the largest electrical consumption at 33%, followed by air compressing (26%), milk cooling (18%), auxiliary water heating (8%), water pumping (4%), and other electricity-consuming processes (11%). Electricity costs followed a similar trend to that of consumption, with the milking process and water pumping accounting for the highest and lowest cost, respectively. The pattern of daily electricity consumption was similar across the lactation periods, with peak consumption occurring at 0100, 0800, and between 1300 and 1600 h. The trends in seasonal electricity consumption followed the seasonal milk production curve. Total water consumption was 3.7 L of water/L of milk produced. Water consumption associated with the dairy herd at the milking shed represented 42% of total water consumed on the farm. Daily water consumption trends indicated consumption to be lowest in the early morning period (0300-0600 h), followed by spikes in consumption between 1100 and 1400 h. Seasonal water trends followed the seasonal milk production curve, except for the month of May, when water consumption was reduced due to above-average rainfall. This study provides a useful insight into the consumption of electricity and water on a pasture-based AM farms, while also facilitating the development of future strategies and technologies likely to increase the sustainability of AM systems. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).

  1. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Students Advise Fortune 500 Company: Designing a Problem-Based Learning Community

    ERIC Educational Resources Information Center

    Brzovic, Kathy; Matz, S. Irene

    2009-01-01

    This article describes the process of planning and implementing a problem-based learning community. Business and communication students from a large university in the Western United States competed in teams to solve an authentic business problem posed by a Fortune 500 company. The company's willingness to adopt some of their recommendations…

  3. Social processes underlying acculturation: a study of drinking behavior among immigrant Latinos in the Northeast United States

    PubMed Central

    LEE, CHRISTINA S.; LÓPEZ, STEVEN REGESER; COBLY, SUZANNE M.; TEJADA, MONICA; GARCÍA-COLL, CYNTHIA; SMITH, MARCIA

    2010-01-01

    Study Goals To identify social processes that underlie the relationship of acculturation and heavy drinking behavior among Latinos who have immigrated to the Northeast United States of America (USA). Method Community-based recruitment strategies were used to identify 36 Latinos who reported heavy drinking. Participants were 48% female, 23 to 56 years of age, and were from South or Central America (39%) and the Caribbean (24%). Six focus groups were audiotaped and transcribed. Results Content analyses indicated that the social context of drinking is different in the participants’ countries of origin and in the United States. In Latin America, alcohol consumption was part of everyday living (being with friends and family). Nostalgia and isolation reflected some of the reasons for drinking in the USA. Results suggest that drinking in the Northeastern United States (US) is related to Latinos’ adaptation to a new sociocultural environment. Knowledge of the shifting social contexts of drinking can inform health interventions. PMID:20376331

  4. Hydration-induced crystalline transformation of starch polymer under ambient conditions.

    PubMed

    Qiao, Dongling; Zhang, Binjia; Huang, Jing; Xie, Fengwei; Wang, David K; Jiang, Fatang; Zhao, Siming; Zhu, Jie

    2017-10-01

    With synchrotron small/wide-angle X-ray scattering (SAXS/WAXS), we revealed that post-harvest hydration at ambient conditions can further alter the starch crystalline structure. The hydration process induced the alignment of starch helices into crystalline lamellae, irrespective of the starch type (A- or B-). In this process, non-crystalline helices were probably packed with water molecules to form new crystal units, thereby enhancing the overall concentration of starch crystallinity. In particular, a fraction of the monoclinic crystal units of the A-type starches encapsulated water molecules during hydration, leading to the outward movement of starch helices. Such movement resulted in the transformation of monoclinic units into hexagonal units, which was associated with the B-type crystallites. Hence, the hydration under ambient conditions could enhance the B-polymorphic features for both A-type and B-type starches. The new knowledge obtained here may guide the design of biopolymer-based liquid crystal materials with controlled lattice regularity and demanded features. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Linam Ranch cryogenic gas plant: A design and operating retrospective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, L.J.; Kuscinski, J.

    1999-07-01

    GPM Gas Corporation's Linam Ranch Gas Plant is the processing hub of their southeastern New Mexico gathering system, producing a y-grade NGL product which is pipelined primarily to the Phillips petrochemical complex at Sweeney, Texas, GPM acquired the facility near Hobbs, N.M. late in 1994 when it was still operating as a refrigerated lean oil plant, renamed it, and commenced an upgrade project culminating in its conversion to a high recovery cryogenic facility in early 1996 with a processing capacity of 150 MMscfd. Facilities that were upgraded included inlet liquids receiving and handling, the amine system, mol sieve dehydration, themore » sulfur recovery unit, inlet compression, and the propane refrigeration system. A Foxboro I/A DCS was also placed into operation. The lean oil system was replaced with a high recovery turboexpander unit supplied by KTI Fish based on their Flash Vapor Reflux (FVR) process. Resulting ethane recovery was greater than 95% for the new facilities. New residue compression units were installed including steam generators on the turbine exhausts, which complemented the existing plant steam system. During the three years since conversion to cryogenic operation, GPM has steadily improved plant operations. Expansion of the mol sieve dehydration system and retrofit of evaporation combustion air cooling on gas turbines have expanded nameplate capacity to 170 MMscfd while maintaining ethane recovery at 95%. Future expansion to 200 MMscfd with high recovery is achievable. In addition, creative use of the Foxboro DCS has been employed to implement advanced control schemes for handling inlet liquid slugs, gas and amine balancing for parallel amine contactors, improved sulfur recovery unit (SRU) trim air control, and constraint-based process optimization to maximize horsepower utilization and ethane recovery. Some challenges remain, leaving room for additional improvements. However, GPM's progress so far has resulted in a current ethane recovery level in excess of 97% when processing gas at the original design throughput of 150 MMscfd.« less

  6. CLABSI Conversations: Lessons From Peer-to-Peer Assessments to Reduce Central Line-Associated Bloodstream Infections.

    PubMed

    Pham, Julius Cuong; Goeschel, Christine A; Berenholtz, Sean M; Demski, Renee; Lubomski, Lisa H; Rosen, Michael A; Sawyer, Melinda D; Thompson, David A; Trexler, Polly; Weaver, Sallie J; Weeks, Kristina R; Pronovost, Peter J

    2016-01-01

    A national collaborative helped many hospitals dramatically reduce central line-associated bloodstream infections (CLABSIs), but some hospitals struggled to reduce infection rates. This article describes the development of a peer-to-peer assessment process (CLABSI Conversations) and the practical, actionable practices we discovered that helped intensive care unit teams achieve a CLABSI rate of less than 1 infection per 1000 catheter-days for at least 1 year. CLABSI Conversations was designed as a learning-oriented process, in which a team of peers visited hospitals to surface barriers to infection prevention and to share best practices and insights from successful intensive care units. Common practices led to 10 recommendations: executive and board leaders communicate the goal of zero CLABSI throughout the hospital; senior and unit-level leaders hold themselves accountable for CLABSI rates; unit physicians and nurse leaders own the problem; clinical leaders and infection preventionists build infection prevention training and simulation programs; infection preventionists participate in unit-based CLABSI reduction efforts; hospital managers make compliance with best practices easy; clinical leaders standardize the hospital's catheter insertion and maintenance practices and empower nurses to stop any potentially harmful acts; unit leaders and infection preventionists investigate CLABSIs to identify root causes; and unit nurses and staff audit catheter maintenance policies and practices.

  7. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  8. Simultaneous Semi-Distributed Model Calibration Guided by ...

    EPA Pesticide Factsheets

    Modelling approaches to transfer hydrologically-relevant information from locations with streamflow measurements to locations without such measurements continues to be an active field of research for hydrologists. The Pacific Northwest Hydrologic Landscapes (PNW HL) provide a solid conceptual classification framework based on our understanding of dominant processes. A Hydrologic Landscape code (5 letter descriptor based on physical and climatic properties) describes each assessment unit area, and these units average area 60km2. The core function of these HL codes is to relate and transfer hydrologically meaningful information between watersheds without the need for streamflow time series. We present a novel approach based on the HL framework to answer the question “How can we calibrate models across separate watersheds simultaneously, guided by our understanding of dominant processes?“. We should be able to apply the same parameterizations to assessment units of common HL codes if 1) the Hydrologic Landscapes contain hydrologic information transferable between watersheds at a sub-watershed-scale and 2) we use a conceptual hydrologic model and parameters that reflect the hydrologic behavior of a watershed. In this study, This work specifically tests the ability or inability to use HL-codes to inform and share model parameters across watersheds in the Pacific Northwest. EPA’s Western Ecology Division has published and is refining a framework for defining la

  9. A neural circuit transforming temporal periodicity information into a rate-based representation in the mammalian auditory system.

    PubMed

    Dicke, Ulrike; Ewert, Stephan D; Dau, Torsten; Kollmeier, Birger

    2007-01-01

    Periodic amplitude modulations (AMs) of an acoustic stimulus are presumed to be encoded in temporal activity patterns of neurons in the cochlear nucleus. Physiological recordings indicate that this temporal AM code is transformed into a rate-based periodicity code along the ascending auditory pathway. The present study suggests a neural circuit for the transformation from the temporal to the rate-based code. Due to the neural connectivity of the circuit, bandpass shaped rate modulation transfer functions are obtained that correspond to recorded functions of inferior colliculus (IC) neurons. In contrast to previous modeling studies, the present circuit does not employ a continuously changing temporal parameter to obtain different best modulation frequencies (BMFs) of the IC bandpass units. Instead, different BMFs are yielded from varying the number of input units projecting onto different bandpass units. In order to investigate the compatibility of the neural circuit with a linear modulation filterbank analysis as proposed in psychophysical studies, complex stimuli such as tones modulated by the sum of two sinusoids, narrowband noise, and iterated rippled noise were processed by the model. The model accounts for the encoding of AM depth over a large dynamic range and for modulation frequency selective processing of complex sounds.

  10. Lessons Learned in Promoting Evidence-Based Public Health: Perspectives from Managers in State Public Health Departments.

    PubMed

    Allen, Peg; Jacob, Rebekah R; Lakshman, Meenakshi; Best, Leslie A; Bass, Kathryn; Brownson, Ross C

    2018-03-02

    Evidence-based public health (EBPH) practice, also called evidence-informed public health, can improve population health and reduce disease burden in populations. Organizational structures and processes can facilitate capacity-building for EBPH in public health agencies. This study involved 51 structured interviews with leaders and program managers in 12 state health department chronic disease prevention units to identify factors that facilitate the implementation of EBPH. Verbatim transcripts of the de-identified interviews were consensus coded in NVIVO qualitative software. Content analyses of coded texts were used to identify themes and illustrative quotes. Facilitator themes included leadership support within the chronic disease prevention unit and division, unit processes to enhance information sharing across program areas and recruitment and retention of qualified personnel, training and technical assistance to build skills, and the ability to provide support to external partners. Chronic disease prevention leaders' role modeling of EBPH processes and expectations for staff to justify proposed plans and approaches were key aspects of leadership support. Leaders protected staff time in order to identify and digest evidence to address the common barrier of lack of time for EBPH. Funding uncertainties or budget cuts, lack of political will for EBPH, and staff turnover remained challenges. In conclusion, leadership support is a key facilitator of EBPH capacity building and practice. Section and division leaders in public health agencies with authority and skills can institute management practices to help staff learn and apply EBPH processes and spread EBPH with partners.

  11. Impact of an oil-based lubricant on the effectiveness of the sterilization processes .

    PubMed

    Rutala, William A; Gergen, Maria F; Weber, David J

    2008-01-01

    Surgical instruments, including hinged instruments, were inoculated with test microorganisms (ie, methicillin-resistant Staphylococcus aureus, approximately 2 x 10(6) colony-forming units [cfu]; Pseudomonas aeruginosa, approximately 3 x 10(6) cfu; Escherichia coli, approximately 2 x 10(5) cfu; vancomycin-resistant enterococci, 1 x 10(5) cfu; Geobacillus stearothermophilus spores, 2 x 10(5) cfu or more; or Bacillus atrophaeus spores, 9 x 10(4) cfu or more), coated with an oil-based lubricant (hydraulic fluid), subjected to a sterilization process, and then samples from the instruments were cultured. We found that the oil-based lubricant did not alter the effectiveness of the sterilization process because high numbers of clinically relevant bacteria and standard test spores (which are relatively resistant to the sterilization process) were inactivated.

  12. Improve the Efficiency of the Service Process as a Result of the Muda Ideology

    NASA Astrophysics Data System (ADS)

    Lorenc, Augustyn; Przyłuski, Krzysztof

    2018-06-01

    The aim of the paper was to improve service processes carried out by Knorr-Bremse Systemy Kolejowe Polska sp. z o.o. Particularly, emphasise unnecessary movements and physical efforts of employees. The indirect goal was to find a solution in the simplest possible way using the Muda ideology. In order to improve the service process at the beginning was executed the process mapping for the devices to be repaired, ie. brake callipers, electro-hydraulic units and auxiliary release units. The processes were assessed and shown as Pareto-Lorenz analysis. In order to determine the most time consuming process. Based on the obtained results use of a column crane with articulated arm was proposed to facilitate the transfer of heavy components between areas. The final step was to assess the effectiveness of the proposed solution in terms of time saving. From the company perspective results of the analysis are important. The proposed solution not only reduces total service time but also contributes to crew's work comfort.

  13. Methane and Hydrogen Production from Anaerobic Fermentation of Municipal Solid Wastes

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takuro; Lee, Dong-Yeol; Xu, Kaiqin; Li, Yu-You; Inamori, Yuhei

    Methane and hydrogen production was investigated in batch experiments of thermophilic methane and hydrogen fermentation, using domestic garbage and food processing waste classified by fat/carbohydrate balance as a base material. Methane production per unit of VS added was significantly positively correlated with fat content and negatively correlated with carbohydrate content in the substrate, and the average value of the methane production per unit of VS added from fat-rich materials was twice as large as that from carbohydrate-rich materials. By contrast, hydrogen production per unit of VS added was significantly positively correlated with carbohydrate content and negatively correlated with fat content. Principal component analysis using the results obtained in this study enable an evaluation of substrates for methane and hydrogen fermentation based on nutrient composition.

  14. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    PubMed

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  15. Maritime Domain Awareness: C4I for the 1000 Ship Navy

    DTIC Science & Technology

    2009-12-04

    unit action, provide unit sensed contacts, coordinate unit operations, process unit information, release image , and release contact report, Figure 33...Intelligence Tasking Request Intelligence Summary Release Unit Person Incident Release Unit Vessel Incident Process Intelligence Tasking Release Image ...xi LIST OF FIGURES Figure 1. Functional Problem Sequence Process Flow. ....................................................4 Figure 2. United

  16. Recent progress in continuous and semi-continuous processing of solid oral dosage forms: a review.

    PubMed

    Teżyk, Michał; Milanowski, Bartłomiej; Ernst, Andrzej; Lulek, Janina

    2016-08-01

    Continuous processing is an innovative production concept well known and successfully used in other industries for many years. The modern pharmaceutical industry is facing the challenge of transition from a traditional manufacturing approach based on batch-wise production to a continuous manufacturing model. The aim of this article is to present technological progress in manufacturing based on continuous and semi-continuous processing of the solid oral dosage forms. Single unit processes possessing an alternative processing pathway to batch-wise technology or, with some modification, an altered approach that may run continuously, and are thus able to seamlessly switch to continuous manufacturing are briefly presented. Furthermore, the concept of semi-continuous processing is discussed. Subsequently, more sophisticated production systems created by coupling single unit processes and comprising all the steps of production, from powder to final dosage form, were reviewed. Finally, attempts of end-to-end production approach, meaning the linking of continuous synthesis of API from intermediates with the production of final dosage form, are described. There are a growing number of scientific articles showing an increasing interest in changing the approach to the production of pharmaceuticals in recent years. Numerous scientific publications are a source of information on the progress of knowledge and achievements of continuous processing. These works often deal with issues of how to modify or replace the unit processes in order to enable seamlessly switching them into continuous processing. A growing number of research papers concentrate on integrated continuous manufacturing lines in which the production concept of "from powder to tablet" is realized. Four main domains are under investigation: influence of process parameters on intermediates or final dosage forms properties, implementation of process analytical tools, control-managing system responsible for keeping continuous materials flow through the whole manufacturing process and the development of new computational methods to assess or simulate these new manufacturing techniques. The attempt to connect the primary and secondary production steps proves that development of continuously operating lines is possible. A mind-set change is needed to be able to face, and fully assess, the advantages and disadvantages of switching from batch to continuous mode production.

  17. USGS Geospatial Fabric and Geo Data Portal for Continental Scale Hydrology Simulations

    NASA Astrophysics Data System (ADS)

    Sampson, K. M.; Newman, A. J.; Blodgett, D. L.; Viger, R.; Hay, L.; Clark, M. P.

    2013-12-01

    This presentation describes use of United States Geological Survey (USGS) data products and server-based resources for continental-scale hydrologic simulations. The USGS Modeling of Watershed Systems (MoWS) group provides a consistent national geospatial fabric built on NHDPlus. They have defined more than 100,000 hydrologic response units (HRUs) over the continental United States based on points of interest (POIs) and split into left and right bank based on the corresponding stream segment. Geophysical attributes are calculated for each HRU that can be used to define parameters in hydrologic and land-surface models. The Geo Data Portal (GDP) project at the USGS Center for Integrated Data Analytics (CIDA) provides access to downscaled climate datasets and processing services via web-interface and python modules for creating forcing datasets for any polygon (such as an HRU). These resources greatly reduce the labor required for creating model-ready data in-house, contributing to efficient and effective modeling applications. We will present an application of this USGS cyber-infrastructure for assessments of impacts of climate change on hydrology over the continental United States.

  18. Hydrologic classification of rivers based on cluster analysis of dimensionless hydrologic signatures: Applications for environmental instream flows

    NASA Astrophysics Data System (ADS)

    Praskievicz, S. J.; Luo, C.

    2017-12-01

    Classification of rivers is useful for a variety of purposes, such as generating and testing hypotheses about watershed controls on hydrology, predicting hydrologic variables for ungaged rivers, and setting goals for river management. In this research, we present a bottom-up (based on machine learning) river classification designed to investigate the underlying physical processes governing rivers' hydrologic regimes. The classification was developed for the entire state of Alabama, based on 248 United States Geological Survey (USGS) stream gages that met criteria for length and completeness of records. Five dimensionless hydrologic signatures were derived for each gage: slope of the flow duration curve (indicator of flow variability), baseflow index (ratio of baseflow to average streamflow), rising limb density (number of rising limbs per unit time), runoff ratio (ratio of long-term average streamflow to long-term average precipitation), and streamflow elasticity (sensitivity of streamflow to precipitation). We used a Bayesian clustering algorithm to classify the gages, based on the five hydrologic signatures, into distinct hydrologic regimes. We then used classification and regression trees (CART) to predict each gaged river's membership in different hydrologic regimes based on climatic and watershed variables. Using existing geospatial data, we applied the CART analysis to classify ungaged streams in Alabama, with the National Hydrography Dataset Plus (NHDPlus) catchment (average area 3 km2) as the unit of classification. The results of the classification can be used for meeting management and conservation objectives in Alabama, such as developing statewide standards for environmental instream flows. Such hydrologic classification approaches are promising for contributing to process-based understanding of river systems.

  19. Interactive brain shift compensation using GPU based programming

    NASA Astrophysics Data System (ADS)

    van der Steen, Sander; Noordmans, Herke Jan; Verdaasdonk, Rudolf

    2009-02-01

    Processing large images files or real-time video streams requires intense computational power. Driven by the gaming industry, the processing power of graphic process units (GPUs) has increased significantly. With the pixel shader model 4.0 the GPU can be used for image processing 10x faster than the CPU. Dedicated software was developed to deform 3D MR and CT image sets for real-time brain shift correction during navigated neurosurgery using landmarks or cortical surface traces defined by the navigation pointer. Feedback was given using orthogonal slices and an interactively raytraced 3D brain image. GPU based programming enables real-time processing of high definition image datasets and various applications can be developed in medicine, optics and image sciences.

  20. Integration of the Anammox process to the rejection water and main stream lines of WWTPs.

    PubMed

    Morales, Nicolás; Val Del Río, Ángeles; Vázquez-Padín, José Ramón; Méndez, Ramón; Mosquera-Corral, Anuska; Campos, José Luis

    2015-12-01

    Nowadays the application of Anammox based processes in the wastewater treatment plants has given a step forward. The new goal consists of removing the nitrogen present in the main stream of the WWTPs to improve their energetic efficiencies. This new approach aims to remove not only the nitrogen but also to provide a better use of the energy contained in the organic matter. The organic matter will be removed either by an anaerobic psychrophilic membrane reactor or an aerobic stage operated at low solids retention time followed by an anaerobic digestion of the generated sludge. Then ammonia coming from these units will be removed in an Anammox based process in a single unit system. The second strategy provides the best results in terms of operational costs and would allow reductions of about 28%. Recent research works performed on Anammox based processes and operated at relatively low temperatures and/or low ammonia concentrations were carried out in single-stage systems using biofilms, granules or a mixture of flocculent nitrifying and granular Anammox biomasses. These systems allowed the appropriated retention of Anammox and ammonia oxidizing bacteria but also the proliferation of nitrite oxidizing bacteria which seems to be the main drawback to achieve the required effluent quality for disposal. Therefore, prior to the implementation of the Anammox based processes at full scale to the water line, a reliable strategy to avoid nitrite oxidation should be defined in order to maintain the process stability and to obtain the desired effluent quality. If not, the application of a post-denitrification step should be necessary. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Processing device with self-scrubbing logic

    DOEpatents

    Wojahn, Christopher K.

    2016-03-01

    An apparatus includes a processing unit including a configuration memory and self-scrubber logic coupled to read the configuration memory to detect compromised data stored in the configuration memory. The apparatus also includes a watchdog unit external to the processing unit and coupled to the self-scrubber logic to detect a failure in the self-scrubber logic. The watchdog unit is coupled to the processing unit to selectively reset the processing unit in response to detecting the failure in the self-scrubber logic. The apparatus also includes an external memory external to the processing unit and coupled to send configuration data to the configuration memory in response to a data feed signal outputted by the self-scrubber logic.

  2. Replication of clinical innovations in multiple medical practices.

    PubMed

    Henley, N S; Pearce, J; Phillips, L A; Weir, S

    1998-11-01

    Many clinical innovations had been successfully developed and piloted in individual medical practice units of Kaiser Permanente in North Carolina during 1995 and 1996. Difficulty in replicating these clinical innovations consistently throughout all 21 medical practice units led to development of the interdisciplinary Clinical Innovation Implementation Team, which was formed by using existing resources from various departments across the region. REPLICATION MODEL: Based on a model of transfer of best practices, the implementation team developed a process and tools (master schedule and activity matrix) to quickly replicate successful pilot projects throughout all medical practice units. The process involved the following steps: identifying a practice and delineating its characteristics and measures (source identification); identifying a team to receive the (new) practice; piloting the practice; and standardizing, including the incorporation of learnings. The model includes the following components for each innovation: sending and receiving teams, an innovation coordinator role, an innovation expert role, a location expert role, a master schedule, and a project activity matrix. Communication depended on a partnership among the location experts (local knowledge and credibility), the innovation coordinator (process expertise), and the innovation experts (content expertise). Results after 12 months of working with the 21 medical practice units include integration of diabetes care team services into the practices, training of more than 120 providers in the use of personal computers and an icon-based clinical information system, and integration of a planwide self-care program into the medical practices--all with measurable improved outcomes. The model for sequential replication and the implementation team structure and function should be successful in other organizational settings.

  3. Fabrication of flexible grating sensing waveguide based on nano-imprint lithography and micro-replication process

    NASA Astrophysics Data System (ADS)

    Liu, Yueming; Tian, Weijian; Zhang, Shaojun

    2009-05-01

    Soft and flexible grating sensing waveguides is urgently demanded in application of micro-bending sensing and surface distortion sensing in medical catheter and smart skin sensing unit etc. Based on Nano-imprint Lithography and micro-replication process, polymer grating waveguides with core size 4μm×20μm and pitch 0.75μm are fabricated successfully in this paper. This novel grating waveguides is soft and flexible enough for related application and with the bio-medical safe feature when used in human body catheter. Fabricated processes are presented including the fabrication of micro mould and UV-replication process, and relative skills are discussed also in this paper.

  4. Techniques for blade tip clearance measurements with capacitive probes

    NASA Astrophysics Data System (ADS)

    Steiner, Alexander

    2000-07-01

    This article presents a proven but advantageous concept for blade tip clearance evaluation in turbomachinery. The system is based on heavy duty probes and a high frequency (HF) and amplifying electronic unit followed by a signal processing unit. Measurements are taken under high temperature and other severe conditions such as ionization. Every single blade can be observed. The signals are digitally filtered and linearized in real time. The electronic set-up is highly integrated. Miniaturized versions of the electronic units exist. The small and robust units can be used in turbo engines in flight. With several probes at different angles in one radial plane further information is available. Shaft eccentricity or blade oscillations can be calculated.

  5. Post-construction monitoring of a Core-Loc™ breakwater using tripod-based LiDAR

    USGS Publications Warehouse

    Podoski, Jessica H.; Bawden, Gerald W.; Bond, Sandra; Smith, Thomas D.; Foster, James

    2010-01-01

    The goal of the technology application described herein is to determine whether breakwater monitoring data collected using Tripod (or Terrestrial) Light Detection and Ranging (T-LiDAR) can give insight into processes such as how Core-Loc™ concrete armour units nest following construction, and in turn how settlement affects armour layer stability, concrete cap performance, and armour unit breakage.  A further objective is that this information can then be incorporated into the design of future projects using concrete armour units.  The results of this application of T-LiDAR, including the challenges encountered and the conclusions drawn regarding initial concrete armour unit movement will be presented in this paper.

  6. Processing and Applications of Depleted Uranium Alloy Products

    DTIC Science & Technology

    1976-09-01

    temperal,,r at the wheel-metal interface, thus tending to produce surface cracks and in some cases to burn the metal. Data on speeds and feeds inr...comprehensive current resource of technical information on the development and utilization of advcnlod metal- or ceramic-base materials. The Center is operated...under the sponsorship of the Department of Defense. Neither the United Staxes Government nor any person acting on be ilf of the United States Government

  7. Optimized 4-bit Quantum Reversible Arithmetic Logic Unit

    NASA Astrophysics Data System (ADS)

    Ayyoub, Slimani; Achour, Benslama

    2017-08-01

    Reversible logic has received a great attention in the recent years due to its ability to reduce the power dissipation. The main purposes of designing reversible logic are to decrease quantum cost, depth of the circuits and the number of garbage outputs. The arithmetic logic unit (ALU) is an important part of central processing unit (CPU) as the execution unit. This paper presents a complete design of a new reversible arithmetic logic unit (ALU) that can be part of a programmable reversible computing device such as a quantum computer. The proposed ALU based on a reversible low power control unit and small performance parameters full adder named double Peres gates. The presented ALU can produce the largest number (28) of arithmetic and logic functions and have the smallest number of quantum cost and delay compared with existing designs.

  8. Design, processing, and testing of LSI arrays for space station

    NASA Technical Reports Server (NTRS)

    Schneider, W. C.

    1974-01-01

    At wafer probe, units of the TA6567 circuit, a beam leaded COS/MOS/SOS 256-bit RAM, were demonstrated to be functionally perfect. An aluminum gate current-sense version and a silicon-gate voltage-sense version of this memory were developed. Initial base line data for the beam lead SOS process using the TA5388 circuit show the stability of the dc device characteristics through the beam lead processing.

  9. Acceleration of GPU-based Krylov solvers via data transfer reduction

    DOE PAGES

    Anzt, Hartwig; Tomov, Stanimire; Luszczek, Piotr; ...

    2015-04-08

    Krylov subspace iterative solvers are often the method of choice when solving large sparse linear systems. At the same time, hardware accelerators such as graphics processing units continue to offer significant floating point performance gains for matrix and vector computations through easy-to-use libraries of computational kernels. However, as these libraries are usually composed of a well optimized but limited set of linear algebra operations, applications that use them often fail to reduce certain data communications, and hence fail to leverage the full potential of the accelerator. In this study, we target the acceleration of Krylov subspace iterative methods for graphicsmore » processing units, and in particular the Biconjugate Gradient Stabilized solver that significant improvement can be achieved by reformulating the method to reduce data-communications through application-specific kernels instead of using the generic BLAS kernels, e.g. as provided by NVIDIA’s cuBLAS library, and by designing a graphics processing unit specific sparse matrix-vector product kernel that is able to more efficiently use the graphics processing unit’s computing power. Furthermore, we derive a model estimating the performance improvement, and use experimental data to validate the expected runtime savings. Finally, considering that the derived implementation achieves significantly higher performance, we assert that similar optimizations addressing algorithm structure, as well as sparse matrix-vector, are crucial for the subsequent development of high-performance graphics processing units accelerated Krylov subspace iterative methods.« less

  10. Proposal for a new CAPE-OPEN Object Model

    EPA Science Inventory

    Process simulation applications require the exchange of significant amounts of data between the flowsheet environment, unit operation model, and thermodynamic server. Packing and unpacking various data types and exchanging data using structured text-based architectures, including...

  11. Uniting of NuSTAR Spacecraft and Rocket

    NASA Image and Video Library

    2012-02-23

    Inside an environmental enclosure at Vandenberg Air Force Base processing facility in California, solar panels line the sides of NASA Nuclear Spectroscopic Telescope Array NuSTAR, which was just joined to the Orbital Sciences Pegasus XL rocket.

  12. USING THE HEALTH TECHNOLOGY ASSESSMENT TOOLBOX TO FACILITATE PROCUREMENT: THE CASE OF SMART PUMPS IN A CANADIAN HOSPITAL.

    PubMed

    Poder, Thomas G

    2017-01-01

    The aim of this study was to present the experience of a Canadian hospital-based health technology assessment (HTA) unit that performed the traditional functions of the HTA process along with many other activities to facilitate the choice of smart pumps. A rapid literature review was initiated, but little evidence was found. Moreover, the evidence provided was too far from our hospital context. To help our decision makers, we offered them a list of various services based on the skills of our HTA unit staff. To involve our HTA unit in the choice of the new smart pumps led to a strong collaboration between hospital services. After a rapid review on smart pumps, we proceeded to establish the clinical needs, followed by an evaluation of technical features. To ascertain clinical needs, we participated in the establishment of a conformity list for the tender, a failure and mode-effect analysis, an audit on the use of actual smart pumps, and simulation exercises with nurses and doctors to evaluate the ease of use and ergonomics. With regard to technical tests, these were mainly conducted to identify potential dysfunction and to assess the efficiency of the pump. This experience with smart pumps was useful for evidence-based procurement and led to the formulation of a nine-step process to guide future work. HTA units and agencies are faced with rapid development of new technologies that may not be supported by sufficient amount of pertinent published evidence. Under these circumstances, approaches other than evidence-based selection might provide useful information. Because these activities may be different from those related to classic HTA, this widens the scope of what can be done in HTA to support decision making.

  13. Rolling scheduling of electric power system with wind power based on improved NNIA algorithm

    NASA Astrophysics Data System (ADS)

    Xu, Q. S.; Luo, C. J.; Yang, D. J.; Fan, Y. H.; Sang, Z. X.; Lei, H.

    2017-11-01

    This paper puts forth a rolling modification strategy for day-ahead scheduling of electric power system with wind power, which takes the operation cost increment of unit and curtailed wind power of power grid as double modification functions. Additionally, an improved Nondominated Neighbor Immune Algorithm (NNIA) is proposed for solution. The proposed rolling scheduling model has further improved the operation cost of system in the intra-day generation process, enhanced the system’s accommodation capacity of wind power, and modified the key transmission section power flow in a rolling manner to satisfy the security constraint of power grid. The improved NNIA algorithm has defined an antibody preference relation model based on equal incremental rate, regulation deviation constraints and maximum & minimum technical outputs of units. The model can noticeably guide the direction of antibody evolution, and significantly speed up the process of algorithm convergence to final solution, and enhance the local search capability.

  14. KENNEDY SPACE CENTER, FLA. - On a tour of the Orbiter Processing Facility, Center Director Jim Kennedy and Deputy Director Woodrow Whitlow Jr. (center, left and right) talk with Kathy Laufenberg, Orbiter Airframe Engineering ground rea manager, and Tom Roberts, Airframe Engineering System specialist, both with United Space Alliance. At far right is Bruce Buckingham, assistant to Dr. Whitlow. They are standing in front of the aft base heatshield of Endeavour, which is in its Orbiter Major Modification period that began in December 2003.

    NASA Image and Video Library

    2004-02-25

    KENNEDY SPACE CENTER, FLA. - On a tour of the Orbiter Processing Facility, Center Director Jim Kennedy and Deputy Director Woodrow Whitlow Jr. (center, left and right) talk with Kathy Laufenberg, Orbiter Airframe Engineering ground rea manager, and Tom Roberts, Airframe Engineering System specialist, both with United Space Alliance. At far right is Bruce Buckingham, assistant to Dr. Whitlow. They are standing in front of the aft base heatshield of Endeavour, which is in its Orbiter Major Modification period that began in December 2003.

  15. KENNEDY SPACE CENTER, FLA. - On a tour of the Orbiter Processing Facility, Center Director Jim Kennedy and Deputy Director Woodrow Whitlow Jr. (center, left and right) talk with Kathy Laufenberg, Orbiter Airframe Engineering ground area manager, and Tom Roberts, Airframe Enginering System specialist, both with United Space Alliance. At far right is Bruce Buckingham, assistant to Dr. Whitlow. They are standing in front of the aft base heatshield of Endeavour, which is in its Orbiter Major Modification period that began in December 2003.

    NASA Image and Video Library

    2004-02-25

    KENNEDY SPACE CENTER, FLA. - On a tour of the Orbiter Processing Facility, Center Director Jim Kennedy and Deputy Director Woodrow Whitlow Jr. (center, left and right) talk with Kathy Laufenberg, Orbiter Airframe Engineering ground area manager, and Tom Roberts, Airframe Enginering System specialist, both with United Space Alliance. At far right is Bruce Buckingham, assistant to Dr. Whitlow. They are standing in front of the aft base heatshield of Endeavour, which is in its Orbiter Major Modification period that began in December 2003.

  16. Optimizing Maintenance of Constraint-Based Database Caches

    NASA Astrophysics Data System (ADS)

    Klein, Joachim; Braun, Susanne

    Caching data reduces user-perceived latency and often enhances availability in case of server crashes or network failures. DB caching aims at local processing of declarative queries in a DBMS-managed cache close to the application. Query evaluation must produce the same results as if done at the remote database backend, which implies that all data records needed to process such a query must be present and controlled by the cache, i. e., to achieve “predicate-specific” loading and unloading of such record sets. Hence, cache maintenance must be based on cache constraints such that “predicate completeness” of the caching units currently present can be guaranteed at any point in time. We explore how cache groups can be maintained to provide the data currently needed. Moreover, we design and optimize loading and unloading algorithms for sets of records keeping the caching units complete, before we empirically identify the costs involved in cache maintenance.

  17. Empty tracks optimization based on Z-Map model

    NASA Astrophysics Data System (ADS)

    Liu, Le; Yan, Guangrong; Wang, Zaijun; Zang, Genao

    2017-12-01

    For parts with many features, there are more empty tracks during machining. If these tracks are not optimized, the machining efficiency will be seriously affected. In this paper, the characteristics of the empty tracks are studied in detail. Combining with the existing optimization algorithm, a new tracks optimization method based on Z-Map model is proposed. In this method, the tool tracks are divided into the unit processing section, and then the Z-Map model simulation technique is used to analyze the order constraint between the unit segments. The empty stroke optimization problem is transformed into the TSP with sequential constraints, and then through the genetic algorithm solves the established TSP problem. This kind of optimization method can not only optimize the simple structural parts, but also optimize the complex structural parts, so as to effectively plan the empty tracks and greatly improve the processing efficiency.

  18. Role of comparative effectiveness research in cancer funding decisions in Ontario, Canada.

    PubMed

    Hoch, Jeffrey S; Hodgson, David C; Earle, Craig C

    2012-12-01

    Recently, the evidence-based drug funding process in Ontario, Canada, was challenged by a young mother with a breast tumor too small, based on the evidence that existed at the time, to qualify for an expensive drug. In reality, this is only the latest in a number of challenges the publicly funded health care system has had to deal with in the face of an evolving drug policy landscape. This article defines comparative effectiveness research (CER), considering how it is viewed differently in the United States and Canada. It also reviews the role CER now plays in the Ontario drug funding process and concludes with a review of the challenges and opportunities of using observational data to conduct CER and incorporate it into policy making within a universal health care system. Many of the issues faced by Ontario are relevant beyond Canada, including in the United States during this period of health care reform.

  19. The evaluation of a web-based incident reporting system.

    PubMed

    Kuo, Ya-Hui; Lee, Ting-Ting; Mills, Mary Etta; Lin, Kuan-Chia

    2012-07-01

    A Web-based reporting system is essential to report incident events anonymously and confidentially. The purpose of this study was to evaluate a Web-based reporting system in Taiwan. User satisfaction and impact of system use were evaluated through a survey answered by 249 nurses. Incident events reported in paper and electronic systems were collected for comparison purposes. Study variables included system user satisfaction, willingness to report, number of reports, severity of the events, and efficiency of the reporting process. Results revealed that senior nurses were less willing to report events, nurses on internal medicine units had higher satisfaction than others, and lowest satisfaction was related to the time it took to file a report. In addition, the Web-based reporting system was used more often than the paper system. The percentages of events reported were significantly higher in the Web-based system in laboratory, environment/device, and incidents occurring in other units, whereas the proportions of reports involving bedsores and dislocation of endotracheal tubes were decreased. Finally, moderate injury event reporting decreased, whereas minor or minimal injury event reporting increased. The study recommends that the data entry process be simplified and the network system be improved to increase user satisfaction and reporting rates.

  20. Tracking Down Batholith Construction Through Time Using Lobes From The Tuolumne Batholith, Sierra Nevada, CA.

    NASA Astrophysics Data System (ADS)

    Memeti, V.; Paterson, S. R.

    2006-12-01

    Data gained using various geologic tools from large, composite batholiths, such as the 95-85 Ma old Tuolumne Batholith (TB), Sierra Nevada, CA, indicate complex batholithic processes at the chamber construction site, in part since they record different increments of batholith construction through time. Large structural and compositional complexity generally occurs throughout the main batholith such as (1) geochemistry, (2) internal contacts between different units (Bateman, 1992; Zak &Paterson, 2005), (3) batholith/host rock contacts, (4) geochronology (Coleman et al., 2004; Matzel et al., 2005, 2006), and (5) internal structures such as schlieren layering and fabrics (Bateman, 1992; Zak et al., 2006) leading to controversies regarding batholith construction models. By using magmatic lobes tongues of individual batholithic units that extend into the host rock away from the main batholith we avoid some of the complexity that evolved over longer times within the main batholith. Magmatic lobes are "simpler" systems, because they are spatially separated from other units of the batholith and thus ideally represent processes in just one unit at the time of emplacement. Furthermore, they are shorter lived than the main batholith since they are surrounded by relatively cold host rock and "freeze in" (1) "snapshots" of batholith construction, and (2) relatively short-lived internal processes and resulting structures and composition in each individual unit. Thus, data from lobes of all batholith units representing different stages of a batholith's lifetime, help us to understand internal magmatic and external host rock processes during batholith construction. Based on field and analytic data from magmatic lobes of the Kuna Crest, Half Dome, and the Cathedral Peak granodiorites, we conclude that (1) the significance of internal processes in the lobes (fractionation versus mixing versus source heterogeneity) is unique for each individual TB unit; (2) emplacement mechanisms such as stoping, downward flow or ductile deformation of host rock act in a very short period of time (only a few 100,000 yrs); and (3) a variety of different magmatic fabrics, formed by strain caused by magma flow, marginal effects, or regional stress, can be found in each lobe. These data lead to the conclusion that the size of the studied lobes indicate the minimum pulse size for TB construction and that fractionation crystallization, even though slightly varying in its magnitude, is an important internal process in each individual TB unit.

  1. Effect of costing methods on unit cost of hospital medical services.

    PubMed

    Riewpaiboon, Arthorn; Malaroje, Saranya; Kongsawatt, Sukalaya

    2007-04-01

    To explore the variance of unit costs of hospital medical services due to different costing methods employed in the analysis. Retrospective and descriptive study at Kaengkhoi District Hospital, Saraburi Province, Thailand, in the fiscal year 2002. The process started with a calculation of unit costs of medical services as a base case. After that, the unit costs were re-calculated based on various methods. Finally, the variations of the results obtained from various methods and the base case were computed and compared. The total annualized capital cost of buildings and capital items calculated by the accounting-based approach (averaging the capital purchase prices throughout their useful life) was 13.02% lower than that calculated by the economic-based approach (combination of depreciation cost and interest on undepreciated portion over the useful life). A change of discount rate from 3% to 6% results in a 4.76% increase of the hospital's total annualized capital cost. When the useful life of durable goods was changed from 5 to 10 years, the total annualized capital cost of the hospital decreased by 17.28% from that of the base case. Regarding alternative criteria of indirect cost allocation, unit cost of medical services changed by a range of -6.99% to +4.05%. We explored the effect on unit cost of medical services in one department. Various costing methods, including departmental allocation methods, ranged between -85% and +32% against those of the base case. Based on the variation analysis, the economic-based approach was suitable for capital cost calculation. For the useful life of capital items, appropriate duration should be studied and standardized. Regarding allocation criteria, single-output criteria might be more efficient than the combined-output and complicated ones. For the departmental allocation methods, micro-costing method was the most suitable method at the time of study. These different costing methods should be standardized and developed as guidelines since they could affect implementation of the national health insurance scheme and health financing management.

  2. Electronic Health Record for Intensive Care based on Usual Windows Based Software.

    PubMed

    Reper, Arnaud; Reper, Pascal

    2015-08-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  3. RTOS kernel in portable electrocardiograph

    NASA Astrophysics Data System (ADS)

    Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.

    2011-12-01

    This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.

  4. Using terrestrial light detection and ranging (lidar) technology for land-surface analysis in the Southwest

    USGS Publications Warehouse

    Soulard, Christopher E.; Bogle, Rian

    2011-01-01

    Emerging technologies provide scientists with methods to measure Earth processes in new ways. One of these technologies--ultra-high-resolution, ground-based light detection and ranging (lidar)--is being used by USGS Western Geographic Science Center scientists to characterize the role of wind and fire processes in shaping desert landscapes of the Southwest United States.

  5. Communication and Collaboration in Library Technical Services: A Case Study of New York University in Abu Dhabi

    ERIC Educational Resources Information Center

    Parrott, Justin

    2016-01-01

    New York University Abu Dhabi Library has developed new strategies to increase efficiency in technical services processing between units based in New York and Abu Dhabi. This case study discusses the challenges specific to the international context and the methods used to overcome them, increase speed processing, and ultimately improve patron…

  6. Process and Outcome Evaluation of an Art Therapy Program for People Living with HIV/AIDS

    ERIC Educational Resources Information Center

    Feldman, Matthew B.; Betts, Donna J.; Blausey, Daniel

    2014-01-01

    Program evaluation offers an opportunity for improving the implementation and impact of art therapy. This article describes a process and outcomes evaluation of an art therapy program within the mental health services unit of a community-based organization for people living with HIV/AIDS. The aims were to assess utilization patterns and program…

  7. The Bologna Club: What U.S. Higher Education Can Learn from a Decade of European Reconstruction

    ERIC Educational Resources Information Center

    Adelman, Clifford

    2008-01-01

    This report examines the efforts of 46 European nations to harmonize (not "standardize") their higher education systems and indicates that the United States higher education system needs to adopt some of the features of the Bologna Process. Based on what can be learned from the Bologna Process, this report makes concrete suggestions for…

  8. Water Use in the United States Energy System: A National Assessment and Unit Process Inventory of Water Consumption and Withdrawals.

    PubMed

    Grubert, Emily; Sanders, Kelly T

    2018-06-05

    The United States (US) energy system is a large water user, but the nature of that use is poorly understood. To support resource comanagement and fill this noted gap in the literature, this work presents detailed estimates for US-based water consumption and withdrawals for the US energy system as of 2014, including both intensity values and the first known estimate of total water consumption and withdrawal by the US energy system. We address 126 unit processes, many of which are new additions to the literature, differentiated among 17 fuel cycles, five life cycle stages, three water source categories, and four levels of water quality. Overall coverage is about 99% of commercially traded US primary energy consumption with detailed energy flows by unit process. Energy-related water consumption, or water removed from its source and not directly returned, accounts for about 10% of both total and freshwater US water consumption. Major consumers include biofuels (via irrigation), oil (via deep well injection, usually of nonfreshwater), and hydropower (via evaporation and seepage). The US energy system also accounts for about 40% of both total and freshwater US water withdrawals, i.e., water removed from its source regardless of fate. About 70% of withdrawals are associated with the once-through cooling systems of approximately 300 steam cycle power plants that produce about 25% of US electricity.

  9. Spatiotemporal variability of snow depletion curves derived from SNODAS for the conterminous United States, 2004-2013

    USGS Publications Warehouse

    Driscoll, Jessica; Hay, Lauren E.; Bock, Andrew R.

    2017-01-01

    Assessment of water resources at a national scale is critical for understanding their vulnerability to future change in policy and climate. Representation of the spatiotemporal variability in snowmelt processes in continental-scale hydrologic models is critical for assessment of water resource response to continued climate change. Continental-extent hydrologic models such as the U.S. Geological Survey National Hydrologic Model (NHM) represent snowmelt processes through the application of snow depletion curves (SDCs). SDCs relate normalized snow water equivalent (SWE) to normalized snow covered area (SCA) over a snowmelt season for a given modeling unit. SDCs were derived using output from the operational Snow Data Assimilation System (SNODAS) snow model as daily 1-km gridded SWE over the conterminous United States. Daily SNODAS output were aggregated to a predefined watershed-scale geospatial fabric and used to also calculate SCA from October 1, 2004 to September 30, 2013. The spatiotemporal variability in SNODAS output at the watershed scale was evaluated through the spatial distribution of the median and standard deviation for the time period. Representative SDCs for each watershed-scale modeling unit over the conterminous United States (n = 54,104) were selected using a consistent methodology and used to create categories of snowmelt based on SDC shape. The relation of SDC categories to the topographic and climatic variables allow for national-scale categorization of snowmelt processes.

  10. Geomorphology and Geology of the Southwestern Margaritifer Sinus and Argyre Regions of Mars. Part 1: Geological and Geomorphological Overview

    NASA Technical Reports Server (NTRS)

    Parker, T. J.; Pieri, D. C.

    1985-01-01

    Based upon Viking Orbiter 1 images of the southwestern portion of the Margaritifer Sinus Quadrangle, the northwestern portion of the Argyre Quadrangle, and a small portion of the southeastern Coprates Quadrangle, three major mountainous of plateau units, seven plains units, and six units related to valley forming processes were identified. The photomosaic is oriented such that it provides good areal coverage of the upper Chryse Trough from Argyre Planitia to just above Margaritifer Chaos as well as of plains units on either side of the Trough. The photomosaic was compiled from Viking Orbiter 1 images ranging in resolution from approximately 150 to 300 meters per pixel printed at a scale of about 1:2,000,000. The characteristics of each geomorphic unit are outlined.

  11. Processing device with self-scrubbing logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojahn, Christopher K.

    An apparatus includes a processing unit including a configuration memory and self-scrubber logic coupled to read the configuration memory to detect compromised data stored in the configuration memory. The apparatus also includes a watchdog unit external to the processing unit and coupled to the self-scrubber logic to detect a failure in the self-scrubber logic. The watchdog unit is coupled to the processing unit to selectively reset the processing unit in response to detecting the failure in the self-scrubber logic. The apparatus also includes an external memory external to the processing unit and coupled to send configuration data to the configurationmore » memory in response to a data feed signal outputted by the self-scrubber logic.« less

  12. [Ecological security of wastewater treatment processes: a review].

    PubMed

    Yang, Sai; Hua, Tao

    2013-05-01

    Though the regular indicators of wastewater after treatment can meet the discharge requirements and reuse standards, it doesn't mean the effluent is harmless. From the sustainable point of view, to ensure the ecological and human security, comprehensive toxicity should be considered when discharge standards are set up. In order to improve the ecological security of wastewater treatment processes, toxicity reduction should be considered when selecting and optimizing the treatment processes. This paper reviewed the researches on the ecological security of wastewater treatment processes, with the focus on the purposes of various treatment processes, including the processes for special wastewater treatment, wastewater reuse, and for the safety of receiving waters. Conventional biological treatment combined with advanced oxidation technologies can enhance the toxicity reduction on the base of pollutants removal, which is worthy of further study. For the process aimed at wastewater reuse, the integration of different process units can complement the advantages of both conventional pollutants removal and toxicity reduction. For the process aimed at ecological security of receiving waters, the emphasis should be put on the toxicity reduction optimization of process parameters and process unit selection. Some suggestions for the problems in the current research and future research directions were put forward.

  13. High performance photovoltaic applications using solution-processed small molecules.

    PubMed

    Chen, Yongsheng; Wan, Xiangjian; Long, Guankui

    2013-11-19

    Energy remains a critical issue for the survival and prosperity of humancivilization. Many experts believe that the eventual solution for sustainable energy is the use of direct solar energy as the main energy source. Among the options for renewable energy, photovoltaic technologies that harness solar energy offer a way to harness an unlimited resource and minimum environment impact in contrast with other alternatives such as water, nuclear, and wind energy. Currently, almost all commercial photovoltaic technologies use Si-based technology, which has a number of disadvantages including high cost, lack of flexibility, and the serious environmental impact of the Si industry. Other technologies, such as organic photovoltaic (OPV) cells, can overcome some of these issues. Today, polymer-based OPV (P-OPV) devices have achieved power conversion efficiencies (PCEs) that exceed 9%. Compared with P-OPV, small molecules based OPV (SM-OPV) offers further advantages, including a defined structure for more reproducible performance, higher mobility and open circuit voltage, and easier synthetic control that leads to more diversified structures. Therefore, while largely undeveloped, SM-OPV is an important emerging technology with performance comparable to P-OPV. In this Account, we summarize our recent results on solution-processed SM-OPV. We believe that solution processing is essential for taking full advantage of OPV technologies. Our work started with the synthesis of oligothiophene derivatives with an acceptor-donor-acceptor (A-D-A) structure. Both the backbone conjugation length and electron withdrawing terminal groups play an important role in the light absorption, energy levels and performance of the devices. Among those molecules, devices using a 7-thiophene-unit backbone and a 3-ethylrhodanine (RD) terminal unit produced a 6.1% PCE. With the optimized conjugation length and terminal unit, we borrowed from the results with P-OPV devices to optimize the backbone. Thus we selected BDT (benzo[1,2-b:4,5-b']dithiophene) and DTS (dithienosilole) to replace the central thiophene unit, leading to a PCE of 8.12%. In addition to our molecules, Bazan and co-workers have developed another excellent system using DTS as the core unit that has also achieved a PCE greater than 8%.

  14. Design-Based Learning for Biology: Genetic Engineering Experience Improves Understanding of Gene Expression

    ERIC Educational Resources Information Center

    Ellefson, Michelle R.; Brinker, Rebecca A.; Vernacchio, Vincent J.; Schunn, Christian D.

    2008-01-01

    Gene expression is a difficult topic for students to learn and comprehend, at least partially because it involves various biochemical structures and processes occurring at the microscopic level. Designer Bacteria, a design-based learning (DBL) unit for high-school students, applies principles of DBL to the teaching of gene expression. Throughout…

  15. A PREA-Based Educational Module to Improve and Enhance BSN Student Forensic Environment Orientation and Preparedness

    ERIC Educational Resources Information Center

    Priano, Staci J.

    2017-01-01

    With the largest incarcerated population in the world, the United States has an urgent but unmet need for a correctional nurse workforce that is prepared with effective, evidence-based (EBP) care-strategies especially focused towards violence and sexual abuse mitigation. Standardized, proactive orientation and education processes within the world…

  16. A Description of a Student-Staffed, Competency-Based Laboratory for the Assessment of Interpersonal Communication Skills.

    ERIC Educational Resources Information Center

    Ratliffe, Sharon A.; Hudson, David D.

    A competency-based skill development and assessment procedure is used in an interpersonal communication course (SpCom 100) at Golden West College in California. SpCom 100, which offers 18 to 24 sections each semester, includes eight units: the interpersonal process, conversation, self-concept and disclosure, perception, verbal language, nonverbal…

  17. 40 CFR Table 1a to Subpart Dddd of... - Production-Based Compliance Options

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the following process units . . . You must meet the following production-based compliance option...) Primary tube dryers 0.26 lb/ODT. (7) Reconstituted wood product board coolers (at new affected sources... dryer heated zones 0.022 lb/MSF 3/8″. (10) Rotary strand dryers 0.18 lb/ODT. (11) Secondary tube dryers...

  18. Multiple constraint analysis of regional land-surface carbon flux

    Treesearch

    D.P. Turner; M. Göckede; B.E. Law; W.D. Ritts; W.B. Cohen; Z. Yang; T. Hudiburg; R. Kennedy; M. Duane

    2011-01-01

    We applied and compared bottom-up (process model-based) and top-down (atmospheric inversion-based) scaling approaches to evaluate the spatial and temporal patterns of net ecosystem production (NEP) over a 2.5 × 105 km2 area (the state of Oregon) in the western United States. Both approaches indicated a carbon sink over this...

  19. Collection of process data after cardiac surgery: initial implementation with a Java-based intranet applet.

    PubMed

    Ratcliffe, M B; Khan, J H; Magee, K M; McElhinney, D B; Hubner, C

    2000-06-01

    Using a Java-based intranet program (applet), we collected postoperative process data after coronary artery bypass grafting. A Java-based applet was developed and deployed on a hospital intranet. Briefly, the nurse entered patient process data using a point and click interface. The applet generated a nursing note, and process data were saved in a Microsoft Access database. In 10 patients, this method was validated by comparison with a retrospective chart review. In 45 consecutive patients, weekly control charts were generated from the data. When aberrations from the pathway occurred, feedback was initiated to restore the goals of the critical pathway. The intranet process data collection method was verified by a manual chart review with 98% sensitivity. The control charts for time to extubation, intensive care unit stay, and hospital stay showed a deviation from critical pathway goals after the first 20 patients. Feedback modulation was associated with a return to critical pathway goals. Java-based applets are inexpensive and can collect accurate postoperative process data, identify critical pathway deviations, and allow timely feedback of process data.

  20. Mars: the evolutionary history of the northern lowlands based on crater counting and geologic mapping

    USGS Publications Warehouse

    Werner, S.C.; Tanaka, K.L.; Skinner, J.A.

    2011-01-01

    The geologic history of planetary surfaces is most effectively determined by joining geologic mapping and crater counting which provides an iterative, qualitative and quantitative method for defining relative ages and absolute model ages. Based on this approach, we present spatial and temporal details regarding the evolution of the Martian northern plains and surrounding regions. The highland–lowland boundary (HLB) formed during the pre-Noachian and was subsequently modified through various processes. The Nepenthes Mensae unit along the northern margins of the cratered highlands, was formed by HLB scarp-erosion, deposition of sedimentary and volcanic materials, and dissection by surface runoff between 3.81 and 3.65 Ga. Ages for giant polygons in Utopia and Acidalia Planitiae are ~ 3.75 Ga and likely reflect the age of buried basement rocks. These buried lowland surfaces are comparable in age to those located closer to the HLB, where a much thinner, post-HLB deposit is mapped. The emplacement of the most extensive lowland surfaces ended between 3.75 and 3.4 Ga, based on densities of craters generally View the MathML source> 3 km in diameter. Results from the polygonal terrain support the existence of a major lowland depocenter shortly after the pre-Noachian formation of the northern lowlands. In general, northern plains surfaces show gradually younger ages at lower elevations, consistent local to regional unit emplacement and resurfacing between 3.6 and 2.6 Ga. Elevation levels and morphology are not necessarily related, and variations in ages within the mapped units are found, especially in units formed and modified by multiple geological processes. Regardless, most of the youngest units in the northern lowlands are considered to be lavas, polar ice, or thick mantle deposits, arguing against the ocean theory during the Amazonian Period (younger than about 3.15 Ga). All ages measured in the closest vicinity of the steep dichotomy escarpment are also 3.7 Ga or older. The formation ages of volcanic flanks at the HLB (e.g., Alba Mons (3.6–3.4 Ga) and the last fan at Apollinaris Mons, 3.71 Ga) may give additional temporal constraint for the possible existence of any kind of Martian ocean before about 3.7 Ga. It seems to reflect the termination of a large-scale, precipitation-based hydrological cycle and major geologic processes related to such cycling.

  1. Enhancing surveillance for hepatitis C through public health informatics.

    PubMed

    Heisey-Grove, Dawn M; Church, Daniel R; Haney, Gillian A; Demaria, Alfred

    2011-01-01

    Disease surveillance for hepatitis C in the United States is limited by the occult nature of many of these infections, the large volume of cases, and limited public health resources. Through a series of discrete processes, the Massachusetts Department of Public Health modified its surveillance system in an attempt to improve timeliness and completeness of reporting and case follow-up of hepatitis C. These processes included clinician-based reporting, electronic laboratory reporting, deployment of a Web-based disease surveillance system, automated triage of pertinent data, and automated character recognition software for case-report processing. These changes have resulted in an increase in the timeliness of reporting.

  2. Can improved quality of care explain the success of orthogeriatric units? A population-based cohort study.

    PubMed

    Kristensen, Pia Kjær; Thillemann, Theis Muncholm; Søballe, Kjeld; Johnsen, Søren Paaske

    2016-01-01

    admission to orthogeriatric units improves clinical outcomes for patients with hip fracture; however, little is known about the underlying mechanisms. to compare quality of in-hospital care, 30-day mortality, time to surgery (TTS) and length of hospital stay (LOS) among patients with hip fracture admitted to orthogeriatric and ordinary orthopaedic units, respectively. population-based cohort study. using prospectively collected data from the Danish Multidisciplinary Hip Fracture Registry, we identified 11,461 patients aged ≥65 years admitted with a hip fracture between 1 March 2010 and 30 November 2011. The patients were divided into two groups: (i) those treated at an orthogeriatric unit, where the geriatrician is an integrated part of the multidisciplinary team, and (ii) those treated at an ordinary orthopaedic unit, where geriatric or medical consultant service are available on request. Outcome measures were the quality of care as reflected by six process performance measures, 30-day mortality, the TTS and the LOS. Data were analysed using log-binomial, linear and logistic regression controlling for potential confounders. admittance to orthogeriatric units was associated with a higher chance for fulfilling five out of six process performance measures. Patients who were admitted to an orthogeriatric unit experienced a lower 30-day mortality (adjusted odds ratio (aOR) 0.69; 95% CI 0.54-0.88), whereas the LOS (adjusted relative time (aRT) of 1.18; 95% CI 0.92-1.52) and the TTS (aRT 1.06; 95% CI 0.89-1.26) were similar. admittance to an orthogeriatric unit was associated with improved quality of care and lower 30-day mortality among patients with hip fracture. © The Author 2015. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. A Large Scale, High Resolution Agent-Based Insurgency Model

    DTIC Science & Technology

    2013-09-30

    CUDA) is NVIDIA Corporation’s software development model for General Purpose Programming on Graphics Processing Units (GPGPU) ( NVIDIA Corporation ...Conference. Argonne National Laboratory, Argonne, IL, October, 2005. NVIDIA Corporation . NVIDIA CUDA Programming Guide 2.0 [Online]. NVIDIA Corporation

  4. Risk-based evaluation of commercial motor vehicle roadside violations : process and results

    DOT National Transportation Integrated Search

    2010-09-01

    This report provides an analytic framework for evaluating the Atlanta Congestion Reduction Demonstration (CRD) under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and CRD Programs. The Atlanta CRD project...

  5. Structural Vulnerability and Health: Latino Migrant Laborers in the United States

    PubMed Central

    Quesada, James; Hart, Laurie K.; Bourgois, Philippe

    2011-01-01

    Latino immigrants in the United States constitute a paradigmatic case of a population group subject to structural violence. Their subordinated location in the global economy and their culturally depreciated status in the United States are exacerbated by legal persecution. Medical Anthropology Volume 30, issues 4 and 5, include a series of ethnographic analyses of the processes that render undocumented Latino immigrants structurally vulnerable to ill-health. We hope to extend the social science concept of ‘structural vulnerability’ to make it a useful tool for health care. Defined as a positionality that imposes physical/emotional suffering on specific population groups and individuals in patterned ways, structural vulnerability is a product of two complementary forces: (1) class-based economic exploitation and cultural, gender/sexual, and racialized discrimination; and (2) processes of symbolic violence and subjectivity formation that have increasingly legitimized punitive neoliberal discourses of individual unworthiness. PMID:21777121

  6. EVALUATING MC AND A EFFECTIVENESS TO VERIFY THE PRESENCE OF NUCLEAR MATERIALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. G. DAWSON; J. A MORZINSKI; ET AL

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process maymore » move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.« less

  7. A Decentralized Framework for Multi-Agent Robotic Systems

    PubMed Central

    2018-01-01

    Over the past few years, decentralization of multi-agent robotic systems has become an important research area. These systems do not depend on a central control unit, which enables the control and assignment of distributed, asynchronous and robust tasks. However, in some cases, the network communication process between robotic agents is overlooked, and this creates a dependency for each agent to maintain a permanent link with nearby units to be able to fulfill its goals. This article describes a communication framework, where each agent in the system can leave the network or accept new connections, sending its information based on the transfer history of all nodes in the network. To this end, each agent needs to comply with four processes to participate in the system, plus a fifth process for data transfer to the nearest nodes that is based on Received Signal Strength Indicator (RSSI) and data history. To validate this framework, we use differential robotic agents and a monitoring agent to generate a topological map of an environment with the presence of obstacles. PMID:29389849

  8. Parallel Computer System for 3D Visualization Stereo on GPU

    NASA Astrophysics Data System (ADS)

    Al-Oraiqat, Anas M.; Zori, Sergii A.

    2018-03-01

    This paper proposes the organization of a parallel computer system based on Graphic Processors Unit (GPU) for 3D stereo image synthesis. The development is based on the modified ray tracing method developed by the authors for fast search of tracing rays intersections with scene objects. The system allows significant increase in the productivity for the 3D stereo synthesis of photorealistic quality. The generalized procedure of 3D stereo image synthesis on the Graphics Processing Unit/Graphics Processing Clusters (GPU/GPC) is proposed. The efficiency of the proposed solutions by GPU implementation is compared with single-threaded and multithreaded implementations on the CPU. The achieved average acceleration in multi-thread implementation on the test GPU and CPU is about 7.5 and 1.6 times, respectively. Studying the influence of choosing the size and configuration of the computational Compute Unified Device Archi-tecture (CUDA) network on the computational speed shows the importance of their correct selection. The obtained experimental estimations can be significantly improved by new GPUs with a large number of processing cores and multiprocessors, as well as optimized configuration of the computing CUDA network.

  9. Petroleum mineral oil refining and evaluation of cancer hazard.

    PubMed

    Mackerer, Carl R; Griffis, Larry C; Grabowski, John S; Reitman, Fred A

    2003-11-01

    Petroleum base oils (petroleum mineral oils) are manufactured from crude oils by vacuum distillation to produce several distillates and a residual oil that are then further refined. Aromatics including alkylated polycyclic aromatic compounds (PAC) are undesirable constituents of base oils because they are deleterious to product performance and are potentially carcinogenic. In modern base oil refining, aromatics are reduced by solvent extraction, catalytic hydrotreating, or hydrocracking. Chronic exposure to poorly refined base oils has the potential to cause skin cancer. A chronic mouse dermal bioassay has been the standard test for estimating carcinogenic potential of mineral oils. The level of alkylated 3-7-ring PAC in raw streams from the vacuum tower must be greatly reduced to render the base oil noncarcinogenic. The processes that can reduce PAC levels are known, but the operating conditions for the processing units (e.g., temperature, pressure, catalyst type, residence time in the unit, unit engineering design, etc.) needed to achieve adequate PAC reduction are refinery specific. Chronic dermal bioassays provide information about whether conditions applied can make a noncarcinogenic oil, but cannot be used to monitor current production for quality control or for conducting research or developing new processes since this test takes at least 78 weeks to conduct. Three short-term, non-animal assays all involving extraction of oil with dimethylsulfoxide (DMSO) have been validated for predicting potential carcinogenic activity of petroleum base oils: a modified Ames assay of a DMSO extract, a gravimetric assay (IP 346) for wt. percent of oil extracted into DMSO, and a GC-FID assay measuring 3-7-ring PAC content in a DMSO extract of oil, expressed as percent of the oil. Extraction with DMSO concentrates PAC in a manner that mimics the extraction method used in the solvent refining of noncarcinogenic oils. The three assays are described, data demonstrating the validation of the assays are shown, and test results of currently manufactured base oils are summarized to illustrate the general lack of cancer hazard for the base oils now being manufactured.

  10. A dual-process perspective on fluency-based aesthetics: the pleasure-interest model of aesthetic liking.

    PubMed

    Graf, Laura K M; Landwehr, Jan R

    2015-11-01

    In this article, we develop an account of how aesthetic preferences can be formed as a result of two hierarchical, fluency-based processes. Our model suggests that processing performed immediately upon encountering an aesthetic object is stimulus driven, and aesthetic preferences that accrue from this processing reflect aesthetic evaluations of pleasure or displeasure. When sufficient processing motivation is provided by a perceiver's need for cognitive enrichment and/or the stimulus' processing affordance, elaborate perceiver-driven processing can emerge, which gives rise to fluency-based aesthetic evaluations of interest, boredom, or confusion. Because the positive outcomes in our model are pleasure and interest, we call it the Pleasure-Interest Model of Aesthetic Liking (PIA Model). Theoretically, this model integrates a dual-process perspective and ideas from lay epistemology into processing fluency theory, and it provides a parsimonious framework to embed and unite a wealth of aesthetic phenomena, including contradictory preference patterns for easy versus difficult-to-process aesthetic stimuli. © 2015 by the Society for Personality and Social Psychology, Inc.

  11. Children and young people with diabetes in Yorkshire: a population-based clinical audit of patient data 2005/2006.

    PubMed

    McKinney, P A; Feltbower, R G; Stephenson, C R; Reynolds, C

    2008-11-01

    To provide a population-based clinical audit of children and young people with diabetes, reporting outcomes, including glycaemic control, for named individual units. Clinical audit data on care processes and glycated haemoglobin (HbA(1c)) were collected for 1742 children and young people treated in 16 paediatric units in Yorkshire, from January 2005 to March 2006. The Yorkshire Register of Diabetes in Children and Young People provided information technology support and validation that enhanced data quality. Multi-level linear regression modelling investigated factors affecting glycaemic control. An HbA(1c) measure was recorded for 91.6% of patients. The National Institute for Clinical Excellence-recommended target level for HbA(1c) of < 7.5% was achieved for 14.7% of patients. HbA(1c) was positively associated with duration of diabetes and later age at diagnosis. Patients living in deprived areas had significantly poorer control compared with those from affluent areas. Significant between-unit variation in HbA(1c) was not reflected by any association with unit size. Our population-based clinical audit of children with diabetes is the product of an effective collaboration between those who deliver care and health services researchers. High levels of recording the key care process measuring diabetes control, compared with national figures, suggests collaboration has translated into improved services. The interesting association between poor diabetes control and higher deprivation is noteworthy and requires further investigation. Future audits require recording of clinical management and clinic structures, in addition to resources to record, assemble and analyse data.

  12. Produced Water Treatment Using the Switchable Polarity Solvent Forward Osmosis (SPS FO) Desalination Process: Preliminary Engineering Design Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Daniel; Adhikari, Birendra; Orme, Christopher

    Switchable Polarity Solvent Forward Osmosis (SPS FO) is a semi-permeable membrane-based water treatment technology. INL is currently advancing SPS FO technology such that a prototype unit can be designed and demonstrated for the purification of produced water from oil and gas production operations. The SPS FO prototype unit will used the thermal energy in the produced water as a source of process heat, thereby reducing the external process energy demands. Treatment of the produced water stream will reduce the volume of saline wastewater requiring disposal via injection, an activity that is correlated with undesirable seismic events, as well as generatemore » a purified product water stream with potential beneficial uses. This paper summarizes experimental data that has been collected in support of the SPS FO scale-up effort, and describes how this data will be used in the sizing of SPS FO process equipment. An estimate of produced water treatment costs using the SPS FO process is also provided.« less

  13. Process Simulation of Gas Metal Arc Welding Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Paul E.

    2005-09-06

    ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size and shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer ofmore » droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less

  14. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.

  15. A 1.2 Gb/s Data Transmission Unit in CMOS 0.18 μm technology for the ALICE Inner Tracking System front-end ASIC

    NASA Astrophysics Data System (ADS)

    Mazza, G.; Aglieri Rinella, G.; Benotto, F.; Corrales Morales, Y.; Kugathasan, T.; Lattuca, A.; Lupi, M.; Ravasenga, I.

    2017-02-01

    The upgrade of the ALICE Inner Tracking System is based on a Monolithic Active Pixel Sensor and ASIC designed in a CMOS 0.18 μ m process. In order to provide the required output bandwidth (1.2 Gb/s for the inner layers and 400 Mb/s for the outer ones) on a single high speed serial link, a custom Data Transmission Unit (DTU) has been developed in the same process. The DTU includes a clock multiplier PLL, a double data rate serializer and a pseudo-LVDS driver with pre-emphasis and is designed to be SEU tolerant.

  16. Design for application of the DETOX{sup SM} wet oxidation process to mixed wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, R.A.; Dhooge, P.M.

    1994-04-01

    Conceptual engineering has been performed for application of the DETOX{sup SM} wet oxidation process to treatment of specific mixed waste types. Chemical compositions, mass balances, energy balances, temperatures, pressures, and flows have been used to define design parameters for treatment units capable of destroying 5. Kg per hour of polychlorinated biphenyls and 25. Kg per hour of tributyl phosphate. Equipment for the units has been sized and materials of construction have been specified. Secondary waste streams have been defined. Environmental safety and health issues in design have been addressed. Capital and operating costs have been estimated based on the conceptualmore » designs.« less

  17. Adaptive real-time methodology for optimizing energy-efficient computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, Chung-Hsing; Feng, Wu-Chun

    Dynamic voltage and frequency scaling (DVFS) is an effective way to reduce energy and power consumption in microprocessor units. Current implementations of DVFS suffer from inaccurate modeling of power requirements and usage, and from inaccurate characterization of the relationships between the applicable variables. A system and method is proposed that adjusts CPU frequency and voltage based on run-time calculations of the workload processing time, as well as a calculation of performance sensitivity with respect to CPU frequency. The system and method are processor independent, and can be applied to either an entire system as a unit, or individually to eachmore » process running on a system.« less

  18. Upgrading the fuel-handling machine of the Novovoronezh nuclear power plant unit no. 5

    NASA Astrophysics Data System (ADS)

    Terekhov, D. V.; Dunaev, V. I.

    2014-02-01

    The calculation of safety parameters was carried out in the process of upgrading the fuel-handling machine (FHM) of the Novovoronezh nuclear power plant (NPP) unit no. 5 based on the results of quantitative safety analysis of nuclear fuel transfer operations using a dynamic logical-and-probabilistic model of the processing procedure. Specific engineering and design concepts that made it possible to reduce the probability of damaging the fuel assemblies (FAs) when performing various technological operations by an order of magnitude and introduce more flexible algorithms into the modernized FHM control system were developed. The results of pilot operation during two refueling campaigns prove that the total reactor shutdown time is lowered.

  19. Sex offender risk assessment: the need to place recidivism research in the context of attrition in the criminal justice system.

    PubMed

    Larcombe, Wendy

    2012-04-01

    Jurisdictions in the United States, United Kingdom, and Australia now have laws that enable preventive detention of post-sentence sex offenders based on an assessment of the offender's likely recidivism. Measures of recidivism, or risk assessments, rely on the criminal justice process to produce the "pool" of sex offenders studied. This article argues that recidivism research needs to be placed in the context of attrition studies that document the disproportionate and patterned attrition of sexual offenses and sexual offenders from the criminal justice process. Understanding the common biases that affect criminal prosecution of sex offenses would improve sexual violence prevention policies.

  20. GPU Accelerated Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley

    2017-01-01

    Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.

  1. MSuPDA: A Memory Efficient Algorithm for Sequence Alignment.

    PubMed

    Khan, Mohammad Ibrahim; Kamal, Md Sarwar; Chowdhury, Linkon

    2016-03-01

    Space complexity is a million dollar question in DNA sequence alignments. In this regard, memory saving under pushdown automata can help to reduce the occupied spaces in computer memory. Our proposed process is that anchor seed (AS) will be selected from given data set of nucleotide base pairs for local sequence alignment. Quick splitting techniques will separate the AS from all the DNA genome segments. Selected AS will be placed to pushdown automata's (PDA) input unit. Whole DNA genome segments will be placed into PDA's stack. AS from input unit will be matched with the DNA genome segments from stack of PDA. Match, mismatch and indel of nucleotides will be popped from the stack under the control unit of pushdown automata. During the POP operation on stack, it will free the memory cell occupied by the nucleotide base pair.

  2. Processes of early stroke care and hospital costs.

    PubMed

    Svendsen, Marie Louise; Ehlers, Lars H; Hundborg, Heidi H; Ingeman, Annette; Johnsen, Søren P

    2014-08-01

    The relationship between processes of early stroke care and hospital costs remains unclear. We therefore examined the association in a population based cohort study. We identified 5909 stroke patients who were admitted to stroke units in a Danish county between 2005 and 2010.The examined recommended processes of care included early admission to a stroke unit, early initiation of antiplatelet or anticoagulant therapy, early computed tomography/magnetic resonance imaging (CT/MRI) scan, early physiotherapy and occupational therapy, early assessment of nutritional risk, constipation risk and of swallowing function, early mobilization,early catheterization, and early thromboembolism prophylaxis.Hospital costs were assessed for each patient based on the number of days spent in different in-hospital facilities using local hospital charges. The mean costs of hospitalization were $23 352 (standard deviation 27 827). The relationship between receiving more relevant processes of early stroke care and lower hospital costs followed a dose–response relationship. The adjusted costs were $24 566 (95% confidence interval 19 364–29 769) lower for patients who received 75–100% of the relevant processes of care compared with patients receiving 0–24%. All processes of care were associated with potential cost savings, except for early catheterization and early thromboembolism prophylaxis. Early care in agreement with key guidelines recommendations for the management of patients with stroke may be associated with hospital savings.

  3. Medical Negligence Determinations, the "Right to Try," and Expanded Access to Innovative Treatments.

    PubMed

    Meyerson, Denise

    2017-09-01

    This article considers the issue of expanded access to innovative treatments in the context of recent legislative initiatives in the United Kingdom and the United States. In the United Kingdom, the supporters of legislative change argued that the common law principles governing medical negligence are a barrier to innovation. In an attempt to remove this perceived impediment, two bills proposed that innovating doctors sued for negligence should be able to rely in their defence on the fact that their decision to innovate was "responsible." A decision to innovate would be regarded as responsible if it followed a specified process. Although these changes to the law of medical negligence were not passed, this article argues that the idea of a process-based approach was sound. In the United States, a number of states have passed "Right to Try" laws that permit doctors to prescribe and companies to provide investigational products without the need for FDA approval. These laws do not purport to and nor are they able to alter the obligations of individuals and companies under federal law. They are consequently unlikely to achieve their stated aim of expanding access to investigational products. This article argues that they nevertheless have a cogent rationale in so far as they highlight the need for rights-based reform to federal regulations governing access.

  4. Laser-assisted chemical vapor deposition setup for fast synthesis of graphene patterns

    NASA Astrophysics Data System (ADS)

    Zhang, Chentao; Zhang, Jianhuan; Lin, Kun; Huang, Yuanqing

    2017-05-01

    An automatic setup based on the laser-assisted chemical vapor deposition method has been developed for the rapid synthesis of graphene patterns. The key components of this setup include a laser beam control and focusing unit, a laser spot monitoring unit, and a vacuum and flow control unit. A laser beam with precision control of laser power is focused on the surface of a nickel foil substrate by the laser beam control and focusing unit for localized heating. A rapid heating and cooling process at the localized region is induced by the relative movement between the focalized laser spot and the nickel foil substrate, which causes the decomposing of gaseous hydrocarbon and the out-diffusing of excess carbon atoms to form graphene patterns on the laser scanning path. All the fabrication parameters that affect the quality and number of graphene layers, such as laser power, laser spot size, laser scanning speed, pressure of vacuum chamber, and flow rates of gases, can be precisely controlled and monitored during the preparation of graphene patterns. A simulation of temperature distribution was carried out via the finite element method, providing a scientific guidance for the regulation of temperature distribution during experiments. A multi-layer graphene ribbon with few defects was synthesized to verify its performance of the rapid growth of high-quality graphene patterns. Furthermore, this setup has potential applications in other laser-based graphene synthesis and processing.

  5. Phenothiazine-based small-molecule organic solar cells with power conversion efficiency over 7% and open circuit voltage of about 1.0 V using solvent vapor annealing.

    PubMed

    Rout, Yogajivan; Misra, Rajneesh; Singhal, Rahul; Biswas, Subhayan; Sharma, Ganesh D

    2018-02-28

    We have used two unsymmetrical small molecules, named phenothiazine 1 and 2 with a D-A-D-π-D configuration, where phenothiazine is used as a central unit, triphenylamine is used as a terminal unit and TCBD and cyclohexa-2,5-diene-1,4-diylidene-expanded TCBD are used as an acceptor between the phenothiazine and triphenylamine units, as a small molecule donor along with PC 71 BM as an acceptor for solution processed bulk heterojunction solar cells. The variation of acceptors in the phenothiazine derivatives makes an exciting change in the photophysical and electrochemical properties, hole mobility and therefore photovoltaic performance. The optimized device based on phenothiazine 2 exhibited a high power conversion efficiency of 7.35% (J sc = 11.98 mA cm -2 , V oc = 0.99 V and FF = 0.62), while the device based on phenothiazine 1 showed a low PCE of 4.81% (J sc = 8.73 mA cm -2 , V oc = 0.95 V and FF = 0.58) after solvent vapour annealing (SVA) treatment. The higher value of power conversion efficiency of the 2 based devices irrespective of the processing conditions may be related to the broader absorption and lower band gap of 2 as compared to 1. The improvement in the SVA treated active layer may be related to the enhanced crystallinity, molecular ordering and aggregation and shorter π-π stacking distance of the small molecule donors.

  6. High efficient optical remote sensing images acquisition for nano-satellite-framework

    NASA Astrophysics Data System (ADS)

    Li, Feng; Xin, Lei; Liu, Yang; Fu, Jie; Liu, Yuhong; Guo, Yi

    2017-09-01

    It is more difficult and challenging to implement Nano-satellite (NanoSat) based optical Earth observation missions than conventional satellites because of the limitation of volume, weight and power consumption. In general, an image compression unit is a necessary onboard module to save data transmission bandwidth and disk space. The image compression unit can get rid of redundant information of those captured images. In this paper, a new image acquisition framework is proposed for NanoSat based optical Earth observation applications. The entire process of image acquisition and compression unit can be integrated in the photo detector array chip, that is, the output data of the chip is already compressed. That is to say, extra image compression unit is no longer needed; therefore, the power, volume, and weight of the common onboard image compression units consumed can be largely saved. The advantages of the proposed framework are: the image acquisition and image compression are combined into a single step; it can be easily built in CMOS architecture; quick view can be provided without reconstruction in the framework; Given a certain compression ratio, the reconstructed image quality is much better than those CS based methods. The framework holds promise to be widely used in the future.

  7. Space imaging infrared optical guidance for autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu

    2008-08-01

    We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.

  8. KSC-2013-4439

    NASA Image and Video Library

    2013-12-19

    VANDENBERG AIR FORCE BASE, Calif. -- A solid rocket motor is rolled into the Solid Rocket Motor Processing Facility at Vandenberg Air Force Base in California. The motor will be attached to the United Launch Alliance Delta II rocket slated to launch NASA's Orbiting Carbon Observatory-2, or OCO-2, spacecraft in July 2014. OCO-2 will collect precise global measurements of carbon dioxide in the Earth's atmosphere. Scientists will analyze this data to improve our understanding of the natural processes and human activities that regulate the abundance and distribution of this important atmospheric gas. Photo credit: NASA/Randy Beaudoin

  9. Consumer preferences and values as an integral key to evidence-based practice.

    PubMed

    Melnyk, Bernadette Mazurek; Fineout-Overholt, Ellen

    2006-01-01

    Although evidence-based practice (EBP) integrates the best evidence from well-designed studies with a clinician's expertise and patient preferences and values, most of what is emphasized in books and reports on EBP is the 5-step EBP process. However, the consideration of patient values and preferences in making clinical decisions is essential to deliver the highest quality of care. This article briefly reviews the status of EBP in the United States, described the ARCC mentorship model, and highlights how to engage consumers in the EBP process.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodenbeck, Christopher T.; Young, Derek; Chou, Tina

    A combined radar and telemetry system is described. The combined radar and telemetry system includes a processing unit that executes instructions, where the instructions define a radar waveform and a telemetry waveform. The processor outputs a digital baseband signal based upon the instructions, where the digital baseband signal is based upon the radar waveform and the telemetry waveform. A radar and telemetry circuit transmits, simultaneously, a radar signal and telemetry signal based upon the digital baseband signal.

  11. Implementation of a Portable Personal EKG Signal Monitoring System

    NASA Astrophysics Data System (ADS)

    Tan, Tan-Hsu; Chang, Ching-Su; Chen, Yung-Fu; Lee, Cheng

    This research develops a portable personal EKG signal monitoring system to help patients monitor their EKG signals instantly to avoid the occurrence of tragedies. This system is built with two main units: signal pro-cessing unit and monitoring and evaluation unit. The first unit consists of EKG signal sensor, signal amplifier, digitalization circuit, and related control circuits. The second unit is a software tool developed on an embedded Linux platform (called CSA). Experimental result indicates that the proposed system has the practical potential for users in health monitoring. It is demonstrated to be more convenient and with greater portability than the conventional PC-based EKG signal monitoring systems. Furthermore, all the application units embedded in the system are built with open source codes, no licensed fee is required for operating systems and authorized applications. Thus, the building cost is much lower than the traditional systems.

  12. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  13. 15 CFR 971.209 - Processing outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Processing outside the United States... Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section... contravenes the overriding national interests of the United States. (b) If foreign processing is proposed, the...

  14. 15 CFR 971.209 - Processing outside the United States.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Processing outside the United States... Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section... contravenes the overriding national interests of the United States. (b) If foreign processing is proposed, the...

  15. Reflective practices at the Security Council: Children and armed conflict and the three United Nations

    PubMed Central

    Bode, Ingvild

    2017-01-01

    The United Nations Security Council passed its first resolution on children in armed conflict in 1999, making it one of the oldest examples of Security Council engagement with a thematic mandate and leading to the creation of a dedicated working group in 2005. Existing theoretical accounts of the Security Council cannot account for the developing substance of the children and armed conflict agenda as they are macro-oriented and focus exclusively on states. I argue that Security Council decision-making on thematic mandates is a productive process whose outcomes are created by and through practices of actors across the three United Nations: member states (the first United Nations), United Nations officials (the second United Nations) and non-governmental organizations (the third United Nations). In presenting a practice-based, micro-oriented analysis of the children and armed conflict agenda, the article aims to deliver on the empirical promise of practice theories in International Relations. I make two contributions to practice-based understandings: first, I argue that actors across the three United Nations engage in reflective practices of a strategic or tactical nature to manage, arrange or create space in Security Council decision-making. Portraying practices as reflective rather than as only based on tacit knowledge highlights how actors may creatively adapt their practices to social situations. Second, I argue that particular individuals from the three United Nations are more likely to become recognized as competent performers of practices because of their personality, understood as plural socialization experiences. This adds varied individual agency to practice theories that, despite their micro-level interests, have focused on how agency is relationally constituted. PMID:29782586

  16. Reflective practices at the Security Council: Children and armed conflict and the three United Nations.

    PubMed

    Bode, Ingvild

    2018-06-01

    The United Nations Security Council passed its first resolution on children in armed conflict in 1999, making it one of the oldest examples of Security Council engagement with a thematic mandate and leading to the creation of a dedicated working group in 2005. Existing theoretical accounts of the Security Council cannot account for the developing substance of the children and armed conflict agenda as they are macro-oriented and focus exclusively on states. I argue that Security Council decision-making on thematic mandates is a productive process whose outcomes are created by and through practices of actors across the three United Nations: member states (the first United Nations), United Nations officials (the second United Nations) and non-governmental organizations (the third United Nations). In presenting a practice-based, micro-oriented analysis of the children and armed conflict agenda, the article aims to deliver on the empirical promise of practice theories in International Relations. I make two contributions to practice-based understandings: first, I argue that actors across the three United Nations engage in reflective practices of a strategic or tactical nature to manage, arrange or create space in Security Council decision-making. Portraying practices as reflective rather than as only based on tacit knowledge highlights how actors may creatively adapt their practices to social situations. Second, I argue that particular individuals from the three United Nations are more likely to become recognized as competent performers of practices because of their personality, understood as plural socialization experiences. This adds varied individual agency to practice theories that, despite their micro-level interests, have focused on how agency is relationally constituted.

  17. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.

  18. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  19. Complex clinical reasoning in the critical care unit - difficulties, pitfalls and adaptive strategies.

    PubMed

    Shaw, M; Singh, S

    2015-04-01

    Diagnostic error has implications for both clinical outcome and resource utilisation, and may often be traced to impaired data gathering, processing or synthesis because of the influence of cognitive bias. Factors inherent to the intensive/acute care environment afford multiple additional opportunities for such errors to occur. This article illustrates many of these with reference to a case encountered on our intensive care unit. Strategies to improve completeness of data gathering, processing and synthesis in the acute care environment are critically appraised in the context of early detection and amelioration of cognitive bias. These include reflection, targeted simulation training and the integration of social media and IT based aids in complex diagnostic processes. A framework which can be quickly and easily employed in a variety of clinical environments is then presented. © 2015 John Wiley & Sons Ltd.

  20. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  1. Index cost estimate based BIM method - Computational example for sports fields

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2017-07-01

    The paper presents an example ofcost estimation in the early phase of the project. The fragment of relative database containing solution, descriptions, geometry of construction object and unit cost of sports facilities was shown. The Index Cost Estimate Based BIM method calculationswith use of Case Based Reasoning were presented, too. The article presentslocal and global similarity measurement and example of BIM based quantity takeoff process. The outcome of cost calculations based on CBR method was presented as a final result of calculations.

  2. Decontamination of Nuclear Liquid Wastes Status of CEA and AREVA R and D: Application to Fukushima Waste Waters - 12312

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fournel, B.; Barre, Y.; Lepeytre, C.

    2012-07-01

    Liquid wastes decontamination processes are mainly based on two techniques: Bulk processes and the so called Cartridges processes. The first technique has been developed for the French nuclear fuel reprocessing industry since the 60's in Marcoule and La Hague. It is a proven and mature technology which has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The second technique, involving cartridges processes, offers new opportunities for the use of innovative adsorbents. The AREVA process developed for Fukushima and some results obtained on site will be presented as well as laboratory scale resultsmore » obtained in CEA laboratories. Examples of new adsorbents development for liquid wastes decontamination are also given. A chemical process unit based on co-precipitation technique has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The asset of this technique is its ability to process large volumes in a continuous mode. Several chemical products can be used to address specific radioelements such as: Cs, Sr, Ru. Its drawback is the production of sludge (about 1% in volume of initial liquid volume). CEA developed strategies to model the co-precipitation phenomena in order to firstly minimize the quantity of added chemical reactants and secondly, minimize the size of co-precipitation units. We are on the way to design compact units that could be mobilized very quickly and efficiently in case of an accidental situation. Addressing the problem of sludge conditioning, cementation appears to be a very attractive solution. Fukushima accident has focused attention on optimizations that should be taken into account in future studies: - To better take account for non-typical aqueous matrixes like seawater; - To enlarge the spectrum of radioelements that can be efficiently processed and especially short lives radioelements that are usually less present in standard effluents resulting from nuclear activities; - To develop reversible solid adsorbents for cartridge-type applications in order to minimize wastes. (authors)« less

  3. Realisation of all 16 Boolean logic functions in a single magnetoresistance memory cell.

    PubMed

    Gao, Shuang; Yang, Guang; Cui, Bin; Wang, Shouguo; Zeng, Fei; Song, Cheng; Pan, Feng

    2016-07-07

    Stateful logic circuits based on next-generation nonvolatile memories, such as magnetoresistance random access memory (MRAM), promise to break the long-standing von Neumann bottleneck in state-of-the-art data processing devices. For the successful commercialisation of stateful logic circuits, a critical step is realizing the best use of a single memory cell to perform logic functions. In this work, we propose a method for implementing all 16 Boolean logic functions in a single MRAM cell, namely a magnetoresistance (MR) unit. Based on our experimental results, we conclude that this method is applicable to any MR unit with a double-hump-like hysteresis loop, especially pseudo-spin-valve magnetic tunnel junctions with a high MR ratio. Moreover, after simply reversing the correspondence between voltage signals and output logic values, this method could also be applicable to any MR unit with a double-pit-like hysteresis loop. These results may provide a helpful solution for the final commercialisation of MRAM-based stateful logic circuits in the near future.

  4. Process Features in Writing: Internal Structure and Incremental Value over Product Features. Research Report. ETS RR-15-27

    ERIC Educational Resources Information Center

    Zhang, Mo; Deane, Paul

    2015-01-01

    In educational measurement contexts, essays have been evaluated and formative feedback has been given based on the end product. In this study, we used a large sample collected from middle school students in the United States to investigate the factor structure of the writing process features gathered from keystroke logs and the association of that…

  5. National Standards for United States History: Exploring the American Experience. Grades 5-12. Expanded Edition. Including Examples of Student Achievement.

    ERIC Educational Resources Information Center

    Crabtree, Charlotte; Nash, Gary B.

    Developed through a broad-based national consensus building process, the National History Standards project has involved working toward agreement both on the larger purposes of history in the school curriculum and on the more specific history understandings and thinking processes that all students should have equal opportunity to acquire over 12…

  6. Execution of a parallel edge-based Navier-Stokes solver on commodity graphics processor units

    NASA Astrophysics Data System (ADS)

    Corral, Roque; Gisbert, Fernando; Pueblas, Jesus

    2017-02-01

    The implementation of an edge-based three-dimensional Reynolds Average Navier-Stokes solver for unstructured grids able to run on multiple graphics processing units (GPUs) is presented. Loops over edges, which are the most time-consuming part of the solver, have been written to exploit the massively parallel capabilities of GPUs. Non-blocking communications between parallel processes and between the GPU and the central processor unit (CPU) have been used to enhance code scalability. The code is written using a mixture of C++ and OpenCL, to allow the execution of the source code on GPUs. The Message Passage Interface (MPI) library is used to allow the parallel execution of the solver on multiple GPUs. A comparative study of the solver parallel performance is carried out using a cluster of CPUs and another of GPUs. It is shown that a single GPU is up to 64 times faster than a single CPU core. The parallel scalability of the solver is mainly degraded due to the loss of computing efficiency of the GPU when the size of the case decreases. However, for large enough grid sizes, the scalability is strongly improved. A cluster featuring commodity GPUs and a high bandwidth network is ten times less costly and consumes 33% less energy than a CPU-based cluster with an equivalent computational power.

  7. Elimination of waste: creation of a successful Lean colonoscopy program at an academic medical center.

    PubMed

    Damle, Aneel; Andrew, Nathan; Kaur, Shubjeet; Orquiola, Alan; Alavi, Karim; Steele, Scott R; Maykel, Justin

    2016-07-01

    Lean processes involve streamlining methods and maximizing efficiency. Well established in the manufacturing industry, they are increasingly being applied to health care. The objective of this study was to determine feasibility and effectiveness of applying Lean principles to an academic medical center colonoscopy unit. Lean process improvement involved training endoscopy personnel, observing patients, mapping the value stream, analyzing patient flow, designing and implementing new processes, and finally re-observing the process. Our primary endpoint was total colonoscopy time (minutes from check-in to discharge) with secondary endpoints of individual segment times and unit colonoscopy capacity. A total of 217 patients were included (November 2013-May 2014), with 107 pre-Lean and 110 post-Lean intervention. Pre-Lean total colonoscopy time was 134 min. After implementation of the Lean process, mean colonoscopy time decreased by 10 % to 121 min (p = 0.01). The three steps of the process affected by the Lean intervention (time to achieve adequate sedation, time to recovery, and time to discharge) decreased from 3.7 to 2.4 min (p < 0.01), 4.0 to 3.4 min (p = 0.09), and 41.2 to 35.4 min (p = 0.05), respectively. Overall, unit capacity of colonoscopies increased from 39.6 per day to 43.6. Post-Lean patient satisfaction surveys demonstrated an average score of 4.5/5.0 (n = 73) regarding waiting time, 4.9/5.0 (n = 60) regarding how favorably this experienced compared to prior colonoscopy experiences, and 4.9/5.0 (n = 74) regarding professionalism of staff. One hundred percentage of respondents (n = 69) stated they would recommend our institution to a friend for colonoscopy. With no additional utilization of resources, a single Lean process improvement cycle increased productivity and capacity of our colonoscopy unit. We expect this to result in increased patient access and revenue while maintaining patient satisfaction. We believe these results are widely generalizable to other colonoscopy units as well as other process-based interventions in health care.

  8. A noninvasive technique for real-time detection of bruises in apple surface based on machine vision

    NASA Astrophysics Data System (ADS)

    Zhao, Juan; Peng, Yankun; Dhakal, Sagar; Zhang, Leilei; Sasao, Akira

    2013-05-01

    Apple is one of the highly consumed fruit item in daily life. However, due to its high damage potential and massive influence on taste and export, the quality of apple has to be detected before it reaches the consumer's hand. This study was aimed to develop a hardware and software unit for real-time detection of apple bruises based on machine vision technology. The hardware unit consisted of a light shield installed two monochrome cameras at different angles, LED light source to illuminate the sample, and sensors at the entrance of box to signal the positioning of sample. Graphical Users Interface (GUI) was developed in VS2010 platform to control the overall hardware and display the image processing result. The hardware-software system was developed to acquire the images of 3 samples from each camera and display the image processing result in real time basis. An image processing algorithm was developed in Opencv and C++ platform. The software is able to control the hardware system to classify the apple into two grades based on presence/absence of surface bruises with the size of 5mm. The experimental result is promising and the system with further modification can be applicable for industrial production in near future.

  9. Design of overload vehicle monitoring and response system based on DSP

    NASA Astrophysics Data System (ADS)

    Yu, Yan; Liu, Yiheng; Zhao, Xuefeng

    2014-03-01

    The overload vehicles are making much more damage to the road surface than the regular ones. Many roads and bridges are equipped with structural health monitoring system (SHM) to provide early-warning to these damage and evaluate the safety of road and bridge. However, because of the complex nature of SHM system, it's expensive to manufacture, difficult to install and not well-suited for the regular bridges and roads. Based on this application background, this paper designs a compact structural health monitoring system based on DSP, which is highly integrated, low-power, easy to install and inexpensive to manufacture. The designed system is made up of sensor arrays, the charge amplifier module, the DSP processing unit, the alarm system for overload, and the estimate for damage of the road and bridge structure. The signals coming from sensor arrays go through the charge amplifier. DSP processing unit will receive the amplified signals, estimate whether it is an overload signal or not, and convert analog variables into digital ones so that they are compatible with the back-end digital circuit for further processing. The system will also restrict certain vehicles that are overweight, by taking image of the car brand, sending the alarm, and transferring the collected pressure data to remote data center for further monitoring analysis by rain-flow counting method.

  10. Analysis and optimization of solid oxide fuel cell-based auxiliary power units using a generic zero-dimensional fuel cell model

    NASA Astrophysics Data System (ADS)

    Göll, S.; Samsun, R. C.; Peters, R.

    Fuel-cell-based auxiliary power units can help to reduce fuel consumption and emissions in transportation. For this application, the combination of solid oxide fuel cells (SOFCs) with upstream fuel processing by autothermal reforming (ATR) is seen as a highly favorable configuration. Notwithstanding the necessity to improve each single component, an optimized architecture of the fuel cell system as a whole must be achieved. To enable model-based analyses, a system-level approach is proposed in which the fuel cell system is modeled as a multi-stage thermo-chemical process using the "flowsheeting" environment PRO/II™. Therein, the SOFC stack and the ATR are characterized entirely by corresponding thermodynamic processes together with global performance parameters. The developed model is then used to achieve an optimal system layout by comparing different system architectures. A system with anode and cathode off-gas recycling was identified to have the highest electric system efficiency. Taking this system as a basis, the potential for further performance enhancement was evaluated by varying four parameters characterizing different system components. Using methods from the design and analysis of experiments, the effects of these parameters and of their interactions were quantified, leading to an overall optimized system with encouraging performance data.

  11. GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes

    NASA Astrophysics Data System (ADS)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  12. GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.

    PubMed

    Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan

    2011-05-01

    Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.

  13. Conceptual design of the X-IFU Instrument Control Unit on board the ESA Athena mission

    NASA Astrophysics Data System (ADS)

    Corcione, L.; Ligori, S.; Capobianco, V.; Bonino, D.; Valenziano, L.; Guizzo, G. P.

    2016-07-01

    Athena is one of L-class missions selected in the ESA Cosmic Vision 2015-2025 program for the science theme of the Hot and Energetic Universe. The Athena model payload includes the X-ray Integral Field Unit (X-IFU), an advanced actively shielded X-ray microcalorimeter spectrometer for high spectral resolution imaging, utilizing cooled Transition Edge Sensors. This paper describes the preliminary architecture of Instrument Control Unit (ICU), which is aimed at operating all XIFU's subsystems, as well as at implementing the main functional interfaces of the instrument with the S/C control unit. The ICU functions include the TC/TM management with S/C, science data formatting and transmission to S/C Mass Memory, housekeeping data handling, time distribution for synchronous operations and the management of the X-IFU components (i.e. CryoCoolers, Filter Wheel, Detector Readout Electronics Event Processor, Power Distribution Unit). ICU functions baseline implementation for the phase-A study foresees the usage of standard and Space-qualified components from the heritage of past and current space missions (e.g. Gaia, Euclid), which currently encompasses Leon2/Leon3 based CPU board and standard Space-qualified interfaces for the exchange commands and data between ICU and X-IFU subsystems. Alternative architecture, arranged around a powerful PowerPC-based CPU, is also briefly presented, with the aim of endowing the system with enhanced hardware resources and processing power capability, for the handling of control and science data processing tasks not defined yet at this stage of the mission study.

  14. Task-Based Learning and Content and Language Integrated Learning Materials Design: Process and Product

    ERIC Educational Resources Information Center

    Moore, Pat; Lorenzo, Francisco

    2015-01-01

    Content and language integrated learning (CLIL) represents an increasingly popular approach to bilingual education in Europe. In this article, we describe and discuss a project which, in response to teachers' pleas for materials, led to the production of a significant bank of task-based primary and secondary CLIL units for three L2s (English,…

  15. Sexual Health Transformation among College Student Educators in an Arts-Based HIV Prevention Intervention: A Qualitative Cross-Site Analysis

    ERIC Educational Resources Information Center

    Dunlap, Shannon L.; Taboada, Arianna; Merino, Yesenia; Heitfeld, Suzanne; Gordon, Robert J.; Gere, David; Lightfoot, Alexandra F.

    2017-01-01

    We examined the sexual health change process experienced by 26 college student sexual health educators from three geographic regions of the United States who participated in a multisite arts-based sexual health prevention program. We conducted eight focus groups and used a phenomenological approach to analyze data. We drew from social cognitive…

  16. Human Rights Literacy: Moving towards Rights-Based Education and Transformative Action through Understandings of Dignity, Equality and Freedom

    ERIC Educational Resources Information Center

    Becker, Anne; de Wet, Annamagriet; van Vollenhoven, Willie

    2015-01-01

    The twentieth century has been characterised by the proliferation of human rights in the discursive practices of the United Nations (Baxi, 1997). In this article, we explore the continual process of rights-based education towards transformative action, and an open and democratic society, as dependent upon the facilitation of human rights literacy…

  17. An Investigation of Spoken Brazilian Portuguese: Part I, Technical Report. Final Report.

    ERIC Educational Resources Information Center

    Hutchins, John A.

    This final report of a study which developed a working corpus of spoken and written Portuguese from which syntactical studies could be conducted includes computer-processed data on which the findings and analysis are based. A data base, obtained by taping some 487 conversations between Brazil and the United States, serves as the corpus from which…

  18. Logistics Force Planner Assistant (Log Planner)

    DTIC Science & Technology

    1989-09-01

    elements. The system is implemented on a MS-DOS based microcomputer, using the "Knowledge Pro’ software tool., 20 DISTRIBUTION/AVAILABILITY OF... service support structure. 3. A microcomputer-based knowledge system was developed and successfully demonstrated. Four modules of information are...combat service support (CSS) units planning process to Army Staff logistics planners. Personnel newly assigned to logistics planning need an

  19. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    PubMed

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.

  1. (abstract) Topographic Signatures in Geology

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.; Evans, Diane L.

    1996-01-01

    Topographic information is required for many Earth Science investigations. For example, topography is an important element in regional and global geomorphic studies because it reflects the interplay between the climate-driven processes of erosion and the tectonic processes of uplift. A number of techniques have been developed to analyze digital topographic data, including Fourier texture analysis. A Fourier transform of the topography of an area allows the spatial frequency content of the topography to be analyzed. Band-pass filtering of the transform produces images representing the amplitude of different spatial wavelengths. These are then used in a multi-band classification to map units based on their spatial frequency content. The results using a radar image instead of digital topography showed good correspondence to a geologic map, however brightness variations in the image unrelated to topography caused errors. An additional benefit to the use of Fourier band-pass images for the classification is that the textural signatures of the units are quantative measures of the spatial characteristics of the units that may be used to map similar units in similar environments.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharyya, D.; Turton, R.; Zitney, S.

    In this presentation, development of a plant-wide dynamic model of an advanced Integrated Gasification Combined Cycle (IGCC) plant with CO2 capture will be discussed. The IGCC reference plant generates 640 MWe of net power using Illinois No.6 coal as the feed. The plant includes an entrained, downflow, General Electric Energy (GEE) gasifier with a radiant syngas cooler (RSC), a two-stage water gas shift (WGS) conversion process, and two advanced 'F' class combustion turbines partially integrated with an elevated-pressure air separation unit (ASU). A subcritical steam cycle is considered for heat recovery steam generation. Syngas is selectively cleaned by a SELEXOLmore » acid gas removal (AGR) process. Sulfur is recovered using a two-train Claus unit with tail gas recycle to the AGR. A multistage intercooled compressor is used for compressing CO2 to the pressure required for sequestration. Using Illinois No.6 coal, the reference plant generates 640 MWe of net power. The plant-wide steady-state and dynamic IGCC simulations have been generated using the Aspen Plus{reg_sign} and Aspen Plus Dynamics{reg_sign} process simulators, respectively. The model is generated based on the Case 2 IGCC configuration detailed in the study available in the NETL website1. The GEE gasifier is represented with a restricted equilibrium reactor model where the temperature approach to equilibrium for individual reactions can be modified based on the experimental data. In this radiant-only configuration, the syngas from the Radiant Syngas Cooler (RSC) is quenched in a scrubber. The blackwater from the scrubber bottom is further cleaned in the blackwater treatment plant. The cleaned water is returned back to the scrubber and also used for slurry preparation. The acid gas from the sour water stripper (SWS) is sent to the Claus plant. The syngas from the scrubber passes through a sour shift process. The WGS reactors are modeled as adiabatic plug flow reactors with rigorous kinetics based on the mid-life activity of the shift-catalyst. The SELEXOL unit consists of the H2S and CO2 absorbers that are designed to meet the stringent environmental limits and requirements of other associated units. The model also considers the stripper for recovering H2S that is sent as a feed to a split-flow Claus unit. The tail gas from the Claus unit is recycled to the SELEXOL unit. The cleaned syngas is sent to the GE 7FB gas turbine. This turbine is modeled as per published data in the literature. Diluent N2 is used from the elevated-pressure ASU for reducing the NOx formation. The heat recovery steam generator (HRSG) is modeled by considering generation of high-pressure, intermediate-pressure, and low-pressure steam. All of the vessels, reactors, heat exchangers, and the columns have been sized. The basic IGCC process control structure has been synthesized by standard guidelines and existing practices. The steady-state simulation is solved in sequential-modular mode in Aspen Plus{reg_sign} and consists of more than 300 unit operations, 33 design specs, and 16 calculator blocks. The equation-oriented dynamic simulation consists of more than 100,000 equations solved using a multi-step Gear's integrator in Aspen Plus Dynamics{reg_sign}. The challenges faced in solving the dynamic model and key transient results from this dynamic model will also be discussed.« less

  3. The effect of side-chain substitution and hot processing on diketopyrrolopyrrole-based polymers for organic solar cells.

    PubMed

    Heintges, Gaël H L; Leenaers, Pieter J; Janssen, René A J

    2017-07-14

    The effects of cold and hot processing on the performance of polymer-fullerene solar cells are investigated for diketopyrrolopyrrole (DPP) based polymers that were specifically designed and synthesized to exhibit a strong temperature-dependent aggregation in solution. The polymers, consisting of alternating DPP and oligothiophene units, are substituted with linear and second position branched alkyl side chains. For the polymer-fullerene blends that can be processed at room temperature, hot processing does not enhance the power conversion efficiencies compared to cold processing because the increased solubility at elevated temperatures results in the formation of wider polymer fibres that reduce charge generation. Instead, hot processing seems to be advantageous when cold processing is not possible due to a limited solubility at room temperature. The resulting morphologies are consistent with a nucleation-growth mechanism for polymer fibres during drying of the films.

  4. How reliable are odour assessments?

    PubMed

    Bokowa, A; Beukes, J A

    2012-01-01

    This paper will demonstrate the differences found in odour test results, when odour sampling is performed at the same sources by two different consultants. By examining two case studies, this paper will highlight that the difference between the results can be significant. Both studies are based on odour sampling programs determining the odour removal efficiency of odour control units installed at two different facilities: a pet food facility and an oil/grease recycling facility. The first study is based on odour measurements at the inlet and outlet of the unit installed by Applied Plasma Physics AS at the pet food facility. Odour assessments were performed by two separate consultants at the same time. The second study is based on testing of the odour removal effectiveness of two units: a scrubber and a biofilter at an oil/grease recycling facility. During this study two odour sampling programs were performed by two consultants at different times, but under the same process conditions. This paper will show how varying results can play a role in choosing the adequate odour control technologies. The final results suggest that although, an odour control unit may appear to be insufficient, it actually is successful at removing the odours.

  5. Distributed and recoverable digital control system

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A real-time multi-tasking digital control system with rapid recovery capability is disclosed. The control system includes a plurality of computing units comprising a plurality of redundant processing units, with each of the processing units configured to generate one or more redundant control commands. One or more internal monitors are employed for detecting data errors in the control commands. One or more recovery triggers are provided for initiating rapid recovery of a processing unit if data errors are detected. The control system also includes a plurality of actuator control units each in operative communication with the computing units. The actuator control units are configured to initiate a rapid recovery if data errors are detected in one or more of the processing units. A plurality of smart actuators communicates with the actuator control units, and a plurality of redundant sensors communicates with the computing units.

  6. Realigning Shared Governance With Magnet® and the Organization's Operating System to Achieve Clinical Excellence.

    PubMed

    Moreno, Janette V; Girard, Anita S; Foad, Wendy

    2018-03-01

    In 2012, an academic medical center successfully overhauled a 15-year-old shared governance to align 6 house-wide and 30 unit-based councils with the new Magnet Recognition Program® and the organization's operating system, using the processes of LEAN methodology. The redesign improved cross-council communication structures, facilitated effective shared decision-making processes, increased staff engagement, and improved clinical outcomes. The innovative structural and process elements of the new model are replicable in other health institutions.

  7. Analysis of the Appropriateness of the Use of Peltier Cells as Energy Sources.

    PubMed

    Hájovský, Radovan; Pieš, Martin; Richtár, Lukáš

    2016-05-25

    The article describes the possibilities of using Peltier cells as an energy source to power the telemetry units, which are used in large-scale monitoring systems as central units, ensuring the collection of data from sensors, processing, and sending to the database server. The article describes the various experiments that were carried out, their progress and results. Based on experiments evaluated, the paper also discusses the possibilities of using various types depending on the temperature difference of the cold and hot sides.

  8. A model of interaction between anticorruption authority and corruption groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neverova, Elena G.; Malafeyef, Oleg A.

    The paper provides a model of interaction between anticorruption unit and corruption groups. The main policy functions of the anticorruption unit involve reducing corrupt practices in some entities through an optimal approach to resource allocation and effective anticorruption policy. We develop a model based on Markov decision-making process and use Howard’s policy-improvement algorithm for solving an optimal decision strategy. We examine the assumption that corruption groups retaliate against the anticorruption authority to protect themselves. This model was implemented through stochastic game.

  9. When family enters the picture: the model of cultural negotiation and gendered experiences of Japanese academic sojourners in the United States.

    PubMed

    Sakamoto, Izumi

    2006-07-01

    A grounded-theory study aimed at reconceptualizing cultural adaptation processes from gender role and family/couple perspectives while critically drawing from acculturation and culture and self literatures. In-depth interviews with 34 Japanese academic sojourners (international students, scholars) and their spouses (a total of 50 interviews with select longitudinal interviews) were conducted. The author earlier developed the Model of Cultural Negotiation (2001; 2006) capturing uneven and cyclical processes of dealing with multiple cultural contexts. The current study further develops more tailored versions of this model, Family-Based (Couple-Based) Cultural Negotiation and Individual-Based Cultural Negotiation, highlighting the impacts of family/couple and gender roles, especially for female spouses. These conceptualizations afford a sophisticated understanding of the processes of culture.

  10. The implementation of unit-based perinatal mortality audit in perinatal cooperation units in the northern region of the Netherlands

    PubMed Central

    2012-01-01

    Background Perinatal (mortality) audit can be considered to be a way to improve the careprocess for all pregnant women and their newborns by creating an opportunity to learn from unwanted events in the care process. In unit-based perinatal audit, the caregivers involved in cases that result in mortality are usually part of the audit group. This makes such an audit a delicate matter. Methods The purpose of this study was to implement unit-based perinatal mortality audit in all 15 perinatal cooperation units in the northern region of the Netherlands between September 2007 and March 2010. These units consist of hospital-based and independent community-based perinatal caregivers. The implementation strategy encompassed an information plan, an organization plan, and a training plan. The main outcomes are the number of participating perinatal cooperation units at the end of the project, the identified substandard factors (SSF), the actions to improve care, and the opinions of the participants. Results The perinatal mortality audit was implemented in all 15 perinatal cooperation units. 677 different caregivers analyzed 112 cases of perinatal mortality and identified 163 substandard factors. In 31% of cases the guidelines were not followed and in 23% care was not according to normal practice. In 28% of cases, the documentation was not in order, while in 13% of cases the communication between caregivers was insufficient. 442 actions to improve care were reported for ‘external cooperation’ (15%), ‘internal cooperation’ (17%), ‘practice organization’ (26%), ‘training and education’ (10%), and ‘medical performance’ (27%). Valued aspects of the audit meetings were: the multidisciplinary character (13%), the collective and non-judgmental search for substandard factors (21%), the perception of safety (13%), the motivation to reflect on one’s own professional performance (5%), and the inherent postgraduate education (10%). Conclusion Following our implementation strategy, the perinatal mortality audit has been successfully implemented in all 15 perinatal cooperation units. An important feature was our emphasis on the delicate character of the caregivers evaluating the care they provided. However, the actual implementation of the proposed actions for improving care is still a point of concern. PMID:22776712

  11. The implementation of unit-based perinatal mortality audit in perinatal cooperation units in the northern region of the Netherlands.

    PubMed

    van Diem, Mariet Th; Timmer, Albertus; Bergman, Klasien A; Bouman, Katelijne; van Egmond, Nico; Stant, Dennis A; Ulkeman, Lida H M; Veen, Wenda B; Erwich, Jan Jaap H M

    2012-07-09

    Perinatal (mortality) audit can be considered to be a way to improve the careprocess for all pregnant women and their newborns by creating an opportunity to learn from unwanted events in the care process. In unit-based perinatal audit, the caregivers involved in cases that result in mortality are usually part of the audit group. This makes such an audit a delicate matter. The purpose of this study was to implement unit-based perinatal mortality audit in all 15 perinatal cooperation units in the northern region of the Netherlands between September 2007 and March 2010. These units consist of hospital-based and independent community-based perinatal caregivers. The implementation strategy encompassed an information plan, an organization plan, and a training plan. The main outcomes are the number of participating perinatal cooperation units at the end of the project, the identified substandard factors (SSF), the actions to improve care, and the opinions of the participants. The perinatal mortality audit was implemented in all 15 perinatal cooperation units. 677 different caregivers analyzed 112 cases of perinatal mortality and identified 163 substandard factors. In 31% of cases the guidelines were not followed and in 23% care was not according to normal practice. In 28% of cases, the documentation was not in order, while in 13% of cases the communication between caregivers was insufficient. 442 actions to improve care were reported for 'external cooperation' (15%), 'internal cooperation' (17%), 'practice organization' (26%), 'training and education' (10%), and 'medical performance' (27%). Valued aspects of the audit meetings were: the multidisciplinary character (13%), the collective and non-judgmental search for substandard factors (21%), the perception of safety (13%), the motivation to reflect on one's own professional performance (5%), and the inherent postgraduate education (10%). Following our implementation strategy, the perinatal mortality audit has been successfully implemented in all 15 perinatal cooperation units. An important feature was our emphasis on the delicate character of the caregivers evaluating the care they provided. However, the actual implementation of the proposed actions for improving care is still a point of concern.

  12. Environmental Engineering Unit Operations and Unit Processes Laboratory Manual.

    ERIC Educational Resources Information Center

    O'Connor, John T., Ed.

    This manual was prepared for the purpose of stimulating the development of effective unit operations and unit processes laboratory courses in environmental engineering. Laboratory activities emphasizing physical operations, biological, and chemical processes are designed for various educational and equipment levels. An introductory section reviews…

  13. 32 CFR 701.54 - Collection of fees and fee rates for technical data.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Computer search is based on the total cost of the central processing unit, input-output devices, and memory... charge for office copy up to six images)—$3.50 Each additional image—$ .10 Each typewritten page—$3.50...

  14. 32 CFR 701.54 - Collection of fees and fee rates for technical data.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Computer search is based on the total cost of the central processing unit, input-output devices, and memory... charge for office copy up to six images)—$3.50 Each additional image—$ .10 Each typewritten page—$3.50...

  15. Demonstration of ROV Based Underwater Electromagnetic Array Technology

    DTIC Science & Technology

    2016-03-01

    levels. In addition, South Florida experiences more hurricanes and tropical depressions than any other area in the United States. Storms are most...organisms and processes building reefs and islands of the Dry Tortugas: The Carnegie Dry Tortugas laboratory centennial celebrations (1905-2005

  16. Demonstration of ROV-Based Underwater Electromagnetic Array Technology

    DTIC Science & Technology

    2016-03-01

    levels. In addition, South Florida experiences more hurricanes and tropical depressions than any other area in the United States. Storms are most...organisms and processes building reefs and islands of the Dry Tortugas: The Carnegie Dry Tortugas laboratory centennial celebrations (1905-2005

  17. Benefits and challenges of using LCA to advance sustainable wasteand materials management

    EPA Science Inventory

    MSW management can be complex and involve many unit processes that can vary based on needs of urban, rural, and suburbia to safely manage waste and to optimize energy and resource recovery while considering local infrastructure and priorities.

  18. Clinical governance in practice: closing the loop with integrated audit systems.

    PubMed

    Taylor, L; Jones, S

    2006-04-01

    Clinical governance has been acknowledged as the driving force behind National Health Service (NHS) reform since the government white paper outlined a new style of NHS in the UK in 1997. The framework of clinical governance ensures that NHS organizations are accountable for continually improving the quality of their services and safeguarding high standards of care by creating an environment in which excellence in clinical care will develop. A major component of a clinical governance framework requires utilizing audit procedures, which assess the effectiveness of current systems and ultimately direct continual quality improvement. This paper describes the audit component of a local clinical governance framework designed for a unit based within an NHS trust, which has utilized a multidisciplinary approach to assess the effectiveness of a newly commissioned service and its impact on the residents and staff. The unit is a 12-bedded, low-secure-intensive rehabilitation unit for clients with severe and enduring mental illness. Using recognized and standardized psychometric outcome measures, information was collected on clinical symptoms, social functioning, social behaviour, quality of life, relationship quality with named nurses and medication side-effects. Additionally, confidential staff measures were included to assess levels of burnout, identify expressed emotion and assess staff perception of models of illness. The paper includes a comprehensive account of how managerial commitment, teaching processes and application of technology ensured prompt data collection and maintained the momentum through the audit timescale. Data analysis and presentation of data in both clinical reviews and in senior management meetings within the unit are discussed. Findings highlight the full integration of the audit system into the processes of the unit. Clinically, the paper highlights the enhancement of the knowledge base of the client group and the influence on clinical decision-making processes and care delivery as a result of the audit. Brief clinical examples are given. In conclusion, the impact of the audit on unit strategy and organizational efficiency are discussed to highlight the importance of closing the audit loop and completing the cycle of clinical governance. The audit system has positive implications for replication in other services.

  19. Analysis of very-high-resolution Galileo images of Europa: Implications for small-scale structure and surface evolution

    NASA Astrophysics Data System (ADS)

    Leonard, E. J.; Pappalardo, R. T.; Yin, A.; Prockter, L. M.; Patthoff, D. A.

    2014-12-01

    The Galileo Solid State Imager (SSI) recorded nine very high-resolution frames (8 at 12 m/pixel and 1 at 6 m/pixel) during the E12 flyby of Europa in Dec. 1997. To understand the implications for the small-scale structure and evolution of Europa, we mosaicked these frames (observations 12ESMOTTLE01 and 02, incidence ≈18°, emission ≈77°) into their regional context (part of observation 11ESREGMAP01, 220 m/pixel, incidence ≈74°, emission ≈23°), despite their very different viewing and lighting conditions. We created a map of geological units based on morphology, structure, and albedo along with stereoscopic images where the frames overlapped. The highly diverse units range from: high albedo sub-parallel ridge and grooved terrain; to variegated-albedo hummocky terrain; to low albedo and relatively smooth terrain. We classified and analyzed the diverse units solely based on the high-resolution image mosaic, prior to comparison to the context image, to obtain an in-depth look at possible surface evolution and underlying formational processes. We infer that some of these units represent different stages and forms of resurfacing, including cryovolcanic and tectonic resurfacing. However, significant morphological variation among units in the region indicates that there are different degrees of resurfacing at work. We have created candidate morphological sequences that provide insight into the conversion of ridged plains to chaotic terrain—generally, a process of subduing formerly sharp features through tectonic modification and/or cryovolcanism. When the map of the high-resolution area is compared to the regional context, features that appear to be one unit at regional resolution are comprised of several distinct units at high resolution, and features that appear to be smooth in the context image are found to show distinct textures. Moreover, in the context image, transitions from ridged units to disrupted units appear to be gradual; however the high-resolution image reveals them to be abrupt, suggesting tectonic control of these boundaries. These discrepancies could have important implications for a future landed exploration.

  20. A Glucose Biosensor Using CMOS Potentiostat and Vertically Aligned Carbon Nanofibers.

    PubMed

    Al Mamun, Khandaker A; Islam, Syed K; Hensley, Dale K; McFarlane, Nicole

    2016-08-01

    This paper reports a linear, low power, and compact CMOS based potentiostat for vertically aligned carbon nanofibers (VACNF) based amperometric glucose sensors. The CMOS based potentiostat consists of a single-ended potential control unit, a low noise common gate difference-differential pair transimpedance amplifier and a low power VCO. The potentiostat current measuring unit can detect electrochemical current ranging from 500 nA to 7 [Formula: see text] from the VACNF working electrodes with high degree of linearity. This current corresponds to a range of glucose, which depends on the fiber forest density. The potentiostat consumes 71.7 [Formula: see text] of power from a 1.8 V supply and occupies 0.017 [Formula: see text] of chip area realized in a 0.18 [Formula: see text] standard CMOS process.

  1. Acoustic reverse-time migration using GPU card and POSIX thread based on the adaptive optimal finite-difference scheme and the hybrid absorbing boundary condition

    NASA Astrophysics Data System (ADS)

    Cai, Xiaohui; Liu, Yang; Ren, Zhiming

    2018-06-01

    Reverse-time migration (RTM) is a powerful tool for imaging geologically complex structures such as steep-dip and subsalt. However, its implementation is quite computationally expensive. Recently, as a low-cost solution, the graphic processing unit (GPU) was introduced to improve the efficiency of RTM. In the paper, we develop three ameliorative strategies to implement RTM on GPU card. First, given the high accuracy and efficiency of the adaptive optimal finite-difference (FD) method based on least squares (LS) on central processing unit (CPU), we study the optimal LS-based FD method on GPU. Second, we develop the CPU-based hybrid absorbing boundary condition (ABC) to the GPU-based one by addressing two issues of the former when introduced to GPU card: time-consuming and chaotic threads. Third, for large-scale data, the combinatorial strategy for optimal checkpointing and efficient boundary storage is introduced for the trade-off between memory and recomputation. To save the time of communication between host and disk, the portable operating system interface (POSIX) thread is utilized to create the other CPU core at the checkpoints. Applications of the three strategies on GPU with the compute unified device architecture (CUDA) programming language in RTM demonstrate their efficiency and validity.

  2. Assessing the efficiency of different CSO positions based on network graph characteristics.

    PubMed

    Sitzenfrei, R; Urich, C; Möderl, M; Rauch, W

    2013-01-01

    The technical design of urban drainage systems comprises two major aspects: first, the spatial layout of the sewer system and second, the pipe-sizing process. Usually, engineers determine the spatial layout of the sewer network manually, taking into account physical features and future planning scenarios. Before the pipe-sizing process starts, it is important to determine locations of possible weirs and combined sewer overflows (CSOs) based on, e.g. distance to receiving water bodies or to a wastewater treatment plant and available space for storage units. However, positions of CSOs are also determined by topological characteristics of the sewer networks. In order to better understand the impact of placement choices for CSOs and storage units in new systems, this work aims to determine case unspecific, general rules. Therefore, based on numerous, stochastically generated virtual alpine sewer systems of different sizes it is investigated how choices for placement of CSOs and storage units have an impact on the pipe-sizing process (hence, also on investment costs) and on technical performance (CSO efficiency and flooding). To describe the impact of the topological positions of these elements in the sewer networks, graph characteristics are used. With an evaluation of 2,000 different alpine combined sewer systems, it was found that, as expected, with CSOs at more downstream positions in the network, greater construction costs and better performance regarding CSO efficiency result. At a specific point (i.e. topological network position), no significant difference (further increase) in construction costs can be identified. Contrarily, the flooding efficiency increases with more upstream positions of the CSOs. Therefore, CSO and flooding efficiency are in a trade-off conflict and a compromise is required.

  3. Using the Six Sigma Process to Implement the Centers for Disease Control and Prevention Guideline for Hand Hygiene in 4 Intensive Care Units

    PubMed Central

    Eldridge, Noel E; Woods, Susan S; Bonello, Robert S; Clutter, Kay; Ellingson, LeAnn; Harris, Mary Ann; Livingston, Barbara K; Bagian, James P; Danko, Linda H; Dunn, Edward J; Parlier, Renee L; Pederson, Cheryl; Reichling, Kim J; Roselle, Gary A; Wright, Steven M

    2006-01-01

    BACKGROUND The Centers for Disease Control and Prevention (CDC) Guideline for Hand Hygiene in Health Care Settings was issued in 2002. In 2003, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) established complying with the CDC Guideline as a National Patient Safety Goal for 2004. This goal has been maintained through 2006. The CDC's emphasis on the use of alcohol-based hand rubs (ABHRs) rather than soap and water was an opportunity to improve compliance, but the Guideline contained over 40 specific recommendations to implement. OBJECTIVE To use the Six Sigma process to examine hand hygiene practices and increase compliance with the CDC hand hygiene recommendations required by JCAHO. DESIGN Six Sigma Project with pre-post design. PARTICIPANTS Physicians, nurses, and other staff working in 4 intensive care units at 3 hospitals. MEASUREMENTS Observed compliance with 10 required hand hygiene practices, mass of ABHR used per month per 100 patient-days, and staff attitudes and perceptions regarding hand hygiene reported by questionnaire. RESULTS Observed compliance increased from 47% to 80%, based on over 4,000 total observations. The mass of ABHR used per 100 patient-days in 3 intensive care units (ICUs) increased by 97%, 94%, and 70%; increases were sustained for 9 months. Self-reported compliance using the questionnaire did not change. Staff reported increased use of ABHR and increased satisfaction with hand hygiene practices and products. CONCLUSIONS The Six Sigma process was effective for organizing the knowledge, opinions, and actions of a group of professionals to implement the CDC's evidence-based hand hygiene practices in 4 ICUs. Several tools were developed for widespread use. PMID:16637959

  4. Using the six sigma process to implement the Centers for Disease Control and Prevention Guideline for Hand Hygiene in 4 intensive care units.

    PubMed

    Eldridge, Noel E; Woods, Susan S; Bonello, Robert S; Clutter, Kay; Ellingson, Leann; Harris, Mary Ann; Livingston, Barbara K; Bagian, James P; Danko, Linda H; Dunn, Edward J; Parlier, Renee L; Pederson, Cheryl; Reichling, Kim J; Roselle, Gary A; Wright, Steven M

    2006-02-01

    The Centers for Disease Control and Prevention (CDC) Guideline for Hand Hygiene in Health Care Settings was issued in 2002. In 2003, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) established complying with the CDC Guideline as a National Patient Safety Goal for 2004. This goal has been maintained through 2006. The CDC's emphasis on the use of alcohol-based hand rubs (ABHRs) rather than soap and water was an opportunity to improve compliance, but the Guideline contained over 40 specific recommendations to implement. To use the Six Sigma process to examine hand hygiene practices and increase compliance with the CDC hand hygiene recommendations required by JCAHO. Six Sigma Project with pre-post design. Physicians, nurses, and other staff working in 4 intensive care units at 3 hospitals. Observed compliance with 10 required hand hygiene practices, mass of ABHR used per month per 100 patient-days, and staff attitudes and perceptions regarding hand hygiene reported by questionnaire. Observed compliance increased from 47% to 80%, based on over 4,000 total observations. The mass of ABHR used per 100 patient-days in 3 intensive care units (ICUs) increased by 97%, 94%, and 70%; increases were sustained for 9 months. Self-reported compliance using the questionnaire did not change. Staff reported increased use of ABHR and increased satisfaction with hand hygiene practices and products. The Six Sigma process was effective for organizing the knowledge, opinions, and actions of a group of professionals to implement the CDC's evidence-based hand hygiene practices in 4 ICUs. Several tools were developed for widespread use.

  5. Breastfeeding protection, promotion, and support in the United States: a time to nudge, a time to measure.

    PubMed

    Pérez-Escamilla, Rafael; Chapman, Donna J

    2012-05-01

    Strong evidence-based advocacy efforts have now translated into high level political support and concrete goals for improving breastfeeding outcomes among women in the United States. In spite of this, major challenge remain for promoting, supporting and especially for protecting breastfeeding in the country. The goals of this commentary are to argue in favor of: A) Changes in the default social and environmental systems, that would allow women to implement their right to breastfeed their infants, B) A multi-level and comprehensive monitoring system to measure process and outcomes indicators in the country. Evidence-based commentary. Breastfeeding rates in the United States can improve based on a well coordinated social marketing framework. This approach calls for innovative promotion through mass media, appropriate facility based and community based support (e.g., Baby Friendly Hospital Initiative, WIC-coordinated community based peer counseling), and adequate protection for working women (e.g., longer paid maternity leave, breastfeeding or breast milk extraction breaks during the working day) and women at large by adhering and enforcing the WHO ethics Code for the Marketing of Breast Milk Substitutes. Sound infant feeding practices monitoring systems, which include WIC administrative food package data, are needed. Given the current high level of political support to improve breastfeeding in the United States, a window of opportunity has been opened. Establishing breastfeeding as the social norm in the USA will take time, but the global experience indicates that it can be done.

  6. Global Perspective for Protecting Intellectual Property - Patenting in USA and Poland

    NASA Astrophysics Data System (ADS)

    Grebski, Michalene Eva; Wolniak, Radosław

    2018-06-01

    Paper addresses the different methods for protecting intellectual property in modern knowledge-based economies. The focus of the paper is a comparison between the procedures for applying for patents in Poland and the United States. The comparison has been made from the perspective of the cost of obtaining and maintaining a patent in Poland, the United States and some other countries. The comparison has also been made from the perspective of the procedures for applying for a patent in different countries based on the Patent Cooperation Treaty. The paper also includes a comparison of the time needed for processing the patent application. Low cost provisional twelve-month patent pending protection available in the United States is also being discussed. The paper also provides some guidance and recommendations for conducting a patent search in order to validate the originality of the invention.

  7. Structure, form, and meaning in the mental lexicon: evidence from Arabic

    PubMed Central

    Boudelaa, Sami; Marslen-Wilson, William D.

    2015-01-01

    Does the organization of the mental lexicon reflect the combination of abstract underlying morphemic units or the concatenation of word-level phonological units? We address these fundamental issues in Arabic, a Semitic language where every surface form is potentially analyzable into abstract morphemic units – the word pattern and the root – and where this view contrasts with stem-based approaches, chiefly driven by linguistic considerations, in which neither roots nor word patterns play independent roles in word formation and lexical representation. Five cross-modal priming experiments examine the processing of morphologically complex forms in the three major subdivisions of the Arabic lexicon – deverbal nouns, verbs, and primitive nouns. The results demonstrate that root and word pattern morphemes function as abstract cognitive entities, operating independently of semantic factors and dissociable from possible phonological confounds, while stem-based approaches consistently fail to accommodate the basic psycholinguistic properties of the Arabic mental lexicon. PMID:26682237

  8. The United Nations programme on space applications: priority thematic areas

    NASA Astrophysics Data System (ADS)

    Haubold, H.

    The Third United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) was held in 1999 with efforts to identify world wide benefits of developing space science and technology, particularly in the developing nations. One of the main vehicles to implement recommendations of UNISPACE III is the United Nations Programme on Space Applications of the Office for Outer Space Affairs at UN Headquarters in Vienna. Following a process of prioritization by Member States, the Programme focus its activities on (i) knowledge-based themes as space law and basic space science, (ii) application-based themes as disaster management, natural resources management, environmental monitoring, tele-health, and (iii) enabling technologies such as remote sensing satellites, communications satellites, global navigation satellite systems, and small satellites. Current activities of the Programme will be reviewed. Further information available at http://www.oosa.unvienna.org/sapidx.html

  9. Metalenses based on the non-parallel double-slit arrays

    NASA Astrophysics Data System (ADS)

    Shao, Hongyan; Chen, Chen; Wang, Jicheng; Pan, Liang; Sang, Tian

    2017-09-01

    Metalenses based on surface plasmon polaritons have played an indispensable role in ultra-thin devices designing. The amplitude, phase and polarization of electromagnetic waves all can be controlled easily by modifying the metasurface structures. Here we propose and investigate a new type of structure with Babinet-inverted nano-antennas which can provide a series of unit-cells with phase-shifts covering 2π and ensure almost same transmittance simultaneously. As a result, the wavefront can be manipulated by arraying the units in course. Metalenses with the linear asymmetrical double slit unit-cell arrays are designed and the simulative results exhibit their perfect focusing characteristics, including single-focus lenses and multi-focus lenses. The small focus size and high numerical aperture make them stand out from the traditional counterparts in application of precision sensing devices. We expect our designs will provide new insights in the practical applications for metasurfaces in data storages, optical information processing and optical holography.

  10. MSuPDA: A memory efficient algorithm for sequence alignment.

    PubMed

    Khan, Mohammad Ibrahim; Kamal, Md Sarwar; Chowdhury, Linkon

    2015-01-16

    Space complexity is a million dollar question in DNA sequence alignments. In this regards, MSuPDA (Memory Saving under Pushdown Automata) can help to reduce the occupied spaces in computer memory. Our proposed process is that Anchor Seed (AS) will be selected from given data set of Nucleotides base pairs for local sequence alignment. Quick Splitting (QS) techniques will separate the Anchor Seed from all the DNA genome segments. Selected Anchor Seed will be placed to pushdown Automata's (PDA) input unit. Whole DNA genome segments will be placed into PDA's stack. Anchor Seed from input unit will be matched with the DNA genome segments from stack of PDA. Whatever matches, mismatches or Indel, of Nucleotides will be POP from the stack under the control of control unit of Pushdown Automata. During the POP operation on stack it will free the memory cell occupied by the Nucleotide base pair.

  11. Resource costing for multinational neurologic clinical trials: methods and results.

    PubMed

    Schulman, K; Burke, J; Drummond, M; Davies, L; Carlsson, P; Gruger, J; Harris, A; Lucioni, C; Gisbert, R; Llana, T; Tom, E; Bloom, B; Willke, R; Glick, H

    1998-11-01

    We present the results of a multinational resource costing study for a prospective economic evaluation of a new medical technology for treatment of subarachnoid hemorrhage within a clinical trial. The study describes a framework for the collection and analysis of international resource cost data that can contribute to a consistent and accurate intercountry estimation of cost. Of the 15 countries that participated in the clinical trial, we collected cost information in the following seven: Australia, France, Germany, the UK, Italy, Spain, and Sweden. The collection of cost data in these countries was structured through the use of worksheets to provide accurate and efficient cost reporting. We converted total average costs to average variable costs and then aggregated the data to develop study unit costs. When unit costs were unavailable, we developed an index table, based on a market-basket approach, to estimate unit costs. To estimate the cost of a given procedure, the market-basket estimation process required that cost information be available for at least one country. When cost information was unavailable in all countries for a given procedure, we estimated costs using a method based on physician-work and practice-expense resource-based relative value units. Finally, we converted study unit costs to a common currency using purchasing power parity measures. Through this costing exercise we developed a set of unit costs for patient services and per diem hospital services. We conclude by discussing the implications of our costing exercise and suggest guidelines to facilitate more effective multinational costing exercises.

  12. Joint Symposium on Compatibility of Plastics/Materials with Explosives Processing Explosives Held in Albuquerque, New Mexico on 15-17 May 1979.

    DTIC Science & Technology

    1979-05-01

    250 A. S. Tompa REAL-TIME LOW TEMPERATURE NC AND PBX 9404 DECOMPOSITION STUDIES ....................................... 276 ._- Dr. Hermann...the five major unit operations for multi-base cannon propellant; nitrocellulose dehydration , premixing, mixing, extruding and cutting. Throughout the...during facility design, a general process description is presented as follows: Thermal Dehydration Nitrocellulose (NC) slurry is fed to a continuous

  13. Discussion of Carbon Emissions for Charging Hot Metal in EAF Steelmaking Process

    NASA Astrophysics Data System (ADS)

    Yang, Ling-zhi; Jiang, Tao; Li, Guang-hui; Guo, Yu-feng

    2017-07-01

    As the cost of hot metal is reduced for iron ore prices are falling in the international market, more and more electric arc furnace (EAF) steelmaking enterprises use partial hot metal instead of scrap as raw materials to reduce costs and the power consumption. In this paper, carbon emissions based on 1,000 kg molten steel by charging hot metal in EAF steelmaking is studied. Based on the analysis of material and energy balance calculation in EAF, the results show that 146.9, 142.2, 137.0, and 130.8 kg/t of carbon emissions are produced at a hot metal ratio of 0 %, 30 %, 50 %, and 70 %, while 143.4, 98.5, 65.81, and 31.5 kg/t of carbon emissions are produced at a hot metal ratio of 0 %, 30 %, 50 %, and 70 % by using gas waste heat utilization (coal gas production) for EAF steelmaking unit process. However, carbon emissions are increased by charging hot metal for the whole blast furnace-electric arc furnace (BF-EAF) steelmaking process. In the condition that the hot metal produced by BF is surplus, as carbon monoxide in gas increased by charging hot metal, the way of coal gas production can be used for waste heat utilization, which reduces carbon emissions in EAF steelmaking unit process.

  14. Accelerating Fibre Orientation Estimation from Diffusion Weighted Magnetic Resonance Imaging Using GPUs

    PubMed Central

    Hernández, Moisés; Guerrero, Ginés D.; Cecilia, José M.; García, José M.; Inuggi, Alberto; Jbabdi, Saad; Behrens, Timothy E. J.; Sotiropoulos, Stamatios N.

    2013-01-01

    With the performance of central processing units (CPUs) having effectively reached a limit, parallel processing offers an alternative for applications with high computational demands. Modern graphics processing units (GPUs) are massively parallel processors that can execute simultaneously thousands of light-weight processes. In this study, we propose and implement a parallel GPU-based design of a popular method that is used for the analysis of brain magnetic resonance imaging (MRI). More specifically, we are concerned with a model-based approach for extracting tissue structural information from diffusion-weighted (DW) MRI data. DW-MRI offers, through tractography approaches, the only way to study brain structural connectivity, non-invasively and in-vivo. We parallelise the Bayesian inference framework for the ball & stick model, as it is implemented in the tractography toolbox of the popular FSL software package (University of Oxford). For our implementation, we utilise the Compute Unified Device Architecture (CUDA) programming model. We show that the parameter estimation, performed through Markov Chain Monte Carlo (MCMC), is accelerated by at least two orders of magnitude, when comparing a single GPU with the respective sequential single-core CPU version. We also illustrate similar speed-up factors (up to 120x) when comparing a multi-GPU with a multi-CPU implementation. PMID:23658616

  15. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  16. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    PubMed Central

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  17. Heterogeneous real-time computing in radio astronomy

    NASA Astrophysics Data System (ADS)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  18. A real-time capable software-defined receiver using GPU for adaptive anti-jam GPS sensors.

    PubMed

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  19. 40 CFR 63.100 - Applicability and designation of source.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... manufacturing process unit has two or more products that have the same maximum annual design capacity on a mass... subject to this subpart. (3) For chemical manufacturing process units that are designed and operated as... chemical manufacturing process units that are designed and operated as flexible operation units shall be...

  20. 15 CFR 971.427 - Processing outside the United States.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Processing outside the United States... outside the United States. If appropriate TCRs will incorporate provisions to implement the decision of the Administrator regarding the return of resources processed outside the United States, in accordance...

Top