Sample records for identifiable processing units

  1. A Comparative Study on Retirement Process in Korea, Germany, and the United States: Identifying Determinants of Retirement Process.

    PubMed

    Cho, Joonmo; Lee, Ayoung; Woo, Kwangho

    2016-10-01

    This study classifies the retirement process and empirically identifies the individual and institutional characteristics determining the retirement process of the aged in South Korea, Germany, and the United States. Using data from the Cross-National Equivalent File, we use a multinomial logistic regression with individual factors, public pension, and an interaction term between an occupation and an education level. We found that in Germany, the elderly with a higher education level were more likely to continue work after retirement with a relatively well-developed social support system, while in Korea, the elderly, with a lower education level in almost all occupation sectors, tended to work off and on after retirement. In the United States, the public pension and the interaction terms have no statistically significant impact on work after retirement. In both Germany and Korea, receiving a higher pension decreased the probability of working after retirement, but the influence of a pension in Korea was much greater than that of Germany. In South Korea, the elderly workers, with lower education levels, tended to work off and on repeatedly because there is no proper security in both the labor market and pension system. © The Author(s) 2016.

  2. Recurrent seascape units identify key ecological processes along the western Antarctic Peninsula.

    PubMed

    Bowman, Jeff S; Kavanaugh, Maria T; Doney, Scott C; Ducklow, Hugh W

    2018-04-10

    The western Antarctic Peninsula (WAP) is a bellwether of global climate change and natural laboratory for identifying interactions between climate and ecosystems. The Palmer Long-Term Ecological Research (LTER) project has collected data on key ecological and environmental processes along the WAP since 1993. To better understand how key ecological parameters are changing across space and time, we developed a novel seascape classification approach based on in situ temperature, salinity, chlorophyll a, nitrate + nitrite, phosphate, and silicate. We anticipate that this approach will be broadly applicable to other geographical areas. Through the application of self-organizing maps (SOMs), we identified eight recurrent seascape units (SUs) in these data. These SUs have strong fidelity to known regional water masses but with an additional layer of biogeochemical detail, allowing us to identify multiple distinct nutrient profiles in several water masses. To identify the temporal and spatial distribution of these SUs, we mapped them across the Palmer LTER sampling grid via objective mapping of the original parameters. Analysis of the abundance and distribution of SUs since 1993 suggests two year types characterized by the partitioning of chlorophyll a into SUs with different spatial characteristics. By developing generalized linear models for correlated, time-lagged external drivers, we conclude that early spring sea ice conditions exert a strong influence on the distribution of chlorophyll a and nutrients along the WAP, but not necessarily the total chlorophyll a inventory. Because the distribution and density of phytoplankton biomass can have an impact on biomass transfer to the upper trophic levels, these results highlight anticipated links between the WAP marine ecosystem and climate. © 2018 John Wiley & Sons Ltd.

  3. Which functional unit to identify sustainable foods?

    PubMed

    Masset, Gabriel; Vieux, Florent; Darmon, Nicole

    2015-09-01

    In life-cycle assessment, the functional unit defines the unit for calculation of environmental indicators. The objective of the present study was to assess the influence of two functional units, 100 g and 100 kcal (420 kJ), on the associations between three dimensions for identifying sustainable foods, namely environmental impact (via greenhouse gas emissions (GHGE)), nutritional quality (using two distinct nutrient profiling systems) and price. GHGE and price data were collected for individual foods, and were each expressed per 100 g and per 100 kcal. Two nutrient profiling models, SAIN,LIM and UK Ofcom, were used to assess foods' nutritional quality. Spearman correlations were used to assess associations between variables. Sustainable foods were identified as those having more favourable values for all three dimensions. The French Individual and National Dietary Survey (INCA2), 2006-2007. Three hundred and seventy-three foods highly consumed in INCA2, covering 65 % of total energy intake of adult participants. When GHGE and price were expressed per 100 g, low-GHGE foods had a lower price and higher SAIN,LIM and Ofcom scores (r=0·59, -0·34 and -0·43, respectively), suggesting a compatibility between the three dimensions; 101 and 100 sustainable foods were identified with SAIN,LIM and Ofcom, respectively. When GHGE and price were expressed per 100 kcal, low-GHGE foods had a lower price but also lower SAIN,LIM and Ofcom scores (r=0·67, 0·51 and 0·47, respectively), suggesting that more environment-friendly foods were less expensive but also less healthy; thirty-four sustainable foods were identified with both SAIN,LIM and Ofcom. The choice of functional unit strongly influenced the compatibility between the sustainability dimensions and the identification of sustainable foods.

  4. Associative list processing unit

    DOEpatents

    Hemmert, Karl Scott; Underwood, Keith D.

    2013-01-29

    An associative list processing unit and method comprising employing a plurality of prioritized cell blocks and permitting inserts to occur in a single clock cycle if all of the cell blocks are not full. Also, an associative list processing unit and method comprising employing a plurality of prioritized cell blocks and using a tree of prioritized multiplexers descending from the plurality of cell blocks.

  5. Developing a Model for Identifying Students at Risk of Failure in a First Year Accounting Unit

    ERIC Educational Resources Information Center

    Smith, Malcolm; Therry, Len; Whale, Jacqui

    2012-01-01

    This paper reports on the process involved in attempting to build a predictive model capable of identifying students at risk of failure in a first year accounting unit in an Australian university. Identifying attributes that contribute to students being at risk can lead to the development of appropriate intervention strategies and support…

  6. The process of implementation of emergency care units in Brazil.

    PubMed

    O'Dwyer, Gisele; Konder, Mariana Teixeira; Reciputti, Luciano Pereira; Lopes, Mônica Guimarães Macau; Agostinho, Danielle Fernandes; Alves, Gabriel Farias

    2017-12-11

    To analyze the process of implementation of emergency care units in Brazil. We have carried out a documentary analysis, with interviews with twenty-four state urgency coordinators and a panel of experts. We have analyzed issues related to policy background and trajectory, players involved in the implementation, expansion process, advances, limits, and implementation difficulties, and state coordination capacity. We have used the theoretical framework of the analysis of the strategic conduct of the Giddens theory of structuration. Emergency care units have been implemented after 2007, initially in the Southeast region, and 446 emergency care units were present in all Brazilian regions in 2016. Currently, 620 emergency care units are under construction, which indicates expectation of expansion. Federal funding was a strong driver for the implementation. The states have planned their emergency care units, but the existence of direct negotiation between municipalities and the Union has contributed with the significant number of emergency care units that have been built but that do not work. In relation to the urgency network, there is tension with the hospital because of the lack of beds in the country, which generates hospitalizations in the emergency care unit. The management of emergency care units is predominantly municipal, and most of the emergency care units are located outside the capitals and classified as Size III. The main challenges identified were: under-funding and difficulty in recruiting physicians. The emergency care unit has the merit of having technological resources and being architecturally differentiated, but it will only succeed within an urgency network. Federal induction has generated contradictory responses, since not all states consider the emergency care unit a priority. The strengthening of the state management has been identified as a challenge for the implementation of the urgency network.

  7. The process of implementation of emergency care units in Brazil

    PubMed Central

    O'Dwyer, Gisele; Konder, Mariana Teixeira; Reciputti, Luciano Pereira; Lopes, Mônica Guimarães Macau; Agostinho, Danielle Fernandes; Alves, Gabriel Farias

    2017-01-01

    ABSTRACT OBJECTIVE To analyze the process of implementation of emergency care units in Brazil. METHODS We have carried out a documentary analysis, with interviews with twenty-four state urgency coordinators and a panel of experts. We have analyzed issues related to policy background and trajectory, players involved in the implementation, expansion process, advances, limits, and implementation difficulties, and state coordination capacity. We have used the theoretical framework of the analysis of the strategic conduct of the Giddens theory of structuration. RESULTS Emergency care units have been implemented after 2007, initially in the Southeast region, and 446 emergency care units were present in all Brazilian regions in 2016. Currently, 620 emergency care units are under construction, which indicates expectation of expansion. Federal funding was a strong driver for the implementation. The states have planned their emergency care units, but the existence of direct negotiation between municipalities and the Union has contributed with the significant number of emergency care units that have been built but that do not work. In relation to the urgency network, there is tension with the hospital because of the lack of beds in the country, which generates hospitalizations in the emergency care unit. The management of emergency care units is predominantly municipal, and most of the emergency care units are located outside the capitals and classified as Size III. The main challenges identified were: under-funding and difficulty in recruiting physicians. CONCLUSIONS The emergency care unit has the merit of having technological resources and being architecturally differentiated, but it will only succeed within an urgency network. Federal induction has generated contradictory responses, since not all states consider the emergency care unit a priority. The strengthening of the state management has been identified as a challenge for the implementation of the urgency network

  8. Environmental Engineering Unit Operations and Unit Processes Laboratory Manual.

    ERIC Educational Resources Information Center

    O'Connor, John T., Ed.

    This manual was prepared for the purpose of stimulating the development of effective unit operations and unit processes laboratory courses in environmental engineering. Laboratory activities emphasizing physical operations, biological, and chemical processes are designed for various educational and equipment levels. An introductory section reviews…

  9. Identifying Corridors among Large Protected Areas in the United States.

    PubMed

    Belote, R Travis; Dietz, Matthew S; McRae, Brad H; Theobald, David M; McClure, Meredith L; Irwin, G Hugh; McKinley, Peter S; Gage, Josh A; Aplet, Gregory H

    2016-01-01

    Conservation scientists emphasize the importance of maintaining a connected network of protected areas to prevent ecosystems and populations from becoming isolated, reduce the risk of extinction, and ultimately sustain biodiversity. Keeping protected areas connected in a network is increasingly recognized as a conservation priority in the current era of rapid climate change. Models that identify suitable linkages between core areas have been used to prioritize potentially important corridors for maintaining functional connectivity. Here, we identify the most "natural" (i.e., least human-modified) corridors between large protected areas in the contiguous Unites States. We aggregated results from multiple connectivity models to develop a composite map of corridors reflecting agreement of models run under different assumptions about how human modification of land may influence connectivity. To identify which land units are most important for sustaining structural connectivity, we used the composite map of corridors to evaluate connectivity priorities in two ways: (1) among land units outside of our pool of large core protected areas and (2) among units administratively protected as Inventoried Roadless (IRAs) or Wilderness Study Areas (WSAs). Corridor values varied substantially among classes of "unprotected" non-core land units, and land units of high connectivity value and priority represent diverse ownerships and existing levels of protections. We provide a ranking of IRAs and WSAs that should be prioritized for additional protection to maintain minimal human modification. Our results provide a coarse-scale assessment of connectivity priorities for maintaining a connected network of protected areas.

  10. Identifying Corridors among Large Protected Areas in the United States

    PubMed Central

    Belote, R. Travis; Dietz, Matthew S.; McRae, Brad H.; Theobald, David M.; McClure, Meredith L.; Irwin, G. Hugh; McKinley, Peter S.; Gage, Josh A.; Aplet, Gregory H.

    2016-01-01

    Conservation scientists emphasize the importance of maintaining a connected network of protected areas to prevent ecosystems and populations from becoming isolated, reduce the risk of extinction, and ultimately sustain biodiversity. Keeping protected areas connected in a network is increasingly recognized as a conservation priority in the current era of rapid climate change. Models that identify suitable linkages between core areas have been used to prioritize potentially important corridors for maintaining functional connectivity. Here, we identify the most “natural” (i.e., least human-modified) corridors between large protected areas in the contiguous Unites States. We aggregated results from multiple connectivity models to develop a composite map of corridors reflecting agreement of models run under different assumptions about how human modification of land may influence connectivity. To identify which land units are most important for sustaining structural connectivity, we used the composite map of corridors to evaluate connectivity priorities in two ways: (1) among land units outside of our pool of large core protected areas and (2) among units administratively protected as Inventoried Roadless (IRAs) or Wilderness Study Areas (WSAs). Corridor values varied substantially among classes of “unprotected” non-core land units, and land units of high connectivity value and priority represent diverse ownerships and existing levels of protections. We provide a ranking of IRAs and WSAs that should be prioritized for additional protection to maintain minimal human modification. Our results provide a coarse-scale assessment of connectivity priorities for maintaining a connected network of protected areas. PMID:27104683

  11. Judicial Process, Grade Eight. Resource Unit (Unit V).

    ERIC Educational Resources Information Center

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the judicial process. The unit was designed with two major purposes in mind. First, it helps pupils understand judicial decision-making, and second, it provides for the study of the rights guaranteed by the federal Constitution. Both…

  12. Legislative Process, Grade Eight. Resource Unit (Unit IV).

    ERIC Educational Resources Information Center

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the legislative process. The unit uses case studies such as the Civil Rights Acts of 1960 and 1964 and attempts to change the Rules Committee in 1961. It also uses much data on background of congressmen and on distribution of…

  13. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  14. Biodiversity of indigenous staphylococci of naturally fermented dry sausages and manufacturing environments of small-scale processing units.

    PubMed

    Leroy, Sabine; Giammarinaro, Philippe; Chacornac, Jean-Paul; Lebert, Isabelle; Talon, Régine

    2010-04-01

    The staphylococcal community of the environments of nine French small-scale processing units and their naturally fermented meat products was identified by analyzing 676 isolates. Fifteen species were accurately identified using validated molecular methods. The three prevalent species were Staphylococcus equorum (58.4%), Staphylococcus saprophyticus (15.7%) and Staphylococcus xylosus (9.3%). S. equorum was isolated in all the processing units in similar proportion in meat and environmental samples. S. saprophyticus was also isolated in all the processing units with a higher percentage in environmental samples. S. xylosus was present sporadically in the processing units and its prevalence was higher in meat samples. The genetic diversity of the strains within the three species isolated from one processing unit was studied by PFGE and revealed a high diversity for S. equorum and S. saprophyticus both in the environment and the meat isolates. The genetic diversity remained high through the manufacturing steps. A small percentage of the strains of the two species share the two ecological niches. These results highlight that some strains, probably introduced by the meat, will persist in the manufacturing environment, while other strains are more adapted to the meat products.

  15. The Executive Process, Grade Eight. Resource Unit (Unit III).

    ERIC Educational Resources Information Center

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the executive process. The unit uses case studies of presidential decision making such as the decision to drop the atomic bomb on Hiroshima, the Cuba Bay of Pigs and quarantine decisions, and the Little Rock decision. A case study of…

  16. Associative list processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemmert, Karl Scott; Underwood, Keith D

    2014-04-01

    An associative list processing unit and method comprising employing a plurality of prioritized cell blocks and permitting inserts to occur in a single clock cycle if all of the cell blocks are not full.

  17. Portable brine evaporator unit, process, and system

    DOEpatents

    Hart, Paul John; Miller, Bruce G.; Wincek, Ronald T.; Decker, Glenn E.; Johnson, David K.

    2009-04-07

    The present invention discloses a comprehensive, efficient, and cost effective portable evaporator unit, method, and system for the treatment of brine. The evaporator unit, method, and system require a pretreatment process that removes heavy metals, crude oil, and other contaminates in preparation for the evaporator unit. The pretreatment and the evaporator unit, method, and system process metals and brine at the site where they are generated (the well site). Thus, saving significant money to producers who can avoid present and future increases in transportation costs.

  18. Hyperspectral processing in graphical processing units

    NASA Astrophysics Data System (ADS)

    Winter, Michael E.; Winter, Edwin M.

    2011-06-01

    With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.

  19. Compact hybrid optoelectrical unit for image processing and recognition

    NASA Astrophysics Data System (ADS)

    Cheng, Gang; Jin, Guofan; Wu, Minxian; Liu, Haisong; He, Qingsheng; Yuan, ShiFu

    1998-07-01

    In this paper a compact opto-electric unit (CHOEU) for digital image processing and recognition is proposed. The central part of CHOEU is an incoherent optical correlator, which is realized with a SHARP QA-1200 8.4 inch active matrix TFT liquid crystal display panel which is used as two real-time spatial light modulators for both the input image and reference template. CHOEU can do two main processing works. One is digital filtering; the other is object matching. Using CHOEU an edge-detection operator is realized to extract the edges from the input images. Then the reprocessed images are sent into the object recognition unit for identifying the important targets. A novel template- matching method is proposed for gray-tome image recognition. A positive and negative cycle-encoding method is introduced to realize the absolute difference measurement pixel- matching on a correlator structure simply. The system has god fault-tolerance ability for rotation distortion, Gaussian noise disturbance or information losing. The experiments are given at the end of this paper.

  20. 15 CFR 971.209 - Processing outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Processing outside the United States... Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section... contravenes the overriding national interests of the United States. (b) If foreign processing is proposed, the...

  1. 15 CFR 971.209 - Processing outside the United States.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Processing outside the United States... Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section... contravenes the overriding national interests of the United States. (b) If foreign processing is proposed, the...

  2. Identifying strengths and weaknesses of Quality Management Unit University of Sumatera Utara software using SCAMPI C

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.

  3. ON DEVELOPING CLEANER ORGANIC UNIT PROCESSES

    EPA Science Inventory

    Organic waste products, potentially harmful to the human health and the environment, are primarily produced in the synthesis stage of manufacturing processes. Many such synthetic unit processes, such as halogenation, oxidation, alkylation, nitration, and sulfonation are common to...

  4. 15 CFR 971.427 - Processing outside the United States.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Processing outside the United States... outside the United States. If appropriate TCRs will incorporate provisions to implement the decision of the Administrator regarding the return of resources processed outside the United States, in accordance...

  5. High-performance computing with quantum processing units

    DOE PAGES

    Britt, Keith A.; Oak Ridge National Lab.; Humble, Travis S.; ...

    2017-03-01

    The prospects of quantum computing have driven efforts to realize fully functional quantum processing units (QPUs). Recent success in developing proof-of-principle QPUs has prompted the question of how to integrate these emerging processors into modern high-performance computing (HPC) systems. We examine how QPUs can be integrated into current and future HPC system architectures by accounting for func- tional and physical design requirements. We identify two integration pathways that are differentiated by infrastructure constraints on the QPU and the use cases expected for the HPC system. This includes a tight integration that assumes infrastructure bottlenecks can be overcome as well asmore » a loose integration that as- sumes they cannot. We find that the performance of both approaches is likely to depend on the quantum interconnect that serves to entangle multiple QPUs. As a result, we also identify several challenges in assessing QPU performance for HPC, and we consider new metrics that capture the interplay between system architecture and the quantum parallelism underlying computational performance.« less

  6. High-performance computing with quantum processing units

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Britt, Keith A.; Oak Ridge National Lab.; Humble, Travis S.

    The prospects of quantum computing have driven efforts to realize fully functional quantum processing units (QPUs). Recent success in developing proof-of-principle QPUs has prompted the question of how to integrate these emerging processors into modern high-performance computing (HPC) systems. We examine how QPUs can be integrated into current and future HPC system architectures by accounting for func- tional and physical design requirements. We identify two integration pathways that are differentiated by infrastructure constraints on the QPU and the use cases expected for the HPC system. This includes a tight integration that assumes infrastructure bottlenecks can be overcome as well asmore » a loose integration that as- sumes they cannot. We find that the performance of both approaches is likely to depend on the quantum interconnect that serves to entangle multiple QPUs. As a result, we also identify several challenges in assessing QPU performance for HPC, and we consider new metrics that capture the interplay between system architecture and the quantum parallelism underlying computational performance.« less

  7. Power-processing unit

    NASA Technical Reports Server (NTRS)

    Wessel, Frank J. (Inventor); Hancock, Donald J. (Inventor)

    1987-01-01

    Power-processing unit uses AC buses (30, 32) to supply all current dependent needs such as connections (54, 56) to an ion thruster through an inductor (88) and the primary of a transformer (90), to assure limited currents to such loads. Where temperature control is also required, such as to the main discharge vaporizer heater connection (36, 38), switches (100, 102) are serially connected with inductor (96) and the primary of transformer (98). Temperature sensor (104) controls the switches (100, 102) for temperature regulation.

  8. The implementation of a postoperative care process on a neurosurgical unit.

    PubMed

    Douglas, Mary; Rowed, Sheila

    2005-12-01

    The postoperative phase is a critical time for any neurosurgical patient. Historically, certain patients having neurosurgical procedures, such as craniotomies and other more complex surgeries, have been nursed postoperatively in the intensive care unit (ICU) for an overnight stay, prior to transfer to a neurosurgical floor. At the Hospital for Sick Children in Toronto, because of challenges with access to ICU beds and the cancellation of surgeries because of lack of available nurses for the ICU setting, this practice was reexamined. A set of criteria was developed to identify which postoperative patients should come directly to the neurosurgical unit immediately following their anesthetic recovery. The criteria were based on patient diagnosis, preoperative condition, comorbidities, the surgical procedure, intraoperative complications, and postoperative status. A detailed process was then outlined that allowed the optimum patients to be selected for this process to ensure patient safety. Included in this process was a postoperative protocol addressing details such as standard physician orders and the levels of monitoring required. Outcomes of this new process include fewer surgical cancellations for patients and families, equally safe, or better patient care, and the conservation of limited ICU resources. The program has currently been expanded to include patients who have undergone endovascular therapies.

  9. Unit Process Wetlands for Removal of Trace Organic Contaminants and Pathogens from Municipal Wastewater Effluents

    PubMed Central

    Jasper, Justin T.; Nguyen, Mi T.; Jones, Zackary L.; Ismail, Niveen S.; Sedlak, David L.; Sharp, Jonathan O.; Luthy, Richard G.; Horne, Alex J.; Nelson, Kara L.

    2013-01-01

    Abstract Treatment wetlands have become an attractive option for the removal of nutrients from municipal wastewater effluents due to their low energy requirements and operational costs, as well as the ancillary benefits they provide, including creating aesthetically appealing spaces and wildlife habitats. Treatment wetlands also hold promise as a means of removing other wastewater-derived contaminants, such as trace organic contaminants and pathogens. However, concerns about variations in treatment efficacy of these pollutants, coupled with an incomplete mechanistic understanding of their removal in wetlands, hinder the widespread adoption of constructed wetlands for these two classes of contaminants. A better understanding is needed so that wetlands as a unit process can be designed for their removal, with individual wetland cells optimized for the removal of specific contaminants, and connected in series or integrated with other engineered or natural treatment processes. In this article, removal mechanisms of trace organic contaminants and pathogens are reviewed, including sorption and sedimentation, biotransformation and predation, photolysis and photoinactivation, and remaining knowledge gaps are identified. In addition, suggestions are provided for how these treatment mechanisms can be enhanced in commonly employed unit process wetland cells or how they might be harnessed in novel unit process cells. It is hoped that application of the unit process concept to a wider range of contaminants will lead to more widespread application of wetland treatment trains as components of urban water infrastructure in the United States and around the globe. PMID:23983451

  10. Unit Process Wetlands for Removal of Trace Organic Contaminants and Pathogens from Municipal Wastewater Effluents.

    PubMed

    Jasper, Justin T; Nguyen, Mi T; Jones, Zackary L; Ismail, Niveen S; Sedlak, David L; Sharp, Jonathan O; Luthy, Richard G; Horne, Alex J; Nelson, Kara L

    2013-08-01

    Treatment wetlands have become an attractive option for the removal of nutrients from municipal wastewater effluents due to their low energy requirements and operational costs, as well as the ancillary benefits they provide, including creating aesthetically appealing spaces and wildlife habitats. Treatment wetlands also hold promise as a means of removing other wastewater-derived contaminants, such as trace organic contaminants and pathogens. However, concerns about variations in treatment efficacy of these pollutants, coupled with an incomplete mechanistic understanding of their removal in wetlands, hinder the widespread adoption of constructed wetlands for these two classes of contaminants. A better understanding is needed so that wetlands as a unit process can be designed for their removal, with individual wetland cells optimized for the removal of specific contaminants, and connected in series or integrated with other engineered or natural treatment processes. In this article, removal mechanisms of trace organic contaminants and pathogens are reviewed, including sorption and sedimentation, biotransformation and predation, photolysis and photoinactivation, and remaining knowledge gaps are identified. In addition, suggestions are provided for how these treatment mechanisms can be enhanced in commonly employed unit process wetland cells or how they might be harnessed in novel unit process cells. It is hoped that application of the unit process concept to a wider range of contaminants will lead to more widespread application of wetland treatment trains as components of urban water infrastructure in the United States and around the globe.

  11. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  12. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 12 2014-07-01 2014-07-01 false Glycol dehydration unit process vent... Storage Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart that must be controlled for air emissions as...

  13. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 12 2013-07-01 2013-07-01 false Glycol dehydration unit process vent... Storage Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart that must be controlled for air emissions as...

  14. 15 CFR 971.209 - Processing outside the United States.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Processing outside the United States... THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section...

  15. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 12 2012-07-01 2011-07-01 true Glycol dehydration unit process vent... Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  16. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Glycol dehydration unit process vent... Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  17. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 11 2013-07-01 2013-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart that must be controlled for air emissions as specified in either...

  18. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  19. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart that must be controlled for air emissions as specified in either...

  20. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  1. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Glycol dehydration unit process vent... Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  2. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 11 2012-07-01 2012-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  3. Line-by-line spectroscopic simulations on graphics processing units

    NASA Astrophysics Data System (ADS)

    Collange, Sylvain; Daumas, Marc; Defour, David

    2008-01-01

    We report here on software that performs line-by-line spectroscopic simulations on gases. Elaborate models (such as narrow band and correlated-K) are accurate and efficient for bands where various components are not simultaneously and significantly active. Line-by-line is probably the most accurate model in the infrared for blends of gases that contain high proportions of H 2O and CO 2 as this was the case for our prototype simulation. Our implementation on graphics processing units sustains a speedup close to 330 on computation-intensive tasks and 12 on memory intensive tasks compared to implementations on one core of high-end processors. This speedup is due to data parallelism, efficient memory access for specific patterns and some dedicated hardware operators only available in graphics processing units. It is obtained leaving most of processor resources available and it would scale linearly with the number of graphics processing units in parallel machines. Line-by-line simulation coupled with simulation of fluid dynamics was long believed to be economically intractable but our work shows that it could be done with some affordable additional resources compared to what is necessary to perform simulations on fluid dynamics alone. Program summaryProgram title: GPU4RE Catalogue identifier: ADZY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 62 776 No. of bytes in distributed program, including test data, etc.: 1 513 247 Distribution format: tar.gz Programming language: C++ Computer: x86 PC Operating system: Linux, Microsoft Windows. Compilation requires either gcc/g++ under Linux or Visual C++ 2003/2005 and Cygwin under Windows. It has been tested using gcc 4.1.2 under Ubuntu Linux 7.04 and using Visual C

  4. Identifying influential individuals on intensive care units: using cluster analysis to explore culture.

    PubMed

    Fong, Allan; Clark, Lindsey; Cheng, Tianyi; Franklin, Ella; Fernandez, Nicole; Ratwani, Raj; Parker, Sarah Henrickson

    2017-07-01

    The objective of this paper is to identify attribute patterns of influential individuals in intensive care units using unsupervised cluster analysis. Despite the acknowledgement that culture of an organisation is critical to improving patient safety, specific methods to shift culture have not been explicitly identified. A social network analysis survey was conducted and an unsupervised cluster analysis was used. A total of 100 surveys were gathered. Unsupervised cluster analysis was used to group individuals with similar dimensions highlighting three general genres of influencers: well-rounded, knowledge and relational. Culture is created locally by individual influencers. Cluster analysis is an effective way to identify common characteristics among members of an intensive care unit team that are noted as highly influential by their peers. To change culture, identifying and then integrating the influencers in intervention development and dissemination may create more sustainable and effective culture change. Additional studies are ongoing to test the effectiveness of utilising these influencers to disseminate patient safety interventions. This study offers an approach that can be helpful in both identifying and understanding influential team members and may be an important aspect of developing methods to change organisational culture. © 2017 John Wiley & Sons Ltd.

  5. Runoff sensitivity to snowmelt process representation for the conterminous United States

    NASA Astrophysics Data System (ADS)

    Driscoll, J. M.; Sexstone, G. A.

    2017-12-01

    Watershed-scale hydrologic models that operate at a continental extent must balance detailed descriptions of spatiotemporal variability against simplified process representations across a diverse range of physiographic and climatic regimes. Some of these models describe the sub-grid variability of snow-cover extent and snowmelt processes using snow depletion curves (SDCs), which relate the snow covered area to the snow water equivalent (SWE). The U.S. Geological Survey's National Hydrologic Modeling (NHM) system run with the daily-timestep Precipitation Runoff Modeling System (PRMS), or NHM-PRMS, originally used two default SDCs to describe snowmelt processes: one for hydrologic response units with elevations above treeline and one for hydrologic response units with elevations below treeline. Seeking to improve upon this approach, spatially-distributed SWE, derived from Snow Data Assimilation System (SNODAS) over eleven years, was used to develop new, site-specific SDCs for each hydrologic response unit in the NHM-PRMS. This study investigates the sensitivity of NHM-PRMS to changes in SDCs for a 30-year historical period by first running the NHM-PRMS with the default binary SDCs and then with the site-specific SDCs. Comparison of simulated snowmelt and streamflow response during the snowmelt season allows for spatial analysis and grouping of the sensitivity of streamflow to changes in snowmelt dynamics. Site-specific SDCs allow for the identification and categorization of areas where faster or slower snowmelt could have a greater impact to water resources. These new SDCs can be used to identify locations where increased SWE observation density would be most useful for seasonal water availability assessments.

  6. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  7. On Tour... Primary Hardwood Processing, Products and Recycling Unit

    Treesearch

    Philip A. Araman; Daniel L. Schmoldt

    1995-01-01

    Housed within the Department of Wood Science and Forest Products at Virginia Polytechnic Institute is a three-person USDA Forest Service research work unit (with one vacancy) devoted to hardwood processing and recycling research. Phil Araman is the project leader of this truly unique and productive unit, titled ãPrimary Hardwood Processing, Products and Recycling.ä The...

  8. Process Improvement to Enhance Quality in a Large Volume Labor and Birth Unit.

    PubMed

    Bell, Ashley M; Bohannon, Jessica; Porthouse, Lisa; Thompson, Heather; Vago, Tony

    The goal of the perinatal team at Mercy Hospital St. Louis is to provide a quality patient experience during labor and birth. After the move to a new labor and birth unit in 2013, the team recognized many of the routines and practices needed to be modified based on different demands. The Lean process was used to plan and implement required changes. This technique was chosen because it is based on feedback from clinicians, teamwork, strategizing, and immediate evaluation and implementation of common sense solutions. Through rapid improvement events, presence of leaders in the work environment, and daily huddles, team member engagement and communication were enhanced. The process allowed for team members to offer ideas, test these ideas, and evaluate results, all within a rapid time frame. For 9 months, frontline clinicians met monthly for a weeklong rapid improvement event to create better experiences for childbearing women and those who provide their care, using Lean concepts. At the end of each week, an implementation plan and metrics were developed to help ensure sustainment. The issues that were the focus of these process improvements included on-time initiation of scheduled cases such as induction of labor and cesarean birth, timely and efficient assessment and triage disposition, postanesthesia care and immediate newborn care completed within approximately 2 hours, transfer from the labor unit to the mother baby unit, and emergency transfers to the main operating room and intensive care unit. On-time case initiation for labor induction and cesarean birth improved, length of stay in obstetric triage decreased, postanesthesia recovery care was reorganized to be completed within the expected 2-hour standard time frame, and emergency transfers to the main hospital operating room and intensive care units were standardized and enhanced for efficiency and safety. Participants were pleased with the process improvements and quality outcomes. Working together as a team

  9. Process methods and levels of automation of wood pallet repair in the United States

    Treesearch

    Jonghun Park; Laszlo Horvath; Robert J. Bush

    2016-01-01

    This study documented the current status of wood pallet repair in the United States by identifying the types of processing and equipment usage in repair operations from an automation prespective. The wood pallet repair firms included in the sudy received an average of approximately 1.28 million cores (i.e., used pallets) for recovery in 2012. A majority of the cores...

  10. Identifying and assessing strategies for evaluating the impact of mobile eye health units on health outcomes.

    PubMed

    Fu, Shiwan; Turner, Angus; Tan, Irene; Muir, Josephine

    2017-12-01

    To identify and assess strategies for evaluating the impact of mobile eye health units on health outcomes. Systematic literature review. Worldwide. Peer-reviewed journal articles that included the use of a mobile eye health unit. Journal articles were included if outcome measures reflected an assessment of the impact of a mobile eye health unit on health outcomes. Six studies were identified with mobile services offering diabetic retinopathy screening (three studies), optometric services (two studies) and orthoptic services (one study). This review identified and assessed strategies in existing literature used to evaluate the impact of mobile eye health units on health outcomes. Studies included in this review used patient outcomes (i.e. disease detection, vision impairment, treatment compliance) and/or service delivery outcomes (i.e. cost per attendance, hospital transport use, inappropriate referrals, time from diabetic retinopathy photography to treatment) to evaluate the impact of mobile eye health units. Limitations include difficulty proving causation of specific outcome measures and the overall shortage of impact evaluation studies. Variation in geographical location, service population and nature of eye care providers limits broad application. © 2017 National Rural Health Alliance Inc.

  11. The interaction between specific and general information in category learning and representation: unitization and parallel interactive processing.

    PubMed

    Nahinsky, Irwin D; Harbison, J Isaiah

    2011-01-01

    We investigated the effects of specific stimulus information on the use of rule information in a category learning task in 2 experiments, one presented here and an intercategory transfer task reported in an earlier article. In the present experiment photograph--name combinations, called identifiers, were associated with 4 demographic attributes. The same attribute information was shown to all participants. However, for one group of participants, half of the identifiers were paired with attribute values repeated over presentation blocks. For the other group the identifier information was new for each presentation block. The first group performed less well than the second group on stimuli with nonrepeated identifiers, indicating a negative effect of specific stimulus information on processing rule information. Application of a network model to the 2 experiments, which provided for the growth of connections between attribute values in learning, indicated that repetition of identifiers produced a unitizing effect on stimuli. Results suggested that unitization produced interference through connections between irrelevant attribute values.

  12. Efficient particle-in-cell simulation of auroral plasma phenomena using a CUDA enabled graphics processing unit

    NASA Astrophysics Data System (ADS)

    Sewell, Stephen

    This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.

  13. High Input Voltage, Silicon Carbide Power Processing Unit Performance Demonstration

    NASA Technical Reports Server (NTRS)

    Bozak, Karin E.; Pinero, Luis R.; Scheidegger, Robert J.; Aulisio, Michael V.; Gonzalez, Marcelo C.; Birchenough, Arthur G.

    2015-01-01

    A silicon carbide brassboard power processing unit has been developed by the NASA Glenn Research Center in Cleveland, Ohio. The power processing unit operates from two sources: a nominal 300 Volt high voltage input bus and a nominal 28 Volt low voltage input bus. The design of the power processing unit includes four low voltage, low power auxiliary supplies, and two parallel 7.5 kilowatt (kW) discharge power supplies that are capable of providing up to 15 kilowatts of total power at 300 to 500 Volts (V) to the thruster. Additionally, the unit contains a housekeeping supply, high voltage input filter, low voltage input filter, and master control board, such that the complete brassboard unit is capable of operating a 12.5 kilowatt Hall effect thruster. The performance of the unit was characterized under both ambient and thermal vacuum test conditions, and the results demonstrate exceptional performance with full power efficiencies exceeding 97%. The unit was also tested with a 12.5kW Hall effect thruster to verify compatibility and output filter specifications. With space-qualified silicon carbide or similar high voltage, high efficiency power devices, this would provide a design solution to address the need for high power electric propulsion systems.

  14. Natural Gas Processing Plants in the United States: 2010 Update

    EIA Publications

    2011-01-01

    This special report presents an analysis of natural gas processing plants in the United States as of 2009 and highlights characteristics of this segment of the industry. The purpose of the paper is to examine the role of natural gas processing plants in the natural gas supply chain and to provide an overview and summary of processing plant characteristics in the United States, such as locations, capacities, and operations.

  15. High Input Voltage, Silicon Carbide Power Processing Unit Performance Demonstration

    NASA Technical Reports Server (NTRS)

    Bozak, Karin E.; Pinero, Luis R.; Scheidegger, Robert J.; Aulisio, Michael V.; Gonzalez, Marcelo C.; Birchenough, Arthur G.

    2015-01-01

    A silicon carbide brassboard power processing unit has been developed by the NASA Glenn Research Center in Cleveland, Ohio. The power processing unit operates from two sources - a nominal 300-Volt high voltage input bus and a nominal 28-Volt low voltage input bus. The design of the power processing unit includes four low voltage, low power supplies that provide power to the thruster auxiliary supplies, and two parallel 7.5 kilowatt power supplies that are capable of providing up to 15 kilowatts of total power at 300-Volts to 500-Volts to the thruster discharge supply. Additionally, the unit contains a housekeeping supply, high voltage input filter, low voltage input filter, and master control board, such that the complete brassboard unit is capable of operating a 12.5 kilowatt Hall Effect Thruster. The performance of unit was characterized under both ambient and thermal vacuum test conditions, and the results demonstrate the exceptional performance with full power efficiencies exceeding 97. With a space-qualified silicon carbide or similar high voltage, high efficiency power device, this design could evolve into a flight design for future missions that require high power electric propulsion systems.

  16. Factors associated with the process of adaptation among Pakistani adolescent females living in United States.

    PubMed

    Khuwaja, Salma A; Selwyn, Beatrice J; Mgbere, Osaro; Khuwaja, Alam; Kapadia, Asha; McCurdy, Sheryl; Hsu, Chiehwen E

    2013-04-01

    This study explored post-migration experiences of recently migrated Pakistani Muslim adolescent females residing in the United States. In-depth, semi-structured interviews were conducted with thirty Pakistani Muslim adolescent females between the ages of 15 and 18 years living with their families in Houston, Texas. Data obtained from the interviews were evaluated using discourse analysis to identify major reoccurring themes. Participants discussed factors associated with the process of adaptation to the American culture. The results revealed that the main factors associated with adaptation process included positive motivation for migration, family bonding, social support networks, inter-familial communication, aspiration of adolescents to learn other cultures, availability of English-as-second-language programs, participation in community rebuilding activities, and faith practices, English proficiency, peer pressure, and inter-generational conflicts. This study provided much needed information on factors associated with adaptation process of Pakistani Muslim adolescent females in the United States. The results have important implications for improving the adaptation process of this group and offer potential directions for intervention and counseling services.

  17. 8 CFR 215.8 - Requirements for biometric identifiers from aliens on departure from the United States.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...

  18. 8 CFR 215.8 - Requirements for biometric identifiers from aliens on departure from the United States.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...

  19. 8 CFR 215.8 - Requirements for biometric identifiers from aliens on departure from the United States.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...

  20. 8 CFR 215.8 - Requirements for biometric identifiers from aliens on departure from the United States.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...

  1. 15 CFR 971.427 - Processing outside the United States.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Processing outside the United States... THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Issuance... outside the United States. If appropriate TCRs will incorporate provisions to implement the decision of...

  2. 8 CFR 215.8 - Requirements for biometric identifiers from aliens on departure from the United States.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... designated port of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...

  3. Identifying Core Concepts of Cybersecurity: Results of Two Delphi Processes

    ERIC Educational Resources Information Center

    Parekh, Geet; DeLatte, David; Herman, Geoffrey L.; Oliva, Linda; Phatak, Dhananjay; Scheponik, Travis; Sherman, Alan T.

    2018-01-01

    This paper presents and analyzes results of two Delphi processes that polled cybersecurity experts to rate cybersecurity topics based on importance, difficulty, and timelessness. These ratings can be used to identify core concepts--cross-cutting ideas that connect knowledge in the discipline. The first Delphi process identified core concepts that…

  4. Graphics processing unit-assisted lossless decompression

    DOEpatents

    Loughry, Thomas A.

    2016-04-12

    Systems and methods for decompressing compressed data that has been compressed by way of a lossless compression algorithm are described herein. In a general embodiment, a graphics processing unit (GPU) is programmed to receive compressed data packets and decompress such packets in parallel. The compressed data packets are compressed representations of an image, and the lossless compression algorithm is a Rice compression algorithm.

  5. Social processes underlying acculturation: a study of drinking behavior among immigrant Latinos in the Northeast United States

    PubMed Central

    LEE, CHRISTINA S.; LÓPEZ, STEVEN REGESER; COBLY, SUZANNE M.; TEJADA, MONICA; GARCÍA-COLL, CYNTHIA; SMITH, MARCIA

    2010-01-01

    Study Goals To identify social processes that underlie the relationship of acculturation and heavy drinking behavior among Latinos who have immigrated to the Northeast United States of America (USA). Method Community-based recruitment strategies were used to identify 36 Latinos who reported heavy drinking. Participants were 48% female, 23 to 56 years of age, and were from South or Central America (39%) and the Caribbean (24%). Six focus groups were audiotaped and transcribed. Results Content analyses indicated that the social context of drinking is different in the participants’ countries of origin and in the United States. In Latin America, alcohol consumption was part of everyday living (being with friends and family). Nostalgia and isolation reflected some of the reasons for drinking in the USA. Results suggest that drinking in the Northeastern United States (US) is related to Latinos’ adaptation to a new sociocultural environment. Knowledge of the shifting social contexts of drinking can inform health interventions. PMID:20376331

  6. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  7. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    PubMed

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.

  8. A Shipping Container-Based Sterile Processing Unit for Low Resources Settings

    PubMed Central

    2016-01-01

    Deficiencies in the sterile processing of medical instruments contribute to poor outcomes for patients, such as surgical site infections, longer hospital stays, and deaths. In low resources settings, such as some rural and semi-rural areas and secondary and tertiary cities of developing countries, deficiencies in sterile processing are accentuated due to the lack of access to sterilization equipment, improperly maintained and malfunctioning equipment, lack of power to operate equipment, poor protocols, and inadequate quality control over inventory. Inspired by our sterile processing fieldwork at a district hospital in Sierra Leone in 2013, we built an autonomous, shipping-container-based sterile processing unit to address these deficiencies. The sterile processing unit, dubbed “the sterile box,” is a full suite capable of handling instruments from the moment they leave the operating room to the point they are sterile and ready to be reused for the next surgery. The sterile processing unit is self-sufficient in power and water and features an intake for contaminated instruments, decontamination, sterilization via non-electric steam sterilizers, and secure inventory storage. To validate efficacy, we ran tests of decontamination and sterilization performance. Results of 61 trials validate convincingly that our sterile processing unit achieves satisfactory outcomes for decontamination and sterilization and as such holds promise to support healthcare facilities in low resources settings. PMID:27007568

  9. No Exit: Identifying Avoidable Terminal Oncology Intensive Care Unit Hospitalizations

    PubMed Central

    Hantel, Andrew; Wroblewski, Kristen; Balachandran, Jay S.; Chow, Selina; DeBoer, Rebecca; Fleming, Gini F.; Hahn, Olwen M.; Kline, Justin; Liu, Hongtao; Patel, Bhakti K.; Verma, Anshu; Witt, Leah J.; Fukui, Mayumi; Kumar, Aditi; Howell, Michael D.; Polite, Blase N.

    2016-01-01

    Purpose: Terminal oncology intensive care unit (ICU) hospitalizations are associated with high costs and inferior quality of care. This study identifies and characterizes potentially avoidable terminal admissions of oncology patients to ICUs. Methods: This was a retrospective case series of patients cared for in an academic medical center’s ambulatory oncology practice who died in an ICU during July 1, 2012 to June 30, 2013. An oncologist, intensivist, and hospitalist reviewed each patient’s electronic health record from 3 months preceding terminal hospitalization until death. The primary outcome was the proportion of terminal ICU hospitalizations identified as potentially avoidable by two or more reviewers. Univariate and multivariate analysis were performed to identify characteristics associated with avoidable terminal ICU hospitalizations. Results: Seventy-two patients met inclusion criteria. The majority had solid tumor malignancies (71%), poor performance status (51%), and multiple encounters with the health care system. Despite high-intensity health care utilization, only 25% had documented advance directives. During a 4-day median ICU length of stay, 81% were intubated and 39% had cardiopulmonary resuscitation. Forty-seven percent of these hospitalizations were identified as potentially avoidable. Avoidable hospitalizations were associated with factors including: worse performance status before admission (median 2 v 1; P = .01), worse Charlson comorbidity score (median 8.5 v 7.0, P = .04), reason for hospitalization (P = .006), and number of prior hospitalizations (median 2 v 1; P = .05). Conclusion: Given the high frequency of avoidable terminal ICU hospitalizations, health care leaders should develop strategies to prospectively identify patients at high risk and formulate interventions to improve end-of-life care. PMID:27601514

  10. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  11. Identifying Hydrologic Processes in Agricultural Watersheds Using Precipitation-Runoff Models

    USGS Publications Warehouse

    Linard, Joshua I.; Wolock, David M.; Webb, Richard M.T.; Wieczorek, Michael

    2009-01-01

    Understanding the fate and transport of agricultural chemicals applied to agricultural fields will assist in designing the most effective strategies to prevent water-quality impairments. At a watershed scale, the processes controlling the fate and transport of agricultural chemicals are generally understood only conceptually. To examine the applicability of conceptual models to the processes actually occurring, two precipitation-runoff models - the Soil and Water Assessment Tool (SWAT) and the Water, Energy, and Biogeochemical Model (WEBMOD) - were applied in different agricultural settings of the contiguous United States. Each model, through different physical processes, simulated the transport of water to a stream from the surface, the unsaturated zone, and the saturated zone. Models were calibrated for watersheds in Maryland, Indiana, and Nebraska. The calibrated sets of input parameters for each model at each watershed are discussed, and the criteria used to validate the models are explained. The SWAT and WEBMOD model results at each watershed conformed to each other and to the processes identified in each watershed's conceptual hydrology. In Maryland the conceptual understanding of the hydrology indicated groundwater flow was the largest annual source of streamflow; the simulation results for the validation period confirm this. The dominant source of water to the Indiana watershed was thought to be tile drains. Although tile drains were not explicitly simulated in the SWAT model, a large component of streamflow was received from lateral flow, which could be attributed to tile drains. Being able to explicitly account for tile drains, WEBMOD indicated water from tile drains constituted most of the annual streamflow in the Indiana watershed. The Nebraska models indicated annual streamflow was composed primarily of perennial groundwater flow and infiltration-excess runoff, which conformed to the conceptual hydrology developed for that watershed. The hydrologic

  12. Combination of an electrolytic pretreatment unit with secondary water reclamation processes

    NASA Technical Reports Server (NTRS)

    Wells, G. W.; Bonura, M. S.

    1973-01-01

    The design and fabrication of a flight concept prototype electrolytic pretreatment unit (EPU) and of a contractor-furnished air evaporation unit (AEU) are described. The integrated EPU and AEU potable water recovery system is referred to as the Electrovap and is capable of processing the urine and flush water of a six-man crew. Results of a five-day performance verification test of the Electrovap system are presented and plans are included for the extended testing of the Electrovap to produce data applicable to the combination of electrolytic pretreatment with most final potable water recovery systems. Plans are also presented for a program to define the design requirements for combining the electrolytic pretreatment unit with a reverse osmosis final processing unit.

  13. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  14. Sedimentary processes on the Atlantic Continental Slope of the United States

    USGS Publications Warehouse

    Knebel, H.J.

    1984-01-01

    Until recently, the sedimentary processes on the United States Atlantic Continental Slope were inferred mainly from descriptive studies based on the bathymetry and on widely spaced grab samples, bottom photographs, and seismic-reflection profiles. Over the past 6 years, however, much additional information has been collected on the bottom morphology, characteristics of shallow-subbottom strata, velocity of bottom currents, and transport of suspended and bottom sediments. A review of these new data provides a much clearer understanding of the kinds and relative importance of gravitational and hydrodynamic processes that affect the surface sediments. On the rugged slope between Georges Bank and Cape Lookout, N.C., these processes include: (1) small scale mass wasting within submarine canyons and peripheral gullies; (2) density flows within some submarine valleys; (3) sand spillover near the shelf break; (4) sediment creep on the upper slope; and (5) hemipelagic sedimentation on the middle and lower slope. The area between Georges Bank and Hudson Canyon is further distinguished by the relative abundance of large-scale slump scars and deposits on the open slope, the presence of ice-rafted debris, and the transport of sand within the heads of some submarine canyons. Between Cape Lookout and southern Florida, the slope divides into two physiographic units, and the topography is smooth and featureless. On the Florida-Hatteras Slope, offshelf sand spillover and sediment winnowing, related to Gulf Stream flow and possibly to storm-driven currents, are the major processes, whereas hemipelagic sedimentation is dominant over the offshore slope along the seaward edge of the Blake Plateau north of the Blake Spur. Slumping generally is absent south of Cape Lookout, although one large slump scarp (related to uplift over salt diapirs) has been identified east of Cape Romain. Future studies concerning sedimentary processes on the Atlantic slope need to resolve: (1) the ages and

  15. The Ortho-Syllable as a Processing Unit in Handwriting: The Mute E Effect

    ERIC Educational Resources Information Center

    Lambert, Eric; Sausset, Solen; Rigalleau, François

    2015-01-01

    Some research on written production has focused on the role of the syllable as a processing unit. However, the precise nature of this syllable unit has yet to be elucidated. The present study examined whether the nature of this processing unit is orthographic (i.e., an ortho-syllable) or phonological. We asked French adults to copy three-syllable…

  16. Ising Processing Units: Potential and Challenges for Discrete Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffrin, Carleton James; Nagarajan, Harsha; Bent, Russell Whitford

    The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one examplemore » of a commercially available Ising processing unit.« less

  17. Aquarius Digital Processing Unit

    NASA Technical Reports Server (NTRS)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  18. Developing a Comprehensive Model of Intensive Care Unit Processes: Concept of Operations.

    PubMed

    Romig, Mark; Tropello, Steven P; Dwyer, Cindy; Wyskiel, Rhonda M; Ravitz, Alan; Benson, John; Gropper, Michael A; Pronovost, Peter J; Sapirstein, Adam

    2015-04-23

    This study aimed to use a systems engineering approach to improve performance and stakeholder engagement in the intensive care unit to reduce several different patient harms. We developed a conceptual framework or concept of operations (ConOps) to analyze different types of harm that included 4 steps as follows: risk assessment, appropriate therapies, monitoring and feedback, as well as patient and family communications. This framework used a transdisciplinary approach to inventory the tasks and work flows required to eliminate 7 common types of harm experienced by patients in the intensive care unit. The inventory gathered both implicit and explicit information about how the system works or should work and converted the information into a detailed specification that clinicians could understand and use. Using the ConOps document, we created highly detailed work flow models to reduce harm and offer an example of its application to deep venous thrombosis. In the deep venous thrombosis model, we identified tasks that were synergistic across different types of harm. We will use a system of systems approach to integrate the variety of subsystems and coordinate processes across multiple types of harm to reduce the duplication of tasks. Through this process, we expect to improve efficiency and demonstrate synergistic interactions that ultimately can be applied across the spectrum of potential patient harms and patient locations. Engineering health care to be highly reliable will first require an understanding of the processes and work flows that comprise patient care. The ConOps strategy provided a framework for building complex systems to reduce patient harm.

  19. Exploring the decision-making process in the delivery of physiotherapy in a stroke unit.

    PubMed

    McGlinchey, Mark P; Davenport, Sally

    2015-01-01

    The aim of this study was to explore the decision-making process in the delivery of physiotherapy in a stroke unit. A focused ethnographical approach involving semi-structured interviews and observations of clinical practice was used. A purposive sample of seven neurophysiotherapists and four patients participated in semi-structured interviews. From this group, three neurophysiotherapists and four patients were involved in observation of practice. Data from interviews and observations were analysed to generate themes. Three themes were identified: planning the ideal physiotherapy delivery, the reality of physiotherapy delivery and involvement in the decision-making process. Physiotherapists used a variety of clinical reasoning strategies and considered many factors to influence their decision-making in the planning and delivery of physiotherapy post-stroke. These factors included the therapist's clinical experience, patient's presentation and response to therapy, prioritisation, organisational constraints and compliance with organisational practice. All physiotherapists highlighted the importance to involve patients in planning and delivering their physiotherapy. However, there were varying levels of patient involvement observed in this process. The study has generated insight into the reality of decision-making in the planning and delivery of physiotherapy post-stroke. Further research involving other stroke units is required to gain a greater understanding of this aspect of physiotherapy. Implications for Rehabilitation Physiotherapists need to consider multiple patient, therapist and organisational factors when planning and delivering physiotherapy in a stroke unit. Physiotherapists should continually reflect upon how they provide physiotherapy, with respect to the duration, frequency and time of day sessions are delivered, in order to guide current and future physiotherapy delivery. As patients may demonstrate varying levels of participation in deciding and

  20. Models of unit operations used for solid-waste processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, G.M.; Glaub, J.C.; Diaz, L.F.

    1984-09-01

    This report documents the unit operations models that have been developed for typical refuse-derived-fuel (RDF) processing systems. These models, which represent the mass balances, energy requirements, and economics of the unit operations, are derived, where possible, from basic principles. Empiricism has been invoked where a governing theory has yet to be developed. Field test data and manufacturers' information, where available, supplement the analytical development of the models. A literature review has also been included for the purpose of compiling and discussing in one document the available information pertaining to the modeling of front-end unit operations. Separate analytics have been donemore » for each task.« less

  1. A consensus exercise identifying priorities for research into clinical effectiveness among children's orthopaedic surgeons in the United Kingdom.

    PubMed

    Perry, D C; Wright, J G; Cooke, S; Roposch, A; Gaston, M S; Nicolaou, N; Theologis, T

    2018-05-01

    Aims High-quality clinical research in children's orthopaedic surgery has lagged behind other surgical subspecialties. This study used a consensus-based approach to identify research priorities for clinical trials in children's orthopaedics. Methods A modified Delphi technique was used, which involved an initial scoping survey, a two-round Delphi process and an expert panel formed of members of the British Society of Children's Orthopaedic Surgery. The survey was conducted amongst orthopaedic surgeons treating children in the United Kingdom and Ireland. Results A total of 86 clinicians contributed to both rounds of the Delphi process, scoring priorities from one (low priority) to five (high priority). Elective topics were ranked higher than those relating to trauma, with the top ten elective research questions scoring higher than the top question for trauma. Ten elective, and five trauma research priorities were identified, with the three highest ranked questions relating to the treatment of slipped capital femoral epiphysis (mean score 4.6/ 5), Perthes' disease (4.5) and bone infection (4.5). Conclusion This consensus-based research agenda will guide surgeons, academics and funders to improve the evidence in children's orthopaedic surgery and encourage the development of multicentre clinical trials. Cite this article: Bone Joint J 2018;100-B:680-4.

  2. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  3. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units.

    PubMed

    Rath, N; Kato, S; Levesque, J P; Mauel, M E; Navratil, G A; Peng, Q

    2014-04-01

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.

  4. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    PubMed

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  5. Estimating Missing Unit Process Data in Life Cycle Assessment Using a Similarity-Based Approach.

    PubMed

    Hou, Ping; Cai, Jiarui; Qu, Shen; Xu, Ming

    2018-05-01

    In life cycle assessment (LCA), collecting unit process data from the empirical sources (i.e., meter readings, operation logs/journals) is often costly and time-consuming. We propose a new computational approach to estimate missing unit process data solely relying on limited known data based on a similarity-based link prediction method. The intuition is that similar processes in a unit process network tend to have similar material/energy inputs and waste/emission outputs. We use the ecoinvent 3.1 unit process data sets to test our method in four steps: (1) dividing the data sets into a training set and a test set; (2) randomly removing certain numbers of data in the test set indicated as missing; (3) using similarity-weighted means of various numbers of most similar processes in the training set to estimate the missing data in the test set; and (4) comparing estimated data with the original values to determine the performance of the estimation. The results show that missing data can be accurately estimated when less than 5% data are missing in one process. The estimation performance decreases as the percentage of missing data increases. This study provides a new approach to compile unit process data and demonstrates a promising potential of using computational approaches for LCA data compilation.

  6. Fast analytical scatter estimation using graphics processing units.

    PubMed

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  7. Testing a model of componential processing of multi-symbol numbers-evidence from measurement units.

    PubMed

    Huber, Stefan; Bahnmueller, Julia; Klein, Elise; Moeller, Korbinian

    2015-10-01

    Research on numerical cognition has addressed the processing of nonsymbolic quantities and symbolic digits extensively. However, magnitude processing of measurement units is still a neglected topic in numerical cognition research. Hence, we investigated the processing of measurement units to evaluate whether typical effects of multi-digit number processing such as the compatibility effect, the string length congruity effect, and the distance effect are also present for measurement units. In three experiments, participants had to single out the larger one of two physical quantities (e.g., lengths). In Experiment 1, the compatibility of number and measurement unit (compatible: 3 mm_6 cm with 3 < 6 and mm < cm; incompatible: 3 cm_6 mm with 3 < 6 but cm > mm) as well as string length congruity (congruent: 1 m_2 km with m < km and 2 < 3 characters; incongruent: 2 mm_1 m with mm < m, but 3 > 2 characters) were manipulated. We observed reliable compatibility effects with prolonged reaction times (RT) for incompatible trials. Moreover, a string length congruity effect was present in RT with longer RT for incongruent trials. Experiments 2 and 3 served as control experiments showing that compatibility effects persist when controlling for holistic distance and that a distance effect for measurement units exists. Our findings indicate that numbers and measurement units are processed in a componential manner and thus highlight that processing characteristics of multi-digit numbers generalize to measurement units. Thereby, our data lend further support to the recently proposed generalized model of componential multi-symbol number processing.

  8. High Power Silicon Carbide (SiC) Power Processing Unit Development

    NASA Technical Reports Server (NTRS)

    Scheidegger, Robert J.; Santiago, Walter; Bozak, Karin E.; Pinero, Luis R.; Birchenough, Arthur G.

    2015-01-01

    NASA GRC successfully designed, built and tested a technology-push power processing unit for electric propulsion applications that utilizes high voltage silicon carbide (SiC) technology. The development specifically addresses the need for high power electronics to enable electric propulsion systems in the 100s of kilowatts. This unit demonstrated how high voltage combined with superior semiconductor components resulted in exceptional converter performance.

  9. Grace: A cross-platform micromagnetic simulator on graphics processing units

    NASA Astrophysics Data System (ADS)

    Zhu, Ru

    2015-12-01

    A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.

  10. High Temperature Boost (HTB) Power Processing Unit (PPU) Formulation Study

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Bradley, Arthur T.; Iannello, Christopher J.; Carr, Gregory A.; Mohammad, Mojarradi M.; Hunter, Don J.; DelCastillo, Linda; Stell, Christopher B.

    2013-01-01

    This technical memorandum is to summarize the Formulation Study conducted during fiscal year 2012 on the High Temperature Boost (HTB) Power Processing Unit (PPU). The effort is authorized and supported by the Game Changing Technology Division, NASA Office of the Chief Technologist. NASA center participation during the formulation includes LaRC, KSC and JPL. The Formulation Study continues into fiscal year 2013. The formulation study has focused on the power processing unit. The team has proposed a modular, power scalable, and new technology enabled High Temperature Boost (HTB) PPU, which has 5-10X improvement in PPU specific power/mass and over 30% in-space solar electric system mass saving.

  11. 40 CFR 63.1016 - Alternative means of emission limitation: Enclosed-vented process units.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Alternative means of emission limitation: Enclosed-vented process units. 63.1016 Section 63.1016 Protection of Environment ENVIRONMENTAL... § 63.1016 Alternative means of emission limitation: Enclosed-vented process units. (a) Use of closed...

  12. 40 CFR 63.1016 - Alternative means of emission limitation: Enclosed-vented process units.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Alternative means of emission limitation: Enclosed-vented process units. 63.1016 Section 63.1016 Protection of Environment ENVIRONMENTAL... § 63.1016 Alternative means of emission limitation: Enclosed-vented process units. (a) Use of closed...

  13. Developing and Implementing a Process for the Review of Nonacademic Units.

    ERIC Educational Resources Information Center

    Brown, Marilyn K.

    1989-01-01

    A major research university's recently developed process for systematic evaluation of nonacademic units is described, and the steps in its development and implementation are outlined: review of literature on organizational effectiveness; survey of peer institutions; development of guidelines for review; and implementation in several campus units.…

  14. Technical options for processing additional light tight oil volumes within the United States

    EIA Publications

    2015-01-01

    This report examines technical options for processing additional LTO volumes within the United States. Domestic processing of additional LTO would enable an increase in petroleum product exports from the United States, already the world’s largest net exporter of petroleum products. Unlike crude oil, products are not subject to export limitations or licensing requirements. While this is one possible approach to absorbing higher domestic LTO production in the absence of a relaxation of current limitations on crude exports, domestic LTO would have to be priced at a level required to encourage additional LTO runs at existing refinery units, debottlenecking, or possible additions of processing capacity.

  15. Psychiatry training in the United Kingdom--part 2: the training process.

    PubMed

    Christodoulou, N; Kasiakogia, K

    2015-01-01

    In the second part of this diptych, we shall deal with psychiatric training in the United Kingdom in detail, and we will compare it--wherever this is meaningful--with the equivalent system in Greece. As explained in the first part of the paper, due to the recently increased emigration of Greek psychiatrists and psychiatric trainees, and the fact that the United Kingdom is a popular destination, it has become necessary to inform those aspiring to train in the United Kingdom of the system and the circumstances they should expect to encounter. This paper principally describes the structure of the United Kingdom's psychiatric training system, including the different stages trainees progress through and their respective requirements and processes. Specifically, specialty and subspecialty options are described and explained, special paths in training are analysed, and the notions of "special interest day" and the optional "Out of programme experience" schemes are explained. Furthermore, detailed information is offered on the pivotal points of each of the stages of the training process, with special care to explain the important differences and similarities between the systems in Greece and the United Kingdom. Special attention is given to The Royal College of Psychiatrists' Membership Exams (MRCPsych) because they are the only exams towards completing specialisation in Psychiatry in the United Kingdom. Also, the educational culture of progressing according to a set curriculum, of utilising diverse means of professional development, of empowering the trainees' autonomy by allowing initiative-based development and of applying peer supervision as a tool for professional development is stressed. We conclude that psychiatric training in the United Kingdom differs substantially to that of Greece in both structure and process. Τhere are various differences such as pure psychiatric training in the United Kingdom versus neurological and medical modules in Greece, in

  16. Using Mathematica to Teach Process Units: A Distillation Case Study

    ERIC Educational Resources Information Center

    Rasteiro, Maria G.; Bernardo, Fernando P.; Saraiva, Pedro M.

    2005-01-01

    The question addressed here is how to integrate computational tools, namely interactive general-purpose platforms, in the teaching of process units. Mathematica has been selected as a complementary tool to teach distillation processes, with the main objective of leading students to achieve a better understanding of the physical phenomena involved…

  17. Identifying cryptotephra units using correlated rapid, nondestructive methods: VSWIR spectroscopy, X-ray fluorescence, and magnetic susceptibility

    NASA Astrophysics Data System (ADS)

    McCanta, Molly C.; Hatfield, Robert G.; Thomson, Bradley J.; Hook, Simon J.; Fisher, Elizabeth

    2015-12-01

    Understanding the frequency, magnitude, and nature of explosive volcanic eruptions is essential for hazard planning and risk mitigation. Terrestrial stratigraphic tephra records can be patchy and incomplete due to subsequent erosion and burial processes. In contrast, the marine sedimentary record commonly preserves a more complete historical record of volcanic activity as individual events are archived within continually accumulating background sediments. While larger tephra layers are often identifiable by changes in sediment color and/or texture, smaller fallout layers may also be present that are not visible to the naked eye. These cryptotephra are commonly more difficult to identify and often require time-consuming and destructive point counting, petrography, and microscopy work. Here we present several rapid, nondestructive, and quantitative core scanning methodologies (magnetic susceptibility, visible to shortwave infrared spectroscopy, and XRF core scanning) which, when combined, can be used to identify the presence of increased volcaniclastic components (interpreted to be cryptotephra) in the sedimentary record. We develop a new spectral parameter (BDI1000VIS) that exploits the absorption of the 1 µm near-infrared band in tephra. Using predetermined mixtures, BDI1000VIS can accurately identify tephra layers in concentrations >15-20%. When applied to the upper ˜270 kyr record of IODP core U1396C from the Caribbean Sea, and verified by traditional point counting, 29 potential cryptotephra layers were identified as originating from eruptions of the Lesser Antilles Volcanic Arc. Application of these methods in future coring endeavors can be used to minimize the need for physical disaggregation of valuable drill core material and allow for near-real-time recognition of tephra units, both visible and cryptotephra. This article was corrected on 23 DEC 2015. See the end of the full text for details.

  18. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  19. Identifying pathogenic processes by integrating microarray data with prior knowledge

    PubMed Central

    2014-01-01

    Background It is of great importance to identify molecular processes and pathways that are involved in disease etiology. Although there has been an extensive use of various high-throughput methods for this task, pathogenic pathways are still not completely understood. Often the set of genes or proteins identified as altered in genome-wide screens show a poor overlap with canonical disease pathways. These findings are difficult to interpret, yet crucial in order to improve the understanding of the molecular processes underlying the disease progression. We present a novel method for identifying groups of connected molecules from a set of differentially expressed genes. These groups represent functional modules sharing common cellular function and involve signaling and regulatory events. Specifically, our method makes use of Bayesian statistics to identify groups of co-regulated genes based on the microarray data, where external information about molecular interactions and connections are used as priors in the group assignments. Markov chain Monte Carlo sampling is used to search for the most reliable grouping. Results Simulation results showed that the method improved the ability of identifying correct groups compared to traditional clustering, especially for small sample sizes. Applied to a microarray heart failure dataset the method found one large cluster with several genes important for the structure of the extracellular matrix and a smaller group with many genes involved in carbohydrate metabolism. The method was also applied to a microarray dataset on melanoma cancer patients with or without metastasis, where the main cluster was dominated by genes related to keratinocyte differentiation. Conclusion Our method found clusters overlapping with known pathogenic processes, but also pointed to new connections extending beyond the classical pathways. PMID:24758699

  20. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  1. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  2. Undergraduate Game Degree Programs in the United Kingdom and United States: A Comparison of the Curriculum Planning Process

    ERIC Educational Resources Information Center

    McGill, Monica M.

    2010-01-01

    Digital games are marketed, mass-produced, and consumed by an increasing number of people and the game industry is only expected to grow. In response, post-secondary institutions in the United Kingdom (UK) and the United States (US) have started to create game degree programs. Though curriculum theorists provide insight into the process of…

  3. 76 FR 34031 - United States Standards for Grades of Processed Raisins

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ...The Agricultural Marketing Service (AMS), of the United States Department of Agriculture (USDA) is withdrawing a notice soliciting comments on its proposed revision to the United States Standards for Grades of Processed Raisins. Based on the petitioner's request to withdraw their petition, the agency has decided not to proceed with this action.

  4. COST ESTIMATION MODELS FOR DRINKING WATER TREATMENT UNIT PROCESSES

    EPA Science Inventory

    Cost models for unit processes typically utilized in a conventional water treatment plant and in package treatment plant technology are compiled in this paper. The cost curves are represented as a function of specified design parameters and are categorized into four major catego...

  5. Survived so what? Identifying priorities for research with children and families post-paediatric intensive care unit.

    PubMed

    Manning, Joseph C; Hemingway, Pippa; Redsell, Sarah A

    2018-03-01

    The involvement of patients and the public in the development, implementation and evaluation of health care services and research is recognized to have tangible benefits in relation to effectiveness and credibility. However, despite >96% of children and young people surviving critical illness or injury, there is a paucity of published reports demonstrating their contribution to informing the priorities for aftercare services and outcomes research. We aimed to identify the service and research priorities for Paediatric Intensive Care Unit survivors with children and young people, their families and other stakeholders. We conducted a face-to-face, multiple-stakeholder consultation event, held in the Midlands (UK), to provide opportunities for experiences, views and priorities to be elicited. Data were gathered using write/draw and tell and focus group approaches. An inductive content analytical approach was used to categorize and conceptualize feedback. A total of 26 individuals attended the consultation exercise, including children and young people who were critical care survivors; their siblings; parents and carers; health professionals; academics; commissioners; and service managers. Consultation findings indicated that future services, interventions and research must be holistic and family-centred. Children and young people advisors reported priorities that focused on longer-term outcomes, whereas adult advisors identified priorities that mapped against the pathways of care. Specific priorities included developing and testing interventions that address unmet communication and information needs. Furthermore, initiatives to optimize the lives and longer-term functional and psycho-social outcomes of Paediatric Intensive Care Unit survivors were identified. This consultation exercise provides further evidence of the value of meaningful patient and public involvement in identifying the priorities for research and services for Paediatric Intensive Care Unit survivors

  6. Use of general purpose graphics processing units with MODFLOW

    USGS Publications Warehouse

    Hughes, Joseph D.; White, Jeremy T.

    2013-01-01

    To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.

  7. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2013-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.

  8. The Self in Movement: Being Identified and Identifying Oneself in the Process of Migration and Asylum Seeking.

    PubMed

    Watzlawik, Meike; Brescó de Luna, Ignacio

    2017-06-01

    How migration influences the processes of identity development has been under longstanding scrutiny in the social sciences. Usually, stage models have been suggested, and different strategies for acculturation (e.g., integration, assimilation, separation, and marginalization) have been considered as ways to make sense of the psychological transformations of migrants as a group. On an individual level, however, identity development is a more complex endeavor: Identity does not just develop by itself, but is constructed as an ongoing process. To capture these processes, we will look at different aspects of migration and asylum seeking; for example, the cultural-specific values and expectations of the hosting (European) countries (e.g., as identifier), but also of the arriving individuals/groups (e.g., identified as refugees). Since the two may contradict each other, negotiations between identities claims and identity assignments become necessary. Ways to solve these contradictions are discussed, with a special focus on the experienced (and often missing) agency in different settings upon arrival in a new country. In addition, it will be shown how sudden events (e.g., 9/11, the Charlie Hebdo attack) may challenge identity processes in different ways.

  9. Acceleration of integral imaging based incoherent Fourier hologram capture using graphic processing unit.

    PubMed

    Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung

    2012-10-08

    Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.

  10. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    PubMed

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  11. Accelerating Malware Detection via a Graphics Processing Unit

    DTIC Science & Technology

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10

  12. Identifying patient-level health and social care costs for older adults discharged from acute medical units in England.

    PubMed

    Franklin, Matthew; Berdunov, Vladislav; Edmans, Judi; Conroy, Simon; Gladman, John; Tanajewski, Lukasz; Gkountouras, Georgios; Elliott, Rachel A

    2014-09-01

    acute medical units allow for those who need admission to be correctly identified, and for those who could be managed in ambulatory settings to be discharged. However, re-admission rates for older people following discharge from acute medical units are high and may be associated with substantial health and social care costs. identifying patient-level health and social care costs for older people discharged from acute medical units in England. a prospective cohort study of health and social care resource use. an acute medical unit in Nottingham, England. four hundred and fifty-six people aged over 70 who were discharged from an acute medical unit within 72 h of admission. hospitalisation and social care data were collected for 3 months post-recruitment. In Nottingham, further approvals were gained to obtain data from general practices, ambulance services, intermediate care and mental healthcare. Resource use was combined with national unit costs. costs from all sectors were available for 250 participants. The mean (95% CI, median, range) total cost was £1926 (1579-2383, 659, 0-23,612). Contribution was: secondary care (76.1%), primary care (10.9%), ambulance service (0.7%), intermediate care (0.2%), mental healthcare (2.1%) and social care (10.0%). The costliest 10% of participants accounted for 50% of the cost. this study highlights the costs accrued by older people discharged from acute medical units (AMUs): they are mainly (76%) in secondary care and half of all costs were incurred by a minority of participants (10%). © The Author 2014. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    NASA Astrophysics Data System (ADS)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  14. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A cost effective process sequence and machinery for the production of flat plate photovoltaic modules are described. Cells were fabricated using the process sequence which was optimized, as was a lamination procedure. Insulator tapes and edge seal material were identified and tested. Encapsulation materials were evaluated.

  15. High-Throughput Characterization of Porous Materials Using Graphics Processing Units

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jihan; Martin, Richard L.; Rübel, Oliver

    We have developed a high-throughput graphics processing units (GPU) code that can characterize a large database of crystalline porous materials. In our algorithm, the GPU is utilized to accelerate energy grid calculations where the grid values represent interactions (i.e., Lennard-Jones + Coulomb potentials) between gas molecules (i.e., CHmore » $$_{4}$$ and CO$$_{2}$$) and material's framework atoms. Using a parallel flood fill CPU algorithm, inaccessible regions inside the framework structures are identified and blocked based on their energy profiles. Finally, we compute the Henry coefficients and heats of adsorption through statistical Widom insertion Monte Carlo moves in the domain restricted to the accessible space. The code offers significant speedup over a single core CPU code and allows us to characterize a set of porous materials at least an order of magnitude larger than ones considered in earlier studies. For structures selected from such a prescreening algorithm, full adsorption isotherms can be calculated by conducting multiple grand canonical Monte Carlo simulations concurrently within the GPU.« less

  16. Graphics processing unit based computation for NDE applications

    NASA Astrophysics Data System (ADS)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  17. [Work process and workers' health in a food and nutrition unit: prescribed versus actual work].

    PubMed

    Colares, Luciléia Granhen Tavares; Freitas, Carlos Machado de

    2007-12-01

    This study focuses on the relationship between the work process in a food and nutrition unit and workers' health, in the words of the participants themselves. Direct observation, a semi-structured interview, and focus groups were used to collect the data. The reference was the dialogue between human ergonomics and work psychodynamics. The results showed that work organization in the study unit represents a routine activity, the requirements of which in terms of the work situation are based on criteria set by the institution. Variability in the activities is influenced mainly by the available equipment, instruments, and materials, thereby generating improvisation in meal production that produces both a physical and psychological cost for workers. Dissatisfaction during the performance of tasks results mainly from the supervisory style and relationship to immediate superiors. Workers themselves proposed changes in the work organization, based on greater dialogue and trust between supervisors and the workforce. Finally, the study identifies the need for an intervention that encourages workers' participation as agents of change.

  18. Delivering a National Process Design Unit with Industry Support

    NASA Astrophysics Data System (ADS)

    Ibana, Don

    Supported by the Minerals Council of Australia (MCA) through the Minerals Tertiary Education Council (MTEC), three Australian universities-Curtin University, Murdoch University and the University of Queensland-have formed the Metallurgical Education Partnership (MEP) to jointly develop and deliver an engineering design capstone unit-Metallurgical Process and Plant Design-in their respective undergraduate programs in extractive metallurgy, in order to enhance the students' educational experience. A unique feature of the program is the close interaction of the students in all three universities and a significant involvement of industry professionals. Now in its sixth year, it is clear that this unit is achieving its objectives.

  19. Process for Making Carbon-Carbon Turbocharger Housing Unit for Intermittent Combustion Engines

    NASA Technical Reports Server (NTRS)

    Northam, G. Burton (Inventor); Ransone, Philip O. (Inventor); Rivers, H. Kevin (Inventor)

    1999-01-01

    An improved. lightweight, turbine housing unit for an intermittent combustion reciprocating internal combustion engine turbocharger is prepared from a lay-up or molding of carbon-carbon composite materials in a single-piece or two-piece process. When compared to conventional steel or cast iron, the use of carbon-carbon composite materials in a turbine housing unit reduces the overall weight of the engine and reduces the heat energy loss used in the turbo-charging process. This reduction in heat energy loss and weight reduction provides for more efficient engine operation.

  20. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2012-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.

  1. A centerless grinding unit used for precisely processing ferrules of optical fiber connector

    NASA Astrophysics Data System (ADS)

    Wu, Yongbo; Kondo, Takahiro; Kato, Masana

    2005-02-01

    This paper describes the development of a centerless grinding unit used for precisely processing ferrules, a key component of optical fiber connectors. In conventional processing procedure, the outer diameter of a ferrule is ground by employing a special machine tool, i.e., centerless grinder. However, in the case of processing small amount of ferrules, introducing a centerless grinder leads to high processing cost. Therefore, in order to take measures against this problem, the present authors propose a new centerless grinding technique where a compact centerless grinding unit, which is composed of an ultrasonic elliptic-vibration shoe, a workrest blade, and their respective holders, is installed on a popular surface grinder to perform the centerless grinding operations for outer diameter machining of ferrules. In this work, a unit is designed and constructed, and is installed on a surface grinder equipped with a diamond grinding wheel. Then, the performance of the unit is examined experimentally followed by grinding tests of ferrule"s outer diameter. As a result, the roundness of the ferrule"s outer diameter improved from the original value of around 3μm to the final value of around 0.5 μm, confirming the validity of the new technique.

  2. Identifying designatable units for intraspecific conservation prioritization: a hierarchical approach applied to the lake whitefish species complex (Coregonus spp.)

    PubMed Central

    Mee, Jonathan A; Bernatchez, Louis; Reist, Jim D; Rogers, Sean M; Taylor, Eric B

    2015-01-01

    The concept of the designatable unit (DU) affords a practical approach to identifying diversity below the species level for conservation prioritization. However, its suitability for defining conservation units in ecologically diverse, geographically widespread and taxonomically challenging species complexes has not been broadly evaluated. The lake whitefish species complex (Coregonus spp.) is geographically widespread in the Northern Hemisphere, and it contains a great deal of variability in ecology and evolutionary legacy within and among populations, as well as a great deal of taxonomic ambiguity. Here, we employ a set of hierarchical criteria to identify DUs within the Canadian distribution of the lake whitefish species complex. We identified 36 DUs based on (i) reproductive isolation, (ii) phylogeographic groupings, (iii) local adaptation and (iv) biogeographic regions. The identification of DUs is required for clear discussion regarding the conservation prioritization of lake whitefish populations. We suggest conservation priorities among lake whitefish DUs based on biological consequences of extinction, risk of extinction and distinctiveness. Our results exemplify the need for extensive genetic and biogeographic analyses for any species with broad geographic distributions and the need for detailed evaluation of evolutionary history and adaptive ecological divergence when defining intraspecific conservation units. PMID:26029257

  3. Understanding the development of minimum unit pricing of alcohol in Scotland: a qualitative study of the policy process.

    PubMed

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing's development by taking a 'multiple-lenses' approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon's multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy 'image' to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process, with Multi-Level Governance particularly useful for

  4. Understanding the Development of Minimum Unit Pricing of Alcohol in Scotland: A Qualitative Study of the Policy Process

    PubMed Central

    Katikireddi, Srinivasa Vittal; Hilton, Shona; Bonell, Chris; Bond, Lyndal

    2014-01-01

    Background Minimum unit pricing of alcohol is a novel public health policy with the potential to improve population health and reduce health inequalities. Theories of the policy process may help to understand the development of policy innovation and in turn identify lessons for future public health research and practice. This study aims to explain minimum unit pricing’s development by taking a ‘multiple-lenses’ approach to understanding the policy process. In particular, we apply three perspectives of the policy process (Kingdon’s multiple streams, Punctuated-Equilibrium Theory, Multi-Level Governance) to understand how and why minimum unit pricing has developed in Scotland and describe implications for efforts to develop evidence-informed policymaking. Methods Semi-structured interviews were conducted with policy actors (politicians, civil servants, academics, advocates, industry representatives) involved in the development of MUP (n = 36). Interviewees were asked about the policy process and the role of evidence in policy development. Data from two other sources (a review of policy documents and an analysis of evidence submission documents to the Scottish Parliament) were used for triangulation. Findings The three perspectives provide complementary understandings of the policy process. Evidence has played an important role in presenting the policy issue of alcohol as a problem requiring action. Scotland-specific data and a change in the policy ‘image’ to a population-based problem contributed to making alcohol-related harms a priority for action. The limited powers of Scottish Government help explain the type of price intervention pursued while distinct aspects of the Scottish political climate favoured the pursuit of price-based interventions. Conclusions Evidence has played a crucial but complex role in the development of an innovative policy. Utilising different political science theories helps explain different aspects of the policy process

  5. Automated processing of whole blood units: operational value and in vitro quality of final blood components

    PubMed Central

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958

  6. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    PubMed

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  7. Roles, processes, and outcomes of interprofessional shared decision-making in a neonatal intensive care unit: A qualitative study.

    PubMed

    Dunn, Sandra I; Cragg, Betty; Graham, Ian D; Medves, Jennifer; Gaboury, Isabelle

    2018-05-01

    Shared decision-making provides an opportunity for the knowledge and skills of care providers to synergistically influence patient care. Little is known about interprofessional shared decision-making processes in critical care settings. The aim of this study was to explore interprofessional team members' perspectives about the nature of interprofessional shared decision-making in a neonatal intensive care unit (NICU) and to determine if there are any differences in perspectives across professional groups. An exploratory qualitative approach was used consisting of semi-structured interviews with 22 members of an interprofessional team working in a tertiary care NICU in Canada. Participants identified four key roles involved in interprofessional shared decision-making: leader, clinical experts, parents, and synthesizer. Participants perceived that interprofessional shared decision-making happens through collaboration, sharing, and weighing the options, the evidence and the credibility of opinions put forward. The process of interprofessional shared decision-making leads to a well-informed decision and participants feeling valued. Findings from this study identified key concepts of interprofessional shared decision-making, increased awareness of differing professional perspectives about this process of shared decision-making, and clarified understanding of the different roles involved in the decision-making process in an NICU.

  8. How Prepared Are Medical and Nursing Students to Identify Common Hazards in the Intensive Care Unit?

    PubMed

    Clay, Alison S; Chudgar, Saumil M; Turner, Kathleen M; Vaughn, Jacqueline; Knudsen, Nancy W; Farnan, Jeanne M; Arora, Vineet M; Molloy, Margory A

    2017-04-01

    Care in the hospital is hazardous. Harm in the hospital may prolong hospitalization, increase suffering, result in death, and increase costs of care. Although the interprofessional team is critical to eliminating hazards that may result in adverse events to patients, professional students' formal education may not prepare them adequately for this role. To determine if medical and nursing students can identify hazards of hospitalization that could result in harm to patients and to detect differences between professions in the types of hazards identified. Mixed-methods observational study of graduating nursing (n = 51) and medical (n = 93) students who completed two "Room of Horrors" simulations to identify patient safety hazards. Qualitative analysis was used to extract themes from students' written hazard descriptions. Fisher's exact test was used to determine differences in frequency of hazards identified between groups. Identification of hazards by students was low: 66% did not identify missing personal protective equipment for a patient on contact isolation, and 58% did not identify a medication administration error (medication hanging for a patient with similar name). Interprofessional differences existed in how hazards were identified: medical students noted that restraints were not indicated (73 vs. 2%, P < 0.001), whereas nursing students noted that there was no order for the restraints (58.5 vs. 0%, P < 0.0001). Nursing students discovered more issues with malfunctioning or incorrectly used equipment than medical students. Teams performed better than individuals, especially for hazards in the second simulation that were similar to those in the first: need to replace a central line with erythema (73% teams identified) versus need to replace a peripheral intravenous line (10% individuals, P < 0.0001). Nevertheless, teams of students missed many intensive care unit-specific hazards: 54% failed to identify the presence of pressure ulcers; 85

  9. Using hyperspectral imaging technology to identify diseased tomato leaves

    NASA Astrophysics Data System (ADS)

    Li, Cuiling; Wang, Xiu; Zhao, Xueguan; Meng, Zhijun; Zou, Wei

    2016-11-01

    In the process of tomato plants growth, due to the effect of plants genetic factors, poor environment factors, or disoperation of parasites, there will generate a series of unusual symptoms on tomato plants from physiology, organization structure and external form, as a result, they cannot grow normally, and further to influence the tomato yield and economic benefits. Hyperspectral image usually has high spectral resolution, not only contains spectral information, but also contains the image information, so this study adopted hyperspectral imaging technology to identify diseased tomato leaves, and developed a simple hyperspectral imaging system, including a halogen lamp light source unit, a hyperspectral image acquisition unit and a data processing unit. Spectrometer detection wavelength ranged from 400nm to 1000nm. After hyperspectral images of tomato leaves being captured, it was needed to calibrate hyperspectral images. This research used spectrum angle matching method and spectral red edge parameters discriminant method respectively to identify diseased tomato leaves. Using spectral red edge parameters discriminant method produced higher recognition accuracy, the accuracy was higher than 90%. Research results have shown that using hyperspectral imaging technology to identify diseased tomato leaves is feasible, and provides the discriminant basis for subsequent disease control of tomato plants.

  10. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    PubMed Central

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  11. 32 CFR 516.10 - Service of civil process within the United States.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of civil process within the United States. 516.10 Section 516.10 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.10 Service of civil process...

  12. 32 CFR 516.10 - Service of civil process within the United States.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Service of civil process within the United States. 516.10 Section 516.10 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.10 Service of civil process...

  13. Influence of technical processing units on chemical composition and antimicrobial activity of carrot (Daucus carrot L.) juice essential oil.

    PubMed

    Ma, Tingting; Luo, Jiyang; Tian, Chengrui; Sun, Xiangyu; Quan, Meiping; Zheng, Cuiping; Kang, Lina; Zhan, Jicheng

    2015-03-01

    The effect of three processing units (blanching, enzyme liquefaction, pasteurisation) on chemical composition and antimicrobial activity of carrot juice essential oil was investigated in this paper. A total of 36 compounds were identified by GC-MS from fresh carrot juice essential oil. The main constituents were carotol (20.20%), sabinene (12.80%), β-caryophyllene (8.04%) and α-pinene (6.05%). Compared with the oil of fresh juice, blanching and pasteurisation could significantly decrease the components of the juice essential oil, whereas enzyme liquefaction had no considerable effect on the composition of juice essential oil. With regard to the antimicrobial activity, carrot juice essential oil could cause physical damage and morphological alteration on microorganisms, while the three different processing units showed noticeable differences on the species of microorganisms, the minimum inhibitory concentration and minimum bactericidal concentration. Results revealed that the carrot juice essential oil has great potential for application as a natural antimicrobial applied in pharmaceutical and food industries. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A historical review of additives and modifiers used in paving asphalt refining processes in the United States.

    PubMed

    Mundt, Diane J; Adams, Robert C; Marano, Kristin M

    2009-11-01

    The U.S. asphalt paving industry has evolved over time to meet various performance specifications for liquid petroleum asphalt binder (known as bitumen outside the United States). Additives to liquid petroleum asphalt produced in the refinery may affect exposures to workers in the hot mix paving industry. This investigation documented the changes in the composition and distribution of the liquid petroleum asphalt products produced from petroleum refining in the United States since World War II. This assessment was accomplished by reviewing documents and interviewing individual experts in the industry to identify current and historical practices. Individuals from 18 facilities were surveyed; the number of facilities reporting use of any material within a particular class ranged from none to more than half the respondents. Materials such as products of the process stream, polymers, elastomers, and anti-strip compounds have been added to liquid petroleum asphalt in the United States over the past 50 years, but modification has not been generally consistent by geography or time. Modifications made to liquid petroleum asphalt were made generally to improve performance and were dictated by state specifications.

  15. Real-time digital holographic microscopy using the graphic processing unit.

    PubMed

    Shimobaba, Tomoyoshi; Sato, Yoshikuni; Miura, Junya; Takenouchi, Mai; Ito, Tomoyoshi

    2008-08-04

    Digital holographic microscopy (DHM) is a well-known powerful method allowing both the amplitude and phase of a specimen to be simultaneously observed. In order to obtain a reconstructed image from a hologram, numerous calculations for the Fresnel diffraction are required. The Fresnel diffraction can be accelerated by the FFT (Fast Fourier Transform) algorithm. However, real-time reconstruction from a hologram is difficult even if we use a recent central processing unit (CPU) to calculate the Fresnel diffraction by the FFT algorithm. In this paper, we describe a real-time DHM system using a graphic processing unit (GPU) with many stream processors, which allows use as a highly parallel processor. The computational speed of the Fresnel diffraction using the GPU is faster than that of recent CPUs. The real-time DHM system can obtain reconstructed images from holograms whose size is 512 x 512 grids in 24 frames per second.

  16. Units of Distinction: Creating a Blueprint for Recognition of High-Performing Medical-Surgical Nursing Units.

    PubMed

    Jeffery, Alvin D; Mosier, Sammie; Baker, Allison; Korwek, Kimberly; Borum, Cindy; Englebright, Jane

    2018-02-01

    Hospital medical-surgical (M/S) nursing units are responsible for up to 28 million encounters annually, yet receive little attention from professional organizations and national initiatives targeted to improve quality and performance. We sought to develop a framework recognizing high-performing units within our large hospital system. This was a retrospective data analysis of M/S units throughout a 168-hospital system. Measures represented patient experience, employee engagement, staff scheduling, nursing-sensitive patient outcomes, professional practices, and clinical process measures. Four hundred ninety units from 129 hospitals contributed information to test the framework. A manual scoring system identified the top 5% and recognized them as a "Unit of Distinction." Secondary analyses with machine learning provided validation of the proposed framework. Similar to external recognition programs, this framework and process provide a holistic evaluation useful for meaningful recognition and lay the groundwork for benchmarking in improvement efforts.

  17. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  18. Physiotherapists' Perceptions of and Experiences with the Discharge Planning Process in Acute-Care General Internal Medicine Units in Ontario

    PubMed Central

    Uyeno, Jennifer; Heck, Carol S.

    2014-01-01

    ABSTRACT Purpose: To examine discharge planning of patients in general internal medicine units in Ontario acute-care hospitals from the perspective of physiotherapists. Methods: A cross-sectional study using an online questionnaire was sent to participants in November 2011. Respondents' demographic characteristics and ranking of factors were analyzed using descriptive statistics; t-tests were performed to determine between-group differences (based on demographic characteristics). Responses to open-ended questions were coded to identify themes. Results: Mobility status was identified as the key factor in determining discharge readiness; other factors included the availability of social support and community resources. While inter-professional communication was identified as important, processes were often informal. Discharge policies, timely availability of other discharge options, and pressure for early discharge were identified as affecting discharge planning. Respondents also noted a lack of training in discharge planning; accounts of ethical dilemmas experienced by respondents supported these themes. Conclusions: Physiotherapists consider many factors beyond the patient's physical function during the discharge planning process. The improvement of team communication and resource allocation should be considered to deal with the realities of discharge planning. PMID:25125778

  19. 26 CFR 1.924(d)-1 - Requirement that economic processes take place outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 10 2011-04-01 2011-04-01 false Requirement that economic processes take place... Citizens of United States § 1.924(d)-1 Requirement that economic processes take place outside the United... any transaction only if economic processes with respect to such transaction take place outside the...

  20. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    PubMed

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  1. Evaluation of the effects of the seasonal variation of solar elevation angle and azimuth on the processes of digital filtering and thematic classification of relief units

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    The effects of the seasonal variation of illumination over digital processing of LANDSAT images are evaluated. Two sets of LANDSAT data referring to the orbit 150 and row 28 were selected with illumination parameters varying from 43 deg to 64 deg for azimuth and from 30 deg to 36 deg for solar elevation respectively. IMAGE-100 system permitted the digital processing of LANDSAT data. Original images were transformed by means of digital filtering so as to enhance their spatial features. The resulting images were used to obtain an unsupervised classification of relief units. Topographic variables (declivity, altitude, relief range and slope length) were used to identify the true relief units existing on the ground. The LANDSAT over pass data show that digital processing is highly affected by illumination geometry, and there is no correspondence between relief units as defined by spectral features and those resulting from topographic features.

  2. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    NASA Technical Reports Server (NTRS)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  3. US Federal LCA Commons Life Cycle Inventory Unit Process Template

    EPA Science Inventory

    The US Federal LCA Commons Life Cycle Inventory Unit Process Template is a multi-sheet Excel template for life cycle inventory data, metadata and other documentation. The template comes as a package that consistent of three parts: (1) the main template itself for life cycle inven...

  4. Active microchannel fluid processing unit and method of making

    DOEpatents

    Bennett, Wendy D [Kennewick, WA; Martin, Peter M [Kennewick, WA; Matson, Dean W [Kennewick, WA; Roberts, Gary L [West Richland, WA; Stewart, Donald C [Richland, WA; Tonkovich, Annalee Y [Pasco, WA; Zilka, Jennifer L [Pasco, WA; Schmitt, Stephen C [Dublin, OH; Werner, Timothy M [Columbus, OH

    2001-01-01

    The present invention is an active microchannel fluid processing unit and method of making, both relying on having (a) at least one inner thin sheet; (b) at least one outer thin sheet; (c) defining at least one first sub-assembly for performing at least one first unit operation by stacking a first of the at least one inner thin sheet in alternating contact with a first of the at least one outer thin sheet into a first stack and placing an end block on the at least one inner thin sheet, the at least one first sub-assembly having at least a first inlet and a first outlet; and (d) defining at least one second sub-assembly for performing at least one second unit operation either as a second flow path within the first stack or by stacking a second of the at least one inner thin sheet in alternating contact with second of the at least one outer thin sheet as a second stack, the at least one second sub-assembly having at least a second inlet and a second outlet.

  5. Active microchannel fluid processing unit and method of making

    DOEpatents

    Bennett, Wendy D [Kennewick, WA; Martin, Peter M [Kennewick, WA; Matson, Dean W [Kennewick, WA; Roberts, Gary L [West Richland, WA; Stewart, Donald C [Richland, WA; Tonkovich, Annalee Y [Pasco, WA; Zilka, Jennifer L [Pasco, WA; Schmitt, Stephen C [Dublin, OH; Werner, Timothy M [Columbus, OH

    2002-12-10

    The present invention is an active microchannel fluid processing unit and method of making, both relying on having (a) at least one inner thin sheet; (b) at least one outer thin sheet; (c) defining at least one first sub-assembly for performing at least one first unit operation by stacking a first of the at least one inner thin sheet in alternating contact with a first of the at least one outer thin sheet into a first stack and placing an end block on the at least one inner thin sheet, the at least one first sub-assembly having at least a first inlet and a first outlet; and (d) defining at least one second sub-assembly for performing at least one second unit operation either as a second flow path within the first stack or by stacking a second of the at least one inner thin sheet in alternating contact with second of the at least one outer thin sheet as a second stack, the at least one second sub-assembly having at least a second inlet and a second outlet.

  6. Exploiting graphics processing units for computational biology and bioinformatics.

    PubMed

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  7. Accelerating sino-atrium computer simulations with graphic processing units.

    PubMed

    Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

    2015-01-01

    Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations.

  8. 20 CFR 1010.300 - What processes are to be implemented to identify covered persons?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What processes are to be implemented to identify covered persons? 1010.300 Section 1010.300 Employees' Benefits OFFICE OF THE ASSISTANT SECRETARY... processes to identify covered persons who physically access service delivery points or who access virtual...

  9. Mechanical and Assembly Units of Viral Capsids Identified via Quasi-Rigid Domain Decomposition

    PubMed Central

    Polles, Guido; Indelicato, Giuliana; Potestio, Raffaello; Cermelli, Paolo; Twarock, Reidun; Micheletti, Cristian

    2013-01-01

    Key steps in a viral life-cycle, such as self-assembly of a protective protein container or in some cases also subsequent maturation events, are governed by the interplay of physico-chemical mechanisms involving various spatial and temporal scales. These salient aspects of a viral life cycle are hence well described and rationalised from a mesoscopic perspective. Accordingly, various experimental and computational efforts have been directed towards identifying the fundamental building blocks that are instrumental for the mechanical response, or constitute the assembly units, of a few specific viral shells. Motivated by these earlier studies we introduce and apply a general and efficient computational scheme for identifying the stable domains of a given viral capsid. The method is based on elastic network models and quasi-rigid domain decomposition. It is first applied to a heterogeneous set of well-characterized viruses (CCMV, MS2, STNV, STMV) for which the known mechanical or assembly domains are correctly identified. The validated method is next applied to other viral particles such as L-A, Pariacoto and polyoma viruses, whose fundamental functional domains are still unknown or debated and for which we formulate verifiable predictions. The numerical code implementing the domain decomposition strategy is made freely available. PMID:24244139

  10. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    DTIC Science & Technology

    2017-08-01

    access to the GPU for general purpose processing .5 CUDA is designed to work easily with multiple programming languages , including Fortran. CUDA is a...Using Graphics Processing Unit (GPU) Computing by Leelinda P Dawson Approved for public release; distribution unlimited...The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing by Leelinda

  11. Graphics Processing Units for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2016-07-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  12. 32 CFR 516.9 - Service of criminal process within the United States.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of criminal process within the United States. 516.9 Section 516.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.9 Service of criminal...

  13. 32 CFR 516.12 - Service of civil process outside the United States.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of civil process outside the United States. 516.12 Section 516.12 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.12 Service of civil...

  14. 32 CFR 516.11 - Service of criminal process outside the United States.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of criminal process outside the United States. 516.11 Section 516.11 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.11 Service of...

  15. 32 CFR 516.12 - Service of civil process outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Service of civil process outside the United States. 516.12 Section 516.12 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.12 Service of civil...

  16. 32 CFR 516.9 - Service of criminal process within the United States.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Service of criminal process within the United States. 516.9 Section 516.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.9 Service of criminal...

  17. 32 CFR 516.11 - Service of criminal process outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Service of criminal process outside the United States. 516.11 Section 516.11 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.11 Service of...

  18. Identifying soil landscape units at the district scale by numerically clustering remote and proximal sensed data

    NASA Astrophysics Data System (ADS)

    Zare, Ehsan; Huang, Jingyi; Triantafilis, John

    2017-04-01

    Identifying soil landscape units at a district scale is important as it allows for sustainable land-use management. However, given the large number of soil properties that need to be understood and mapped, cost-effective methods are required. In this study, we use a digital soil mapping (DSM) approach where remote and proximal sensed ancillary data collected across a farming district near Bourke, are numerical clustered (fuzzy k-means: FKM) to identify soil landscape units. The remote data was obtained from an air-borne gamma-ray spectrometer survey (i.e. potassium-K, uranium-U, thorium-Th and total counts-TC). Proximal sensed data was collected using an EM38 in the horizontal (EM38h) and vertical (EM38v) mode of operation. The FKM analysis (using Mahalanobis metric) of the kriged ancillary (i.e. common 100 m grid) data revealed a fuzziness exponent (phi) of 1.4 was suitable for further analysis and that k = 4 classes was smallest for the fuzziness performance index (FPI) and normalised classification entropy (NCE). Using laboratory measured physical (i.e. clay) and chemical (i.e. CEC, ECe and pH) properties we found k = 4 was minimized in terms of mean squared prediction error (i.e. 2p,C) when considering topsoil (0-0.3 m) clay (159.76), CEC (21.943), ECe (13.56) and pH (0.2296) and subsoil (0.9-1.2 m) clay (80.81), CEC (31.251) and ECe (16.66). These sigma2p,C are smaller than those calculated using the mapped soil landscape units identified using a traditional approach. Nevertheless, class 4A represents the Aeolian soil landscape (i.e. Nb4), while 4D, represents deep grey (CC19) self-mulching clays, and 4B and 4C yellow-grey (II1) self-mulching clays adjacent to the river and clay alluvial plain, respectively. The differences in clay and CEC reveal why 4B, 4C and 4D have been extensively developed for irrigated cotton production and also why the slightly less reactive 4B might be a source of deep drainage; evidenced by smaller topsoil (2.13 dS/m) and subsoil

  19. Low cost solar array project production process and equipment task. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Technical readiness for the production of photovoltaic modules using single crystal silicon dendritic web sheet material is demonstrated by: (1) selection, design and implementation of solar cell and photovoltaic module process sequence in a Module Experimental Process System Development Unit; (2) demonstration runs; (3) passing of acceptance and qualification tests; and (4) achievement of a cost effective module.

  20. Phrase Units as Determinants of Visual Processing in Music Reading

    ERIC Educational Resources Information Center

    Sloboda, John A.

    1977-01-01

    Keyboard musicians sight-read passages of music in which the amount of information about the presence of phrase units was systematically varied. Results suggest a clear analogy between the cognition of music and language, in that knowledge of abstract structure is of importance in the organization of immediate visual processing of text. (Editor/RK)

  1. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  2. How many schools adopt interviews during the student admission process across the health professions in the United States of America?

    PubMed Central

    2016-01-01

    Health profession schools use interviews during the admissions process to identify certain non-cognitive skills that are needed for success in diverse, inter-professional settings. This study aimed to assess the use of interviews during the student admissions process across health disciplines at schools in the United States of America in 2014. The type and frequency of non-cognitive skills assessed were also evaluated. Descriptive methods were used to analyze a sample of interview rubrics collected as part of a national survey on admissions in the health professions, which surveyed 228 schools of medicine, dentistry, pharmacy, nursing, and public health. Of the 228 schools, 130 used interviews. The most desirable non-cognitive skills from 34 schools were identified as follows: communication skills (30), motivation (22), readiness for the profession (17), service (12), and problem-solving (12). Ten schools reported using the multiple mini-interview format, which may indicate potential for expanding this practice. Disparities in the use of interviewing across health professions should be verified to help schools adopt interviews during student admissions processes. PMID:26924541

  3. How many schools adopt interviews during the student admission process across the health professions in the United States of America?

    PubMed

    Glazer, Greer; Startsman, Laura F; Bankston, Karen; Michaels, Julia; Danek, Jennifer C; Fair, Malika

    2016-01-01

    Health profession schools use interviews during the admissions process to identify certain non-cognitive skills that are needed for success in diverse, inter-professional settings. This study aimed to assess the use of interviews during the student admissions process across health disciplines at schools in the United States of America in 2014. The type and frequency of non-cognitive skills assessed were also evaluated. Descriptive methods were used to analyze a sample of interview rubrics collected as part of a national survey on admissions in the health professions, which surveyed 228 schools of medicine, dentistry, pharmacy, nursing, and public health. Of the 228 schools, 130 used interviews. The most desirable non-cognitive skills from 34 schools were identified as follows: communication skills (30), motivation (22), readiness for the profession (17), service (12), and problem-solving (12). Ten schools reported using the multiple mini-interview format, which may indicate potential for expanding this practice. Disparities in the use of interviewing across health professions should be verified to help schools adopt interviews during student admissions processes.

  4. Factors impeding flexible inpatient unit design.

    PubMed

    Pati, Debajyoti; Evans, Jennie; Harvey, Thomas E; Bazuin, Doug

    2012-01-01

    To identify and examine factors extraneous to the design decision-making process that could impede the optimization of flexibility on inpatient units. A 2006 empirical study to identify domains of design decisions that affect flexibility on inpatient units found some indication in the context of the acuity-adaptable operational model that factors extraneous to the design process could have negatively influenced the successful implementation of the model. This raised questions regarding extraneous factors that might influence the successful optimization of flexibility. An exploratory, qualitative method was adopted to examine the question. Stakeholders from five recently built acute care inpatient units participated in the study, which involved three types of data collection: (1) verbal protocol data from a gaming session; (2) in-depth semi-structured interviews; and (3) shadowing frontline personnel. Data collection was conducted between June 2009 and November 2010. The study revealed at least nine factors extraneous to the design process that have the potential to hinder the optimization of flexibility in four domains: (1) systemic; (2) cultural; (3) human; and (4) financial. Flexibility is critical to hospital operations in the new healthcare climate, where cost reduction constitutes a vital target. From this perspective, flexibility and efficiency strategies can be influenced by (1) return on investment, (2) communication, (3) culture change, and (4) problem definition. Extraneous factors identified in this study could also affect flexibility in other care settings; therefore, these findings may be viewed from the overall context of hospital design.

  5. Efficient Acceleration of the Pair-HMMs Forward Algorithm for GATK HaplotypeCaller on Graphics Processing Units.

    PubMed

    Ren, Shanshan; Bertels, Koen; Al-Ars, Zaid

    2018-01-01

    GATK HaplotypeCaller (HC) is a popular variant caller, which is widely used to identify variants in complex genomes. However, due to its high variants detection accuracy, it suffers from long execution time. In GATK HC, the pair-HMMs forward algorithm accounts for a large percentage of the total execution time. This article proposes to accelerate the pair-HMMs forward algorithm on graphics processing units (GPUs) to improve the performance of GATK HC. This article presents several GPU-based implementations of the pair-HMMs forward algorithm. It also analyzes the performance bottlenecks of the implementations on an NVIDIA Tesla K40 card with various data sets. Based on these results and the characteristics of GATK HC, we are able to identify the GPU-based implementations with the highest performance for the various analyzed data sets. Experimental results show that the GPU-based implementations of the pair-HMMs forward algorithm achieve a speedup of up to 5.47× over existing GPU-based implementations.

  6. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  7. AFSO 21: Identifying Potential Failure Points in Sustaining Continuous Process Improvement Across the Air Force

    DTIC Science & Technology

    2007-04-01

    Michael W. Wynne, and the Air Force Chief of Staff, General T. Michael Moseley, “our strategy will be a comprehensive effort to improve our work processes...5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7...property of the United States government. ii AU/ACSC/2307/AY07 Preface I have always been a proponent of working smarter and not harder. I

  8. Disciplinary power and the process of training informal carers on stroke units.

    PubMed

    Sadler, Euan; Hawkins, Rebecca; Clarke, David J; Godfrey, Mary; Dickerson, Josie; McKevitt, Christopher

    2018-01-01

    This article examines the process of training informal carers on stroke units using the lens of power. Care is usually assumed as a kinship obligation but the state has long had an interest in framing the carer and caring work. Training carers in healthcare settings raises questions about the power of the state and healthcare professionals as its agents to shape expectations and practices related to the caring role. Drawing on Foucault's notion of disciplinary power, we show how disciplinary forms of power exercised in interactions between healthcare professionals and carers shape the engagement and resistance of carers in the process of training. Interview and observational field note extracts are drawn from a multi-sited study of a training programme on stroke units targeting family carers of people with stroke to consider the consequences of subjecting caring to this intervention. We found that the process of training informal carers on stroke units was not simply a matter of transferring skills from professional to lay person, but entailed disciplinary forms of power intended to shape the conduct of the carer. We interrogate the extent to which a specific kind of carer is produced through such an approach, and the wider implications for the participation of carers in training in healthcare settings and the empowerment of carers. © 2017 Foundation for the Sociology of Health & Illness.

  9. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  10. Analysis of the policymaking process in Burkina Faso's health sector: case studies of the creation of two health system support units.

    PubMed

    Zida, Andre; Lavis, John N; Sewankambo, Nelson K; Kouyate, Bocar; Moat, Kaelan; Shearer, Jessica

    2017-02-13

    Burkina Faso has made a number of health system policy decisions to improve performance on health indicators and strengthen responsiveness to health-related challenges. These included the creation of a General Directorate of Health Information and Statistics (DGISS) and a technical unit to coordinate performance-based financing (CT-FBR). We analysed the policymaking processes associated with the establishment of these units, and documented the factors that influenced this process. We used a multiple-case study design based on Kingdon's agenda-setting model to investigate the DGISS and CT-FBR policymaking processes. Data were collected from interviews with key informants (n = 28), published literature, policy documents (including two strategic and 230 action plans), and 55 legal/regulatory texts. Interviews were analysed using thematic qualitative analysis. Data from the documentary analysis were triangulated with the qualitative interview data. Key factors influencing the policymaking processes associated with the two units involved the 'problem' (problem identification), 'policy' (formation of policy proposals), and 'politics' (political climate/change) streams, which came together in a way that resulted in proposals being placed on the decision agenda. A number of problems with Burkina Faso's health information and financing systems were identified. Policy proposals for the DGISS and CT-FBR units were developed in response to these problems, emerging from several sources including development partners. Changes in political and public service administrations (specifically the 2008 appointment of a new Minister of Health and the establishment of a new budget allocation system), with corresponding changes in the actors and interests involved, appeared key in elevating the proposals to the decision agenda. Efforts to improve performance on health indicators and strengthen responsiveness to health-related challenges need focus on the need for a compelling problem, a

  11. Widespread loess-like deposit in the Martian northern lowlands identifies Middle Amazonian climate change

    USGS Publications Warehouse

    Skinner, James A.; Tanaka, Kenneth L.; Platz, Thomas

    2014-01-01

    Consistently mappable units critical to distinguishing the style and interplay of geologic processes through time are sparse in the Martian lowlands. This study identifies a previously unmapped Middle Amazonian (ca. 1 Ga) unit (Middle Amazonian lowland unit, mAl) that postdates the Late Hesperian and Early Amazonian lowland plains by >2 b.y. The unit is regionally defined by subtle marginal scarps and slopes, has a mean thickness of 32 m, and extends >3.1 × 106 km2 between lat 35°N and 80°N. Pedestal-type craterforms and nested, arcuate ridges (thumbprint terrain) tend to occur adjacent to unit mAl outcrops, suggesting that current outcrops are vestiges of a more extensive deposit that previously covered ∼16 × 106 km2. Exposed layers, surface pits, and the draping of subjacent landforms allude to a sedimentary origin, perhaps as a loess-like deposit emplaced rhythmically through atmospheric fallout. We propose that unit mAl accumulated coevally with, and at the expense of, the erosion of the north polar basal units, identifying a major episode of Middle Amazonian climate-driven sedimentation in the lowlands. This work links ancient sedimentary processes to climate change that occurred well before those implied by current orbital and spin axis models.

  12. Characterization of suspended bacteria from processing units in an advanced drinking water treatment plant of China.

    PubMed

    Wang, Feng; Li, Weiying; Zhang, Junpeng; Qi, Wanqi; Zhou, Yanyan; Xiang, Yuan; Shi, Nuo

    2017-05-01

    For the drinking water treatment plant (DWTP), the organic pollutant removal was the primary focus, while the suspended bacterial was always neglected. In this study, the suspended bacteria from each processing unit in a DWTP employing an ozone-biological activated carbon process was mainly characterized by using heterotrophic plate counts (HPCs), a flow cytometer, and 454-pyrosequencing methods. The results showed that an adverse changing tendency of HPC and total cell counts was observed in the sand filtration tank (SFT), where the cultivability of suspended bacteria increased to 34%. However, the cultivability level of other units stayed below 3% except for ozone contact tank (OCT, 13.5%) and activated carbon filtration tank (ACFT, 34.39%). It meant that filtration processes promoted the increase in cultivability of suspended bacteria remarkably, which indicated biodegrading capability. In the unit of OCT, microbial diversity indexes declined drastically, and the dominant bacteria were affiliated to Proteobacteria phylum (99.9%) and Betaproteobacteria class (86.3%), which were also the dominant bacteria in the effluent of other units. Besides, the primary genus was Limnohabitans in the effluents of SFT (17.4%) as well as ACFT (25.6%), which was inferred to be the crucial contributors for the biodegradable function in the filtration units. Overall, this paper provided an overview of community composition of each processing units in a DWTP as well as reference for better developing microbial function for drinking water treatment in the future.

  13. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  14. Treatment barriers identified by substance abusers assessed at a centralized intake unit.

    PubMed

    Rapp, Richard C; Xu, Jiangmin; Carr, Carey A; Lane, D Tim; Wang, Jichuan; Carlson, Robert

    2006-04-01

    The 59-item Barriers to Treatment Inventory (BTI) was administered to 312 substance abusers at a centralized intake unit following assessment but before treatment entry to assess their views on barriers to treatment. Factor analysis identified 25 items in 7 well-defined latent constructs: Absence of Problem, Negative Social Support, Fear of Treatment, Privacy Concerns, Time Conflict, Poor Treatment Availability, and Admission Difficulty. The factorial structure of the barriers is consistent with the findings of other studies that asked substance abusers about barriers to treatment and is conceptually compatible with Andersen's model of health care utilization. Factors were moderately to highly correlated, suggesting that they interact with one another. Selected characteristics were generally not predictive of barrier factors. Overall, results indicate that the BTI has good content validity and is a reliable instrument for assessing barriers to drug treatment. The potential utility of the BTI in assessment settings is discussed.

  15. Identifying an Education Gap in Wound Care Training in United States Dermatology.

    PubMed

    Ruiz, Emily Stamell; Ingram, Amber; Landriscina, Angelo; Tian, Jiaying; Kirsner, Robert S; Friedman, Adam

    2015-07-01

    As restoration of the integument is paramount to wound healing, dermatologists should be central to managing wounds; yet this is often not the case. If a training gap exists during residency training, this may account for the observed discrepancy. To identify United States (US) dermatology residents' impressions regarding their preparedness to care for wounds, and to assess the amount and type of training devoted to wound care during residency. An online survey among current US dermatology residents enrolled in a residency training program. The primary goal was to determine whether dermatology residents believe more wound care education is needed, evaluate preparedness to care for wounds, and identify future plans to manage wounds. Responses were received from 175 of 517 (33.8%) US Dermatology residents contacted. The majority of residents did not feel prepared to manage acute (78.3%) and chronic (84.6%) wounds. Over three quarters (77.1%) felt that more education is needed. Fewer than half (49.1% and 35.4%) of residents planned to care for acute and chronic wounds, respectively, when in practice. There is a gap in wound care education in US dermatology residency training. This translates to a low percentage of dermatology residents planning to care for wounds in future practice. Dermatology residents need to receive focused wound care training in order to translate the underpinnings of wound healing biology and ultimately better serve patients.

  16. Using Loop Heat Pipes to Minimize Survival Heater Power for NASA's Evolutionary Xenon Thruster Power Processing Units

    NASA Technical Reports Server (NTRS)

    Choi, Michael K.

    2017-01-01

    A thermal design concept of using propylene loop heat pipes to minimize survival heater power for NASA's Evolutionary Xenon Thruster power processing units is presented. It reduces the survival heater power from 183 W to 35 W per power processing unit. The reduction is 81%.

  17. Identifying carcinogens: the tobacco industry and regulatory politics in the United States.

    PubMed

    Cook, Daniel M; Bero, Lisa A

    2006-01-01

    The process of identifying carcinogens for purposes of health and safety regulation has been contested internationally. The U.S. government produces a "Report on Carcinogens" every two years, which lists known and likely human carcinogenic substances. In the late 1990s the tobacco industry responded to the proposed listing of secondhand smoke with a multi-part strategy. Despite industry efforts to challenge both the substance of the report and the agency procedures, environmental tobacco smoke was declared by the agency in 2000 to be a known human carcinogen. A subsequent lawsuit, launched by chemical interests but linked to the tobacco industry, failed, but it produced a particular legal precedent of judicial review that is favorable to all regulated industries. The authors argue that, in this case, tobacco industry regulation contradicts academic expectations of business regulatory victories. However, the tobacco industry's participation in the regulatory process influenced the process in favor of all regulated industry.

  18. Designing and Implementing an OVERFLOW Reader for ParaView and Comparing Performance Between Central Processing Units and Graphical Processing Units

    NASA Technical Reports Server (NTRS)

    Chawner, David M.; Gomez, Ray J.

    2010-01-01

    In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.

  19. NASA's Evolutionary Xenon Thruster (NEXT) Power Processing Unit (PPU) Capacitor Failure Root Cause Analysis

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Pinero, Luis; Schneidegger, Robert; Dunning, John; Birchenough, Art

    2012-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. A critical element of the propulsion system is the Power Processing Unit (PPU) which supplies regulated power to the key components of the thruster. The PPU contains six different power supplies including the beam, discharge, discharge heater, neutralizer, neutralizer heater, and accelerator supplies. The beam supply is the largest and processes up to 93+% of the power. The NEXT PPU had been operated for approximately 200+ hours and has experienced a series of three capacitor failures in the beam supply. The capacitors are in the same, nominally non-critical location the input filter capacitor to a full wave switching inverter. The three failures occurred after about 20, 30, and 135 hours of operation. This paper provides background on the NEXT PPU and the capacitor failures. It discusses the failure investigation approach, the beam supply power switching topology and its operating modes, capacitor characteristics and circuit testing. Finally, it identifies root cause of the failures to be the unusual confluence of circuit switching frequency, the physical layout of the power circuits, and the characteristics of the capacitor.

  20. NASA's Evolutionary Xenon Thruster (NEXT) Power Processing Unit (PPU) Capacitor Failure Root Cause Analysis

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Scheidegger, Robert J.; Pinero, Luis R.; Birchenough, Arthur J.; Dunning, John W.

    2012-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. A critical element of the propulsion system is the Power Processing Unit (PPU) which supplies regulated power to the key components of the thruster. The PPU contains six different power supplies including the beam, discharge, discharge heater, neutralizer, neutralizer heater, and accelerator supplies. The beam supply is the largest and processes up to 93+% of the power. The NEXT PPU had been operated for approximately 200+ hr and has experienced a series of three capacitor failures in the beam supply. The capacitors are in the same, nominally non-critical location-the input filter capacitor to a full wave switching inverter. The three failures occurred after about 20, 30, and 135 hr of operation. This paper provides background on the NEXT PPU and the capacitor failures. It discusses the failure investigation approach, the beam supply power switching topology and its operating modes, capacitor characteristics and circuit testing. Finally, it identifies root cause of the failures to be the unusual confluence of circuit switching frequency, the physical layout of the power circuits, and the characteristics of the capacitor.

  1. Characteristics of vibrator use by gay and bisexually identified men in the United States.

    PubMed

    Reece, Michael; Rosenberger, Joshua G; Schick, Vanessa; Herbenick, Debby; Dodge, Brian; Novak, David S

    2010-10-01

    Recent reports indicate that vibrator use during solo and partnered sexual activities is common among heterosexual men and women in the United States. However, little research has comprehensively assessed vibrator use among gay and bisexually identified men. This study sought to document the extent to which gay and bisexually identified men report using vibrators, the sexual and relational situations within which they use them, and how men use vibrators on their own and their partners' bodies. Data were collected from 25,294 gay and bisexually identified men from 50 U.S. states and from the District of Columbia via an internet-based survey. Measures included sociodemographics, health-related indicators, sexual behaviors, and those related to recent and past use of vibrators during solo and partnered sexual interactions with other men. Approximately half (49.8%) of gay and bisexually identified men reported having used vibrators. Most men who had used a vibrator in the past reported use during masturbation (86.2%). When used during partnered interactions, vibrators were incorporated into foreplay (65.9%) and intercourse (59.4%). Men reported frequent insertion of vibrators into the anus or rectum when using them during masturbation (87.3%), which was also common during partnered interactions (∼60%), but varied slightly for casual and relationship sex partners. For both masturbation and partnered interactions, men overwhelmingly endorsed the extent to which vibrator use contributed to sexual arousal, orgasm, and pleasure. Vibrator use during both solo and partnered sexual acts was common among the gay and bisexually identified men in this sample and was described by men as adding to the quality of their sexual experiences. © 2010 International Society for Sexual Medicine.

  2. Patterns of genetic differentiation at MHC class I genes and microsatellites identify conservation units in the giant panda.

    PubMed

    Zhu, Ying; Wan, Qiu-Hong; Yu, Bin; Ge, Yun-Fa; Fang, Sheng-Guo

    2013-10-22

    Evaluating patterns of genetic variation is important to identify conservation units (i.e., evolutionarily significant units [ESUs], management units [MUs], and adaptive units [AUs]) in endangered species. While neutral markers could be used to infer population history, their application in the estimation of adaptive variation is limited. The capacity to adapt to various environments is vital for the long-term survival of endangered species. Hence, analysis of adaptive loci, such as the major histocompatibility complex (MHC) genes, is critical for conservation genetics studies. Here, we investigated 4 classical MHC class I genes (Aime-C, Aime-F, Aime-I, and Aime-L) and 8 microsatellites to infer patterns of genetic variation in the giant panda (Ailuropoda melanoleuca) and to further define conservation units. Overall, we identified 24 haplotypes (9 for Aime-C, 1 for Aime-F, 7 for Aime-I, and 7 for Aime-L) from 218 individuals obtained from 6 populations of giant panda. We found that the Xiaoxiangling population had the highest genetic variation at microsatellites among the 6 giant panda populations and higher genetic variation at Aime-MHC class I genes than other larger populations (Qinling, Qionglai, and Minshan populations). Differentiation index (FST)-based phylogenetic and Bayesian clustering analyses for Aime-MHC-I and microsatellite loci both supported that most populations were highly differentiated. The Qinling population was the most genetically differentiated. The giant panda showed a relatively higher level of genetic diversity at MHC class I genes compared with endangered felids. Using all of the loci, we found that the 6 giant panda populations fell into 2 ESUs: Qinling and non-Qinling populations. We defined 3 MUs based on microsatellites: Qinling, Minshan-Qionglai, and Daxiangling-Xiaoxiangling-Liangshan. We also recommended 3 possible AUs based on MHC loci: Qinling, Minshan-Qionglai, and Daxiangling-Xiaoxiangling-Liangshan. Furthermore, we recommend

  3. Evolution of the Power Processing Units Architecture for Electric Propulsion at CRISA

    NASA Astrophysics Data System (ADS)

    Palencia, J.; de la Cruz, F.; Wallace, N.

    2008-09-01

    Since 2002, the team formed by EADS Astrium CRISA, Astrium GmbH Friedrichshafen, and QinetiQ has participated in several flight programs where the Electric Propulsion based on Kaufman type Ion Thrusters is the baseline conceptOn 2002, CRISA won the contract for the development of the Ion Propulsion Control Unit (IPCU) for GOCE. This unit together with the T5 thruster by QinetiQ provides near perfect atmospheric drag compensation offering thrust levels in the range of 1 to 20mN.By the end of 2003, CRISA started the adaptation of the IPCU concept to the QinetiQ T6 Ion Thruster for the Alphabus program.This paper shows how the Power Processing Unit design evolved in time including the current developments.

  4. A data mining paradigm for identifying key factors in biological processes using gene expression data.

    PubMed

    Li, Jin; Zheng, Le; Uchiyama, Akihiko; Bin, Lianghua; Mauro, Theodora M; Elias, Peter M; Pawelczyk, Tadeusz; Sakowicz-Burkiewicz, Monika; Trzeciak, Magdalena; Leung, Donald Y M; Morasso, Maria I; Yu, Peng

    2018-06-13

    A large volume of biological data is being generated for studying mechanisms of various biological processes. These precious data enable large-scale computational analyses to gain biological insights. However, it remains a challenge to mine the data efficiently for knowledge discovery. The heterogeneity of these data makes it difficult to consistently integrate them, slowing down the process of biological discovery. We introduce a data processing paradigm to identify key factors in biological processes via systematic collection of gene expression datasets, primary analysis of data, and evaluation of consistent signals. To demonstrate its effectiveness, our paradigm was applied to epidermal development and identified many genes that play a potential role in this process. Besides the known epidermal development genes, a substantial proportion of the identified genes are still not supported by gain- or loss-of-function studies, yielding many novel genes for future studies. Among them, we selected a top gene for loss-of-function experimental validation and confirmed its function in epidermal differentiation, proving the ability of this paradigm to identify new factors in biological processes. In addition, this paradigm revealed many key genes in cold-induced thermogenesis using data from cold-challenged tissues, demonstrating its generalizability. This paradigm can lead to fruitful results for studying molecular mechanisms in an era of explosive accumulation of publicly available biological data.

  5. Case Studies of Internationalization in Adult and Higher Education: Inside the Processes of Four Universities in the United States and the United Kingdom

    ERIC Educational Resources Information Center

    Coryell, Joellen Elizabeth; Durodoye, Beth A.; Wright, Robin Redmon; Pate, P. Elizabeth; Nguyen, Shelbee

    2012-01-01

    This report outlines a method for learning about the internationalization processes at institutions of adult and higher education and then provides the analysis of data gathered from the researchers' own institution and from site visits to three additional universities in the United States and the United Kingdom. It was found that campus…

  6. Development and pilot test of a process to identify research needs from a systematic review.

    PubMed

    Saldanha, Ian J; Wilson, Lisa M; Bennett, Wendy L; Nicholson, Wanda K; Robinson, Karen A

    2013-05-01

    To ensure appropriate allocation of research funds, we need methods for identifying high-priority research needs. We developed and pilot tested a process to identify needs for primary clinical research using a systematic review in gestational diabetes mellitus. We conducted eight steps: abstract research gaps from a systematic review using the Population, Intervention, Comparison, Outcomes, and Settings (PICOS) framework; solicit feedback from the review authors; translate gaps into researchable questions using the PICOS framework; solicit feedback from multidisciplinary stakeholders at our institution; establish consensus among multidisciplinary external stakeholders on the importance of the research questions using the Delphi method; prioritize outcomes; develop conceptual models to highlight research needs; and evaluate the process. We identified 19 research questions. During the Delphi method, external stakeholders established consensus for 16 of these 19 questions (15 with "high" and 1 with "medium" clinical benefit/importance). We pilot tested an eight-step process to identify clinically important research needs. Before wider application of this process, it should be tested using systematic reviews of other diseases. Further evaluation should include assessment of the usefulness of the research needs generated using this process for primary researchers and funders. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. groHMM: a computational tool for identifying unannotated and cell type-specific transcription units from global run-on sequencing data.

    PubMed

    Chae, Minho; Danko, Charles G; Kraus, W Lee

    2015-07-16

    Global run-on coupled with deep sequencing (GRO-seq) provides extensive information on the location and function of coding and non-coding transcripts, including primary microRNAs (miRNAs), long non-coding RNAs (lncRNAs), and enhancer RNAs (eRNAs), as well as yet undiscovered classes of transcripts. However, few computational tools tailored toward this new type of sequencing data are available, limiting the applicability of GRO-seq data for identifying novel transcription units. Here, we present groHMM, a computational tool in R, which defines the boundaries of transcription units de novo using a two state hidden-Markov model (HMM). A systematic comparison of the performance between groHMM and two existing peak-calling methods tuned to identify broad regions (SICER and HOMER) favorably supports our approach on existing GRO-seq data from MCF-7 breast cancer cells. To demonstrate the broader utility of our approach, we have used groHMM to annotate a diverse array of transcription units (i.e., primary transcripts) from four GRO-seq data sets derived from cells representing a variety of different human tissue types, including non-transformed cells (cardiomyocytes and lung fibroblasts) and transformed cells (LNCaP and MCF-7 cancer cells), as well as non-mammalian cells (from flies and worms). As an example of the utility of groHMM and its application to questions about the transcriptome, we show how groHMM can be used to analyze cell type-specific enhancers as defined by newly annotated enhancer transcripts. Our results show that groHMM can reveal new insights into cell type-specific transcription by identifying novel transcription units, and serve as a complete and useful tool for evaluating functional genomic elements in cells.

  8. 50 CFR Table 1 to Subpart H of... - Pacific Salmon EFH Identified by USGS Hydrologic Unit Code (HUC)

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Pacific Salmon EFH Identified by USGS Hydrologic Unit Code (HUC) 1 Table 1 to Subpart H of Part 660 Wildlife and Fisheries FISHERY CONSERVATION AND... WEST COAST STATES West Coast Salmon Fisheries Pt. 660, Subpt. H, Table 1 Table 1 to Subpart H of Part...

  9. Pulp capacity in the United States, 2000.

    Treesearch

    Brett R. Smith; Robert W. Rice; Peter J. Ince

    2003-01-01

    Production capacities of all woodpulp mills in the United States are identified by location, ownership, and process type. For each mill, production capacity is reported for the year 2000 by process type; total mill capacities are also reported for 1961, 1965, 1979, 1974, and 1983. In addition, the report summarizes the recent history and current status of woodpulp...

  10. Potential for solar industrial process heat in the United States: A look at California

    NASA Astrophysics Data System (ADS)

    Kurup, Parthiv; Turchi, Craig

    2016-05-01

    The use of Concentrating Solar Power (CSP) collectors (e.g., parabolic trough or linear Fresnel systems) for industrial thermal applications has been increasing in global interest in the last few years. In particular, the European Union has been tracking the deployment of Solar Industrial Process Heat (SIPH) plants. Although relatively few plants have been deployed in the United States (U.S.), we establish that 29% of primary energy consumption in the U.S. manufacturing sector is used for process heating. Perhaps the best opportunities for SIPH reside in the state of California due to its excellent solar resource, strong industrial base, and solar-friendly policies. This initial analysis identified 48 TWhth/year of process heat demand in certain California industries versus a technical solar-thermal energy potential of 23,000 TWhth/year. The top five users of industrial steam in the state are highlighted and special attention paid to the food sector that has been an early adopter of SIPH in other countries. A comparison of the cost of heat from solar-thermal collectors versus the cost of industrial natural gas in California indicates that SIPH may be cost effective even under the relatively low gas prices seen in 2014. A recommended next step is the identification of pilot project candidates to promote the deployment of SIPH facilities.

  11. Potential for Solar Industrial Process Heat in the United States: A Look at California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurup, Parthiv; Turchi, Craig

    The use of Concentrating Solar Power (CSP) collectors (e.g., parabolic trough or linear Fresnel systems) for industrial thermal applications has been increasing in global interest in the last few years. In particular, the European Union has been tracking the deployment of Solar Industrial Process Heat (SIPH) plants. Although relatively few plants have been deployed in the United States (U.S.), we establish that 29% of primary energy consumption in the U.S. manufacturing sector is used for process heating. Perhaps the best opportunities for SIPH reside in the state of California due to its excellent solar resource, strong industrial base, and solar-friendlymore » policies. This initial analysis identified 48 TWhth/year of process heat demand in certain California industries versus a technical solar-thermal energy potential of 23,000 TWhth/year. The top five users of industrial steam in the state are highlighted and special attention paid to the food sector that has been an early adopter of SIPH in other countries. A comparison of the cost of heat from solar-thermal collectors versus the cost of industrial natural gas in California indicates that SIPH may be cost effective even under the relatively low gas prices seen in 2014. A recommended next step is the identification of pilot project candidates to promote the deployment of SIPH facilities.« less

  12. Data Handling and Processing Unit for Alphabus/Alphasat TDP-8

    NASA Astrophysics Data System (ADS)

    Habinc, Sandi; Martins, Rodolfo; Costa Pinto, Joao; Furano, Gianluca

    2011-08-01

    ESA's and Inmarsat's ARTES 8 Alphabus/Alphasat is a specific programme dedicated to the development and deployment of Alphasat. It encompasses several technology demonstration payloads (TDPs), of which the TDP8 is an Environment effects facility to monitor the GEO radiation environment and its effects on electronic components and sensors. This paper will discuss the rapid development of the processor and board for TDP8's data handling and processing unit.

  13. Risk management of emergency service vehicle crashes in the United States fire service: process, outputs, and recommendations.

    PubMed

    Bui, David P; Pollack Porter, Keshia; Griffin, Stephanie; French, Dustin D; Jung, Alesia M; Crothers, Stephen; Burgess, Jefferey L

    2017-11-17

    Emergency service vehicle crashes (ESVCs) are a leading cause of death in the United States fire service. Risk management (RM) is a proactive process for identifying occupational risks and reducing hazards and unwanted events through an iterative process of scoping hazards, risk assessment, and implementing controls. We describe the process, outputs, and lessons learned from the application of a proactive RM process to reduce ESVCs in US fire departments. Three fire departments representative of urban, suburban, and rural geographies, participated in a facilitated RM process delivered through focus groups and stakeholder discussion. Crash reports from department databases were reviewed to characterize the context, circumstances, hazards and risks of ESVCs. Identified risks were ranked using a risk matrix that considered risk likelihood and severity. Department-specific control measures were selected based on group consensus. Interviews, and focus groups were used to assess acceptability and utility of the RM process and perceived facilitators and barriers of implementation. Three to six RM meetings were conducted at each fire department. There were 7.4 crashes per 100 personnel in the urban department and 10.5 per 100 personnel in the suburban department; the rural department experienced zero crashes. All departments identified emergency response, backing, on scene struck by, driver distraction, vehicle/road visibility, and driver training as high or medium concerns. Additional high priority risks varied by department; the urban department prioritized turning and rear ending crashes; the suburban firefighters prioritized inclement weather/road environment and low visibility related crashes; and the rural volunteer fire department prioritized exiting station, vehicle failure, and inclement weather/road environment related incidents. Selected controls included new policies and standard operating procedures to reduce emergency response, cameras to enhance driver

  14. System and method for identifying, reporting, and evaluating presence of substance

    DOEpatents

    Smith, Maurice [Kansas City, MO; Lusby, Michael [Kansas City, MO; Van Hook, Arthur [Lotawana, MO; Cook, Charles J [Raytown, MO; Wenski, Edward G [Lenexa, KS; Solyom, David [Overland Park, KS

    2012-02-14

    A system and method for identifying, reporting, and evaluating a presence of a solid, liquid, gas, or other substance of interest, particularly a dangerous, hazardous, or otherwise threatening chemical, biological, or radioactive substance. The system comprises one or more substantially automated, location self-aware remote sensing units; a control unit; and one or more data processing and storage servers. Data is collected by the remote sensing units and transmitted to the control unit; the control unit generates and uploads a report incorporating the data to the servers; and thereafter the report is available for review by a hierarchy of responsive and evaluative authorities via a wide area network. The evaluative authorities include a group of relevant experts who may be widely or even globally distributed.

  15. Korean Americans in the United States: Problems and Alternatives.

    ERIC Educational Resources Information Center

    Kim, Eugene C.

    Problems faced by Koreans in the United States are identified and analyzed in this paper, and some pragmatic remedies are offered. First, the acculturation process is slow--the mean of the Koreans' sojourn in the United States is only 6.5 years, whereas complete acculturation takes several generations. Second, although most Korean emigres learned…

  16. A Master Plan for Unit Cost Studies Among Community Junior Colleges.

    ERIC Educational Resources Information Center

    Sims, Howard D.

    The need for higher education programs is being challenged, and unit cost studies may become an integral part of the funding process for junior colleges. This paper describes the major tasks in a cost study and reviews the problems encountered in the unit costing efforts. The main tasks are: (1) identifying units of measurement (the language used…

  17. Modeling of Unit-Cells With Z-Pins Using FLASH: Pre-Processing and Post Processing

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald

    2005-01-01

    Although the toughening properties of stitches, z-pins and similar structures have been studied extensively, investigations on the effect of z-pins on the in-plane properties of laminates are limited. A brief summary on the effect of z-pins on the in-plane tensile and compressive properties of composite laminates is presented together with a concise introduction into the finite element code FLASH. The remainder of the report illustrates the modeling aspect of unit cells with z-pins in FLASH and focuses on input and output data as well as post-processing of results.

  18. Eco-Efficient Process Improvement at the Early Development Stage: Identifying Environmental and Economic Process Hotspots for Synergetic Improvement Potential.

    PubMed

    Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia

    2018-05-15

    We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.

  19. On the technological development of cotton primary processing, using a new drying-purifying unit

    NASA Astrophysics Data System (ADS)

    Agzamov, M. M.; Yunusov, S. Z.; Gafurov, J. K.

    2017-10-01

    The article reflects feasibility study of conducting research on technological development of cotton primary processing with the modified parameters of drying and cleaning process for small litter. As a result of theoretical and experimental research, drying and purifying unit is designed, in which in the existing processes a heat source, exhaust fans, a dryer drum, a peg-drum cleaner of cotton and the vehicle transmitting raw cotton from the dryer to the purifier will be excluded. The experience has shown that when a drying-purifying unit is installed (with eight wheels) purifying effect on the small litter of 34%, ie cleaning effect is higher than of that currently in operation 1XK drum cleaner. According to the research patent of RU UZ FAP 00674 “Apparatus for drying and cleaning fibrous material” is received.

  20. Optimisation of multiplet identifier processing on a PLAYSTATION® 3

    NASA Astrophysics Data System (ADS)

    Hattori, Masami; Mizuno, Takashi

    2010-02-01

    To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.

  1. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.

  2. 40 CFR 65.118 - Alternative means of emission limitation: Enclosed-vented process units.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONSOLIDATED FEDERAL AIR RULE Equipment Leaks § 65.118... control device. Process units that are enclosed in such a manner that all emissions from equipment leaks...

  3. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    PubMed

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  4. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

  5. System and method for identifying, reporting, and evaluating presence of substance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Maurice; Lusby, Michael; Van Hook, Arthur

    A system and method for identifying, reporting, and evaluating a presence of a solid, liquid, gas, or other substance of interest, particularly a dangerous, hazardous, or otherwise threatening chemical, biological, or radioactive substance. The system comprises one or more substantially automated, location self-aware remote sensing units; a control unit; and one or more data processing and storage servers. Data is collected by the remote sensing units and transmitted to the control unit; the control unit generates and uploads a report incorporating the data to the servers; and thereafter the report is available for review by a hierarchy of responsive andmore » evaluative authorities via a wide area network. The evaluative authorities include a group of relevant experts who may be widely or even globally distributed.« less

  6. System And Method For Identifying, Reporting, And Evaluating Presence Of Substance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Maurice; Lusby, Michael; Hook, Arthur Van

    A system and method for identifying, reporting, and evaluating a presence of a solid, liquid, gas, or other substance of interest, particularly a dangerous, hazardous, or otherwise threatening chemical, biological, or radioactive substance. The system comprises one or more substantially automated, location self-aware remote sensing units; a control unit; and one or more data processing and storage servers. Data is collected by the remote sensing units and transmitted to the control unit; the control unit generates and uploads a report incorporating the data to the servers; and thereafter the report is available for review by a hierarchy of responsive andmore » evaluative authorities via a wide area network. The evaluative authorities include a group of relevant experts who may be widely or even globally distributed.« less

  7. Envisioning successful teamwork: An exploratory qualitative study of team processes used by nursing teams in a paediatric hospital unit.

    PubMed

    Whitehair, Leeann; Hurley, John; Provost, Steve

    2018-06-12

    To explore how team processes support nursing teams in hospital units during every day work. Due to their close proximity to patients, nurses are central to the process of maintaining patient safety. Globally, changes in models of care delivery by nurses, inclusive of team nursing are being considered. This qualitative study used purposive sampling in a single hospital and participants were nurses employed to work on a paediatric unit. Data was collected using non-participant observation. Thematic analysis was used to analyse and code data to create themes. Three clear themes emerged. Theme 1:"We are a close knit team"; Behaviours building a successful team"- outlines expectations regarding how members are to behave when establishing, nurturing and managing a team. Theme 2: "Onto it"; Ways of interacting with each other" - Identifies the expected pattern of relating within the team which contribute to shared understanding and actions. Theme 3: "No point in second guessing"; Maintaining a global view of the unit" - focuses on the processes for monitoring and reporting signals that team performance is on course or breaking down and includes accepting responsibility to lead the team and team members having a widespread sensitivity to what needs to happen. Essential to successful teamwork is the interplay and mutuality of team members and team leaders. Leadership behaviours exhibited in this study provide useful insights to how informal and shared or distributed leadership of teams may be achieved. Without buy-in from team members, teams may not achieve successful desired outcomes. It is not sufficient for teams to rely on current successful outcomes, as they need to be on the look-out for new ways to ensure that they can anticipate possible risks or threats to the team before harm is done. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less

  9. Failure mode and effect analysis: improving intensive care unit risk management processes.

    PubMed

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  10. Location identifiers

    DOT National Transportation Integrated Search

    1997-01-30

    This order lists the location identifiers authorized by the Federal Aviation Administration, Department of the Navy, and Transport Canada. It lists United States airspace fixes and procedure codes. The order also includes guidelines for requesting id...

  11. [Analysis of the safety culture in a Cardiology Unit managed by processes].

    PubMed

    Raso-Raso, Rafael; Uris-Selles, Joaquín; Nolasco-Bonmatí, Andreu; Grau-Jornet, Guillermo; Revert-Gandia, Rosa; Jiménez-Carreño, Rebeca; Sánchez-Soriano, Ruth M; Chamorro-Fernández, Carlos I; Marco-Francés, Elvira; Albero-Martínez, José V

    2017-04-04

    Safety culture is one of the requirements for preventing the occurrence of adverse effects. However, this has not been studied in the field of cardiology. The aim of this study is to evaluate the safety culture in a cardiology unit that has implemented and certified an integrated quality and risk management system for patient safety. A cross-sectional observational study was conducted in 2 consecutive years, with all staff completing the Spanish version of the questionnaire, "Hospital Survey on Patient Safety Culture" of the "Agency for Healthcare Research and Quality", with 42 items grouped into 12 dimensions. The percentage of positive responses in each dimension in 2014 and 2015 were compared, as well as national data and United States data, following the established rules. The overall assessment out of a possible 5, was 4.5 in 2014 and 4.7 in 2015. Seven dimensions were identified as strengths. The worst rated were: staffing, management support and teamwork between units. The comparison showed superiority in all dimensions compared to national data, and in 8 of them compared to American data. The safety culture in a Cardiology Unit with an integrated quality and risk management patient safety system is high, and higher than nationally in all its dimensions and in most of them compared to the United States. Copyright © 2017 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  12. Culling a clinical terminology: a systematic approach to identifying problematic content.

    PubMed Central

    Sable, J. H.; Nash, S. K.; Wang, A. Y.

    2001-01-01

    The College of American Pathologists and the National Health Service (NHS) in the United Kingdom are merging their respective clinical terminologies, SNOMED RT and Clinical Terms Version 3, into a new terminology, SNOMED CT. This requires mapping concept descriptions between the two existing terminologies. During the mapping process, many descriptions were identified as being potentially problematic. They require further review by the SNOMED editorial process before either (1) being incorporated into SNOMED CT, or (2) retired from active use. This article presents data on the concept descriptions that were identified as needing further review during the early phases of SNOMED CT development. Based on this work, we describe fourteen types of problematic terminology content. Identifying problematic terminology content can be approached in a systematic manner. PMID:11825253

  13. An Excel Workbook for Identifying Redox Processes in Ground Water

    USGS Publications Warehouse

    Jurgens, Bryant C.; McMahon, Peter B.; Chapelle, Francis H.; Eberts, Sandra M.

    2009-01-01

    The reduction/oxidation (redox) condition of ground water affects the concentration, transport, and fate of many anthropogenic and natural contaminants. The redox state of a ground-water sample is defined by the dominant type of reduction/oxidation reaction, or redox process, occurring in the sample, as inferred from water-quality data. However, because of the difficulty in defining and applying a systematic redox framework to samples from diverse hydrogeologic settings, many regional water-quality investigations do not attempt to determine the predominant redox process in ground water. Recently, McMahon and Chapelle (2008) devised a redox framework that was applied to a large number of samples from 15 principal aquifer systems in the United States to examine the effect of redox processes on water quality. This framework was expanded by Chapelle and others (in press) to use measured sulfide data to differentiate between iron(III)- and sulfate-reducing conditions. These investigations showed that a systematic approach to characterize redox conditions in ground water could be applied to datasets from diverse hydrogeologic settings using water-quality data routinely collected in regional water-quality investigations. This report describes the Microsoft Excel workbook, RedoxAssignment_McMahon&Chapelle.xls, that assigns the predominant redox process to samples using the framework created by McMahon and Chapelle (2008) and expanded by Chapelle and others (in press). Assignment of redox conditions is based on concentrations of dissolved oxygen (O2), nitrate (NO3-), manganese (Mn2+), iron (Fe2+), sulfate (SO42-), and sulfide (sum of dihydrogen sulfide [aqueous H2S], hydrogen sulfide [HS-], and sulfide [S2-]). The logical arguments for assigning the predominant redox process to each sample are performed by a program written in Microsoft Visual Basic for Applications (VBA). The program is called from buttons on the main worksheet. The number of samples that can be analyzed

  14. Real-time blood flow visualization using the graphics processing unit

    NASA Astrophysics Data System (ADS)

    Yang, Owen; Cuccia, David; Choi, Bernard

    2011-01-01

    Laser speckle imaging (LSI) is a technique in which coherent light incident on a surface produces a reflected speckle pattern that is related to the underlying movement of optical scatterers, such as red blood cells, indicating blood flow. Image-processing algorithms can be applied to produce speckle flow index (SFI) maps of relative blood flow. We present a novel algorithm that employs the NVIDIA Compute Unified Device Architecture (CUDA) platform to perform laser speckle image processing on the graphics processing unit. Software written in C was integrated with CUDA and integrated into a LabVIEW Virtual Instrument (VI) that is interfaced with a monochrome CCD camera able to acquire high-resolution raw speckle images at nearly 10 fps. With the CUDA code integrated into the LabVIEW VI, the processing and display of SFI images were performed also at ~10 fps. We present three video examples depicting real-time flow imaging during a reactive hyperemia maneuver, with fluid flow through an in vitro phantom, and a demonstration of real-time LSI during laser surgery of a port wine stain birthmark.

  15. Real-time blood flow visualization using the graphics processing unit

    PubMed Central

    Yang, Owen; Cuccia, David; Choi, Bernard

    2011-01-01

    Laser speckle imaging (LSI) is a technique in which coherent light incident on a surface produces a reflected speckle pattern that is related to the underlying movement of optical scatterers, such as red blood cells, indicating blood flow. Image-processing algorithms can be applied to produce speckle flow index (SFI) maps of relative blood flow. We present a novel algorithm that employs the NVIDIA Compute Unified Device Architecture (CUDA) platform to perform laser speckle image processing on the graphics processing unit. Software written in C was integrated with CUDA and integrated into a LabVIEW Virtual Instrument (VI) that is interfaced with a monochrome CCD camera able to acquire high-resolution raw speckle images at nearly 10 fps. With the CUDA code integrated into the LabVIEW VI, the processing and display of SFI images were performed also at ∼10 fps. We present three video examples depicting real-time flow imaging during a reactive hyperemia maneuver, with fluid flow through an in vitro phantom, and a demonstration of real-time LSI during laser surgery of a port wine stain birthmark. PMID:21280915

  16. Orthographic units in the absence of visual processing: Evidence from sublexical structure in braille.

    PubMed

    Fischer-Baum, Simon; Englebretson, Robert

    2016-08-01

    Reading relies on the recognition of units larger than single letters and smaller than whole words. Previous research has linked sublexical structures in reading to properties of the visual system, specifically on the parallel processing of letters that the visual system enables. But whether the visual system is essential for this to happen, or whether the recognition of sublexical structures may emerge by other means, is an open question. To address this question, we investigate braille, a writing system that relies exclusively on the tactile rather than the visual modality. We provide experimental evidence demonstrating that adult readers of (English) braille are sensitive to sublexical units. Contrary to prior assumptions in the braille research literature, we find strong evidence that braille readers do indeed access sublexical structure, namely the processing of multi-cell contractions as single orthographic units and the recognition of morphemes within morphologically-complex words. Therefore, we conclude that the recognition of sublexical structure is not exclusively tied to the visual system. However, our findings also suggest that there are aspects of morphological processing on which braille and print readers differ, and that these differences may, crucially, be related to reading using the tactile rather than the visual sensory modality. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Multidisciplinary Simulation Acceleration using Multiple Shared-Memory Graphical Processing Units

    NASA Astrophysics Data System (ADS)

    Kemal, Jonathan Yashar

    For purposes of optimizing and analyzing turbomachinery and other designs, the unsteady Favre-averaged flow-field differential equations for an ideal compressible gas can be solved in conjunction with the heat conduction equation. We solve all equations using the finite-volume multiple-grid numerical technique, with the dual time-step scheme used for unsteady simulations. Our numerical solver code targets CUDA-capable Graphical Processing Units (GPUs) produced by NVIDIA. Making use of MPI, our solver can run across networked compute notes, where each MPI process can use either a GPU or a Central Processing Unit (CPU) core for primary solver calculations. We use NVIDIA Tesla C2050/C2070 GPUs based on the Fermi architecture, and compare our resulting performance against Intel Zeon X5690 CPUs. Solver routines converted to CUDA typically run about 10 times faster on a GPU for sufficiently dense computational grids. We used a conjugate cylinder computational grid and ran a turbulent steady flow simulation using 4 increasingly dense computational grids. Our densest computational grid is divided into 13 blocks each containing 1033x1033 grid points, for a total of 13.87 million grid points or 1.07 million grid points per domain block. To obtain overall speedups, we compare the execution time of the solver's iteration loop, including all resource intensive GPU-related memory copies. Comparing the performance of 8 GPUs to that of 8 CPUs, we obtain an overall speedup of about 6.0 when using our densest computational grid. This amounts to an 8-GPU simulation running about 39.5 times faster than running than a single-CPU simulation.

  18. Model of areas for identifying risks influencing the compliance of technological processes and products

    NASA Astrophysics Data System (ADS)

    Misztal, A.; Belu, N.

    2016-08-01

    Operation of every company is associated with the risk of interfering with proper performance of its fundamental processes. This risk is associated with various internal areas of the company, as well as the environment in which it operates. From the point of view of ensuring compliance of the course of specific technological processes and, consequently, product conformity with requirements, it is important to identify these threats and eliminate or reduce the risk of their occurrence. The purpose of this article is to present a model of areas of identifying risk affecting the compliance of processes and products, which is based on multiregional targeted monitoring of typical places of interference and risk management methods. The model is based on the verification of risk analyses carried out in small and medium-sized manufacturing companies in various industries..

  19. CORROSION PROCESS IN REINFORCED CONCRETE IDENTIFIED BY ACOUSTIC EMISSION

    NASA Astrophysics Data System (ADS)

    Kawasaki, Yuma; Kitaura, Misuzu; Tomoda, Yuichi; Ohtsu, Masayasu

    Deterioration of Reinforced Concrete (RC) due to salt attack is known as one of serious problems. Thus, development of non-destructive evaluation (NDE) techniques is important to assess the corrosion process. Reinforcement in concrete normally does not corrode because of a passive film on the surface of reinforcement. When chloride concentration at reinfo rcement exceeds the threshold level, the passive film is destroyed. Thus maintenance is desirable at an early stage. In this study, to identify the onset of corrosion and the nucleation of corrosion-induced cracking in concrete due to expansion of corrosion products, continuous acoustic emission (AE) monitoring is applied. Accelerated corrosion and cyclic wet and dry tests are performed in a laboratory. The SiGMA (Simplified Green's functions for Moment tensor Analysis) proce dure is applied to AE waveforms to clarify source kinematics of micro-cracks locations, types and orientations. Results show that the onset of corrosion and the nu cleation of corrosion-induced cracking in concrete are successfully identified. Additionally, cross-sections inside the reinforcement are observed by a scanning electron microscope (SEM). From these results, a great promise for AE techniques to monitor salt damage at an early stage in RC structures is demonstrated.

  20. Modeling of yield and environmental impact categories in tea processing units based on artificial neural networks.

    PubMed

    Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa

    2017-12-01

    In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for

  1. The Use of the Nursing Process in Spain as Compared to the United States and Canada.

    PubMed

    Huitzi-Egilegor, Joseba Xabier; Elorza-Puyadena, Maria Isabel; Asurabarrena-Iraola, Carmen

    2017-05-18

    To analyze the development of the nursing method process in Spain, and compare it with the development in the United States and Canada. This is a narrative review. The teaching of the nursing process in nursing schools started in Spain as from 1977 and that it started being used in professional practice in the 1990's. The development, the difficulties, the nursing models used and its application form are discussed. The development of the nursing process in the United States and Canada started to happen in Spain about 15-20 years later and, today, is a reality. Cross-sectional studies are needed to determine the changes in the development of the nursing process in Spain. © 2017 NANDA International, Inc.

  2. How Does the United States Rank According to the World Breastfeeding Trends Initiative?

    PubMed

    Cadwell, Karin; Turner-Maffei, Cynthia; Blair, Anna; Brimdyr, Kajsa; OʼConnor, Barbara

    The World Breastfeeding Trends Initiative is an assessment process designed to facilitate an ongoing national appraisal of progress toward the goals of the United Nations Children's Fund (UNICEF)/World Health Organization (WHO) Global Strategy for Infant and Young Child Feeding. More than 80 countries have completed this national assessment, including the United States of America. This article describes the process undertaken by the US World Breastfeeding Trends Initiative team, the findings of the expert panel related to infant and young child feeding policies, programs, and practices and the ranking of the United States compared with the 83 other participating nations. Identified strengths of the United States include data collection and monitoring, especially by the Centers for Disease Control and Prevention, the US Baby-Friendly Hospital Initiative, and the United States Breastfeeding Committee. The absence of a national infant feeding policy, insufficient maternity protection, and lack of preparation for infant and young children feeding in emergencies are key targets identified by the assessment requiring concerted national effort.

  3. Hydrologic landscape units and adaptive management of intermountain wetlands

    USGS Publications Warehouse

    Custer, Stephen G.; Sojda, R.S.

    2006-01-01

    daptive management is often proposed to assist in the management of national wildlife refuges and allows the exploration of alternatives as well as the addition of ne w knowledge as it becomes available. The hydrological landscape unit can be a good foundation for such efforts. Red Rock Lakes National Wildlife Refuge (NWR) is in an intermountain basin dominated by vertical tectonics in the Northern Rocky Mountains. A geographic information system was used to define the boundaries for the hydrologic landscape units there. Units identified include alluvial fan, interfan, stream alluvi um and basin flat. Management alternatives can be informed by ex amination of processes that occu r on the units. For example, an ancient alluvial fan unit related to Red Rock Creek appear s to be isolated from stream flow today, with recharge dominated by precipitation and bedrock springs; while other alluvial fan units in the area have shallow ground water recharged from mountain streams and precipitation. The scale of hydrologic processes in interfan units differs from that in alluvial fan hydrologic landscape units. These differences are important when the refuge is evaluating habitat management activities. Hydrologic landscape units provide scientific unde rpinnings for the refuge’s comprehensive planning process. New geologic, hydrologic, and biologic knowledge can be integrated into the hydrologic landscape unit definition and improve adaptive management.

  4. Matrix decomposition graphics processing unit solver for Poisson image editing

    NASA Astrophysics Data System (ADS)

    Lei, Zhao; Wei, Li

    2012-10-01

    In recent years, gradient-domain methods have been widely discussed in the image processing field, including seamless cloning and image stitching. These algorithms are commonly carried out by solving a large sparse linear system: the Poisson equation. However, solving the Poisson equation is a computational and memory intensive task which makes it not suitable for real-time image editing. A new matrix decomposition graphics processing unit (GPU) solver (MDGS) is proposed to settle the problem. A matrix decomposition method is used to distribute the work among GPU threads, so that MDGS will take full advantage of the computing power of current GPUs. Additionally, MDGS is a hybrid solver (combines both the direct and iterative techniques) and has two-level architecture. These enable MDGS to generate identical solutions with those of the common Poisson methods and achieve high convergence rate in most cases. This approach is advantageous in terms of parallelizability, enabling real-time image processing, low memory-taken and extensive applications.

  5. System design of ELITE power processing unit

    NASA Astrophysics Data System (ADS)

    Caldwell, David J.

    The Electric Propulsion Insertion Transfer Experiment (ELITE) is a space mission planned for the mid 1990s in which technological readiness will be demonstrated for electric orbit transfer vehicles (EOTVs). A system-level design of the power processing unit (PPU), which conditions solar array power for the arcjet thruster, was performed to optimize performance with respect to reliability, power output, efficiency, specific mass, and radiation hardness. The PPU system consists of multiphased parallel switchmode converters, configured as current sources, connected directly from the array to the thruster. The PPU control system includes a solar array peak power tracker (PPT) to maximize the power delivered to the thruster regardless of variations in array characteristics. A stability analysis has been performed to verify that the system is stable despite the nonlinear negative impedance of the PPU input and the arcjet thruster. Performance specifications are given to provide the required spacecraft capability with existing technology.

  6. Water Use in the United States Energy System: A National Assessment and Unit Process Inventory of Water Consumption and Withdrawals.

    PubMed

    Grubert, Emily; Sanders, Kelly T

    2018-06-05

    The United States (US) energy system is a large water user, but the nature of that use is poorly understood. To support resource comanagement and fill this noted gap in the literature, this work presents detailed estimates for US-based water consumption and withdrawals for the US energy system as of 2014, including both intensity values and the first known estimate of total water consumption and withdrawal by the US energy system. We address 126 unit processes, many of which are new additions to the literature, differentiated among 17 fuel cycles, five life cycle stages, three water source categories, and four levels of water quality. Overall coverage is about 99% of commercially traded US primary energy consumption with detailed energy flows by unit process. Energy-related water consumption, or water removed from its source and not directly returned, accounts for about 10% of both total and freshwater US water consumption. Major consumers include biofuels (via irrigation), oil (via deep well injection, usually of nonfreshwater), and hydropower (via evaporation and seepage). The US energy system also accounts for about 40% of both total and freshwater US water withdrawals, i.e., water removed from its source regardless of fate. About 70% of withdrawals are associated with the once-through cooling systems of approximately 300 steam cycle power plants that produce about 25% of US electricity.

  7. Identifying influential directors in the United States corporate governance network.

    PubMed

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  8. Identifying influential directors in the United States corporate governance network

    NASA Astrophysics Data System (ADS)

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H. Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  9. The impact of a lean rounding process in a pediatric intensive care unit.

    PubMed

    Vats, Atul; Goin, Kristin H; Villarreal, Monica C; Yilmaz, Tuba; Fortenberry, James D; Keskinocak, Pinar

    2012-02-01

    Poor workflow associated with physician rounding can produce inefficiencies that decrease time for essential activities, delay clinical decisions, and reduce staff and patient satisfaction. Workflow and provider resources were not optimized when a pediatric intensive care unit increased by 22,000 square feet (to 33,000) and by nine beds (to 30). Lean methods (focusing on essential processes) and scenario analysis were used to develop and implement a patient-centric standardized rounding process, which we hypothesize would lead to improved rounding efficiency, decrease required physician resources, improve satisfaction, and enhance throughput. Human factors techniques and statistical tools were used to collect and analyze observational data for 11 rounding events before and 12 rounding events after process redesign. Actions included: 1) recording rounding events, times, and patient interactions and classifying them as essential, nonessential, or nonvalue added; 2) comparing rounding duration and time per patient to determine the impact on efficiency; 3) analyzing discharge orders for timeliness; 4) conducting staff surveys to assess improvements in communication and care coordination; and 5) analyzing customer satisfaction data to evaluate impact on patient experience. Thirty-bed pediatric intensive care unit in a children's hospital with academic affiliation. Eight attending pediatric intensivists and their physician rounding teams. Eight attending physician-led teams were observed for 11 rounding events before and 12 rounding events after implementation of a standardized lean rounding process focusing on essential processes. Total rounding time decreased significantly (157 ± 35 mins before vs. 121 ± 20 mins after), through a reduction in time spent on nonessential (53 ± 30 vs. 9 ± 6 mins) activities. The previous process required three attending physicians for an average of 157 mins (7.55 attending physician man-hours), while the new process required two

  10. 20 CFR 1010.300 - What processes are to be implemented to identify covered persons?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false What processes are to be implemented to identify covered persons? 1010.300 Section 1010.300 Employees' Benefits OFFICE OF THE ASSISTANT SECRETARY... FOR COVERED PERSONS Applying Priority of Service § 1010.300 What processes are to be implemented to...

  11. Medical review practices for driver licensing volume 3: guidelines and processes in the United States.

    DOT National Transportation Integrated Search

    2017-04-01

    This is the third of three reports examining driver medical review practices in the United States and how : they fulfill the basic functions of identifying, assessing, and rendering licensing decisions on medically or : functionally at-risk drivers. ...

  12. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  13. Identifying differences in biased affective information processing in major depression.

    PubMed

    Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F

    2008-05-30

    This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.

  14. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  15. Miniaturized Power Processing Unit Study: A Cubesat Electric Propulsion Technology Enabler Project

    NASA Technical Reports Server (NTRS)

    Ghassemieh, Shakib M.

    2014-01-01

    This study evaluates High Voltage Power Processing Unit (PPU) technology and driving requirements necessary to enable the Microfluidic Electric Propulsion technology research and development by NASA and university partners. This study provides an overview of the state of the art PPU technology with recommendations for technology demonstration projects and missions for NASA to pursue.

  16. Parallelized CCHE2D flow model with CUDA Fortran on Graphics Process Units

    USDA-ARS?s Scientific Manuscript database

    This paper presents the CCHE2D implicit flow model parallelized using CUDA Fortran programming technique on Graphics Processing Units (GPUs). A parallelized implicit Alternating Direction Implicit (ADI) solver using Parallel Cyclic Reduction (PCR) algorithm on GPU is developed and tested. This solve...

  17. Awareness of technology-induced errors and processes for identifying and preventing such errors.

    PubMed

    Bellwood, Paule; Borycki, Elizabeth M; Kushniruk, Andre W

    2015-01-01

    There is a need to determine if organizations working with health information technology are aware of technology-induced errors and how they are addressing and preventing them. The purpose of this study was to: a) determine the degree of technology-induced error awareness in various Canadian healthcare organizations, and b) identify those processes and procedures that are currently in place to help address, manage, and prevent technology-induced errors. We identified a lack of technology-induced error awareness among participants. Participants identified there was a lack of well-defined procedures in place for reporting technology-induced errors, addressing them when they arise, and preventing them.

  18. Graphics Processing Unit Acceleration of Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Hause, Benjamin; Parker, Scott

    2012-10-01

    We find a substantial increase in on-node performance using Graphics Processing Unit (GPU) acceleration in gyrokinetic delta-f particle-in-cell simulation. Optimization is performed on a two-dimensional slab gyrokinetic particle simulation using the Portland Group Fortran compiler with the GPU accelerator compiler directives. We have implemented the GPU acceleration on a Core I7 gaming PC with a NVIDIA GTX 580 GPU. We find comparable, or better, acceleration relative to the NERSC DIRAC cluster with the NVIDIA Tesla C2050 computing processor. The Tesla C 2050 is about 2.6 times more expensive than the GTX 580 gaming GPU. Optimization strategies and comparisons between DIRAC and the gaming PC will be presented. We will also discuss progress on optimizing the comprehensive three dimensional general geometry GEM code.

  19. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    NASA Astrophysics Data System (ADS)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  20. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Restructuring research objectives from a technical readiness demonstration program to an investigation of high risk, high payoff activities associated with producing photovoltaic modules using non-CZ sheet material is reported. Deletion of the module frame in favor of a frameless design, and modification in cell series parallel electrical interconnect configuration are reviewed. A baseline process sequence was identified for the fabrication of modules using the selected dendritic web sheet material, and economic evaluations of the sequence were completed.

  1. Congestion estimation technique in the optical network unit registration process.

    PubMed

    Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk

    2016-07-01

    We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.

  2. Submarine hydrothermal processes, mirroring the geotectonic evolution of the NE Hungarian Jurassic Szarvaskő Unit

    NASA Astrophysics Data System (ADS)

    Kiss, Gabriella B.; Zagyva, Tamás; Pásztor, Domokos; Zaccarini, Federica

    2018-05-01

    The Jurassic pillow basalt of the NE Hungarian Szarvaskő Unit is part of an incomplete ophiolitic sequence, formed in a back-arc- or marginal basin of Neotethyan origin. Different, often superimposing hydrothermal processes were studied aiming to characterise them and to discover their relationship with the geotectonic evolution of the region. Closely packed pillow, pillow-fragmented hyaloclastite breccia and transition to peperitic facies of a submarine lava flow were observed. The rocks underwent primary and cooling-related local submarine hydrothermal processes immediately after eruption at ridge setting. Physico-chemical data of this process and volcanic facies analyses revealed distal formation in the submarine lava flow. A superimposing, more extensive fluid circulation system resulted in intense alteration of basalt and in the formation of mostly sulphide-filled cavities. This lower temperature, but larger-scale process was similar to VMS systems and was related to ridge setting. As a peculiarity of the Szarvaskő Unit, locally basalt may be completely altered to a grossular-bearing mineral assemblage formed by rodingitisation s.l. This unique process observed in basalt happened in ridge setting/during spreading, in the absence of known large ultramafic blocks. Epigenetic veins formed also during Alpine regional metamorphism, related to subduction/obduction. The observed hydrothermal minerals represent different steps of the geotectonic evolution of the Szarvaskő Unit, from the ridge setting and spreading till the subduction/obduction. Hence, studying the superimposing alteration mineral assemblages can be a useful tool for reconstructing the tectonic history of an ophiolitic complex. Though the found mineral parageneses are often similar, careful study can help in distinguishing the processes and characterising their P, T, and X conditions.

  3. Reducing time-to-unit among patients referred to an outpatient stroke assessment unit with a novel triage process: a prospective cohort study.

    PubMed

    Bibok, Maximilian B; Votova, Kristine; Balshaw, Robert F; Lesperance, Mary L; Croteau, Nicole S; Trivedi, Anurag; Morrison, Jaclyn; Sedgwick, Colin; Penn, Andrew M

    2018-02-27

    To evaluate the performance of a novel triage system for Transient Ischemic Attack (TIA) units built upon an existent clinical prediction rule (CPR) to reduce time to unit arrival, relative to the time of symptom onset, for true TIA and minor stroke patients. Differentiating between true and false TIA/minor stroke cases (mimics) is necessary for effective triage as medical intervention for true TIA/minor stroke is time-sensitive and TIA unit spots are a finite resource. Prospective cohort study design utilizing patient referral data and TIA unit arrival times from a regional fast-track TIA unit on Vancouver Island, Canada, accepting referrals from emergency departments (ED) and general practice (GP). Historical referral cohort (N = 2942) from May 2013-Oct 2014 was triaged using the ABCD2 score; prospective referral cohort (N = 2929) from Nov 2014-Apr 2016 was triaged using the novel system. A retrospective survival curve analysis, censored at 28 days to unit arrival, was used to compare days to unit arrival from event date between cohort patients matched by low (0-3), moderate (4-5) and high (6-7) ABCD2 scores. Survival curve analysis indicated that using the novel triage system, prospectively referred TIA/minor stroke patients with low and moderate ABCD2 scores arrived at the unit 2 and 1 day earlier than matched historical patients, respectively. The novel triage process is associated with a reduction in time to unit arrival from symptom onset for referred true TIA/minor stroke patients with low and moderate ABCD2 scores.

  4. Using Systems Theory to Examine Patient and Nurse Structures, Processes, and Outcomes in Centralized and Decentralized Units.

    PubMed

    Real, Kevin; Fay, Lindsey; Isaacs, Kathy; Carll-White, Allison; Schadler, Aric

    2018-01-01

    This study utilizes systems theory to understand how changes to physical design structures impact communication processes and patient and staff design-related outcomes. Many scholars and researchers have noted the importance of communication and teamwork for patient care quality. Few studies have examined changes to nursing station design within a systems theory framework. This study employed a multimethod, before-and-after, quasi-experimental research design. Nurses completed surveys in centralized units and later in decentralized units ( N = 26 pre , N = 51 post ). Patients completed surveys ( N = 62 pre ) in centralized units and later in decentralized units ( N = 49 post ). Surveys included quantitative measures and qualitative open-ended responses. Patients preferred the decentralized units because of larger single-occupancy rooms, greater privacy/confidentiality, and overall satisfaction with design. Nurses had a more complex response. Nurses approved the patient rooms, unit environment, and noise levels in decentralized units. However, they reported reduced access to support spaces, lower levels of team/mentoring communication, and less satisfaction with design than in centralized units. Qualitative findings supported these results. Nurses were more positive about centralized units and patients were more positive toward decentralized units. The results of this study suggest a need to understand how system components operate in concert. A major contribution of this study is the inclusion of patient satisfaction with design, an important yet overlooked fact in patient satisfaction. Healthcare design researchers and practitioners may consider how changing system interdependencies can lead to unexpected changes to communication processes and system outcomes in complex systems.

  5. A multiplex PCR mini-barcode assay to identify processed shark products in the global trade.

    PubMed

    Cardeñosa, Diego; Fields, Andrew; Abercrombie, Debra; Feldheim, Kevin; Shea, Stanley K H; Chapman, Demian D

    2017-01-01

    Protecting sharks from overexploitation has become global priority after widespread population declines have occurred. Tracking catches and trade on a species-specific basis has proven challenging, in part due to difficulties in identifying processed shark products such as fins, meat, and liver oil. This has hindered efforts to implement regulations aimed at promoting sustainable use of commercially important species and protection of imperiled species. Genetic approaches to identify shark products exist but are typically based on sequencing or amplifying large DNA regions and may fail to work on heavily processed products in which DNA is degraded. Here, we describe a novel multiplex PCR mini-barcode assay based on two short fragments of the cytochrome oxidase I (COI) gene. This assay can identify to species all sharks currently listed on the Convention of International Trade of Endangered Species (CITES) and most shark species present in the international trade. It achieves species diagnosis based on a single PCR and one to two downstream DNA sequencing reactions. The assay is capable of identifying highly processed shark products including fins, cooked shark fin soup, and skin-care products containing liver oil. This is a straightforward and reliable identification method for data collection and enforcement of regulations implemented for certain species at all governance levels.

  6. A multiplex PCR mini-barcode assay to identify processed shark products in the global trade

    PubMed Central

    Fields, Andrew; Abercrombie, Debra; Feldheim, Kevin; Shea, Stanley K. H.; Chapman, Demian D.

    2017-01-01

    Protecting sharks from overexploitation has become global priority after widespread population declines have occurred. Tracking catches and trade on a species-specific basis has proven challenging, in part due to difficulties in identifying processed shark products such as fins, meat, and liver oil. This has hindered efforts to implement regulations aimed at promoting sustainable use of commercially important species and protection of imperiled species. Genetic approaches to identify shark products exist but are typically based on sequencing or amplifying large DNA regions and may fail to work on heavily processed products in which DNA is degraded. Here, we describe a novel multiplex PCR mini-barcode assay based on two short fragments of the cytochrome oxidase I (COI) gene. This assay can identify to species all sharks currently listed on the Convention of International Trade of Endangered Species (CITES) and most shark species present in the international trade. It achieves species diagnosis based on a single PCR and one to two downstream DNA sequencing reactions. The assay is capable of identifying highly processed shark products including fins, cooked shark fin soup, and skin-care products containing liver oil. This is a straightforward and reliable identification method for data collection and enforcement of regulations implemented for certain species at all governance levels. PMID:29020095

  7. How do formulation and process parameters impact blend and unit dose uniformity? Further analysis of the product quality research institute blend uniformity working group industry survey.

    PubMed

    Hancock, Bruno C; Garcia-Munoz, Salvador

    2013-03-01

    Responses from the second Product Quality Research Institute (PQRI) Blend Uniformity Working Group (BUWG) survey of industry have been reanalyzed to identify potential links between formulation and processing variables and the measured uniformity of blends and unit dosage forms. As expected, the variability of the blend potency and tablet potency data increased with a decrease in the loading of the active pharmaceutical ingredient (API). There was also an inverse relationship between the nominal strength of the unit dose and the blend uniformity data. The data from the PQRI industry survey do not support the commonly held viewpoint that granulation processes are necessary to create and sustain tablet and capsule formulations with a high degree of API uniformity. There was no correlation between the blend or tablet potency variability and the type of process used to manufacture the product. Although it is commonly believed that direct compression processes should be avoided for low API loading formulations because of blend and tablet content uniformity concerns, the data for direct compression processes reported by the respondents to the PQRI survey suggest that such processes are being used routinely to manufacture solid dosage forms of acceptable quality even when the drug loading is quite low. Copyright © 2012 Wiley Periodicals, Inc.

  8. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.

  9. Utilization of 134Cs/137Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-08-01

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.

  10. Utilization of (134)Cs/(137)Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident.

    PubMed

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-08-22

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured (134)Cs/(137)Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of (134)Cs/(137)Cs are different in reactor units owing to fuel burnup differences, the (134)Cs/(137)Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.

  11. Utilization of 134Cs/137Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident

    PubMed Central

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-01-01

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12–21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2. PMID:27546490

  12. 21st Century Parent-Child Sex Communication in the United States: A Process Review.

    PubMed

    Flores, Dalmacio; Barroso, Julie

    Parent-child sex communication results in the transmission of family expectations, societal values, and role modeling of sexual health risk-reduction strategies. Parent-child sex communication's potential to curb negative sexual health outcomes has sustained a multidisciplinary effort to better understand the process and its impact on the development of healthy sexual attitudes and behaviors among adolescents. This review advances what is known about the process of sex communication in the United States by reviewing studies published from 2003 to 2015. We used the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, SocINDEX, and PubMed, and the key terms "parent child" AND "sex education" for the initial query; we included 116 original articles for analysis. Our review underscores long-established factors that prevent parents from effectively broaching and sustaining talks about sex with their children and has also identified emerging concerns unique to today's parenting landscape. Parental factors salient to sex communication are established long before individuals become parents and are acted upon by influences beyond the home. Child-focused communication factors likewise describe a maturing audience that is far from captive. The identification of both enduring and emerging factors that affect how sex communication occurs will inform subsequent work that will result in more positive sexual health outcomes for adolescents.

  13. Identification of different geologic units using fuzzy constrained resistivity tomography

    NASA Astrophysics Data System (ADS)

    Singh, Anand; Sharma, S. P.

    2018-01-01

    Different geophysical inversion strategies are utilized as a component of an interpretation process that tries to separate geologic units based on the resistivity distribution. In the present study, we present the results of separating different geologic units using fuzzy constrained resistivity tomography. This was accomplished using fuzzy c means, a clustering procedure to improve the 2D resistivity image and geologic separation within the iterative minimization through inversion. First, we developed a Matlab-based inversion technique to obtain a reliable resistivity image using different geophysical data sets (electrical resistivity and electromagnetic data). Following this, the recovered resistivity model was converted into a fuzzy constrained resistivity model by assigning the highest probability value of each model cell to the cluster utilizing fuzzy c means clustering procedure during the iterative process. The efficacy of the algorithm is demonstrated using three synthetic plane wave electromagnetic data sets and one electrical resistivity field dataset. The presented approach shows improvement on the conventional inversion approach to differentiate between different geologic units if the correct number of geologic units will be identified. Further, fuzzy constrained resistivity tomography was performed to examine the augmentation of uranium mineralization in the Beldih open cast mine as a case study. We also compared geologic units identified by fuzzy constrained resistivity tomography with geologic units interpreted from the borehole information.

  14. Community Health Workers in the United States: Challenges in Identifying, Surveying, and Supporting the Workforce.

    PubMed

    Sabo, Samantha; Allen, Caitlin G; Sutkowi, Katherine; Wennerstrom, Ashley

    2017-12-01

    Community health workers (CHWs) are members of a growing profession in the United States. Studying this dynamic labor force is challenging, in part because its members have more than 100 different job titles. The demand for timely, accurate information about CHWs is increasing as the profession gains recognition for its ability to improve health outcomes and reduce costs. Although numerous surveys of CHWs have been conducted, the field lacks well-delineated methods for gaining access to this hard-to-identify workforce. We outline methods for surveying CHWs and promising approaches to engage the workforce and other stakeholders in conducting local, state, and national studies. We also highlight successful strategies to overcome challenges in CHW surveys and future directions for surveying the field.

  15. [Variations in the diagnostic confirmation process between breast cancer mass screening units].

    PubMed

    Natal, Carmen; Fernández-Somoano, Ana; Torá-Rocamora, Isabel; Tardón, Adonina; Castells, Xavier

    2016-01-01

    To analyse variations in the diagnostic confirmation process between screening units, variations in the outcome of each episode and the relationship between the use of the different diagnostic confirmation tests and the lesion detection rate. Observational study of variability of the standardised use of diagnostic and lesion detection tests in 34 breast cancer mass screening units participating in early-detection programmes in three Spanish regions from 2002-2011. The diagnostic test variation ratio in percentiles 25-75 ranged from 1.68 (further appointments) to 3.39 (fine-needle aspiration). The variation ratio in detection rates of benign lesions, ductal carcinoma in situ and invasive cancer were 2.79, 1.99 and 1.36, respectively. A positive relationship between rates of testing and detection rates was found with fine-needle aspiration-benign lesions (R(2): 0.53), fine-needle aspiration-invasive carcinoma (R(2): 0 28), core biopsy-benign lesions (R(2): 0.64), core biopsy-ductal carcinoma in situ (R(2): 0.61) and core biopsy-invasive carcinoma (R(2): 0.48). Variation in the use of invasive tests between the breast cancer screening units participating in early-detection programmes was found to be significantly higher than variations in lesion detection. Units which conducted more fine-needle aspiration tests had higher benign lesion detection rates, while units that conducted more core biopsies detected more benign lesions and cancer. Copyright © 2016 SESPAS. Published by Elsevier Espana. All rights reserved.

  16. High-Speed Particle-in-Cell Simulation Parallelized with Graphic Processing Units for Low Temperature Plasmas for Material Processing

    NASA Astrophysics Data System (ADS)

    Hur, Min Young; Verboncoeur, John; Lee, Hae June

    2014-10-01

    Particle-in-cell (PIC) simulations have high fidelity in the plasma device requiring transient kinetic modeling compared with fluid simulations. It uses less approximation on the plasma kinetics but requires many particles and grids to observe the semantic results. It means that the simulation spends lots of simulation time in proportion to the number of particles. Therefore, PIC simulation needs high performance computing. In this research, a graphic processing unit (GPU) is adopted for high performance computing of PIC simulation for low temperature discharge plasmas. GPUs have many-core processors and high memory bandwidth compared with a central processing unit (CPU). NVIDIA GeForce GPUs were used for the test with hundreds of cores which show cost-effective performance. PIC code algorithm is divided into two modules which are a field solver and a particle mover. The particle mover module is divided into four routines which are named move, boundary, Monte Carlo collision (MCC), and deposit. Overall, the GPU code solves particle motions as well as electrostatic potential in two-dimensional geometry almost 30 times faster than a single CPU code. This work was supported by the Korea Institute of Science Technology Information.

  17. Sodium content of popular commercially processed and restaurant foods in the United States

    USDA-ARS?s Scientific Manuscript database

    Nutrient Data Laboratory (NDL) of the U.S. Department of Agriculture (USDA) in close collaboration with U.S. Center for Disease Control and Prevention is monitoring the sodium content of commercially processed and restaurant foods in the United States. The main purpose of this manuscript is to prov...

  18. Business Process Improvement Applied to Written Temporary Duty Travel Orders within the United States Air Force

    DTIC Science & Technology

    1993-12-01

    Generally Accepted Process While neither DoD Directives nor USAF Regulations specify exact mandatory TDY order processing methods, most USAF units...functional input. Finally, TDY order processing functional experts at Hanscom, Los Angeles and McClellan AFBs provided inputs based on their experiences...current electronic auditing capabilities. 81 DTPS Initiative. This DFAS-initiated action to standardize TDY order processing throughout DoD is currently

  19. Approaches to Identifying the Emerging Innovative Water Technology Industry in the United States

    PubMed Central

    WOOD, ALLISON R.; HARTEN, TERESA; GUTIERREZ, SALLY C.

    2018-01-01

    Clean water is vital to sustaining our natural environment, human health, and our economy. As infrastructure continues to deteriorate and water resources become increasingly threatened, new technologies will be needed to ensure safe and sustainable water in the future. Though the US water industry accounts for approximately 1% gross domestic product and regional “clusters” for water technology exist throughout the country, this emerging industry has not been captured by recent studies. As use of the term “cluster” becomes more prevalent, regional mapping efforts have revealed international differences in definition yet showcase this industry’s economic impact. In reality, institutional processes may inhibit altering industry coding to better describe water technology. Forgoing the benefits of national economic tracking, alternative data sets are available, which may support new ways of identifying these clusters. This work provides cluster definitions; summarizes current approaches to identifying industry activity using data, interviews, and literature; and sets a foundation for future research. PMID:29937546

  20. Approaches to Identifying the Emerging Innovative Water Technology Industry in the United States.

    PubMed

    Wood, Allison R; Harten, Teresa; Gutierrez, Sally C

    2018-04-25

    Clean water is vital to sustaining our natural environment, human health, and our economy. As infrastructure continues to deteriorate and water resources become increasingly threatened, new technologies will be needed to ensure safe and sustainable water in the future. Though the US water industry accounts for approximately 1% gross domestic product and regional "clusters" for water technology exist throughout the country, this emerging industry has not been captured by recent studies. As use of the term "cluster" becomes more prevalent, regional mapping efforts have revealed international differences in definition yet showcase this industry's economic impact. In reality, institutional processes may inhibit altering industry coding to better describe water technology. Forgoing the benefits of national economic tracking, alternative data sets are available, which may support new ways of identifying these clusters. This work provides cluster definitions; summarizes current approaches to identifying industry activity using data, interviews, and literature; and sets a foundation for future research.

  1. Theoretical Dimensions of Small Unit Resilience

    DTIC Science & Technology

    2010-12-01

    ending process and everyday and every experience offers a new education. Keep learning and keep moving forward. To my brother and sister Soldiers...strategies and coping mechanisms. 13 Unit Leadership and Coping Willingness to Seek Care Reducing Barriers to Care Family and Marital Support...identifies the following 10 combat skills: Buddies (Cohesion) Accountability Targeted Aggression 14 15 Tactical Awareness Lethally Armed Emotional

  2. The Process and Impact of Stakeholder Engagement in Developing a Pediatric Intensive Care Unit Communication and Decision-Making Intervention.

    PubMed

    Michelson, Kelly N; Frader, Joel; Sorce, Lauren; Clayman, Marla L; Persell, Stephen D; Fragen, Patricia; Ciolino, Jody D; Campbell, Laura C; Arenson, Melanie; Aniciete, Danica Y; Brown, Melanie L; Ali, Farah N; White, Douglas

    2016-12-01

    Stakeholder-developed interventions are needed to support pediatric intensive care unit (PICU) communication and decision-making. Few publications delineate methods and outcomes of stakeholder engagement in research. We describe the process and impact of stakeholder engagement on developing a PICU communication and decision-making support intervention. We also describe the resultant intervention. Stakeholders included parents of PICU patients, healthcare team members (HTMs), and research experts. Through a year-long iterative process, we involved 96 stakeholders in 25 meetings and 26 focus groups or interviews. Stakeholders adapted an adult navigator model by identifying core intervention elements and then determining how to operationalize those core elements in pediatrics. The stakeholder input led to PICU-specific refinements, such as supporting transitions after PICU discharge and including ancillary tools. The resultant intervention includes navigator involvement with parents and HTMs and navigator-guided use of ancillary tools. Subsequent research will test the feasibility and efficacy of our intervention.

  3. Streamlining the medication process improves safety in the intensive care unit.

    PubMed

    Benoit, E; Eckert, P; Theytaz, C; Joris-Frasseren, M; Faouzi, M; Beney, J

    2012-09-01

    Multiple interventions were made to optimize the medication process in our intensive care unit (ICU). 1 Transcriptions from the medical order form to the administration plan were eliminated by merging both into a single document; 2 the new form was built in a logical sequence and was highly structured to promote completeness and standardization of information; 3 frequently used drug names, approved units, and fixed routes were pre-printed; 4 physicians and nurses were trained with regard to the correct use of the new form. This study was aimed at evaluating the impact of these interventions on clinically significant types of medication errors. Eight types of medication errors were measured by a prospective chart review before and after the interventions in the ICU of a public tertiary care hospital. We used an interrupted time-series design to control the secular trends. Over 85 days, 9298 lines of drug prescription and/or administration to 294 patients, corresponding to 754 patient-days were collected and analysed for the three series before and three series following the intervention. Global error rate decreased from 4.95 to 2.14% (-56.8%, P < 0.001). The safety of the medication process in our ICU was improved by simple and inexpensive interventions. In addition to the optimization of the prescription writing process, the documentation of intravenous preparation, and the scheduling of administration, the elimination of the transcription in combination with the training of users contributed to reducing errors and carried an interesting potential to increase safety. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.

  4. Introduction to Crop Production. Unit A-7.

    ERIC Educational Resources Information Center

    Luft, Vernon D.; Backlund, Paul

    This document is a teacher's guide for a unit in vocational agriculture for college freshmen. It is intended to be used for 20 hours of instruction as an introductory course on the crop industry. It provides a broad background of the industry, including production, marketing, processing, and transportation, with emphasis on identifying major crops…

  5. Discrete typing units of Trypanosoma cruzi identified in rural dogs and cats in the humid Argentinean Chaco

    PubMed Central

    ENRIQUEZ, G.F.; CARDINAL, M.V.; OROZCO, M.M.; LANATI, L.; SCHIJMAN, A.G.; GÜRTLER, R.E.

    2013-01-01

    SUMMARY The discrete typing units (DTUs) of Trypanosoma cruzi that infect domestic dogs and cats have rarely been studied. With this purpose we conducted a cross-sectional xenodiagnostic survey of dog and cat populations residing in two infested rural villages in Pampa del Indio, in the humid Argentine Chaco. Parasites were isolated by culture from 44 dogs and 12 cats with a positive xenodiagnosis. DTUs were identified from parasite culture samples using a strategy based on multiple polymerase-chain reactions. TcVI was identified in 37 of 44 dogs and in 10 of 12 cats, whereas TcV was identified in five dogs and in two cats –a new finding for cats. No mixed infections were detected. The occurrence of two dogs infected with TcIII –classically found in armadillos– suggests a probable link with the local sylvatic transmission cycle involving Dasypus novemcinctus armadillos and a potential risk of human infection with TcIII. Our study reinforces the importance of dogs and cats as domestic reservoir hosts and sources of various DTUs infecting humans, and suggests a link between dogs and the sylvatic transmission cycle of TcIII. PMID:23058180

  6. Safety Management of a Clinical Process Using Failure Mode and Effect Analysis: Continuous Renal Replacement Therapies in Intensive Care Unit Patients.

    PubMed

    Sanchez-Izquierdo-Riera, Jose Angel; Molano-Alvarez, Esteban; Saez-de la Fuente, Ignacio; Maynar-Moliner, Javier; Marín-Mateos, Helena; Chacón-Alves, Silvia

    2016-01-01

    The failure mode and effect analysis (FMEA) may improve the safety of the continuous renal replacement therapies (CRRT) in the intensive care unit. We use this tool in three phases: 1) Retrospective observational study. 2) A process FMEA, with implementation of the improvement measures identified. 3) Cohort study after FMEA. We included 54 patients in the pre-FMEA group and 72 patients in the post-FMEA group. Comparing the risks frequencies per patient in both groups, we got less cases of under 24 hours of filter survival time in the post-FMEA group (31 patients 57.4% vs. 21 patients 29.6%; p < 0.05); less patients suffered circuit coagulation with inability to return the blood to the patient (25 patients [46.3%] vs. 16 patients [22.2%]; p < 0.05); 54 patients (100%) versus 5 (6.94%) did not get phosphorus levels monitoring (p < 0.05); in 14 patients (25.9%) versus 0 (0%), the CRRT prescription did not appear on medical orders. As a measure of improvement, we adopt a dynamic dosage management. After the process FMEA, there were several improvements in the management of intensive care unit patients receiving CRRT, and we consider it a useful tool for improving the safety of critically ill patients.

  7. General purpose molecular dynamics simulations fully implemented on graphics processing units

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Lorenz, Chris D.; Travesset, A.

    2008-05-01

    Graphics processing units (GPUs), originally developed for rendering real-time effects in computer games, now provide unprecedented computational power for scientific applications. In this paper, we develop a general purpose molecular dynamics code that runs entirely on a single GPU. It is shown that our GPU implementation provides a performance equivalent to that of fast 30 processor core distributed memory cluster. Our results show that GPUs already provide an inexpensive alternative to such clusters and discuss implications for the future.

  8. Nurse adoption of continuous patient monitoring on acute post-surgical units: managing technology implementation.

    PubMed

    Jeskey, Mary; Card, Elizabeth; Nelson, Donna; Mercaldo, Nathaniel D; Sanders, Neal; Higgins, Michael S; Shi, Yaping; Michaels, Damon; Miller, Anne

    2011-10-01

    To report an exploratory action-research process used during the implementation of continuous patient monitoring in acute post-surgical nursing units. Substantial US Federal funding has been committed to implementing new health care technology, but failure to manage implementation processes may limit successful adoption and the realisation of proposed benefits. Effective approaches for managing barriers to new technology implementation are needed. Continuous patient monitoring was implemented in three of 13 medical/surgical units. An exploratory action-feedback approach, using time-series nurse surveys, was used to identify barriers and develop and evaluate responses. Post-hoc interviews and document analysis were used to describe the change implementation process. Significant differences were identified in night- and dayshift nurses' perceptions of technology benefits. Research nurses' facilitated the change process by evolving 'clinical nurse implementation specialist' expertise. Health information technology (HIT)-related patient outcomes are mediated through nurses' acting on new information but HIT designed for critical care may not transfer to acute care settings. Exploratory action-feedback approaches can assist nurse managers in assessing and mitigating the real-world effects of HIT implementations. It is strongly recommended that nurse managers identify stakeholders and develop comprehensive plans for monitoring the effects of HIT in their units. © 2011 Blackwell Publishing Ltd.

  9. Application of graphics processing units to search pipelines for gravitational waves from coalescing binaries of compact objects

    NASA Astrophysics Data System (ADS)

    Chung, Shin Kee; Wen, Linqing; Blair, David; Cannon, Kipp; Datta, Amitava

    2010-07-01

    We report a novel application of a graphics processing unit (GPU) for the purpose of accelerating the search pipelines for gravitational waves from coalescing binaries of compact objects. A speed-up of 16-fold in total has been achieved with an NVIDIA GeForce 8800 Ultra GPU card compared with one core of a 2.5 GHz Intel Q9300 central processing unit (CPU). We show that substantial improvements are possible and discuss the reduction in CPU count required for the detection of inspiral sources afforded by the use of GPUs.

  10. Proteomics of Aspergillus fumigatus Conidia-containing Phagolysosomes Identifies Processes Governing Immune Evasion.

    PubMed

    Schmidt, Hella; Vlaic, Sebastian; Krüger, Thomas; Schmidt, Franziska; Balkenhol, Johannes; Dandekar, Thomas; Guthke, Reinhard; Kniemeyer, Olaf; Heinekamp, Thorsten; Brakhage, Axel A

    2018-06-01

    Invasive infections by the human pathogenic fungus Aspergillus fumigatus start with the outgrowth of asexual, airborne spores (conidia) into the lung tissue of immunocompromised patients. The resident alveolar macrophages phagocytose conidia, which end up in phagolysosomes. However, A. fumigatus conidia resist phagocytic degradation to a certain degree. This is mainly attributable to the pigment 1,8-dihydroxynaphthalene (DHN) melanin located in the cell wall of conidia, which manipulates the phagolysosomal maturation and prevents their intracellular killing. To get insight in the underlying molecular mechanisms, we comparatively analyzed proteins of mouse macrophage phagolysosomes containing melanized wild-type (wt) or nonmelanized pksP mutant conidia. For this purpose, a protocol to isolate conidia-containing phagolysosomes was established and a reference protein map of phagolysosomes was generated. We identified 637 host and 22 A. fumigatus proteins that were differentially abundant in the phagolysosome. 472 of the host proteins were overrepresented in the pksP mutant and 165 in the wt conidia-containing phagolysosome. Eight of the fungal proteins were produced only in pksP mutant and 14 proteins in wt conidia-containing phagolysosomes. Bioinformatical analysis compiled a regulatory module, which indicates host processes affected by the fungus. These processes include vATPase-driven phagolysosomal acidification, Rab5 and Vamp8-dependent endocytic trafficking, signaling pathways, as well as recruitment of the Lamp1 phagolysosomal maturation marker and the lysosomal cysteine protease cathepsin Z. Western blotting and immunofluorescence analyses confirmed the proteome data and moreover showed differential abundance of the major metabolic regulator mTOR. Taken together, with the help of a protocol optimized to isolate A. fumigatus conidia-containing phagolysosomes and a potent bioinformatics algorithm, we were able to confirm A. fumigatus conidia

  11. Power processing units for high power solar electric propulsion

    NASA Astrophysics Data System (ADS)

    Frisbee, Robert H.; Das, Radhe S.; Krauthamer, Stanley

    An evaluation of high-power processing units (PPUs) for multimegawatt solar electric propulsion (SEP) vehicles using advanced ion thrusters is presented. Significant savings of scale are possible for PPUs used to supply power to ion thrusters operating at 0.1 to 1.5 MWe per thruster. The PPU specific mass is found to be strongly sensitive to variations in the ion thruster's power per thruster and moderately sensitive to variations in the thruster's screen voltage due to varying the I(sp) of the thruster. Each PPU consists of a dc-to-dc converter to increase the voltage from the 500 V dc of the photovoltaic power system to the 5 to 13 kV dc required by the ion thrusters.

  12. Understanding the College Choice Process of United States Military-Affiliated Transfer Students

    ERIC Educational Resources Information Center

    Ives, Emily Joanne

    2017-01-01

    This study examined the college choice process of transfer student veterans who are currently enrolled in a public research university. The research presented in this dissertation utilized both quantitative and qualitative strategies to identify key factors in students' college choice process. This study focuses on the following two research…

  13. Near-realtime simulations of biolelectric activity in small mammalian hearts using graphical processing units

    PubMed Central

    Vigmond, Edward J.; Boyle, Patrick M.; Leon, L. Joshua; Plank, Gernot

    2014-01-01

    Simulations of cardiac bioelectric phenomena remain a significant challenge despite continual advancements in computational machinery. Spanning large temporal and spatial ranges demands millions of nodes to accurately depict geometry, and a comparable number of timesteps to capture dynamics. This study explores a new hardware computing paradigm, the graphics processing unit (GPU), to accelerate cardiac models, and analyzes results in the context of simulating a small mammalian heart in real time. The ODEs associated with membrane ionic flow were computed on traditional CPU and compared to GPU performance, for one to four parallel processing units. The scalability of solving the PDE responsible for tissue coupling was examined on a cluster using up to 128 cores. Results indicate that the GPU implementation was between 9 and 17 times faster than the CPU implementation and scaled similarly. Solving the PDE was still 160 times slower than real time. PMID:19964295

  14. Methodological systematic review identifies major limitations in prioritization processes for updating.

    PubMed

    Martínez García, Laura; Pardo-Hernandez, Hector; Superchi, Cecilia; Niño de Guzman, Ena; Ballesteros, Monica; Ibargoyen Roteta, Nora; McFarlane, Emma; Posso, Margarita; Roqué I Figuls, Marta; Rotaeche Del Campo, Rafael; Sanabria, Andrea Juliana; Selva, Anna; Solà, Ivan; Vernooij, Robin W M; Alonso-Coello, Pablo

    2017-06-01

    The aim of the study was to identify and describe strategies to prioritize the updating of systematic reviews (SRs), health technology assessments (HTAs), or clinical guidelines (CGs). We conducted an SR of studies describing one or more methods to prioritize SRs, HTAs, or CGs for updating. We searched MEDLINE (PubMed, from 1966 to August 2016) and The Cochrane Methodology Register (The Cochrane Library, Issue 8 2016). We hand searched abstract books, reviewed reference lists, and contacted experts. Two reviewers independently screened the references and extracted data. We included 14 studies. Six studies were classified as descriptive (6 of 14, 42.9%) and eight as implementation studies (8 of 14, 57.1%). Six studies reported an updating strategy (6 of 14, 42.9%), six a prioritization process (6 of 14, 42.9%), and two a prioritization criterion (2 of 14, 14.2%). Eight studies focused on SRs (8 of 14, 57.1%), six studies focused on CGs (6 of 14, 42.9%), and none were about HTAs. We identified 76 prioritization criteria that can be applied when prioritizing documents for updating. The most frequently cited criteria were as follows: available evidence (19 of 76, 25.0%), clinical relevance (10 of 76; 13.2%), and users' interest (10 of 76; 13.2%). There is wide variability and suboptimal reporting of the methods used to develop and implement processes to prioritize updating of SRs, HTAs, and CGs. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Finite Element Optimization for Nondestructive Evaluation on a Graphics Processing Unit for Ground Vehicle Hull Inspection

    DTIC Science & Technology

    2013-08-22

    4 cores, where the code may simultaneously run on the multiple cores or the graphics processing unit (or GPU – to be more specific on an NVIDIA ...allowed to get accurate crack shapes. DISCLAIMER Reference herein to any specific commercial company , product, process, or service by trade name

  16. Catchment-Scale Conservation Units Identified for the Threatened Yarra Pygmy Perch (Nannoperca obscura) in Highly Modified River Systems

    PubMed Central

    Brauer, Chris J.; Unmack, Peter J.; Hammer, Michael P.; Adams, Mark; Beheregaray, Luciano B.

    2013-01-01

    Habitat fragmentation caused by human activities alters metapopulation dynamics and decreases biological connectivity through reduced migration and gene flow, leading to lowered levels of population genetic diversity and to local extinctions. The threatened Yarra pygmy perch, Nannoperca obscura, is a poor disperser found in small, isolated populations in wetlands and streams of southeastern Australia. Modifications to natural flow regimes in anthropogenically-impacted river systems have recently reduced the amount of habitat for this species and likely further limited its opportunity to disperse. We employed highly resolving microsatellite DNA markers to assess genetic variation, population structure and the spatial scale that dispersal takes place across the distribution of this freshwater fish and used this information to identify conservation units for management. The levels of genetic variation found for N. obscura are amongst the lowest reported for a fish species (mean heterozygosity of 0.318 and mean allelic richness of 1.92). We identified very strong population genetic structure, nil to little evidence of recent migration among demes and a minimum of 11 units for conservation management, hierarchically nested within four major genetic lineages. A combination of spatial analytical methods revealed hierarchical genetic structure corresponding with catchment boundaries and also demonstrated significant isolation by riverine distance. Our findings have implications for the national recovery plan of this species by demonstrating that N. obscura populations should be managed at a catchment level and highlighting the need to restore habitat and avoid further alteration of the natural hydrology. PMID:24349405

  17. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  18. Nontimber forest products in the United States: Montreal Process indicators as measures of current conditions and sustainability

    Treesearch

    Susan J. Alexander; Sonja N. Oswalt; Marla R. Emery

    2011-01-01

    The United States, in partnership with 11 other countries, participates in the Montreal Process. Each country assesses national progress toward the sustainable management of forest resources by using a set of criteria and indicators agreed on by all member countries. Several indicators focus on nontimber forest products (NTFPs). In the United States, permit and...

  19. Real-time computation of parameter fitting and image reconstruction using graphical processing units

    NASA Astrophysics Data System (ADS)

    Locans, Uldis; Adelmann, Andreas; Suter, Andreas; Fischer, Jannis; Lustermann, Werner; Dissertori, Günther; Wang, Qiulin

    2017-06-01

    In recent years graphical processing units (GPUs) have become a powerful tool in scientific computing. Their potential to speed up highly parallel applications brings the power of high performance computing to a wider range of users. However, programming these devices and integrating their use in existing applications is still a challenging task. In this paper we examined the potential of GPUs for two different applications. The first application, created at Paul Scherrer Institut (PSI), is used for parameter fitting during data analysis of μSR (muon spin rotation, relaxation and resonance) experiments. The second application, developed at ETH, is used for PET (Positron Emission Tomography) image reconstruction and analysis. Applications currently in use were examined to identify parts of the algorithms in need of optimization. Efficient GPU kernels were created in order to allow applications to use a GPU, to speed up the previously identified parts. Benchmarking tests were performed in order to measure the achieved speedup. During this work, we focused on single GPU systems to show that real time data analysis of these problems can be achieved without the need for large computing clusters. The results show that the currently used application for parameter fitting, which uses OpenMP to parallelize calculations over multiple CPU cores, can be accelerated around 40 times through the use of a GPU. The speedup may vary depending on the size and complexity of the problem. For PET image analysis, the obtained speedups of the GPU version were more than × 40 larger compared to a single core CPU implementation. The achieved results show that it is possible to improve the execution time by orders of magnitude.

  20. Performance of the NEXT Engineering Model Power Processing Unit

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Hopson, Mark; Todd, Philip C.; Wong, Brian

    2007-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. An engineering model (EM) power processing unit (PPU) for the NEXT project was designed and fabricated by L-3 Communications under contract with NASA Glenn Research Center (GRC). This modular PPU is capable of processing up from 0.5 to 7.0 kW of output power for the NEXT ion thruster. Its design includes many significant improvements for better performance over the state-of-the-art PPU. The most significant difference is the beam supply which is comprised of six modules and capable of very efficient operation through a wide voltage range because of innovative features like dual controls, module addressing, and a high current mode. The low voltage power supplies are based on elements of the previously validated NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) PPU. The highly modular construction of the PPU resulted in improved manufacturability, simpler scalability, and lower cost. This paper describes the design of the EM PPU and the results of the bench-top performance tests.

  1. Porting a Hall MHD Code to a Graphic Processing Unit

    NASA Technical Reports Server (NTRS)

    Dorelli, John C.

    2011-01-01

    We present our experience porting a Hall MHD code to a Graphics Processing Unit (GPU). The code is a 2nd order accurate MUSCL-Hancock scheme which makes use of an HLL Riemann solver to compute numerical fluxes and second-order finite differences to compute the Hall contribution to the electric field. The divergence of the magnetic field is controlled with Dedner?s hyperbolic divergence cleaning method. Preliminary benchmark tests indicate a speedup (relative to a single Nehalem core) of 58x for a double precision calculation. We discuss scaling issues which arise when distributing work across multiple GPUs in a CPU-GPU cluster.

  2. Monte Carlo MP2 on Many Graphical Processing Units.

    PubMed

    Doran, Alexander E; Hirata, So

    2016-10-11

    In the Monte Carlo second-order many-body perturbation (MC-MP2) method, the long sum-of-product matrix expression of the MP2 energy, whose literal evaluation may be poorly scalable, is recast into a single high-dimensional integral of functions of electron pair coordinates, which is evaluated by the scalable method of Monte Carlo integration. The sampling efficiency is further accelerated by the redundant-walker algorithm, which allows a maximal reuse of electron pairs. Here, a multitude of graphical processing units (GPUs) offers a uniquely ideal platform to expose multilevel parallelism: fine-grain data-parallelism for the redundant-walker algorithm in which millions of threads compute and share orbital amplitudes on each GPU; coarse-grain instruction-parallelism for near-independent Monte Carlo integrations on many GPUs with few and infrequent interprocessor communications. While the efficiency boost by the redundant-walker algorithm on central processing units (CPUs) grows linearly with the number of electron pairs and tends to saturate when the latter exceeds the number of orbitals, on a GPU it grows quadratically before it increases linearly and then eventually saturates at a much larger number of pairs. This is because the orbital constructions are nearly perfectly parallelized on a GPU and thus completed in a near-constant time regardless of the number of pairs. In consequence, an MC-MP2/cc-pVDZ calculation of a benzene dimer is 2700 times faster on 256 GPUs (using 2048 electron pairs) than on two CPUs, each with 8 cores (which can use only up to 256 pairs effectively). We also numerically determine that the cost to achieve a given relative statistical uncertainty in an MC-MP2 energy increases as O(n 3 ) or better with system size n, which may be compared with the O(n 5 ) scaling of the conventional implementation of deterministic MP2. We thus establish the scalability of MC-MP2 with both system and computer sizes.

  3. Standardized severe maternal morbidity review: rationale and process.

    PubMed

    Kilpatrick, Sarah J; Berg, Cynthia; Bernstein, Peter; Bingham, Debra; Delgado, Ana; Callaghan, William M; Harris, Karen; Lanni, Susan; Mahoney, Jeanne; Main, Elliot; Nacht, Amy; Schellpfeffer, Michael; Westover, Thomas; Harper, Margaret

    2014-08-01

    Severe maternal morbidity and mortality have been rising in the United States. To begin a national effort to reduce morbidity, a specific call to identify all pregnant and postpartum women experiencing admission to an intensive care unit or receipt of 4 or more units of blood for routine review has been made. While advocating for review of these cases, no specific guidance for the review process was provided. Therefore, the aim of this expert opinion is to present guidelines for a standardized severe maternal morbidity interdisciplinary review process to identify systems, professional, and facility factors that can be ameliorated, with the overall goal of improving institutional obstetric safety and reducing severe morbidity and mortality among pregnant and recently pregnant women. This opinion was developed by a multidisciplinary working group that included general obstetrician-gynecologists, maternal-fetal medicine subspecialists, certified nurse-midwives, and registered nurses all with experience in maternal mortality reviews. A process for standardized review of severe maternal morbidity addressing committee organization, review process, medical record abstraction and assessment, review culture, data management, review timing, and review confidentiality is presented. Reference is made to a sample severe maternal morbidity abstraction and assessment form.

  4. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  5. Identifying additional studies for a systematic review of retention strategies in randomised controlled trials: making contact with trials units and trial methodologists.

    PubMed

    Brueton, Valerie; Tierney, Jayne F; Stenning, Sally; Rait, Greta

    2017-08-22

    Search strategies for systematic reviews aim to identify all evidence relevant to the research question posed. Reports of methodological research can be difficult to find leading to biased results in systematic reviews of research methodology. Evidence suggests that contact with investigators can help to identify unpublished research. To identify additional eligible randomised controlled trials (RCTs) for a Cochrane systematic review of strategies to improve retention in RCTs, we conducted a survey of UK clinical trials units (CTUs) and made contact with RCT methodologists. Key contacts for all UK CTUs were sent a personalised email with a short questionnaire and summary protocol of the Cochrane methodology review. The questionnaire asked whether a RCT evaluating strategies to improve retention embedded in a RCT had ever been conducted by the CTU. Questions about the stage of completion and publication of such RCTs were included. The summary protocol outlined the aims, eligibility criteria, examples of types of retention strategies, and the primary outcome for the systematic review. Personal communication with RCT methodologists and presentations of preliminary results of the review at conferences were also used to identify additional eligible RCTs. We checked the results of our standard searches to see if eligible studies identified through these additional methods were also found using our standard searches. We identified 14 of the 38 RCTs included in the Cochrane methodology review by contacting trials units and methodologists. Eleven of the 14 RCTs identified by these methods were either published in grey literature, in press or unpublished. Three remaining RCTs were fully published at the time. Six of the RCTs identified were not found through any other searches. The RCTs identified represented data for 6 of 14 RCTs of incentive strategies (52% of randomised participants included in the review), and 6 of 14 RCTs of communication strategies (52% of randomised

  6. Evaluating Mobile Graphics Processing Units (GPUs) for Real-Time Resource Constrained Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meredith, J; Conger, J; Liu, Y

    2005-11-11

    Modern graphics processing units (GPUs) can provide tremendous performance boosts for some applications beyond what a single CPU can accomplish, and their performance is growing at a rate faster than CPUs as well. Mobile GPUs available for laptops have the small form factor and low power requirements suitable for use in embedded processing. We evaluated several desktop and mobile GPUs and CPUs on traditional and non-traditional graphics tasks, as well as on the most time consuming pieces of a full hyperspectral imaging application. Accuracy remained high despite small differences in arithmetic operations like rounding. Performance improvements are summarized here relativemore » to a desktop Pentium 4 CPU.« less

  7. 78 FR 19019 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-28

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: Prevailing Wage Rates for Certain Occupations Processed Under H-2A Special Procedures; Correction and Rescission AGENCY: Employment and Training...

  8. 43 CFR 429.37 - Does interest accrue on monies owed to the United States during my appeal process?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... United States during my appeal process? 429.37 Section 429.37 Public Lands: Interior Regulations Relating... States during my appeal process? Except for any period in the appeal process during which a stay is then... decision to OHA, or during judicial review of final agency action. ...

  9. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  10. Identifying unusual performance in Australian and New Zealand intensive care units from 2000 to 2010.

    PubMed

    Solomon, Patricia J; Kasza, Jessica; Moran, John L

    2014-04-22

    The Australian and New Zealand Intensive Care Society (ANZICS) Adult Patient Database (APD) collects voluntary data on patient admissions to Australian and New Zealand intensive care units (ICUs). This paper presents an in-depth statistical analysis of risk-adjusted mortality of ICU admissions from 2000 to 2010 for the purpose of identifying ICUs with unusual performance. A cohort of 523,462 patients from 144 ICUs was analysed. For each ICU, the natural logarithm of the standardised mortality ratio (log-SMR) was estimated from a risk-adjusted, three-level hierarchical model. This is the first time a three-level model has been fitted to such a large ICU database anywhere. The analysis was conducted in three stages which included the estimation of a null distribution to describe usual ICU performance. Log-SMRs with appropriate estimates of standard errors are presented in a funnel plot using 5% false discovery rate thresholds. False coverage-statement rate confidence intervals are also presented. The observed numbers of deaths for ICUs identified as unusual are compared to the predicted true worst numbers of deaths under the model for usual ICU performance. Seven ICUs were identified as performing unusually over the period 2000 to 2010, in particular, demonstrating high risk-adjusted mortality compared to the majority of ICUs. Four of the seven were ICUs in private hospitals. Our three-stage approach to the analysis detected outlying ICUs which were not identified in a conventional (single) risk-adjusted model for mortality using SMRs to compare ICUs. We also observed a significant linear decline in mortality over the decade. Distinct yearly and weekly respiratory seasonal effects were observed across regions of Australia and New Zealand for the first time. The statistical approach proposed in this paper is intended to be used for the review of observed ICU and hospital mortality. Two important messages from our study are firstly, that comprehensive risk

  11. Identifying unusual performance in Australian and New Zealand intensive care units from 2000 to 2010

    PubMed Central

    2014-01-01

    Background The Australian and New Zealand Intensive Care Society (ANZICS) Adult Patient Database (APD) collects voluntary data on patient admissions to Australian and New Zealand intensive care units (ICUs). This paper presents an in-depth statistical analysis of risk-adjusted mortality of ICU admissions from 2000 to 2010 for the purpose of identifying ICUs with unusual performance. Methods A cohort of 523,462 patients from 144 ICUs was analysed. For each ICU, the natural logarithm of the standardised mortality ratio (log-SMR) was estimated from a risk-adjusted, three-level hierarchical model. This is the first time a three-level model has been fitted to such a large ICU database anywhere. The analysis was conducted in three stages which included the estimation of a null distribution to describe usual ICU performance. Log-SMRs with appropriate estimates of standard errors are presented in a funnel plot using 5% false discovery rate thresholds. False coverage-statement rate confidence intervals are also presented. The observed numbers of deaths for ICUs identified as unusual are compared to the predicted true worst numbers of deaths under the model for usual ICU performance. Results Seven ICUs were identified as performing unusually over the period 2000 to 2010, in particular, demonstrating high risk-adjusted mortality compared to the majority of ICUs. Four of the seven were ICUs in private hospitals. Our three-stage approach to the analysis detected outlying ICUs which were not identified in a conventional (single) risk-adjusted model for mortality using SMRs to compare ICUs. We also observed a significant linear decline in mortality over the decade. Distinct yearly and weekly respiratory seasonal effects were observed across regions of Australia and New Zealand for the first time. Conclusions The statistical approach proposed in this paper is intended to be used for the review of observed ICU and hospital mortality. Two important messages from our study are

  12. Microbial Contaminants of Cord Blood Units Identified by 16S rRNA Sequencing and by API Test System, and Antibiotic Sensitivity Profiling

    PubMed Central

    França, Luís; Simões, Catarina; Taborda, Marco; Diogo, Catarina; da Costa, Milton S.

    2015-01-01

    Over a period of ten months a total of 5618 cord blood units (CBU) were screened for microbial contamination under routine conditions. The antibiotic resistance profile for all isolates was also examined using ATB strips. The detection rate for culture positive units was 7.5%, corresponding to 422 samples.16S rRNA sequence analysis and identification with API test system were used to identify the culturable aerobic, microaerophilic and anaerobic bacteria from CBUs. From these samples we recovered 485 isolates (84 operational taxonomic units, OTUs) assigned to the classes Bacteroidia, Actinobacteria, Clostridia, Bacilli, Betaproteobacteria and primarily to the Gammaproteobacteria. Sixty-nine OTUs, corresponding to 447 isolates, showed 16S rRNA sequence similarities above 99.0% with known cultured bacteria. However, 14 OTUs had 16S rRNA sequence similarities between 95 and 99% in support of genus level identification and one OTU with 16S rRNA sequence similarity of 90.3% supporting a family level identification only. The phenotypic identification formed 29 OTUs that could be identified to the species level and 9 OTUs that could be identified to the genus level by API test system. We failed to obtain identification for 14 OTUs, while 32 OTUs comprised organisms producing mixed identifications. Forty-two OTUs covered species not included in the API system databases. The API test system Rapid ID 32 Strep and Rapid ID 32 E showed the highest proportion of identifications to the species level, the lowest ratio of unidentified results and the highest agreement to the results of 16S rRNA assignments. Isolates affiliated to the Bacilli and Bacteroidia showed the highest antibiotic multi-resistance indices and microorganisms of the Clostridia displayed the most antibiotic sensitive phenotypes. PMID:26512991

  13. Microbial Contaminants of Cord Blood Units Identified by 16S rRNA Sequencing and by API Test System, and Antibiotic Sensitivity Profiling.

    PubMed

    França, Luís; Simões, Catarina; Taborda, Marco; Diogo, Catarina; da Costa, Milton S

    2015-01-01

    Over a period of ten months a total of 5618 cord blood units (CBU) were screened for microbial contamination under routine conditions. The antibiotic resistance profile for all isolates was also examined using ATB strips. The detection rate for culture positive units was 7.5%, corresponding to 422 samples.16S rRNA sequence analysis and identification with API test system were used to identify the culturable aerobic, microaerophilic and anaerobic bacteria from CBUs. From these samples we recovered 485 isolates (84 operational taxonomic units, OTUs) assigned to the classes Bacteroidia, Actinobacteria, Clostridia, Bacilli, Betaproteobacteria and primarily to the Gammaproteobacteria. Sixty-nine OTUs, corresponding to 447 isolates, showed 16S rRNA sequence similarities above 99.0% with known cultured bacteria. However, 14 OTUs had 16S rRNA sequence similarities between 95 and 99% in support of genus level identification and one OTU with 16S rRNA sequence similarity of 90.3% supporting a family level identification only. The phenotypic identification formed 29 OTUs that could be identified to the species level and 9 OTUs that could be identified to the genus level by API test system. We failed to obtain identification for 14 OTUs, while 32 OTUs comprised organisms producing mixed identifications. Forty-two OTUs covered species not included in the API system databases. The API test system Rapid ID 32 Strep and Rapid ID 32 E showed the highest proportion of identifications to the species level, the lowest ratio of unidentified results and the highest agreement to the results of 16S rRNA assignments. Isolates affiliated to the Bacilli and Bacteroidia showed the highest antibiotic multi-resistance indices and microorganisms of the Clostridia displayed the most antibiotic sensitive phenotypes.

  14. Ecoregions of the conterminous United States: evolution of a hierarchical spatial framework

    USGS Publications Warehouse

    Omernik, James M.; Griffith, Glenn E.

    2014-01-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  15. Ecoregions of the Conterminous United States: Evolution of a Hierarchical Spatial Framework

    NASA Astrophysics Data System (ADS)

    Omernik, James M.; Griffith, Glenn E.

    2014-12-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  16. Numerical simulation of disperse particle flows on a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Sierakowski, Adam J.

    In both nature and technology, we commonly encounter solid particles being carried within fluid flows, from dust storms to sediment erosion and from food processing to energy generation. The motion of uncountably many particles in highly dynamic flow environments characterizes the tremendous complexity of such phenomena. While methods exist for the full-scale numerical simulation of such systems, current computational capabilities require the simplification of the numerical task with significant approximation using closure models widely recognized as insufficient. There is therefore a fundamental need for the investigation of the underlying physical processes governing these disperse particle flows. In the present work, we develop a new tool based on the Physalis method for the first-principles numerical simulation of thousands of particles (a small fraction of an entire disperse particle flow system) in order to assist in the search for new reduced-order closure models. We discuss numerous enhancements to the efficiency and stability of the Physalis method, which introduces the influence of spherical particles to a fixed-grid incompressible Navier-Stokes flow solver using a local analytic solution to the flow equations. Our first-principles investigation demands the modeling of unresolved length and time scales associated with particle collisions. We introduce a collision model alongside Physalis, incorporating lubrication effects and proposing a new nonlinearly damped Hertzian contact model. By reproducing experimental studies from the literature, we document extensive validation of the methods. We discuss the implementation of our methods for massively parallel computation using a graphics processing unit (GPU). We combine Eulerian grid-based algorithms with Lagrangian particle-based algorithms to achieve computational throughput up to 90 times faster than the legacy implementation of Physalis for a single central processing unit. By avoiding all data

  17. ECO LOGIC INTERNATIONAL GAS-PHASE CHEMICAL REDUCTION PROCESS - THE THERMAL DESORPTION UNIT - APPLICATIONS ANALYSIS REPORT

    EPA Science Inventory

    ELI ECO Logic International, Inc.'s Thermal Desorption Unit (TDU) is specifically designed for use with Eco Logic's Gas Phase Chemical Reduction Process. The technology uses an externally heated bath of molten tin in a hydrogen atmosphere to desorb hazardous organic compounds fro...

  18. A numerical investigation of the scale-up effects on flow, heat transfer, and kinetics processes of FCC units.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S. L.

    1998-08-25

    Fluid Catalytic Cracking (FCC) technology is the most important process used by the refinery industry to convert crude oil to valuable lighter products such as gasoline. Process development is generally very time consuming especially when a small pilot unit is being scaled-up to a large commercial unit because of the lack of information to aide in the design of scaled-up units. Such information can now be obtained by analysis based on the pilot scale measurements and computer simulation that includes controlling physics of the FCC system. A Computational fluid dynamic (CFD) code, ICRKFLO, has been developed at Argonne National Laboratorymore » (ANL) and has been successfully applied to the simulation of catalytic petroleum cracking risers. It employs hybrid hydrodynamic-chemical kinetic coupling techniques, enabling the analysis of an FCC unit with complex chemical reaction sets containing tens or hundreds of subspecies. The code has been continuously validated based on pilot-scale experimental data. It is now being used to investigate the effects of scaled-up FCC units. Among FCC operating conditions, the feed injection conditions are found to have a strong impact on the product yields of scaled-up FCC units. The feed injection conditions appear to affect flow and heat transfer patterns and the interaction of hydrodynamics and cracking kinetics causes the product yields to change accordingly.« less

  19. Towards the understanding of network information processing in biology

    NASA Astrophysics Data System (ADS)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  20. Real-time liquid-crystal atmosphere turbulence simulator with graphic processing unit.

    PubMed

    Hu, Lifa; Xuan, Li; Li, Dayu; Cao, Zhaoliang; Mu, Quanquan; Liu, Yonggang; Peng, Zenghui; Lu, Xinghai

    2009-04-27

    To generate time-evolving atmosphere turbulence in real time, a phase-generating method for our liquid-crystal (LC) atmosphere turbulence simulator (ATS) is derived based on the Fourier series (FS) method. A real matrix expression for generating turbulence phases is given and calculated with a graphic processing unit (GPU), the GeForce 8800 Ultra. A liquid crystal on silicon (LCOS) with 256x256 pixels is used as the turbulence simulator. The total time to generate a turbulence phase is about 7.8 ms for calculation and readout with the GPU. A parallel processing method of calculating and sending a picture to the LCOS is used to improve the simulating speed of our LC ATS. Therefore, the real-time turbulence phase-generation frequency of our LC ATS is up to 128 Hz. To our knowledge, it is the highest speed used to generate a turbulence phase in real time.

  1. 40 CFR Table 8 to Subpart G of... - Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Wastewater Provisions for Process Units at New Sources 8 Table 8 to Subpart G of Part 63 Protection of... Vessels, Transfer Operations, and Wastewater Pt. 63, Subpt. G, Table 8 Table 8 to Subpart G of Part 63—Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources Chemical name CAS No...

  2. 40 CFR Table 8 to Subpart G of... - Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Wastewater Provisions for Process Units at New Sources 8 Table 8 to Subpart G of Part 63 Protection of... Vessels, Transfer Operations, and Wastewater Pt. 63, Subpt. G, Table 8 Table 8 to Subpart G of Part 63—Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources Chemical name CAS No...

  3. 40 CFR Table 8 to Subpart G of... - Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Wastewater Provisions for Process Units at New Sources 8 Table 8 to Subpart G of Part 63 Protection of... Vessels, Transfer Operations, and Wastewater Pt. 63, Subpt. G, Table 8 Table 8 to Subpart G of Part 63—Organic HAP's Subject to the Wastewater Provisions for Process Units at New Sources Chemical name CAS No...

  4. Observations on the Use of SCAN To Identify Children at Risk for Central Auditory Processing Disorder.

    ERIC Educational Resources Information Center

    Emerson, Maria F.; And Others

    1997-01-01

    The SCAN: A Screening Test for Auditory Processing Disorders was administered to 14 elementary children with a history of otitis media and 14 typical children, to evaluate the validity of the test in identifying children with central auditory processing disorder. Another experiment found that test results differed based on the testing environment…

  5. 25 CFR 170.501 - What happens when the review process identifies areas for improvement?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What happens when the review process identifies areas for improvement? 170.501 Section 170.501 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER INDIAN RESERVATION ROADS PROGRAM Planning, Design, and Construction of Indian Reservation Roads...

  6. Influence of coatings on the thermal and mechanical processes at insulating glass units

    NASA Astrophysics Data System (ADS)

    Penkova, Nina; Krumov, Kalin; Surleva, Andriana; Geshkova, Zlatka

    2017-09-01

    Different coatings on structural glass are used in the advances transparent facades and window systems in order to increase the thermal performance of the glass units and to regulate their optical properties. Coated glass has a higher absorptance in the solar spectrum which leads to correspondent higher temperature in the presence of solar load compared to the uncoated one. That process results in higher climatic loads at the insulating glass units (IGU) and in thermal stresses in the coated glass elements. Temperature fields and gradients in glass panes and climatic loads at IGU in window systems are estimated at different coating of glazed system. The study is implemented by numerical simulation of conjugate heat transfer in the window systems at summer time and presence of solar irradiation, as well as during winter night time.

  7. Reducing intraoperative red blood cell unit wastage in a large academic medical center.

    PubMed

    Whitney, Gina M; Woods, Marcella C; France, Daniel J; Austin, Thomas M; Deegan, Robert J; Paroskie, Allison; Booth, Garrett S; Young, Pampee P; Dmochowski, Roger R; Sandberg, Warren S; Pilla, Michael A

    2015-11-01

    The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p < 0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15-0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. © 2015 AABB.

  8. Process engineering design of pathological waste incinerator with an integrated combustion gases treatment unit.

    PubMed

    Shaaban, A F

    2007-06-25

    Management of medical wastes generated at different hospitals in Egypt is considered a highly serious problem. The sources and quantities of regulated medical wastes have been thoroughly surveyed and estimated (75t/day from governmental hospitals in Cairo). From the collected data it was concluded that the most appropriate incinerator capacity is 150kg/h. The objective of this work is to develop the process engineering design of an integrated unit, which is technically and economically capable for incinerating medical wastes and treatment of combustion gases. Such unit consists of (i) an incineration unit (INC-1) having an operating temperature of 1100 degrees C at 300% excess air, (ii) combustion-gases cooler (HE-1) generating 35m(3)/h hot water at 75 degrees C, (iii) dust filter (DF-1) capable of reducing particulates to 10-20mg/Nm(3), (iv) gas scrubbers (GS-1,2) for removing acidic gases, (v) a multi-tube fixed bed catalytic converter (CC-1) to maintain the level of dioxins and furans below 0.1ng/Nm(3), and (vi) an induced-draft suction fan system (SF-1) that can handle 6500Nm(3)/h at 250 degrees C. The residence time of combustion gases in the ignition, mixing and combustion chambers was found to be 2s, 0.25s and 0.75s, respectively. This will ensure both thorough homogenization of combustion gases and complete destruction of harmful constituents of the refuse. The adequate engineering design of individual process equipment results in competitive fixed and operating investments. The incineration unit has proved its high operating efficiency through the measurements of different pollutant-levels vented to the open atmosphere, which was found to be in conformity with the maximum allowable limits as specified in the law number 4/1994 issued by the Egyptian Environmental Affairs Agency (EEAA) and the European standards.

  9. Towards simplification of hydrologic modeling: Identification of dominant processes

    USGS Publications Warehouse

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  10. Assessing pine regeneration for the South Central United States

    Treesearch

    William H. McWilliams

    1990-01-01

    Poor regeneration of pine following harvest on nonindustrial timberland has been identified as a major cause for loss of pine forests and slowdown of softwood growth in the Southern United States.Developing a strategy for regeneration assessment requires clear definition of sampling objectives, sampling design, and analytical processes. It is important that...

  11. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes

    NASA Astrophysics Data System (ADS)

    Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.

    2017-02-01

    The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.

  12. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loughry, Thomas A.

    As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to tenmore » times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.« less

  13. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  14. High-throughput sequence alignment using Graphics Processing Units

    PubMed Central

    Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh

    2007-01-01

    Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU. PMID:18070356

  15. Gravitational tree-code on graphics processing units: implementation in CUDA

    NASA Astrophysics Data System (ADS)

    Gaburov, Evghenii; Bédorf, Jeroen; Portegies Zwart, Simon

    2010-05-01

    We present a new very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way we achieve a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s. It takes about a second to compute forces on a million particles with an opening angle of θ ≈ 0.5. The code has a convenient user interface and is freely available for use. http://castle.strw.leidenuniv.nl/software/octgrav.html

  16. A novel mini-DNA barcoding assay to identify processed fins from internationally protected shark species.

    PubMed

    Fields, Andrew T; Abercrombie, Debra L; Eng, Rowena; Feldheim, Kevin; Chapman, Demian D

    2015-01-01

    There is a growing need to identify shark products in trade, in part due to the recent listing of five commercially important species on the Appendices of the Convention on International Trade in Endangered Species (CITES; porbeagle, Lamna nasus, oceanic whitetip, Carcharhinus longimanus scalloped hammerhead, Sphyrna lewini, smooth hammerhead, S. zygaena and great hammerhead S. mokarran) in addition to three species listed in the early part of this century (whale, Rhincodon typus, basking, Cetorhinus maximus, and white, Carcharodon carcharias). Shark fins are traded internationally to supply the Asian dried seafood market, in which they are used to make the luxury dish shark fin soup. Shark fins usually enter international trade with their skin still intact and can be identified using morphological characters or standard DNA-barcoding approaches. Once they reach Asia and are traded in this region the skin is removed and they are treated with chemicals that eliminate many key diagnostic characters and degrade their DNA ("processed fins"). Here, we present a validated mini-barcode assay based on partial sequences of the cytochrome oxidase I gene that can reliably identify the processed fins of seven of the eight CITES listed shark species. We also demonstrate that the assay can even frequently identify the species or genus of origin of shark fin soup (31 out of 50 samples).

  17. A Novel Mini-DNA Barcoding Assay to Identify Processed Fins from Internationally Protected Shark Species

    PubMed Central

    Fields, Andrew T.; Abercrombie, Debra L.; Eng, Rowena; Feldheim, Kevin; Chapman, Demian D.

    2015-01-01

    There is a growing need to identify shark products in trade, in part due to the recent listing of five commercially important species on the Appendices of the Convention on International Trade in Endangered Species (CITES; porbeagle, Lamna nasus, oceanic whitetip, Carcharhinus longimanus scalloped hammerhead, Sphyrna lewini, smooth hammerhead, S. zygaena and great hammerhead S. mokarran) in addition to three species listed in the early part of this century (whale, Rhincodon typus, basking, Cetorhinus maximus, and white, Carcharodon carcharias). Shark fins are traded internationally to supply the Asian dried seafood market, in which they are used to make the luxury dish shark fin soup. Shark fins usually enter international trade with their skin still intact and can be identified using morphological characters or standard DNA-barcoding approaches. Once they reach Asia and are traded in this region the skin is removed and they are treated with chemicals that eliminate many key diagnostic characters and degrade their DNA (“processed fins”). Here, we present a validated mini-barcode assay based on partial sequences of the cytochrome oxidase I gene that can reliably identify the processed fins of seven of the eight CITES listed shark species. We also demonstrate that the assay can even frequently identify the species or genus of origin of shark fin soup (31 out of 50 samples). PMID:25646789

  18. Improving patient care by making small sustainable changes: a cardiac telemetry unit's experience.

    PubMed

    Braaten, Jane S; Bellhouse, Dorothy E

    2007-01-01

    With the introduction of each new drug, technology, and regulation, the processes of care become more complicated, creating an elaborate set of procedures connecting various hospital units and departments. Using methods of Adaptive Design and the Toyota Production System, a nursing unit redesigned work systems to achieve sustainable improvements in productivity, staff and patient satisfaction, and quality outcomes. The first hurdle of redesign was identifying problems, to which staff had become so accustomed with various work arounds that they had trouble seeing the process bottlenecks. Once the staff identified problems, they assumed they could solve the problem because they assumed they knew the causes. Utilizing root cause analysis, asking, "why, why, why," was essential to unearthing the true cause of a problem. Similarly, identifying solutions that were simple and low cost was an essential step in problem solving. Adopting new procedures and sustaining the commitment to identify and signal problems was a last and critical step toward realizing improvement, requiring a manager to function as "teacher/coach" rather than "fixer/firefighter".

  19. 78 FR 18234 - Service of Process on Manufacturers; Manufacturers Importing Electronic Products Into the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1005 [Docket No. FDA-2007-N-0091; (formerly 2007N-0104)] Service of Process on Manufacturers; Manufacturers Importing Electronic Products Into the United States; Agent Designation; Change of Address AGENCY: Food and Drug...

  20. A mobile unit for memory retrieval in daily life based on image and sensor processing

    NASA Astrophysics Data System (ADS)

    Takesumi, Ryuji; Ueda, Yasuhiro; Nakanishi, Hidenobu; Nakamura, Atsuyoshi; Kakimori, Nobuaki

    2003-10-01

    We developed a Mobile Unit which purpose is to support memory retrieval of daily life. In this paper, we describe the two characteristic factors of this unit. (1)The behavior classification with an acceleration sensor. (2)Extracting the difference of environment with image processing technology. In (1), By analyzing power and frequency of an acceleration sensor which turns to gravity direction, the one's activities can be classified using some techniques to walk, stay, and so on. In (2), By extracting the difference between the beginning scene and the ending scene of a stay scene with image processing, the result which is done by user is recognized as the difference of environment. Using those 2 techniques, specific scenes of daily life can be extracted, and important information at the change of scenes can be realized to record. Especially we describe the effect to support retrieving important things, such as a thing left behind and a state of working halfway.

  1. Google matrix of business process management

    NASA Astrophysics Data System (ADS)

    Abel, M. W.; Shepelyansky, D. L.

    2011-12-01

    Development of efficient business process models and determination of their characteristic properties are subject of intense interdisciplinary research. Here, we consider a business process model as a directed graph. Its nodes correspond to the units identified by the modeler and the link direction indicates the causal dependencies between units. It is of primary interest to obtain the stationary flow on such a directed graph, which corresponds to the steady-state of a firm during the business process. Following the ideas developed recently for the World Wide Web, we construct the Google matrix for our business process model and analyze its spectral properties. The importance of nodes is characterized by PageRank and recently proposed CheiRank and 2DRank, respectively. The results show that this two-dimensional ranking gives a significant information about the influence and communication properties of business model units. We argue that the Google matrix method, described here, provides a new efficient tool helping companies to make their decisions on how to evolve in the exceedingly dynamic global market.

  2. Analysis of possible designs of processing units with radial plasma flows

    NASA Astrophysics Data System (ADS)

    Kolesnik, V. V.; Zaitsev, S. V.; Vashilin, V. S.; Limarenko, M. V.; Prochorenkov, D. S.

    2018-03-01

    Analysis of plasma-ion methods of obtaining thin-film coatings shows that their development goes along the path of the increasing use of sputter deposition processes, which allow one to obtain multicomponent coatings with varying percentage of particular components. One of the methods that allow one to form multicomponent coatings with virtually any composition of elementary components is the method of coating deposition using quasi-magnetron sputtering systems [1]. This requires the creation of an axial magnetic field of a defined configuration with the flux density within the range of 0.01-0.1 T [2]. In order to compare and analyze various configurations of processing unit magnetic systems, it is necessary to obtain the following dependencies: the dependency of magnetic core section on the input power to inductors, the distribution of magnetic induction within the equatorial plane in the corresponding sections, the distribution of the magnetic induction value in the area of cathode target location.

  3. Impact of memory bottleneck on the performance of graphics processing units

    NASA Astrophysics Data System (ADS)

    Son, Dong Oh; Choi, Hong Jun; Kim, Jong Myon; Kim, Cheol Hong

    2015-12-01

    Recent graphics processing units (GPUs) can process general-purpose applications as well as graphics applications with the help of various user-friendly application programming interfaces (APIs) supported by GPU vendors. Unfortunately, utilizing the hardware resource in the GPU efficiently is a challenging problem, since the GPU architecture is totally different to the traditional CPU architecture. To solve this problem, many studies have focused on the techniques for improving the system performance using GPUs. In this work, we analyze the GPU performance varying GPU parameters such as the number of cores and clock frequency. According to our simulations, the GPU performance can be improved by 125.8% and 16.2% on average as the number of cores and clock frequency increase, respectively. However, the performance is saturated when memory bottleneck problems incur due to huge data requests to the memory. The performance of GPUs can be improved as the memory bottleneck is reduced by changing GPU parameters dynamically.

  4. Prototype design of singles processing unit for the small animal PET

    NASA Astrophysics Data System (ADS)

    Deng, P.; Zhao, L.; Lu, J.; Li, B.; Dong, R.; Liu, S.; An, Q.

    2018-05-01

    Position Emission Tomography (PET) is an advanced clinical diagnostic imaging technique for nuclear medicine. Small animal PET is increasingly used for studying the animal model of disease, new drugs and new therapies. A prototype of Singles Processing Unit (SPU) for a small animal PET system was designed to obtain the time, energy, and position information. The energy and position is actually calculated through high precison charge measurement, which is based on amplification, shaping, A/D conversion and area calculation in digital signal processing domian. Analysis and simulations were also conducted to optimize the key parameters in system design. Initial tests indicate that the charge and time precision is better than 3‰ FWHM and 350 ps FWHM respectively, while the position resolution is better than 3.5‰ FWHM. Commination tests of the SPU prototype with the PET detector indicate that the system time precision is better than 2.5 ns, while the flood map and energy spectra concored well with the expected.

  5. Identifying counties vulnerable to diabetes from obesity prevalence in the United States: a spatiotemporal analysis.

    PubMed

    Li, Xiao; Staudt, Amanda; Chien, Lung-Chang

    2016-11-21

    Clinical and epidemiological research has reported a strong association between diabetes and obesity. However, whether increased diabetes prevalence is more likely to appear in areas with increased obesity prevalence has not been thoroughly investigated in the United States (US). The Bayesian structured additive regression model was applied to identify whether counties with higher obesity prevalence are more likely clustered in specific regions in 48 contiguous US states. Prevalence data adopted the small area estimate from the Behavioral Risk Factor Surveillance System. Confounding variables like socioeconomic status adopted data were from the American Community Survey. This study reveals that an increased percentage of relative risk of diabetes was more likely to appear in Southeast, Northeast, Central and South regions. Of counties vulnerable to diabetes, 36.8% had low obesity prevalence, and most of them were located in the Southeast, Central, and South regions. The geographic distribution of counties vulnerable to diabetes expanded to the Southwest, West and Northern regions when obesity prevalence increased. This study also discloses that 7.4% of counties had the largest average in predicted diabetes prevalence compared to the other counties. Their average diabetes prevalence escalated from 8.7% in 2004 to 11.2% in 2011. This study not only identifies counties vulnerable to diabetes due to obesity, but also distinguishes counties in terms of different levels of vulnerability to diabetes. The findings can provide the possibility of establishing targeted surveillance systems to raise awareness of diabetes in those counties.

  6. Clocking the social mind by identifying mental processes in the IAT with electrical neuroimaging

    PubMed Central

    Schiller, Bastian; Gianotti, Lorena R. R.; Baumgartner, Thomas; Nash, Kyle; Koenig, Thomas; Knoch, Daria

    2016-01-01

    Why do people take longer to associate the word “love” with outgroup words (incongruent condition) than with ingroup words (congruent condition)? Despite the widespread use of the implicit association test (IAT), it has remained unclear whether this IAT effect is due to additional mental processes in the incongruent condition, or due to longer duration of the same processes. Here, we addressed this previously insoluble issue by assessing the spatiotemporal evolution of brain electrical activity in 83 participants. From stimulus presentation until response production, we identified seven processes. Crucially, all seven processes occurred in the same temporal sequence in both conditions, but participants needed more time to perform one early occurring process (perceptual processing) and one late occurring process (implementing cognitive control to select the motor response) in the incongruent compared with the congruent condition. We also found that the latter process contributed to individual differences in implicit bias. These results advance understanding of the neural mechanics of response time differences in the IAT: They speak against theories that explain the IAT effect as due to additional processes in the incongruent condition and speak in favor of theories that assume a longer duration of specific processes in the incongruent condition. More broadly, our data analysis approach illustrates the potential of electrical neuroimaging to illuminate the temporal organization of mental processes involved in social cognition. PMID:26903643

  7. Clocking the social mind by identifying mental processes in the IAT with electrical neuroimaging.

    PubMed

    Schiller, Bastian; Gianotti, Lorena R R; Baumgartner, Thomas; Nash, Kyle; Koenig, Thomas; Knoch, Daria

    2016-03-08

    Why do people take longer to associate the word "love" with outgroup words (incongruent condition) than with ingroup words (congruent condition)? Despite the widespread use of the implicit association test (IAT), it has remained unclear whether this IAT effect is due to additional mental processes in the incongruent condition, or due to longer duration of the same processes. Here, we addressed this previously insoluble issue by assessing the spatiotemporal evolution of brain electrical activity in 83 participants. From stimulus presentation until response production, we identified seven processes. Crucially, all seven processes occurred in the same temporal sequence in both conditions, but participants needed more time to perform one early occurring process (perceptual processing) and one late occurring process (implementing cognitive control to select the motor response) in the incongruent compared with the congruent condition. We also found that the latter process contributed to individual differences in implicit bias. These results advance understanding of the neural mechanics of response time differences in the IAT: They speak against theories that explain the IAT effect as due to additional processes in the incongruent condition and speak in favor of theories that assume a longer duration of specific processes in the incongruent condition. More broadly, our data analysis approach illustrates the potential of electrical neuroimaging to illuminate the temporal organization of mental processes involved in social cognition.

  8. Identifying critical success factors for designing selection processes into postgraduate specialty training: the case of UK general practice.

    PubMed

    Plint, Simon; Patterson, Fiona

    2010-06-01

    The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.

  9. Extreme Environment Capable, Modular and Scalable Power Processing Unit for Solar Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; DelCastillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.

    2013-01-01

    This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.

  10. Extreme Environment Capable, Modular and Scalable Power Processing Unit for Solar Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; Del Castillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.

    2013-01-01

    This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where the reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.

  11. Model of a programmable quantum processing unit based on a quantum transistor effect

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  12. Influence of unit operations on the levels of polyacetylenes in minimally processed carrots and parsnips: An industrial trial.

    PubMed

    Koidis, Anastasios; Rawson, Ashish; Tuohy, Maria; Brunton, Nigel

    2012-06-01

    Carrots and parsnips are often consumed as minimally processed ready-to-eat convenient foods and contain in minor quantities, bioactive aliphatic C17-polyacetylenes (falcarinol, falcarindiol, falcarindiol-3-acetate). Their retention during minimal processing in an industrial trial was evaluated. Carrot and parsnips were prepared in four different forms (disc cutting, baton cutting, cubing and shredding) and samples were taken in every point of their processing line. The unit operations were: peeling, cutting and washing with chlorinated water and also retention during 7days storage was evaluated. The results showed that the initial unit operations (mainly peeling) influence the polyacetylene retention. This was attributed to the high polyacetylene content of their peels. In most cases, when washing was performed after cutting, less retention was observed possibly due to leakage during tissue damage occurred in the cutting step. The relatively high retention during storage indicates high plant matrix stability. Comparing the behaviour of polyacetylenes in the two vegetables during storage, the results showed that they were slightly more retained in parsnips than in carrots. Unit operations and especially abrasive peeling might need further optimisation to make them gentler and minimise bioactive losses. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Report: Inconsistencies With EPA Policy Identified in Region 10's Biweekly Pay Cap Waiver Process

    EPA Pesticide Factsheets

    Report #18-P-0068, January 12, 2018. We identified issues with documentation and review of biweekly pay cap waivers at Region 10, resulting from a lack of an internal policy or process. Region 10 recently issued a new procedure that addresses our concerns.

  14. A Study Identifying and Validating Competencies Needed for Mid-Managers That Work in Housing and Residence Life at Colleges and Universities in the United States of America

    ERIC Educational Resources Information Center

    Morrison, Hassel Andre

    2016-01-01

    The researcher identified a gap in the knowledge of competencies needed for midmanagers that work in housing and residence life at the southeast colleges and universities in the United States. The purpose of this study was to identify and develop a consensus on competencies needed by mid-managers. The review of the literature describes and…

  15. The Narrative-Emotion Process Coding System 2.0: A multi-methodological approach to identifying and assessing narrative-emotion process markers in psychotherapy.

    PubMed

    Angus, Lynne E; Boritz, Tali; Bryntwick, Emily; Carpenter, Naomi; Macaulay, Christianne; Khattra, Jasmine

    2017-05-01

    Recent studies suggest that it is not simply the expression of emotion or emotional arousal in session that is important, but rather it is the reflective processing of emergent, adaptive emotions, arising in the context of personal storytelling and/or Emotion-Focused Therapy (EFT) interventions, that is associated with change. To enhance narrative-emotion integration specifically in EFT, Angus and Greenberg originally identified a set of eight clinically derived narrative-emotion integration markers were originally identified for the implementation of process-guiding therapeutic responses. Further evaluation and testing by the Angus Narrative-Emotion Marker Lab resulted in the identification of 10 empirically validated Narrative-Emotion Process (N-EP) markers that are included in the Narrative-Emotion Process Coding System Version 2.0 (NEPCS 2.0). Based on empirical research findings, individual markers are clustered into Problem (e.g., stuckness in repetitive story patterns, over-controlled or dysregulated emotion, lack of reflectivity), Transition (e.g., reflective, access to adaptive emotions and new emotional plotlines, heightened narrative and emotion integration), and Change (e.g., new story outcomes and self-narrative discovery, and co-construction and re-conceptualization) subgroups. To date, research using the NEPCS 2.0 has investigated the proportion and pattern of narrative-emotion markers in Emotion-Focused, Client-Centered, and Cognitive Therapy for Major Depression, Motivational Interviewing plus Cognitive Behavioral Therapy for Generalized Anxiety Disorder, and EFT for Complex Trauma. Results have consistently identified significantly higher proportions of N-EP Transition and Change markers, and productive shifts, in mid- and late phase sessions, for clients who achieved recovery by treatment termination. Recovery is consistently associated with client storytelling that is emotionally engaged, reflective, and evidencing new story outcomes and self

  16. Implementing evidence in an onco-haematology nursing unit: a process of change using participatory action research.

    PubMed

    Abad-Corpa, Eva; Delgado-Hito, Pilar; Cabrero-García, Julio; Meseguer-Liza, Cristobal; Zárate-Riscal, Carmen Lourdes; Carrillo-Alcaraz, Andrés; Martínez-Corbalán, José Tomás; Caravaca-Hernández, Amor

    2013-03-01

    To implement evidence in a nursing unit and to gain a better understanding of the experience of change within a participatory action research. Study design of a participatory action research type was use from the constructivist paradigm. The analytical-methodological decisions were inspired by Checkland Flexible Systems for evidence implementation in the nursing unit. The study was carried out between March and November 2007 in the isolation unit section for onco-haematological patients in a tertiary level general university hospital in Spain. Accidental sampling was carried out with the participation of six nurses. Data were collected using five group meetings and individual reflections in participants' dairies. The participant observation technique was also carried out by researchers. Data analysis was carried out by content analysis. The rigorous criteria were used: credibility, confirmability, dependence, transferability and reflexivity. A lack of use of evidence in clinical practice is the main problem. The factors involved were identified (training, values, beliefs, resources and professional autonomy). Their daily practice (complexity in taking decisions, variability, lack of professional autonomy and safety) was compared with an ideal situation (using evidence it will be possible to normalise practice and to work more effectively in teams by increasing safety and professional recognition). It was decided to create five working areas about several clinical topics (mucositis, pain, anxiety, satisfaction, nutritional assessment, nauseas and vomiting, pressure ulcers and catheter-related problems) and seven changes in clinical practice were agreed upon together with 11 implementation strategies. Some reflections were made about the features of the study: the changes produced; the strategies used and how to improve them; the nursing 'subculture'; attitudes towards innovation; and the commitment as participants in the study and as healthcare professionals. The

  17. Life Science, A Process Approach, Second Edition: Revised, 1970.

    ERIC Educational Resources Information Center

    Phare, Wayne; And Others

    Seventeen scientific processes are identified and annotated; some suggestions for activities to demonstrate them are given. These processes are used as headings in the teacher's guide to succeeding units on biological classifications, microbiology, physiology of plants and animals, chick embryology and ecology. Similar headings are usually used in…

  18. Integration Process for the Habitat Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott

    2010-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of

  19. PO*WW*ER mobile treatment unit process hazards analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damagemore » and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less

  20. Identifying sediment sources in the sediment TMDL process

    USGS Publications Warehouse

    Gellis, Allen C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.; Landy, R.B.; Gorman Sanisaca, Lillian E.

    2015-01-01

    Sediment is an important pollutant contributing to aquatic-habitat degradation in many waterways of the United States. This paper discusses the application of sediment budgets in conjunction with sediment fingerprinting as tools to determine the sources of sediment in impaired waterways. These approaches complement monitoring, assessment, and modeling of sediment erosion, transport, and storage in watersheds. Combining the sediment fingerprinting and sediment budget approaches can help determine specific adaptive management plans and techniques applied to targeting hot spots or areas of high erosion.

  1. Mendel-GPU: haplotyping and genotype imputation on graphics processing units

    PubMed Central

    Chen, Gary K.; Wang, Kai; Stram, Alex H.; Sobel, Eric M.; Lange, Kenneth

    2012-01-01

    Motivation: In modern sequencing studies, one can improve the confidence of genotype calls by phasing haplotypes using information from an external reference panel of fully typed unrelated individuals. However, the computational demands are so high that they prohibit researchers with limited computational resources from haplotyping large-scale sequence data. Results: Our graphics processing unit based software delivers haplotyping and imputation accuracies comparable to competing programs at a fraction of the computational cost and peak memory demand. Availability: Mendel-GPU, our OpenCL software, runs on Linux platforms and is portable across AMD and nVidia GPUs. Users can download both code and documentation at http://code.google.com/p/mendel-gpu/. Contact: gary.k.chen@usc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22954633

  2. Reducing intraoperative red blood cell unit wastage in a large academic medical center

    PubMed Central

    Whitney, Gina M.; Woods, Marcella C.; France, Daniel J.; Austin, Thomas M.; Deegan, Robert J.; Paroskie, Allison; Booth, Garrett S.; Young, Pampee P.; Dmochowski, Roger R.; Sandberg, Warren S.; Pilla, Michael A.

    2015-01-01

    BACKGROUND The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. STUDY DESIGN AND METHODS Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. RESULTS Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p <0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15–0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. CONCLUSIONS These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. PMID:26202213

  3. Cardiorespiratory instability in monitored step-down unit patients: using cluster analysis to identify patterns of change

    PubMed Central

    Clermont, Gilles; Chen, Lujie; Dubrawski, Artur W.; Ren, Dianxu; Hoffman, Leslie A.; Pinsky, Michael R.; Hravnak, Marilyn

    2018-01-01

    Cardiorespiratory instability (CRI) in monitored step-down unit (SDU) patients has a variety of etiologies, and likely manifests in patterns of vital signs (VS) changes. We explored use of clustering techniques to identify patterns in the initial CRI epoch (CRI1; first exceedances of VS beyond stability thresholds after SDU admission) of unstable patients, and inter-cluster differences in admission characteristics and outcomes. Continuous noninvasive monitoring of heart rate (HR), respiratory rate (RR), and pulse oximetry (SpO2) were sampled at 1/20 Hz. We identified CRI1 in 165 patients, employed hierarchical and k-means clustering, tested several clustering solutions, used 10-fold cross validation to establish the best solution and assessed inter-cluster differences in admission characteristics and outcomes. Three clusters (C) were derived: C1) normal/high HR and RR, normal SpO2 (n = 30); C2) normal HR and RR, low SpO2 (n = 103); and C3) low/normal HR, low RR and normal SpO2 (n = 32). Clusters were significantly different based on age (p < 0.001; older patients in C2), number of comorbidities (p = 0.008; more C2 patients had ≥ 2) and hospital length of stay (p = 0.006; C1 patients stayed longer). There were no between-cluster differences in SDU length of stay, or mortality. Three different clusters of VS presentations for CRI1 were identified. Clusters varied on age, number of comorbidities and hospital length of stay. Future study is needed to determine if there are common physiologic underpinnings of VS clusters which might inform clinical decision-making when CRI first manifests. PMID:28229353

  4. Graphics Processing Unit Acceleration of Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Hause, Benjamin; Parker, Scott; Chen, Yang

    2013-10-01

    We find a substantial increase in on-node performance using Graphics Processing Unit (GPU) acceleration in gyrokinetic delta-f particle-in-cell simulation. Optimization is performed on a two-dimensional slab gyrokinetic particle simulation using the Portland Group Fortran compiler with the OpenACC compiler directives and Fortran CUDA. Mixed implementation of both Open-ACC and CUDA is demonstrated. CUDA is required for optimizing the particle deposition algorithm. We have implemented the GPU acceleration on a third generation Core I7 gaming PC with two NVIDIA GTX 680 GPUs. We find comparable, or better, acceleration relative to the NERSC DIRAC cluster with the NVIDIA Tesla C2050 computing processor. The Tesla C 2050 is about 2.6 times more expensive than the GTX 580 gaming GPU. We also see enormous speedups (10 or more) on the Titan supercomputer at Oak Ridge with Kepler K20 GPUs. Results show speed-ups comparable or better than that of OpenMP models utilizing multiple cores. The use of hybrid OpenACC, CUDA Fortran, and MPI models across many nodes will also be discussed. Optimization strategies will be presented. We will discuss progress on optimizing the comprehensive three dimensional general geometry GEM code.

  5. Ultra-processed food consumption in children from a Basic Health Unit.

    PubMed

    Sparrenberger, Karen; Friedrich, Roberta Roggia; Schiffner, Mariana Dihl; Schuch, Ilaine; Wagner, Mário Bernardes

    2015-01-01

    To evaluate the contribution of ultra-processed food (UPF) on the dietary consumption of children treated at a Basic Health Unit and the associated factors. Cross-sectional study carried out with a convenience sample of 204 children, aged 2-10 years old, in Southern Brazil. Children's food intake was assessed using a 24-h recall questionnaire. Food items were classified as minimally processed, processed for culinary use, and ultra-processed. A semi-structured questionnaire was applied to collect socio-demographic and anthropometric variables. Overweight in children was classified using a Z score >2 for children younger than 5 and Z score >+1 for those aged between 5 and 10 years, using the body mass index for age. Overweight frequency was 34% (95% CI: 28-41%). Mean energy consumption was 1672.3 kcal/day, with 47% (95% CI: 45-49%) coming from ultra-processed food. In the multiple linear regression model, maternal education (r=0.23; p=0.001) and child age (r=0.40; p<0.001) were factors associated with a greater percentage of UPF in the diet (r=0.42; p<0.001). Additionally, a statistically significant trend for higher UPF consumption was observed when data were stratified by child age and maternal educational level (p<0.001). The contribution of UPF is significant in children's diets and age appears to be an important factor for the consumption of such products. Copyright © 2015 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  6. Magnetoencephalographic Signals Identify Stages in Real-Life Decision Processes

    PubMed Central

    Braeutigam, Sven; Stins, John F.; Rose, Steven P. R.; Swithenby, Stephen J.; Ambler, Tim

    2001-01-01

    We used magnetoencephalography (MEG) to study the dynamics of neural responses in eight subjects engaged in shopping for day-to-day items from supermarket shelves. This behavior not only has personal and economic importance but also provides an example of an experience that is both personal and shared between individuals. The shopping experience enables the exploration of neural mechanisms underlying choice based on complex memories. Choosing among different brands of closely related products activated a robust sequence of signals within the first second after the presentation of the choice images. This sequence engaged first the visual cortex (80-100 ms), then as the images were analyzed, predominantly the left temporal regions (310-340 ms). At longer latency, characteristic neural activetion was found in motor speech areas (500-520 ms) for images requiring low salience choices with respect to previous (brand) memory, and in right parietal cortex for high salience choices (850-920 ms). We argue that the neural processes associated with the particular brand-choice stimulus can be separated into identifiable stages through observation of MEG responses and knowledge of functional anatomy. PMID:12018772

  7. Using natural language processing to identify problem usage of prescription opioids.

    PubMed

    Carrell, David S; Cronkite, David; Palmer, Roy E; Saunders, Kathleen; Gross, David E; Masters, Elizabeth T; Hylan, Timothy R; Von Korff, Michael

    2015-12-01

    Accurate and scalable surveillance methods are critical to understand widespread problems associated with misuse and abuse of prescription opioids and for implementing effective prevention and control measures. Traditional diagnostic coding incompletely documents problem use. Relevant information for each patient is often obscured in vast amounts of clinical text. We developed and evaluated a method that combines natural language processing (NLP) and computer-assisted manual review of clinical notes to identify evidence of problem opioid use in electronic health records (EHRs). We used the EHR data and text of 22,142 patients receiving chronic opioid therapy (≥70 days' supply of opioids per calendar quarter) during 2006-2012 to develop and evaluate an NLP-based surveillance method and compare it to traditional methods based on International Classification of Disease, Ninth Edition (ICD-9) codes. We developed a 1288-term dictionary for clinician mentions of opioid addiction, abuse, misuse or overuse, and an NLP system to identify these mentions in unstructured text. The system distinguished affirmative mentions from those that were negated or otherwise qualified. We applied this system to 7336,445 electronic chart notes of the 22,142 patients. Trained abstractors using a custom computer-assisted software interface manually reviewed 7751 chart notes (from 3156 patients) selected by the NLP system and classified each note as to whether or not it contained textual evidence of problem opioid use. Traditional diagnostic codes for problem opioid use were found for 2240 (10.1%) patients. NLP-assisted manual review identified an additional 728 (3.1%) patients with evidence of clinically diagnosed problem opioid use in clinical notes. Inter-rater reliability among pairs of abstractors reviewing notes was high, with kappa=0.86 and 97% agreement for one pair, and kappa=0.71 and 88% agreement for another pair. Scalable, semi-automated NLP methods can efficiently and

  8. Optimized mobile retroreflectivity unit data processing algorithms.

    DOT National Transportation Integrated Search

    2017-04-01

    The University of North Florida, in collaboration with the FDOT, was tasked to establish precise line-stripe evaluation methods using the Mobile Retroreflectivity Unit (MRU). Initial implementation of the manufacturers software resulted in measure...

  9. Dynamic motif occupancy (DynaMO) analysis identifies transcription factors and their binding sites driving dynamic biological processes

    PubMed Central

    Kuang, Zheng; Ji, Zhicheng

    2018-01-01

    Abstract Biological processes are usually associated with genome-wide remodeling of transcription driven by transcription factors (TFs). Identifying key TFs and their spatiotemporal binding patterns are indispensable to understanding how dynamic processes are programmed. However, most methods are designed to predict TF binding sites only. We present a computational method, dynamic motif occupancy analysis (DynaMO), to infer important TFs and their spatiotemporal binding activities in dynamic biological processes using chromatin profiling data from multiple biological conditions such as time-course histone modification ChIP-seq data. In the first step, DynaMO predicts TF binding sites with a random forests approach. Next and uniquely, DynaMO infers dynamic TF binding activities at predicted binding sites using their local chromatin profiles from multiple biological conditions. Another landmark of DynaMO is to identify key TFs in a dynamic process using a clustering and enrichment analysis of dynamic TF binding patterns. Application of DynaMO to the yeast ultradian cycle, mouse circadian clock and human neural differentiation exhibits its accuracy and versatility. We anticipate DynaMO will be generally useful for elucidating transcriptional programs in dynamic processes. PMID:29325176

  10. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  11. The fundamental units, processes and patterns of evolution, and the Tree of Life conundrum

    PubMed Central

    Koonin, Eugene V; Wolf, Yuri I

    2009-01-01

    Background The elucidation of the dominant role of horizontal gene transfer (HGT) in the evolution of prokaryotes led to a severe crisis of the Tree of Life (TOL) concept and intense debates on this subject. Concept Prompted by the crisis of the TOL, we attempt to define the primary units and the fundamental patterns and processes of evolution. We posit that replication of the genetic material is the singular fundamental biological process and that replication with an error rate below a certain threshold both enables and necessitates evolution by drift and selection. Starting from this proposition, we outline a general concept of evolution that consists of three major precepts. 1. The primary agency of evolution consists of Fundamental Units of Evolution (FUEs), that is, units of genetic material that possess a substantial degree of evolutionary independence. The FUEs include both bona fide selfish elements such as viruses, viroids, transposons, and plasmids, which encode some of the information required for their own replication, and regular genes that possess quasi-independence owing to their distinct selective value that provides for their transfer between ensembles of FUEs (genomes) and preferential replication along with the rest of the recipient genome. 2. The history of replication of a genetic element without recombination is isomorphously represented by a directed tree graph (an arborescence, in the graph theory language). Recombination within a FUE is common between very closely related sequences where homologous recombination is feasible but becomes negligible for longer evolutionary distances. In contrast, shuffling of FUEs occurs at all evolutionary distances. Thus, a tree is a natural representation of the evolution of an individual FUE on the macro scale, but not of an ensemble of FUEs such as a genome. 3. The history of life is properly represented by the "forest" of evolutionary trees for individual FUEs (Forest of Life, or FOL). Search for trends

  12. Accelerating NBODY6 with graphics processing units

    NASA Astrophysics Data System (ADS)

    Nitadori, Keigo; Aarseth, Sverre J.

    2012-07-01

    We describe the use of graphics processing units (GPUs) for speeding up the code NBODY6 which is widely used for direct N-body simulations. Over the years, the N2 nature of the direct force calculation has proved a barrier for extending the particle number. Following an early introduction of force polynomials and individual time steps, the calculation cost was first reduced by the introduction of a neighbour scheme. After a decade of GRAPE computers which speeded up the force calculation further, we are now in the era of GPUs where relatively small hardware systems are highly cost effective. A significant gain in efficiency is achieved by employing the GPU to obtain the so-called regular force which typically involves some 99 per cent of the particles, while the remaining local forces are evaluated on the host. However, the latter operation is performed up to 20 times more frequently and may still account for a significant cost. This effort is reduced by parallel SSE/AVX procedures where each interaction term is calculated using mainly single precision. We also discuss further strategies connected with coordinate and velocity prediction required by the integration scheme. This leaves hard binaries and multiple close encounters which are treated by several regularization methods. The present NBODY6-GPU code is well balanced for simulations in the particle range 104-2 × 105 for a dual-GPU system attached to a standard PC.

  13. Compliance of clinical microbiology laboratories in the United States with current recommendations for processing respiratory tract specimens from patients with cystic fibrosis.

    PubMed

    Zhou, Juyan; Garber, Elizabeth; Desai, Manisha; Saiman, Lisa

    2006-04-01

    Respiratory tract specimens from patients with cystic fibrosis (CF) require unique processing by clinical microbiology laboratories to ensure detection of all potential pathogens. The present study sought to determine the compliance of microbiology laboratories in the United States with recently published recommendations for CF respiratory specimens. Microbiology laboratory protocols from 150 of 190 (79%) CF care sites were reviewed. Most described the use of selective media for Burkholderia cepacia complex (99%), Staphylococcus aureus (82%), and Haemophilus influenzae (89%) and identified the species of all gram-negative bacilli (87%). Only 52% delineated the use of agar diffusion assays for susceptibility testing of Pseudomonas aeruginosa. Standardizing laboratory practices will improve treatment, infection control, and our understanding of the changing epidemiology of CF microbiology.

  14. Effect of medium on friction and wear properties of compacted graphite cast iron processed by biomimetic coupling laser remelting process

    NASA Astrophysics Data System (ADS)

    Guo, Qing-chun; Zhou, Hong; Wang, Cheng-tao; Zhang, Wei; Lin, Peng-yu; Sun, Na; Ren, Luquan

    2009-04-01

    Stimulated by the cuticles of soil animals, an attempt to improve the wear resistance of compact graphite cast iron (CGI) with biomimetic units on the surface was made by using a biomimetic coupled laser remelting process in air and various thicknesses water film, respectively. The microstructures of biomimetic units were examined by scanning electron microscope and X-ray diffraction was used to describe the microstructure and identify the phases in the melted zone. Microhardness was measured and the wear behaviors of biomimetic specimens as functions of different mediums as well as various water film thicknesses were investigated under dry sliding condition, respectively. The results indicated that the microstructure zones in the biomimetic specimens processed with water film are refined compared with that processed in air and had better wear resistance increased by 60%, the microhardness of biomimetic units has been improved significantly. The application of water film provided finer microstructures and much more regular grain shape in biomimetic units, which played a key role in improving the friction properties and wear resistance of CGI.

  15. Employing OpenCL to Accelerate Ab Initio Calculations on Graphics Processing Units.

    PubMed

    Kussmann, Jörg; Ochsenfeld, Christian

    2017-06-13

    We present an extension of our graphics processing units (GPU)-accelerated quantum chemistry package to employ OpenCL compute kernels, which can be executed on a wide range of computing devices like CPUs, Intel Xeon Phi, and AMD GPUs. Here, we focus on the use of AMD GPUs and discuss differences as compared to CUDA-based calculations on NVIDIA GPUs. First illustrative timings are presented for hybrid density functional theory calculations using serial as well as parallel compute environments. The results show that AMD GPUs are as fast or faster than comparable NVIDIA GPUs and provide a viable alternative for quantum chemical applications.

  16. Concentrations of polychlorinated dibenzo-p-dioxins in processed ball clay from the United States.

    PubMed

    Ferrario, Joseph; Byrne, Christian; Schaum, John

    2007-04-01

    Processed ball clays commonly used by the ceramic art industry in the United States were collected from retail suppliers and analyzed for the presence and concentration of the 2,3,7,8-Cl substituted polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDDs/PCDFs). The average PCDD toxic equivalent (TEQ) concentrations of these processed ball clays was approximately 800 pg/g (TEQ-WHO) with characteristic congener profiles and isomer distributions similar to patterns of previously analyzed raw and processed ball clays. The PCDF concentrations were below the average limit of detection (LOD) of 0.5 pg/g. Correlation analyses reveal no significant relationship between total organic carbon (TOC) and either individual, homologues, and total tetra-through octa-chlorinated PCDD congeners, or TEQ concentrations of the processed ball clays. The results are consistent with earlier studies on levels of PCDDs in ball clays. Data from earlier studies indicated that dioxins may be released to the environment during the processing of raw clay or the firing process used in commercial ceramic facilities. The presence of dioxin in the clays also raises concerns about potential occupational exposure for individuals involved in the mining/processing of ball clay, ceramics manufacturing and ceramic artwork.

  17. Real-time processing for full-range Fourier-domain optical-coherence tomography with zero-filling interpolation using multiple graphic processing units.

    PubMed

    Watanabe, Yuuki; Maeno, Seiya; Aoshima, Kenji; Hasegawa, Haruyuki; Koseki, Hitoshi

    2010-09-01

    The real-time display of full-range, 2048?axial pixelx1024?lateral pixel, Fourier-domain optical-coherence tomography (FD-OCT) images is demonstrated. The required speed was achieved by using dual graphic processing units (GPUs) with many stream processors to realize highly parallel processing. We used a zero-filling technique, including a forward Fourier transform, a zero padding to increase the axial data-array size to 8192, an inverse-Fourier transform back to the spectral domain, a linear interpolation from wavelength to wavenumber, a lateral Hilbert transform to obtain the complex spectrum, a Fourier transform to obtain the axial profiles, and a log scaling. The data-transfer time of the frame grabber was 15.73?ms, and the processing time, which includes the data transfer between the GPU memory and the host computer, was 14.75?ms, for a total time shorter than the 36.70?ms frame-interval time using a line-scan CCD camera operated at 27.9?kHz. That is, our OCT system achieved a processed-image display rate of 27.23 frames/s.

  18. Multi-Unit Considerations for Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St. Germain, S.; Boring, R.; Banaseanu, G.

    This paper uses the insights from the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) methodology to help identify human actions currently modeled in the single unit PSA that may need to be modified to account for additional challenges imposed by a multi-unit accident as well as identify possible new human actions that might be modeled to more accurately characterize multi-unit risk. In identifying these potential human action impacts, the use of the SPAR-H strategy to include both errors in diagnosis and errors in action is considered as well as identifying characteristics of a multi-unit accident scenario that may impact themore » selection of the performance shaping factors (PSFs) used in SPAR-H. The lessons learned from the Fukushima Daiichi reactor accident will be addressed to further help identify areas where improved modeling may be required. While these multi-unit impacts may require modifications to a Level 1 PSA model, it is expected to have much more importance for Level 2 modeling. There is little currently written specifically about multi-unit HRA issues. A review of related published research will be presented. While this paper cannot answer all issues related to multi-unit HRA, it will hopefully serve as a starting point to generate discussion and spark additional ideas towards the proper treatment of HRA in a multi-unit PSA.« less

  19. Low cost solar aray project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental Process System Development Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.

  20. Lean methodology for performance improvement in the trauma discharge process.

    PubMed

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  1. The AMchip04 and the processing unit prototype for the FastTracker

    NASA Astrophysics Data System (ADS)

    Andreani, A.; Annovi, A.; Beretta, M.; Bogdan, M.; Citterio, M.; Alberti, F.; Giannetti, P.; Lanza, A.; Magalotti, D.; Piendibene, M.; Shochet, M.; Stabile, A.; Tang, J.; Tompkins, L.; Volpi, G.

    2012-08-01

    Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment`s complexity, the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive event selection. We present the first prototype of a new Processing Unit (PU), the core of the FastTracker processor (FTK). FTK is a real time tracking device for the ATLAS experiment`s trigger upgrade. The computing power of the PU is such that a few hundred of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV/c in ATLAS events up to Phase II instantaneous luminosities (3 × 1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below a hundred microseconds. The PU provides massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generally referred to as the ``combinatorial challenge'', is solved by the Associative Memory (AM) technology exploiting parallelism to the maximum extent; it compares the event to all pre-calculated ``expectations'' or ``patterns'' (pattern matching) simultaneously, looking for candidate tracks called ``roads''. This approach reduces to a linear behavior the typical exponential complexity of the CPU based algorithms. Pattern recognition is completed by the time data are loaded into the AM devices. We report on the design of the first Processing Unit prototypes. The design had to address the most challenging aspects of this technology: a huge number of detector clusters (``hits'') must be distributed at high rate with very large fan-out to all patterns (10 Million patterns will be located on 128 chips placed on a single board) and a huge number of roads must be collected and sent back to the FTK post-pattern-recognition functions. A network of high speed serial links is used to solve the data distribution problem.

  2. Motor unit action potential conduction velocity estimated from surface electromyographic signals using image processing techniques.

    PubMed

    Soares, Fabiano Araujo; Carvalho, João Luiz Azevedo; Miosso, Cristiano Jacques; de Andrade, Marcelino Monteiro; da Rocha, Adson Ferreira

    2015-09-17

    In surface electromyography (surface EMG, or S-EMG), conduction velocity (CV) refers to the velocity at which the motor unit action potentials (MUAPs) propagate along the muscle fibers, during contractions. The CV is related to the type and diameter of the muscle fibers, ion concentration, pH, and firing rate of the motor units (MUs). The CV can be used in the evaluation of contractile properties of MUs, and of muscle fatigue. The most popular methods for CV estimation are those based on maximum likelihood estimation (MLE). This work proposes an algorithm for estimating CV from S-EMG signals, using digital image processing techniques. The proposed approach is demonstrated and evaluated, using both simulated and experimentally-acquired multichannel S-EMG signals. We show that the proposed algorithm is as precise and accurate as the MLE method in typical conditions of noise and CV. The proposed method is not susceptible to errors associated with MUAP propagation direction or inadequate initialization parameters, which are common with the MLE algorithm. Image processing -based approaches may be useful in S-EMG analysis to extract different physiological parameters from multichannel S-EMG signals. Other new methods based on image processing could also be developed to help solving other tasks in EMG analysis, such as estimation of the CV for individual MUs, localization and tracking of innervation zones, and study of MU recruitment strategies.

  3. Efficacy of identifying neural components in the face and emotion processing system in schizophrenia using a dynamic functional localizer.

    PubMed

    Arnold, Aiden E G F; Iaria, Giuseppe; Goghari, Vina M

    2016-02-28

    Schizophrenia is associated with deficits in face perception and emotion recognition. Despite consistent behavioural results, the neural mechanisms underlying these cognitive abilities have been difficult to isolate, in part due to differences in neuroimaging methods used between studies for identifying regions in the face processing system. Given this problem, we aimed to validate a recently developed fMRI-based dynamic functional localizer task for use in studies of psychiatric populations and specifically schizophrenia. Previously, this functional localizer successfully identified each of the core face processing regions (i.e. fusiform face area, occipital face area, superior temporal sulcus), and regions within an extended system (e.g. amygdala) in healthy individuals. In this study, we tested the functional localizer success rate in 27 schizophrenia patients and in 24 community controls. Overall, the core face processing regions were localized equally between both the schizophrenia and control group. Additionally, the amygdala, a candidate brain region from the extended system, was identified in nearly half the participants from both groups. These results indicate the effectiveness of a dynamic functional localizer at identifying regions of interest associated with face perception and emotion recognition in schizophrenia. The use of dynamic functional localizers may help standardize the investigation of the facial and emotion processing system in this and other clinical populations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Speedup for quantum optimal control from automatic differentiation based on graphics processing units

    NASA Astrophysics Data System (ADS)

    Leung, Nelson; Abdelhafez, Mohamed; Koch, Jens; Schuster, David

    2017-04-01

    We implement a quantum optimal control algorithm based on automatic differentiation and harness the acceleration afforded by graphics processing units (GPUs). Automatic differentiation allows us to specify advanced optimization criteria and incorporate them in the optimization process with ease. We show that the use of GPUs can speedup calculations by more than an order of magnitude. Our strategy facilitates efficient numerical simulations on affordable desktop computers and exploration of a host of optimization constraints and system parameters relevant to real-life experiments. We demonstrate optimization of quantum evolution based on fine-grained evaluation of performance at each intermediate time step, thus enabling more intricate control on the evolution path, suppression of departures from the truncated model subspace, as well as minimization of the physical time needed to perform high-fidelity state preparation and unitary gates.

  5. Parallelized multi-graphics processing unit framework for high-speed Gabor-domain optical coherence microscopy.

    PubMed

    Tankam, Patrice; Santhanam, Anand P; Lee, Kye-Sung; Won, Jungeun; Canavesi, Cristina; Rolland, Jannick P

    2014-07-01

    Gabor-domain optical coherence microscopy (GD-OCM) is a volumetric high-resolution technique capable of acquiring three-dimensional (3-D) skin images with histological resolution. Real-time image processing is needed to enable GD-OCM imaging in a clinical setting. We present a parallelized and scalable multi-graphics processing unit (GPU) computing framework for real-time GD-OCM image processing. A parallelized control mechanism was developed to individually assign computation tasks to each of the GPUs. For each GPU, the optimal number of amplitude-scans (A-scans) to be processed in parallel was selected to maximize GPU memory usage and core throughput. We investigated five computing architectures for computational speed-up in processing 1000×1000 A-scans. The proposed parallelized multi-GPU computing framework enables processing at a computational speed faster than the GD-OCM image acquisition, thereby facilitating high-speed GD-OCM imaging in a clinical setting. Using two parallelized GPUs, the image processing of a 1×1×0.6  mm3 skin sample was performed in about 13 s, and the performance was benchmarked at 6.5 s with four GPUs. This work thus demonstrates that 3-D GD-OCM data may be displayed in real-time to the examiner using parallelized GPU processing.

  6. 77 FR 12882 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2012 Allowable Charges for Agricultural Workers' Meals and Travel Subsistence Reimbursement, Including Lodging AGENCY: Employment and Training...

  7. 78 FR 15741 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-12

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2013 Allowable Charges for Agricultural Workers' Meals and Travel Subsistence Reimbursement, Including Lodging AGENCY: Employment and Training...

  8. 77 FR 13635 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2012 Allowable Charges for Agricultural Workers' Meals and Travel Subsistence Reimbursement, Including Lodging AGENCY: Employment and Training...

  9. 76 FR 11286 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2011 Adverse Effect Wage Rates, Allowable Charges for Agricultural Workers' Meals, and Maximum Travel Subsistence Reimbursement AGENCY...

  10. Startup of Pumping Units in Process Water Supplies with Cooling Towers at Thermal and Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlin, V. V., E-mail: vberlin@rinet.ru; Murav’ev, O. A., E-mail: muraviov1954@mail.ru; Golubev, A. V., E-mail: electronik@inbox.ru

    Aspects of the startup of pumping units in the cooling and process water supply systems for thermal and nuclear power plants with cooling towers, the startup stages, and the limits imposed on the extreme parameters during transients are discussed.

  11. Optimization of the coherence function estimation for multi-core central processing unit

    NASA Astrophysics Data System (ADS)

    Cheremnov, A. G.; Faerman, V. A.; Avramchuk, V. S.

    2017-02-01

    The paper considers use of parallel processing on multi-core central processing unit for optimization of the coherence function evaluation arising in digital signal processing. Coherence function along with other methods of spectral analysis is commonly used for vibration diagnosis of rotating machinery and its particular nodes. An algorithm is given for the function evaluation for signals represented with digital samples. The algorithm is analyzed for its software implementation and computational problems. Optimization measures are described, including algorithmic, architecture and compiler optimization, their results are assessed for multi-core processors from different manufacturers. Thus, speeding-up of the parallel execution with respect to sequential execution was studied and results are presented for Intel Core i7-4720HQ и AMD FX-9590 processors. The results show comparatively high efficiency of the optimization measures taken. In particular, acceleration indicators and average CPU utilization have been significantly improved, showing high degree of parallelism of the constructed calculating functions. The developed software underwent state registration and will be used as a part of a software and hardware solution for rotating machinery fault diagnosis and pipeline leak location with acoustic correlation method.

  12. The Units Ontology: a tool for integrating units of measurement in science

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432

  13. Dynamic motif occupancy (DynaMO) analysis identifies transcription factors and their binding sites driving dynamic biological processes.

    PubMed

    Kuang, Zheng; Ji, Zhicheng; Boeke, Jef D; Ji, Hongkai

    2018-01-09

    Biological processes are usually associated with genome-wide remodeling of transcription driven by transcription factors (TFs). Identifying key TFs and their spatiotemporal binding patterns are indispensable to understanding how dynamic processes are programmed. However, most methods are designed to predict TF binding sites only. We present a computational method, dynamic motif occupancy analysis (DynaMO), to infer important TFs and their spatiotemporal binding activities in dynamic biological processes using chromatin profiling data from multiple biological conditions such as time-course histone modification ChIP-seq data. In the first step, DynaMO predicts TF binding sites with a random forests approach. Next and uniquely, DynaMO infers dynamic TF binding activities at predicted binding sites using their local chromatin profiles from multiple biological conditions. Another landmark of DynaMO is to identify key TFs in a dynamic process using a clustering and enrichment analysis of dynamic TF binding patterns. Application of DynaMO to the yeast ultradian cycle, mouse circadian clock and human neural differentiation exhibits its accuracy and versatility. We anticipate DynaMO will be generally useful for elucidating transcriptional programs in dynamic processes. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Morphology study of thoracic transverse processes and its significance in pedicle-rib unit screw fixation.

    PubMed

    Cui, Xin-gang; Cai, Jin-fang; Sun, Jian-min; Jiang, Zhen-song

    2015-03-01

    Thoracic transverse process is an important anatomic structure of the spine. Several anatomic studies have investigated the adjacent structures of the thoracic transverse process. But there is still a blank on the morphology of the thoracic transverse processes. The purpose of the cadaveric study is to investigate the morphology of thoracic transverse processes and to provide morphology basis for the pedicle-rib unit (extrapedicular) screw fixation method. Forty-five adult dehydrated skeletons (T1-T10) were included in this study. The length, width, thickness, and the tilt angle (upward and backward) of the thoracic transverse process were measured. The data were then analyzed statistically. On the basis of the morphometric study, 5 fresh cadavers were used to place screws from transverse processes to the vertebral body in the thoracic spine, and then observed by the naked eye and on computed tomography scans. The lengths of thoracic transverse processes were between 16.63±1.59 and 18.10±1.95 mm; the longest was at T7, and the shortest was at T10. The widths of thoracic transverse processes were between 11.68±0.80 and 12.87±1.48 mm; the widest was at T3, and the narrowest was at T7. The thicknesses of thoracic transverse processes were between 7.86±1.24 and 10.78±1.35 mm; the thickest was at T1, and the thinnest was at T7. The upward tilt angles of thoracic transverse processes were between 24.9±3.1 and 3.0±1.56 degrees; the maximal upward tilt angle was at T1, and the minimal upward tilt angle was at T7. The upward tilt angles of T1 and T2 were obviously different from the other thoracic transverse processes (P<0.01). The backward tilt angles of thoracic transverse processes gradually increased from 24.5±2.91 degrees at T1 to 64.5±5.12 degrees at T10. The backward tilt angles were significantly different between each other, except between T5 and T6. In the validation study, screws were all placed successfully from transverse processes to the vertebrae of

  15. Identifying potential misfit items in cognitive process of learning engineering mathematics based on Rasch model

    NASA Astrophysics Data System (ADS)

    Ataei, Sh; Mahmud, Z.; Khalid, M. N.

    2014-04-01

    The students learning outcomes clarify what students should know and be able to demonstrate after completing their course. So, one of the issues on the process of teaching and learning is how to assess students' learning. This paper describes an application of the dichotomous Rasch measurement model in measuring the cognitive process of engineering students' learning of mathematics. This study provides insights into the perspective of 54 engineering students' cognitive ability in learning Calculus III based on Bloom's Taxonomy on 31 items. The results denote that some of the examination questions are either too difficult or too easy for the majority of the students. This analysis yields FIT statistics which are able to identify if there is data departure from the Rasch theoretical model. The study has identified some potential misfit items based on the measurement of ZSTD where the removal misfit item was accomplished based on the MNSQ outfit of above 1.3 or less than 0.7 logit. Therefore, it is recommended that these items be reviewed or revised to better match the range of students' ability in the respective course.

  16. Process-oriented modelling to identify main drivers of erosion-induced carbon fluxes

    NASA Astrophysics Data System (ADS)

    Wilken, Florian; Sommer, Michael; Van Oost, Kristof; Bens, Oliver; Fiener, Peter

    2017-05-01

    Coupled modelling of soil erosion, carbon redistribution, and turnover has received great attention over the last decades due to large uncertainties regarding erosion-induced carbon fluxes. For a process-oriented representation of event dynamics, coupled soil-carbon erosion models have been developed. However, there are currently few models that represent tillage erosion, preferential water erosion, and transport of different carbon fractions (e.g. mineral bound carbon, carbon encapsulated by soil aggregates). We couple a process-oriented multi-class sediment transport model with a carbon turnover model (MCST-C) to identify relevant redistribution processes for carbon dynamics. The model is applied for two arable catchments (3.7 and 7.8 ha) located in the Tertiary Hills about 40 km north of Munich, Germany. Our findings indicate the following: (i) redistribution by tillage has a large effect on erosion-induced vertical carbon fluxes and has a large carbon sequestration potential; (ii) water erosion has a minor effect on vertical fluxes, but episodic soil organic carbon (SOC) delivery controls the long-term erosion-induced carbon balance; (iii) delivered sediments are highly enriched in SOC compared to the parent soil, and sediment delivery is driven by event size and catchment connectivity; and (iv) soil aggregation enhances SOC deposition due to the transformation of highly mobile carbon-rich fine primary particles into rather immobile soil aggregates.

  17. 26 CFR 1.924(d)-1 - Requirement that economic processes take place outside the United States.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 10 2014-04-01 2013-04-01 true Requirement that economic processes take place outside the United States. 1.924(d)-1 Section 1.924(d)-1 Internal Revenue INTERNAL REVENUE SERVICE... otherwise constitute advertising (such as sending sales literature to a customer or potential customer) will...

  18. Identifying Falls Risk Screenings Not Documented with Administrative Codes Using Natural Language Processing

    PubMed Central

    Zhu, Vivienne J; Walker, Tina D; Warren, Robert W; Jenny, Peggy B; Meystre, Stephane; Lenert, Leslie A

    2017-01-01

    Quality reporting that relies on coded administrative data alone may not completely and accurately depict providers’ performance. To assess this concern with a test case, we developed and evaluated a natural language processing (NLP) approach to identify falls risk screenings documented in clinical notes of patients without coded falls risk screening data. Extracting information from 1,558 clinical notes (mainly progress notes) from 144 eligible patients, we generated a lexicon of 38 keywords relevant to falls risk screening, 26 terms for pre-negation, and 35 terms for post-negation. The NLP algorithm identified 62 (out of the 144) patients who falls risk screening documented only in clinical notes and not coded. Manual review confirmed 59 patients as true positives and 77 patients as true negatives. Our NLP approach scored 0.92 for precision, 0.95 for recall, and 0.93 for F-measure. These results support the concept of utilizing NLP to enhance healthcare quality reporting. PMID:29854264

  19. Using Neuropsychological Process Scores to Identify Subtle Cognitive Decline and Predict Progression to Mild Cognitive Impairment.

    PubMed

    Thomas, Kelsey R; Edmonds, Emily C; Eppig, Joel; Salmon, David P; Bondi, Mark W

    2018-05-26

    We previously operationally-defined subtle cognitive decline (SCD) in preclinical Alzheimer's disease (AD) using total scores on neuropsychological (NP) tests. NP process scores (i.e., provide information about how a total NP score was achieved) may be a useful tool for identifying early cognitive inefficiencies prior to objective impairment seen in mild cognitive impairment (MCI) and dementia. We aimed to integrate process scores into the SCD definition to identify stages of SCD and improve early detection of those at risk for decline. Cognitively "normal" participants from the Alzheimer's Disease Neuroimaging Initiative were classified as "early" SCD (E-SCD; >1 SD below mean on 2 process scores or on 1 process score plus 1 NP total score), "late" SCD (L-SCD; existing SCD criteria of >1 SD below norm-adjusted mean on 2 NP total scores in different domains), or "no SCD" (NC). Process scores considered in the SCD criteria were word-list intrusion errors, retroactive interference, and learning slope. Cerebrospinal fluid AD biomarkers were used to examine pathologic burden across groups. E-SCD and L-SCD progressed to MCI 2.5-3.4 times faster than the NC group. Survival curves for E-SCD and L-SCD converged at 7-8 years after baseline. The combined (E-SCD+L-SCD) group had improved sensitivity to detect progression to MCI relative to L-SCD only. AD biomarker positivity increased across NC, SCD, and MCI groups. Process scores can be integrated into the SCD criteria to allow for increased sensitivity and earlier identification of cognitively normal older adults at risk for decline prior to frank impairment on NP total scores.

  20. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution.

    PubMed

    Correia, J R C C C; Martins, C J A P

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  1. Using Graphical Processing Units to Accelerate Orthorectification, Atmospheric Correction and Transformations for Big Data

    NASA Astrophysics Data System (ADS)

    O'Connor, A. S.; Justice, B.; Harris, A. T.

    2013-12-01

    Graphics Processing Units (GPUs) are high-performance multiple-core processors capable of very high computational speeds and large data throughput. Modern GPUs are inexpensive and widely available commercially. These are general-purpose parallel processors with support for a variety of programming interfaces, including industry standard languages such as C. GPU implementations of algorithms that are well suited for parallel processing can often achieve speedups of several orders of magnitude over optimized CPU codes. Significant improvements in speeds for imagery orthorectification, atmospheric correction, target detection and image transformations like Independent Components Analsyis (ICA) have been achieved using GPU-based implementations. Additional optimizations, when factored in with GPU processing capabilities, can provide 50x - 100x reduction in the time required to process large imagery. Exelis Visual Information Solutions (VIS) has implemented a CUDA based GPU processing frame work for accelerating ENVI and IDL processes that can best take advantage of parallelization. Testing Exelis VIS has performed shows that orthorectification can take as long as two hours with a WorldView1 35,0000 x 35,000 pixel image. With GPU orthorecification, the same orthorectification process takes three minutes. By speeding up image processing, imagery can successfully be used by first responders, scientists making rapid discoveries with near real time data, and provides an operational component to data centers needing to quickly process and disseminate data.

  2. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts.

    PubMed

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement.

  3. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts

    PubMed Central

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J.; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement. PMID:29854223

  4. Identifying children at risk for being bullies in the United States.

    PubMed

    Shetgiri, Rashmi; Lin, Hua; Flores, Glenn

    2012-01-01

    To identify risk factors associated with the greatest and lowest prevalence of bullying perpetration among U.S. children. Using the 2001-2002 Health Behavior in School-Aged Children, a nationally representative survey of U.S. children in 6th-10th grades, bivariate analyses were conducted to identify factors associated with any (once or twice or more), moderate (two to three times/month or more), and frequent (weekly or more) bullying. Stepwise multivariable analyses identified risk factors associated with bullying. Recursive partitioning analysis (RPA) identified risk factors which, in combination, identify students with the highest and lowest bullying prevalence. The prevalence of any bullying in the 13,710 students was 37.3%, moderate bullying was 12.6%, and frequent bullying was 6.6%. Characteristics associated with bullying were similar in the multivariable analyses and RPA clusters. In RPA, the highest prevalence of any bullying (67%) accrued in children with a combination of fighting and weapon-carrying. Students who carry weapons, smoke, and drink alcohol more than 5 to 6 days/week were at greatest risk for moderate bullying (61%). Those who carry weapons, smoke, have more than one alcoholic drink per day, have above-average academic performance, moderate/high family affluence, and feel irritable or bad-tempered daily were at greatest risk for frequent bullying (68%). Risk clusters for any, moderate, and frequent bullying differ. Children who fight and carry weapons are at greatest risk of any bullying. Weapon-carrying, smoking, and alcohol use are included in the greatest risk clusters for moderate and frequent bullying. Risk-group categories may be useful to providers in identifying children at the greatest risk for bullying and in targeting interventions. Copyright © 2012 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  5. Uniting Gradual and Abrupt set Processes in Resistive Switching Oxides

    NASA Astrophysics Data System (ADS)

    Fleck, Karsten; La Torre, Camilla; Aslam, Nabeel; Hoffmann-Eifert, Susanne; Böttger, Ulrich; Menzel, Stephan

    2016-12-01

    Identifying limiting factors is crucial for a better understanding of the dynamics of the resistive switching phenomenon in transition-metal oxides. This improved understanding is important for the design of fast-switching, energy-efficient, and long-term stable redox-based resistive random-access memory devices. Therefore, this work presents a detailed study of the set kinetics of valence change resistive switches on a time scale from 10 ns to 104 s , taking Pt /SrTiO3/TiN nanocrossbars as a model material. The analysis of the transient currents reveals that the switching process can be subdivided into a linear-degradation process that is followed by a thermal runaway. The comparison with a dynamical electrothermal model of the memory cell allows the deduction of the physical origin of the degradation. The origin is an electric-field-induced increase of the oxygen-vacancy concentration near the Schottky barrier of the Pt /SrTiO3 interface that is accompanied by a steadily rising local temperature due to Joule heating. The positive feedback of the temperature increase on the oxygen-vacancy mobility, and thereby on the conductivity of the filament, leads to a self-acceleration of the set process.

  6. Identifying Repetitive Institutional Review Board Stipulations by Natural Language Processing and Network Analysis.

    PubMed

    Kury, Fabrício S P; Cimino, James J

    2015-01-01

    The corrections ("stipulations") to a proposed research study protocol produced by an institutional review board (IRB) can often be repetitive across many studies; however, there is no standard set of stipulations that could be used, for example, by researchers wishing to anticipate and correct problems in their research proposals prior to submitting to an IRB. The objective of the research was to computationally identify the most repetitive types of stipulations generated in the course of IRB deliberations. The text of each stipulation was normalized using the natural language processing techniques. An undirected weighted network was constructed in which each stipulation was represented by a node, and each link, if present, had weight corresponding to the TF-IDF Cosine Similarity of the stipulations. Network analysis software was then used to identify clusters in the network representing similar stipulations. The final results were correlated with additional data to produce further insights about the IRB workflow. From a corpus of 18,582 stipulations we identified 31 types of repetitive stipulations. Those types accounted for 3,870 stipulations (20.8% of the corpus) produced for 697 (88.7%) of all protocols in 392 (also 88.7%) of all the CNS IRB meetings with stipulations entered in our data source. A notable peroportion of the corrections produced by the IRB can be considered highly repetitive. Our shareable method relied on a minimal manual analysis and provides an intuitive exploration with theoretically unbounded granularity. Finer granularity allowed for the insight that is anticipated to prevent the need for identifying the IRB panel expertise or any human supervision.

  7. The Daily Readiness Huddle: a process to rapidly identify issues and foster improvement through problem-solving accountability.

    PubMed

    Donnelly, Lane F; Cherian, Shirley S; Chua, Kimberly B; Thankachan, Sam; Millecker, Laura A; Koroll, Alex G; Bisset, George S

    2017-01-01

    Because of the increasing complexities of providing imaging for pediatric health care services, a more reliable process to manage the daily delivery of care is necessary. Objective We describe our Daily Readiness Huddle and the effects of the process on problem identification and improvement. Our Daily Readiness Huddle has four elements: metrics review, clinical volume review, daily readiness assessment, and problem accountability. It is attended by radiologists, directors, managers, front-line staff with concerns, representatives from support services (information technology [IT] and biomedical engineering [biomed]), and representatives who join the meeting in a virtual format from off-site locations. Data are visually displayed on erasable whiteboards. The daily readiness assessment uses queues to determine whether anyone has concerns or outlier data in regard to S-MESA (Safety, Methods, Equipment, Supplies or Associates). Through this assessment, problems are identified and categorized as quick hits (will be resolved in 24-48 h, not requiring project management) and complex issues. Complex issues are assigned an owner, quality coach and report-back date. Additionally, projects are defined as improvements that are often strategic, are anticipated to take more than 60 days, and do not necessarily arise out of identified issues during the Daily Readiness Huddle. We tracked and calculated the mean, median and range of days to resolution and completion for complex issues and for projects during the first full year of implementing this process. During the first 12 months, 91 complex issues were identified and resolved, 11 projects were in progress and 33 completed, with 23 other projects active or in planning. Time to resolution of complex issues (in days) was mean 37.5, median 34.0, and range 1-105. For projects, time to completion (in days) was mean 86.0, median 84.0, and range 5-280. The Daily Readiness Huddle process has given us a framework to rapidly identify

  8. Development of microcontroller-based acquisition and processing unit for fiber optic vibration sensor

    NASA Astrophysics Data System (ADS)

    Suryadi; Puranto, P.; Adinanta, H.; Waluyo, T. B.; Priambodo, P. S.

    2017-04-01

    Microcontroller based acquisition and processing unit (MAPU) has been developed to measure vibration signal from fiber optic vibration sensor. The MAPU utilizes a 32-bit ARM microcontroller to perform acquisition and processing of the input signal. The input signal is acquired with 12 bit ADC and processed using FFT method to extract frequency information. Stability of MAPU is characterized by supplying a constant input signal at 500 Hz for 29 hours and shows a stable operation. To characterize the frequency response, input signal is swapped from 20 to 1000 Hz with 20 Hz interval. The characterization result shows that MAPU can detect input signal from 20 to 1000 Hz with minimum signal of 4 mV RMS. The experiment has been set that utilizes the MAPU with singlemode-multimode-singlemode (SMS) fiber optic sensor to detect vibration which is induced by a transducer in a wooden platform. The experimental result indicates that vibration signal from 20 to 600 Hz has been successfully detected. Due to the limitation of the vibration source used in the experiment, vibration signal above 600 Hz is undetected.

  9. Career Education Resource Units: Grade 3.

    ERIC Educational Resources Information Center

    Newark School District, DE.

    The units contained in this guide are intended primarily as resource materials to assist grade 3 teachers in identifying units into which career awareness concepts can be infused and also in identifying instructional activities that correlate basic skills and career education objectives. Introductory information includes a definition of career…

  10. Career Education Resource Units: Grades 4.

    ERIC Educational Resources Information Center

    Newark School District, DE.

    The units contained in this guide are intended primarily as resource materials to assist grade 4 teachers in identifying units into which career awareness concepts can be infused and also in identifying instructional activities that correlate basic skills and career education objectives. Introductory information includes a definition of career…

  11. 78 FR 1260 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: Prevailing Wage Rates for Certain... Agriculture (USDA) farm production region that includes another State either with its own wage rate finding or...

  12. Parallelized multi–graphics processing unit framework for high-speed Gabor-domain optical coherence microscopy

    PubMed Central

    Tankam, Patrice; Santhanam, Anand P.; Lee, Kye-Sung; Won, Jungeun; Canavesi, Cristina; Rolland, Jannick P.

    2014-01-01

    Abstract. Gabor-domain optical coherence microscopy (GD-OCM) is a volumetric high-resolution technique capable of acquiring three-dimensional (3-D) skin images with histological resolution. Real-time image processing is needed to enable GD-OCM imaging in a clinical setting. We present a parallelized and scalable multi-graphics processing unit (GPU) computing framework for real-time GD-OCM image processing. A parallelized control mechanism was developed to individually assign computation tasks to each of the GPUs. For each GPU, the optimal number of amplitude-scans (A-scans) to be processed in parallel was selected to maximize GPU memory usage and core throughput. We investigated five computing architectures for computational speed-up in processing 1000×1000 A-scans. The proposed parallelized multi-GPU computing framework enables processing at a computational speed faster than the GD-OCM image acquisition, thereby facilitating high-speed GD-OCM imaging in a clinical setting. Using two parallelized GPUs, the image processing of a 1×1×0.6  mm3 skin sample was performed in about 13 s, and the performance was benchmarked at 6.5 s with four GPUs. This work thus demonstrates that 3-D GD-OCM data may be displayed in real-time to the examiner using parallelized GPU processing. PMID:24695868

  13. Specialized Inpatient Psychiatry Units for Children with Autism and Developmental Disorders: A United States Survey

    ERIC Educational Resources Information Center

    Siegel, Matthew; Doyle, Kathleen; Chemelski, Bruce; Payne, David; Ellsworth, Beth; Harmon, Jamie; Robbins, Douglas; Milligan, Briana; Lubetsky, Martin

    2012-01-01

    A cross sectional survey was performed to obtain the characteristics of specialized inpatient psychiatry units exclusively serving children with autism and other developmental disorders in the United States. Identified units were surveyed on basic demographic characteristics, clinical challenges and therapeutic modalities. Average length of stay…

  14. [IMPLEMENTATION OF A QUALITY MANAGEMENT SYSTEM IN A NUTRITION UNIT ACCORDING TO ISO 9001:2008].

    PubMed

    Velasco Gimeno, Cristina; Cuerda Compés, Cristina; Alonso Puerta, Alba; Frías Soriano, Laura; Camblor Álvarez, Miguel; Bretón Lesmes, Irene; Plá Mestre, Rosa; Izquierdo Membrilla, Isabel; García-Peris, Pilar

    2015-09-01

    the implementation of quality management systems (QMS) in the health sector has made great progress in recent years, remains a key tool for the management and improvement of services provides to patients. to describe the process of implementing a quality management system (QMS) according to the standard ISO 9001:2008 in a Nutrition Unit. the implementation began in October 2012. Nutrition Unit was supported by Hospital Preventive Medicine and Quality Management Service (PMQM). Initially training sessions on QMS and ISO standards for staff were held. Quality Committee (QC) was established with representation of the medical and nursing staff. Every week, meeting took place among members of the QC and PMQM to define processes, procedures and quality indicators. We carry on a 2 months follow-up of these documents after their validation. a total of 4 processes were identified and documented (Nutritional status assessment, Nutritional treatment, Monitoring of nutritional treatment and Planning and control of oral feeding) and 13 operating procedures in which all the activity of the Unit were described. The interactions among them were defined in the processes map. Each process has associated specific quality indicators for measuring the state of the QMS, and identifying opportunities for improvement. All the documents associated with requirements of ISO 9001:2008 were developed: quality policy, quality objectives, quality manual, documents and records control, internal audit, nonconformities and corrective and preventive actions. The unit was certified by AENOR in April 2013. the implementation of a QMS causes a reorganization of the activities of the Unit in order to meet customer's expectations. Documenting these activities ensures a better understanding of the organization, defines the responsibilities of all staff and brings a better management of time and resources. QMS also improves the internal communication and is a motivational element. Explore the satisfaction

  15. Status of the Development of Flight Power Processing Units for the NASAs Evolutionary Xenon Thruster - Commercial (NEXT-C) Project

    NASA Technical Reports Server (NTRS)

    Aulisio, Michael V.; Pinero, Luis R.; White, Brandon L.; Hickman, Tyler A.; Bontempo, James J.; Hertel, Thomas A.; Birchenough, Arthur G.

    2016-01-01

    A pathfinder prototype unit and two flight power processing units (PPUs) are being developed by the Aerojet Rocketdyne Corporation in Redmond, Washington and ZIN Technologies in Cleveland, Ohio, in support of the NEXT-C Project. This project is being led by the NASA Glenn Research Center in Cleveland, Ohio, and will also yield two flight thrusters. This hardware is being considered to be provided as Government Furnished Equipment for the New Frontiers Program, and is applicable to a variety of planetary science missions and astrophysics science missions. The design of the NEXT-C PPU evolves from the hardware fabricated under the NEXT technology development project. The power processing unit operates from two sources: a wide input 80 to 160 V high-power bus and a nominal 28 V low-power bus. The unit includes six power supplies. Four power supplies (beam, accelerator, discharge, and neutralizer keeper) are needed for steady state operation, while two cathode heater power supplies (neutralizer and discharge) are utilized during thruster startup. The unit in total delivers up to 7 kW of regulated power to a single gridded-ion thruster. Significant modifications to the initial design include: high-power adaptive-delay control, upgrade of design to EEE-INST-002 compliance, telemetry accuracy improvements, incorporation of telemetry to detect plume-mode operation, and simplification of the design in select areas to improve manufacturability and commercialization potential. The project is presently in the prototype phase and preparing for qualification level environmental testing.

  16. Geomorphic Units on Titan

    NASA Astrophysics Data System (ADS)

    Lopes, R. M. C.; Malaska, M. J.; Schoenfeld, A.; Birch, S. P.; Hayes, A. G., Jr.

    2014-12-01

    The Cassini-Huygens mission has revealed the surface of Titan in unprecedented detail. The Synthetic Aperture Radar (SAR) mode on the Cassini Titan Radar Mapper is able to penetrate clouds and haze to provide high resolution (~350 m spatial resolution at best) views of the surface geology. The instrument's other modes (altimetry, scatterometry, radiometry) also provide valuable data for interpreting the geology, as do other instruments on Cassini, in particular, the Imaging Science Subsystem (ISS) and the Visual and Infrared Mapping Spectrometer (VIMS). Continuing the initial work described in Lopes et al. (2010, Icarus, 212, 744-750), we have established the major geomorphologic unit classes on Titan using data from flybys Ta through T92 (October 2004-July 2013). We will present the global distribution of the major classes of units and, where there are direct morphological contacts, describe how these classes of units relate to each other in terms of setting and emplacement history. The classes of units are mountainous/hummocky terrains, plains, dunes, labyrinthic terrains and lakes. The oldest classes of units are the mountainous/hummocky and the labyrinthic terrains. The mountainous/hummocky terrains consist of mountain chains and isolated radar-bright terrains. The labyrinthic terrains consist of highly incised dissected plateaux with medium radar backscatter. The plains are younger than both mountainous/hummocky and labyrinthic unit classes. Dunes and lakes are the youngest unit classes on Titan; no contact is observed between the dunes and lakes but it is likely that both processes are still active. We have identified individual features such as craters, channels, and candidate cryovolcanic features. Characterization and comparison of the properties of the unit classes and the individual features with data from radiometry, ISS, and VIMS provides information on their composition and possible provenance. We can use these correlations to also infer global

  17. Geomorphic Units on Titan

    NASA Astrophysics Data System (ADS)

    Lopes, Rosaly; Malaska, Michael; Schoenfeld, Ashley; Birch, Samuel; Hayes, Alexander; Solomonidou, Anezina; Radebaugh, Jani

    2015-04-01

    The Cassini-Huygens mission has revealed the surface of Titan in unprecedented detail. The Synthetic Aperture Radar (SAR) mode on the Cassini Titan Radar Mapper is able to penetrate clouds and haze to provide high resolution (~350 m spatial resolution at best) views of the surface geology. The instrument's other modes (altimetry, scatterometry, radiometry) also provide valuable data for interpreting the geology, as do other instruments on Cassini, in particular, the Imaging Science Subsystem (ISS) and the Visual and Infrared Mapping Spectrometer (VIMS). Continuing the initial work described in Lopes et al. (2010, Icarus, 212, 744-750), we have established the major geomorphologic unit classes on Titan using data from flybys Ta through T92 (October 2004-July 2013). We will present the global distribution of the major classes of units and, where there are direct morphological contacts, describe how these classes of units relate to each other in terms of setting and emplacement history. The classes of units are mountainous/hummocky terrains, plains, dunes, labyrinthic terrains and lakes. The oldest classes of units are the mountainous/hummocky and the labyrinthic terrains. The mountainous/hummocky terrains consist of mountain chains and isolated radar-bright terrains. The labyrinthic terrains consist of highly incised dissected plateaux with medium radar backscatter. The plains are younger than both mountainous/hummocky and labyrinthic unit classes. Dunes and lakes are the youngest unit classes on Titan; no contact is observed between the dunes and lakes but it is likely that both processes are still active. We have identified individual features such as craters, channels, and candidate cryovolcanic features. Characterization and comparison of the properties of the unit classes and the individual features with data from radiometry, ISS, and VIMS provides information on their composition and possible provenance. We can use these correlations to also infer global

  18. Dynamic wavefront creation for processing units using a hybrid compactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puthoor, Sooraj; Beckmann, Bradford M.; Yudanov, Dmitri

    A method, a non-transitory computer readable medium, and a processor for repacking dynamic wavefronts during program code execution on a processing unit, each dynamic wavefront including multiple threads are presented. If a branch instruction is detected, a determination is made whether all wavefronts following a same control path in the program code have reached a compaction point, which is the branch instruction. If no branch instruction is detected in executing the program code, a determination is made whether all wavefronts following the same control path have reached a reconvergence point, which is a beginning of a program code segment tomore » be executed by both a taken branch and a not taken branch from a previous branch instruction. The dynamic wavefronts are repacked with all threads that follow the same control path, if all wavefronts following the same control path have reached the branch instruction or the reconvergence point.« less

  19. Real-time speckle variance swept-source optical coherence tomography using a graphics processing unit.

    PubMed

    Lee, Kenneth K C; Mariampillai, Adrian; Yu, Joe X Z; Cadotte, David W; Wilson, Brian C; Standish, Beau A; Yang, Victor X D

    2012-07-01

    Advances in swept source laser technology continues to increase the imaging speed of swept-source optical coherence tomography (SS-OCT) systems. These fast imaging speeds are ideal for microvascular detection schemes, such as speckle variance (SV), where interframe motion can cause severe imaging artifacts and loss of vascular contrast. However, full utilization of the laser scan speed has been hindered by the computationally intensive signal processing required by SS-OCT and SV calculations. Using a commercial graphics processing unit that has been optimized for parallel data processing, we report a complete high-speed SS-OCT platform capable of real-time data acquisition, processing, display, and saving at 108,000 lines per second. Subpixel image registration of structural images was performed in real-time prior to SV calculations in order to reduce decorrelation from stationary structures induced by the bulk tissue motion. The viability of the system was successfully demonstrated in a high bulk tissue motion scenario of human fingernail root imaging where SV images (512 × 512 pixels, n = 4) were displayed at 54 frames per second.

  20. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  1. Assessment of changes in plasma hemoglobin and potassium levels in red cell units during processing and storage.

    PubMed

    Saini, Nishant; Basu, Sabita; Kaur, Ravneet; Kaur, Jasbinder

    2015-06-01

    Red cell units undergo changes during storage and processing. The study was planned to assess plasma potassium, plasma hemoglobin, percentage hemolysis during storage and to determine the effects of outdoor blood collection and processing on those parameters. Blood collection in three types of blood storage bags was done - single CPDA bag (40 outdoor and 40 in-house collection), triple CPD + SAGM bag (40 in-house collection) and quadruple CPD + SAGM bag with integral leukoreduction filter (40 in-house collection). All bags were sampled on day 0 (day of collection), day 1 (after processing), day 7, day 14 and day 28 for measurement of percentage hemolysis and potassium levels in the plasma of bag contents. There was significant increase in percentage hemolysis, plasma hemoglobin and plasma potassium level in all the groups during storage (p < 0.001). No significant difference was found between any parameter analyzed for outdoor and in-house collected single CPDA red cell units. There was significant lower percentage hemolysis (p < 0.001) and potassium (day 7 to day 14 - p < 0.05 and day 14 to day 28 - p < 0.001) in red cell units from day 7 onward until day 28 of storage in the leukoreduced quadruple bag as compared to the triple bag. The in-house single CPDA red cell units showed significantly more hemolysis (p < 0.001) as compared to the triple bags with SAGM additive solution after 28 days of storage. There is gradual increase in plasma hemoglobin and plasma potassium levels during the storage of red blood cells. Blood collection can be safely undertaken in outdoor blood donation camps even in hot summer months in monitored blood transport boxes. SAGM additive solution decreases the red cell hemolysis and allows extended storage of red cells. Prestorage leukoreduction decreases the red cell hemolysis and improves the quality of blood. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Parent-identified barriers to pediatric health care: a process-oriented model.

    PubMed

    Sobo, Elisa J; Seid, Michael; Reyes Gelhard, Leticia

    2006-02-01

    To further understand barriers to care as experienced by health care consumers, and to demonstrate the importance of conjoining qualitative and quantitative health services research. Transcripts from focus groups conducted in San Diego with English- and Spanish-speaking parents of children with special health care needs. Participants were asked about the barriers to care they had experienced or perceived, and their strategies for overcoming these barriers. Using elementary anthropological discourse analysis techniques, a process-based conceptual model of the parent experience was devised. The analysis revealed a parent-motivated model of barriers to care that enriched our understanding of quantitative findings regarding the population from which the focus group sample was drawn. Parent-identified barriers were grouped into the following six temporally and spatially sequenced categories: necessary skills and prerequisites for gaining access to the system; realizing access once it is gained; front office experiences; interactions with physicians; system arbitrariness and fragmentation; outcomes that affect future interaction with the system. Key to the successful navigation of the system was parents' functional biomedical acculturation; this construct likens the biomedical health services system to a cultural system within which all parents/patients must learn to function competently. Qualitative analysis of focus group data enabled a deeper understanding of barriers to care--one that went beyond the traditional association of marker variables with poor outcomes ("what") to reveal an understanding of the processes by which parents experience the health care system ("how,"why") and by which disparities may arise. Development of such process-oriented models furthers the provision of patient-centered care and the creation of interventions, programs, and curricula to enhance such care. Qualitative discourse analysis, for example using this project's widely applicable

  3. Computing the Density Matrix in Electronic Structure Theory on Graphics Processing Units.

    PubMed

    Cawkwell, M J; Sanville, E J; Mniszewski, S M; Niklasson, Anders M N

    2012-11-13

    The self-consistent solution of a Schrödinger-like equation for the density matrix is a critical and computationally demanding step in quantum-based models of interatomic bonding. This step was tackled historically via the diagonalization of the Hamiltonian. We have investigated the performance and accuracy of the second-order spectral projection (SP2) algorithm for the computation of the density matrix via a recursive expansion of the Fermi operator in a series of generalized matrix-matrix multiplications. We demonstrate that owing to its simplicity, the SP2 algorithm [Niklasson, A. M. N. Phys. Rev. B2002, 66, 155115] is exceptionally well suited to implementation on graphics processing units (GPUs). The performance in double and single precision arithmetic of a hybrid GPU/central processing unit (CPU) and full GPU implementation of the SP2 algorithm exceed those of a CPU-only implementation of the SP2 algorithm and traditional matrix diagonalization when the dimensions of the matrices exceed about 2000 × 2000. Padding schemes for arrays allocated in the GPU memory that optimize the performance of the CUBLAS implementations of the level 3 BLAS DGEMM and SGEMM subroutines for generalized matrix-matrix multiplications are described in detail. The analysis of the relative performance of the hybrid CPU/GPU and full GPU implementations indicate that the transfer of arrays between the GPU and CPU constitutes only a small fraction of the total computation time. The errors measured in the self-consistent density matrices computed using the SP2 algorithm are generally smaller than those measured in matrices computed via diagonalization. Furthermore, the errors in the density matrices computed using the SP2 algorithm do not exhibit any dependence of system size, whereas the errors increase linearly with the number of orbitals when diagonalization is employed.

  4. Real time 3D structural and Doppler OCT imaging on graphics processing units

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Szkulmowski, Maciej; Gorczyńska, Iwona; Bukowska, Danuta; Wojtkowski, Maciej; Targowski, Piotr

    2013-03-01

    In this report the application of graphics processing unit (GPU) programming for real-time 3D Fourier domain Optical Coherence Tomography (FdOCT) imaging with implementation of Doppler algorithms for visualization of the flows in capillary vessels is presented. Generally, the time of the data processing of the FdOCT data on the main processor of the computer (CPU) constitute a main limitation for real-time imaging. Employing additional algorithms, such as Doppler OCT analysis, makes this processing even more time consuming. Lately developed GPUs, which offers a very high computational power, give a solution to this problem. Taking advantages of them for massively parallel data processing, allow for real-time imaging in FdOCT. The presented software for structural and Doppler OCT allow for the whole processing with visualization of 2D data consisting of 2000 A-scans generated from 2048 pixels spectra with frame rate about 120 fps. The 3D imaging in the same mode of the volume data build of 220 × 100 A-scans is performed at a rate of about 8 frames per second. In this paper a software architecture, organization of the threads and optimization applied is shown. For illustration the screen shots recorded during real time imaging of the phantom (homogeneous water solution of Intralipid in glass capillary) and the human eye in-vivo is presented.

  5. Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.

    PubMed

    Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray

    2017-07-11

    Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.

  6. Evaluation of virus reduction efficiency in wastewater treatment unit processes as a credit value in the multiple-barrier system for wastewater reclamation and reuse.

    PubMed

    Ito, Toshihiro; Kato, Tsuyoshi; Hasegawa, Makoto; Katayama, Hiroyuki; Ishii, Satoshi; Okabe, Satoshi; Sano, Daisuke

    2016-12-01

    The virus reduction efficiency of each unit process is commonly determined based on the ratio of virus concentration in influent to that in effluent of a unit, but the virus concentration in wastewater has often fallen below the analytical quantification limit, which does not allow us to calculate the concentration ratio at each sampling event. In this study, left-censored datasets of norovirus (genogroup I and II), and adenovirus were used to calculate the virus reduction efficiency in unit processes of secondary biological treatment and chlorine disinfection. Virus concentration in influent, effluent from the secondary treatment, and chlorine-disinfected effluent of four municipal wastewater treatment plants were analyzed by a quantitative polymerase chain reaction (PCR) approach, and the probabilistic distributions of log reduction (LR) were estimated by a Bayesian estimation algorithm. The mean values of LR in the secondary treatment units ranged from 0.9 and 2.2, whereas those in the free chlorine disinfection units were from -0.1 and 0.5. The LR value in the secondary treatment was virus type and unit process dependent, which raised the importance for accumulating the data of virus LR values applicable to the multiple-barrier system, which is a global concept of microbial risk management in wastewater reclamation and reuse.

  7. 78 FR 1259 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2013 Adverse Effect Wage Rates AGENCY... Department of Agriculture (USDA). 20 CFR 655.120(c) requires that the Administrator of the Office of Foreign...

  8. 76 FR 79711 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: 2012 Adverse Effect Wage Rates AGENCY... Department of Agriculture (USDA). 20 CFR 655.120(c) requires the Administrator of the Office of Foreign Labor...

  9. Network-Based Methods for Identifying Key Active Proteins in the Extracellular Electron Transfer Process in Shewanella oneidensis MR-1.

    PubMed

    Ding, Dewu; Sun, Xiao

    2018-01-16

    Shewanella oneidensis MR-1 can transfer electrons from the intracellular environment to the extracellular space of the cells to reduce the extracellular insoluble electron acceptors (Extracellular Electron Transfer, EET). Benefiting from this EET capability, Shewanella has been widely used in different areas, such as energy production, wastewater treatment, and bioremediation. Genome-wide proteomics data was used to determine the active proteins involved in activating the EET process. We identified 1012 proteins with decreased expression and 811 proteins with increased expression when the EET process changed from inactivation to activation. We then networked these proteins to construct the active protein networks, and identified the top 20 key active proteins by network centralization analysis, including metabolism- and energy-related proteins, signal and transcriptional regulatory proteins, translation-related proteins, and the EET-related proteins. We also constructed the integrated protein interaction and transcriptional regulatory networks for the active proteins, then found three exclusive active network motifs involved in activating the EET process-Bi-feedforward Loop, Regulatory Cascade with a Feedback, and Feedback with a Protein-Protein Interaction (PPI)-and identified the active proteins involved in these motifs. Both enrichment analysis and comparative analysis to the whole-genome data implicated the multiheme c -type cytochromes and multiple signal processing proteins involved in the process. Furthermore, the interactions of these motif-guided active proteins and the involved functional modules were discussed. Collectively, by using network-based methods, this work reported a proteome-wide search for the key active proteins that potentially activate the EET process.

  10. Multilevel Summation of Electrostatic Potentials Using Graphics Processing Units*

    PubMed Central

    Hardy, David J.; Stone, John E.; Schulten, Klaus

    2009-01-01

    Physical and engineering practicalities involved in microprocessor design have resulted in flat performance growth for traditional single-core microprocessors. The urgent need for continuing increases in the performance of scientific applications requires the use of many-core processors and accelerators such as graphics processing units (GPUs). This paper discusses GPU acceleration of the multilevel summation method for computing electrostatic potentials and forces for a system of charged atoms, which is a problem of paramount importance in biomolecular modeling applications. We present and test a new GPU algorithm for the long-range part of the potentials that computes a cutoff pair potential between lattice points, essentially convolving a fixed 3-D lattice of “weights” over all sub-cubes of a much larger lattice. The implementation exploits the different memory subsystems provided on the GPU to stream optimally sized data sets through the multiprocessors. We demonstrate for the full multilevel summation calculation speedups of up to 26 using a single GPU and 46 using multiple GPUs, enabling the computation of a high-resolution map of the electrostatic potential for a system of 1.5 million atoms in under 12 seconds. PMID:20161132

  11. Identifying Plant Poisoning in Livestock

    USDA-ARS?s Scientific Manuscript database

    Poisonous plant intoxication is a common and often deadly problem that annually costs the livestock industry more than $340 million in the western United States alone. Despite the cost or frequency, definitively identifying or diagnosing poisoning by plants in livestock is challenging. The purpos...

  12. Assessment Approach for Identifying Compatibility of Restoration Projects with Geomorphic and Flooding Processes in Gravel Bed Rivers

    NASA Astrophysics Data System (ADS)

    DeVries, Paul; Aldrich, Robert

    2015-08-01

    A critical requirement for a successful river restoration project in a dynamic gravel bed river is that it be compatible with natural hydraulic and sediment transport processes operating at the reach scale. The potential for failure is greater at locations where the influence of natural processes is inconsistent with intended project function and performance. We present an approach using practical GIS, hydrologic, hydraulic, and sediment transport analyses to identify locations where specific restoration project types have the greatest likelihood of working as intended because their function and design are matched with flooding and morphologic processes. The key premise is to identify whether a specific river analysis segment (length ~1-10 bankfull widths) within a longer reach is geomorphically active or inactive in the context of vertical and lateral stabilities, and hydrologically active for floodplain connectivity. Analyses involve empirical channel geometry relations, aerial photographic time series, LiDAR data, HEC-RAS hydraulic modeling, and a time-integrated sediment transport budget to evaluate trapping efficiency within each segment. The analysis segments are defined by HEC-RAS model cross sections. The results have been used effectively to identify feasible projects in a variety of alluvial gravel bed river reaches with lengths between 11 and 80 km and 2-year flood magnitudes between ~350 and 1330 m3/s. Projects constructed based on the results have all performed as planned. In addition, the results provide key criteria for formulating erosion and flood management plans.

  13. Assessment Approach for Identifying Compatibility of Restoration Projects with Geomorphic and Flooding Processes in Gravel Bed Rivers.

    PubMed

    DeVries, Paul; Aldrich, Robert

    2015-08-01

    A critical requirement for a successful river restoration project in a dynamic gravel bed river is that it be compatible with natural hydraulic and sediment transport processes operating at the reach scale. The potential for failure is greater at locations where the influence of natural processes is inconsistent with intended project function and performance. We present an approach using practical GIS, hydrologic, hydraulic, and sediment transport analyses to identify locations where specific restoration project types have the greatest likelihood of working as intended because their function and design are matched with flooding and morphologic processes. The key premise is to identify whether a specific river analysis segment (length ~1-10 bankfull widths) within a longer reach is geomorphically active or inactive in the context of vertical and lateral stabilities, and hydrologically active for floodplain connectivity. Analyses involve empirical channel geometry relations, aerial photographic time series, LiDAR data, HEC-RAS hydraulic modeling, and a time-integrated sediment transport budget to evaluate trapping efficiency within each segment. The analysis segments are defined by HEC-RAS model cross sections. The results have been used effectively to identify feasible projects in a variety of alluvial gravel bed river reaches with lengths between 11 and 80 km and 2-year flood magnitudes between ~350 and 1330 m(3)/s. Projects constructed based on the results have all performed as planned. In addition, the results provide key criteria for formulating erosion and flood management plans.

  14. The implementation of unit-based perinatal mortality audit in perinatal cooperation units in the northern region of the Netherlands.

    PubMed

    van Diem, Mariet Th; Timmer, Albertus; Bergman, Klasien A; Bouman, Katelijne; van Egmond, Nico; Stant, Dennis A; Ulkeman, Lida H M; Veen, Wenda B; Erwich, Jan Jaap H M

    2012-07-09

    Perinatal (mortality) audit can be considered to be a way to improve the careprocess for all pregnant women and their newborns by creating an opportunity to learn from unwanted events in the care process. In unit-based perinatal audit, the caregivers involved in cases that result in mortality are usually part of the audit group. This makes such an audit a delicate matter. The purpose of this study was to implement unit-based perinatal mortality audit in all 15 perinatal cooperation units in the northern region of the Netherlands between September 2007 and March 2010. These units consist of hospital-based and independent community-based perinatal caregivers. The implementation strategy encompassed an information plan, an organization plan, and a training plan. The main outcomes are the number of participating perinatal cooperation units at the end of the project, the identified substandard factors (SSF), the actions to improve care, and the opinions of the participants. The perinatal mortality audit was implemented in all 15 perinatal cooperation units. 677 different caregivers analyzed 112 cases of perinatal mortality and identified 163 substandard factors. In 31% of cases the guidelines were not followed and in 23% care was not according to normal practice. In 28% of cases, the documentation was not in order, while in 13% of cases the communication between caregivers was insufficient. 442 actions to improve care were reported for 'external cooperation' (15%), 'internal cooperation' (17%), 'practice organization' (26%), 'training and education' (10%), and 'medical performance' (27%). Valued aspects of the audit meetings were: the multidisciplinary character (13%), the collective and non-judgmental search for substandard factors (21%), the perception of safety (13%), the motivation to reflect on one's own professional performance (5%), and the inherent postgraduate education (10%). Following our implementation strategy, the perinatal mortality audit has been

  15. Nanosilver as a disinfectant in dental unit waterlines ...

    EPA Pesticide Factsheets

    Dental unit water lines (DUWL) are susceptible to biofilm development and bacterial growth leading to water contamination, causing health and ecological effects. This study monitors the interactions between a commonly used nanosilver disinfectant (ASAP-AGX-32, an antimicrobial cleaner for dental units, 0.0032% Ag) and biofilm development in DUWL. To simulate the disinfection scenario, an in-house DUWL model was assembled and biofilm accumulation was allowed. Subsequent to biofilm development, the disinfection process was performed according to the manufacturer's instructions. The pristine nanosilver particles in the cleaner measured between 3 and 5 nm in diameter and were surrounded by a stabilizing polymer. However, the polymeric stabilizing agent diminished over the disinfection process, initiating partial AgNPs aggregation. Furthermore, surface speciation of the pristine AgNPs were identified as primarily AgO, and after the disinfection process, transformations to AgCl were observed. The physicochemical characteristics of AgNPs are known to govern their fate, and transport and environmental implications. Hence, knowledge of the AgNPs characteristics after the disinfection process (usage scenario) is of significance. This study demonstrates the adsorption of AgNPs onto biofilm surfaces and, therefore, will assist in illustration of the toxicity mechanisms of AgNPs to bacteria and biofilms. This work can be an initial step in better understanding how

  16. Changes in estrogenicity and micropollutant concentrations across unit processes in a biological wastewater treatment system.

    PubMed

    Chen, Jian Lin; Ravindran, Shanthinie; Swift, Simon; Singhal, Naresh

    2018-03-01

    The behavior of 10 micropollutants, i.e. four estrogens (estrone, 17β-estradiol, estriol, 17α-ethynylestradiol), carbamazepine (CBZ), sulfamethoxazole (SMX), triclosan, oxybenzone, 4-nonylphenol, and bisphenol A, was investigated in a typical domestic wastewater treatment plant. LC-MS and yeast estrogen screen bioassay were used to study the changes in micropollutants and estrogenicity across unit processes in the treatment system. Primary treatment via sedimentation showed that only 4-nonylphenol was removed, but led to no significant change in estrogenicity. Secondary treatment by the biological nitrification-dentrification process showed complete removal of oxybenzone and partial removal of the estrogens, which led to a decrease in estrogenic activity from 80 to 48 ng/L as estradiol equivalent (EEq). Ultraviolet treatment completely degraded the estrogens and triclosan, but failed to lower the concentrations of bisphenol A, SMX, and CBZ; a decrease in estrogenic activity from 48 to 5 ng/L EEq across the unit, a value that was only slightly larger than the observed EEq of 1 ng/L for the deionized control. Similarly, the anaerobic digestion of sludge completely degraded estrogens, oxybenzone, and SMX, but had no impact on bisphenol A, triclosan, and CBZ. The study emphasises the need to complement chemical analyses with estrogenic bioassays to evaluate the efficacy of waste water treatment plants.

  17. Hydrologic Unit Map -- 1978, state of South Dakota

    USGS Publications Warehouse

    ,

    1978-01-01

    This map and accompanying table show Hydrologic Unites that are basically hydrographic in nature. The Cataloging Unites shown supplant the Cataloging Units previously depicted n the 1974 State Hydrologic Unit Map. The boundaries as shown have been adapted from the 1974 State Hydrologic Unit Map, "The Catalog of Information on Water Data" (1972), "Water Resources Regions and Subregions for the National Assessment of Water and Related Land Resources" by the U.S. Water Resources Council (1970), "River Basin of the United States" by the U.S. Soil Conservation Service (1963, 1970), "River Basin Maps Showing Hydrologic Stations" by the Inter-Agency Committee on Water Resources, Subcommittee on Hydrology (1961), and State planning maps. The Political Subdivision has been adopted from "Counties and County Equivalents of the States if the United States" presented in Federal Information Processing Standards Publication 6-2, issued by the National Bureau of Standards (1973) in which each county or county equivalent is identified by a 2-character State code and a 3-character county code. The Regions, Subregions and Accounting Units are aggregates of the Cataloging Unites. The Regions and Sub regions are currently (1978) used by the U.S> Water Resources Council for comprehensive planning, including the National Assessment, and as a standard geographical framework for more detailed water and related land-resources planning. The Accounting Units are those currently (1978) in use by the U.S. Geological Survey for managing the National Water Data Network. This map was revised to include a boundary realinement between Cataloging Units 10140103 and 10160009.

  18. Large-scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU).

    PubMed

    Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin

    2015-01-15

    Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  20. An Examination of Individual Level Factors in Stress and Coping Processes: Perspectives of Chinese International Students in the United States

    ERIC Educational Resources Information Center

    Yan, Kun; Berliner, David C.

    2011-01-01

    No empirical research has focused solely upon understanding the stress and coping processes of Chinese international students in the United States. This qualitative inquiry examines the individual-level variables that affect the stress-coping process of Chinese international students and how they conceptualize and adapt to their stress at an…

  1. Energy resources of the United States

    USGS Publications Warehouse

    Theobald, P.K.; Schweinfurth, Stanley P.; Duncan, Donald Cave

    1972-01-01

    Estimates are made of United States resources of coal, petroleum liquids, natural gas, uranium, geothermal energy, and oil from oil shale. The estimates, compiled by specialists of the U.S. Geological Survey, are generally made on geologic projections of favorable rocks and on anticipated frequency of the energy resource in the favorable rocks. Accuracy of the estimates probably ranges from 20 to 50 percent for identified-recoverable resources to about an order of magnitude for undiscovered-submarginal resources. The total coal resource base in the United States is estimated to be about 3,200 billion tons, of which 200-390 billion tons can be considered in the category identified and recoverable. More than 70 percent of current production comes from the Appalachian basin where the resource base, better known than for the United States as a whole, is about 330 billion tons, of which 22 billion tons is identified and recoverable. Coals containing less than 1 percent sulfur are the premium coals. These are abundant in the western coal fields, but in the Appalachian basin the resource base for low-sulfur coal is estimated to be only a little more than 100 billion tons, of which 12 billion tons is identified and recoverable. Of the many estimates of petroleum liquids and natural-gas resources, those of the U.S. Geological Survey are the largest because, in general, our estimates include the largest proportion of favorable ground for exploration. We estimate the total resource base for petroleum liquids to be about 2,900 billion barrels, of which 52 billion barrels is identified and recoverable. Of the total resource base, some 600 billion barrels is in Alaska or offshore from Alaska, 1,500 billion barrels is offshore from the United States, and 1,300 billion barrels is onshore in the conterminous United States. Identified-recoverable resources of petroleum liquids corresponding to these geographic units are 11, 6, and 36 billion barrels, respectively. The total natural

  2. OCTGRAV: Sparse Octree Gravitational N-body Code on Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Gaburov, Evghenii; Bédorf, Jeroen; Portegies Zwart, Simon

    2010-10-01

    Octgrav is a very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The algorithms are based on parallel-scan and sort methods. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way, a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s is achieved. It takes about a second to compute forces on a million particles with an opening angle of heta approx 0.5. To test the performance and feasibility, we implemented the algorithms in CUDA in the form of a gravitational tree-code which completely runs on the GPU. The tree construction and traverse algorithms are portable to many-core devices which have support for CUDA or OpenCL programming languages. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second. The code has a convenient user interface and is freely available for use.

  3. Chapter 3. Coordination and collaboration with interface units

    PubMed Central

    Joynt, Gavin M.; Loo, Shi; Taylor, Bruce L.; Margalit, Gila; Christian, Michael D.; Sandrock, Christian; Danis, Marion; Leoniv, Yuval

    2016-01-01

    Purpose To provide recommendations and standard operating procedures (SOPs) for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on enhancing coordination and collaboration between the ICU and other key stakeholders. Methods Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including coordination and collaboration. Results Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resource use and communications; (2) develop a system of communication, coordination and collaboration between the ICU and key interface departments within the hospital; (3) identify key functions or processes requiring coordination and collaboration, the most important of these being manpower and resources utilization (surge capacity) and re-allocation of personnel, equipment and physical space; (4) develop processes to allow smooth inter-departmental patient transfers; (5) creating systems and guidelines is not sufficient, it is important to: (a) identify the roles and responsibilities of key individuals necessary for the implementation of the guidelines; (b) ensure that these individuals are adequately trained and prepared to perform their roles; (c) ensure adequate equipment to allow key coordination and collaboration activities; (d) ensure an adequate physical environment to allow staff to properly implement guidelines; (6) trigger events for determining a crisis should be defined. Conclusions Judicious planning and adoption of protocols for coordination and collaboration with interface units are necessary to optimize outcomes during a pandemic. PMID:20213418

  4. Psychometric assessment of the Family Satisfaction in the Intensive Care Unit questionnaire in the United Kingdom.

    PubMed

    Harrison, David A; Ferrando-Vivas, Paloma; Wright, Stephen E; McColl, Elaine; Heyland, Daren K; Rowan, Kathryn M

    2017-04-01

    To establish the psychometric properties of the Family Satisfaction in the Intensive Care Unit 24-item (FS-ICU-24) questionnaire in the United Kingdom. The Family-Reported Experiences Evaluation study recruited family members of patients staying at least 24 hours in 20 participating intensive care units. Questionnaires were evaluated for nonresponse, floor/ceiling effects, redundancy, and construct validity. Internal consistency was evaluated with item-to-own scale correlations and Cronbach α. Confirmatory and exploratory factor analyses were used to explore the underlying structure. Twelve thousand three hundred forty-six family members of 6380 patients were recruited and 7173 (58%) family members of 4615 patients returned a completed questionnaire. One family member per patient was included in the psychometric assessment. Six items had greater than 10% nonresponse; 1 item had a ceiling effect; and 11 items had potential redundancy. Internal consistency was high (Cronbach α, overall .96; satisfaction with care, .94; satisfaction with decision making, .93). The 2-factor solution was not a good fit. Exploratory factor analysis indicated that satisfaction with decision making encompassed 2 constructs-satisfaction with information and satisfaction with the decision-making process. The Family Satisfaction in the Intensive Care Unit 24-item questionnaire demonstrated good psychometric properties in the United Kingdom setting. Construct validity could be improved by use of 3 domains and some scope for further improvement was identified. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  6. Identifying Potentially Preventable Emergency Department Visits by Nursing Home Residents in the United States.

    PubMed

    Burke, Robert E; Rooks, Sean P; Levy, Cari; Schwartz, Robert; Ginde, Adit A

    2015-05-01

    To identify and describe potentially preventable emergency department (ED) visits by nursing home (NH) residents in the United States. These visits are important because they are common, frequently lead to hospitalization, and can be associated with significant cost to the patient and the health care system. Retrospective analysis of the 2005-2010 National Hospital Ambulatory Care Survey (NHAMCS), comparing ED visits by nursing home residents that did not lead to hospital admission (potentially preventable) with those that led to admission (less likely preventable). Nationally representative sample of US EDs; federal hospitals and hospitals with fewer than 6 beds were excluded. Older (age ≥65 years) NH residents with an ED visit during this time period. Patient demographics, ED visit information including testing performed, interventions (both procedures and medications) provided, and diagnoses treated. Older NH residents accounted for 3857 of 208,956 ED visits during the time period of interest (1.8%). When weighted to be nationally representative, these represent 13.97 million ED visits, equivalent to 1.8 ED visits annually per NH resident in the United States. More than half of visits (53.5%) did not lead to hospital admission; of those discharged from the ED, 62.8% had normal vital signs on presentation and 18.9% did not have any diagnostic testing before ED discharge. Injuries were 1.78 times more likely to be discharged than admitted (44.8% versus 25.3%, respectively, P < .001), whereas infections were 2.06 times as likely to be admitted as discharged (22.9% versus 11.1%, respectively). Computed tomography (CT) scans were performed in 25.4% and 30.1% of older NH residents who were discharged from the ED and admitted to the hospital, respectively, and more than 70% of these were CTs of the head. NH residents received centrally acting, sedating medications before ED discharge in 9.4% of visits. This nationally representative sample of older NH residents

  7. Accelerating Wright–Fisher Forward Simulations on the Graphics Processing Unit

    PubMed Central

    Lawrie, David S.

    2017-01-01

    Forward Wright–Fisher simulations are powerful in their ability to model complex demography and selection scenarios, but suffer from slow execution on the Central Processor Unit (CPU), thus limiting their usefulness. However, the single-locus Wright–Fisher forward algorithm is exceedingly parallelizable, with many steps that are so-called “embarrassingly parallel,” consisting of a vast number of individual computations that are all independent of each other and thus capable of being performed concurrently. The rise of modern Graphics Processing Units (GPUs) and programming languages designed to leverage the inherent parallel nature of these processors have allowed researchers to dramatically speed up many programs that have such high arithmetic intensity and intrinsic concurrency. The presented GPU Optimized Wright–Fisher simulation, or “GO Fish” for short, can be used to simulate arbitrary selection and demographic scenarios while running over 250-fold faster than its serial counterpart on the CPU. Even modest GPU hardware can achieve an impressive speedup of over two orders of magnitude. With simulations so accelerated, one can not only do quick parametric bootstrapping of previously estimated parameters, but also use simulated results to calculate the likelihoods and summary statistics of demographic and selection models against real polymorphism data, all without restricting the demographic and selection scenarios that can be modeled or requiring approximations to the single-locus forward algorithm for efficiency. Further, as many of the parallel programming techniques used in this simulation can be applied to other computationally intensive algorithms important in population genetics, GO Fish serves as an exciting template for future research into accelerating computation in evolution. GO Fish is part of the Parallel PopGen Package available at: http://dl42.github.io/ParallelPopGen/. PMID:28768689

  8. Method of identifying plant pathogen tolerance

    DOEpatents

    Ecker, Joseph R.; Staskawicz, Brian J.; Bent, Andrew F.; Innes, Roger W.

    1997-10-07

    A process for identifying a plant having disease tolerance comprising administering to a plant an inhibitory amount of ethylene and screening for ethylene insensitivity, thereby identifying a disease tolerant plant, is described. Plants identified by the foregoing process are also described.

  9. Comparative qualitative phosphoproteomics analysis identifies shared phosphorylation motifs and associated biological processes in evolutionary divergent plants.

    PubMed

    Al-Momani, Shireen; Qi, Da; Ren, Zhe; Jones, Andrew R

    2018-06-15

    Phosphorylation is one of the most prevalent post-translational modifications and plays a key role in regulating cellular processes. We carried out a bioinformatics analysis of pre-existing phosphoproteomics data, to profile two model species representing the largest subclasses in flowering plants the dicot Arabidopsis thaliana and the monocot Oryza sativa, to understand the extent to which phosphorylation signaling and function is conserved across evolutionary divergent plants. We identified 6537 phosphopeptides from 3189 phosphoproteins in Arabidopsis and 2307 phosphopeptides from 1613 phosphoproteins in rice. We identified phosphorylation motifs, finding nineteen pS motifs and two pT motifs shared in rice and Arabidopsis. The majority of shared motif-containing proteins were mapped to the same biological processes with similar patterns of fold enrichment, indicating high functional conservation. We also identified shared patterns of crosstalk between phosphoserines with enrichment for motifs pSXpS, pSXXpS and pSXXXpS, where X is any amino acid. Lastly, our results identified several pairs of motifs that are significantly enriched to co-occur in Arabidopsis proteins, indicating cross-talk between different sites, but this was not observed in rice. Our results demonstrate that there are evolutionary conserved mechanisms of phosphorylation-mediated signaling in plants, via analysis of high-throughput phosphorylation proteomics data from key monocot and dicot species: rice and Arabidposis thaliana. The results also suggest that there is increased crosstalk between phosphorylation sites in A. thaliana compared with rice. The results are important for our general understanding of cell signaling in plants, and the ability to use A. thaliana as a general model for plant biology. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Particle-in-cell simulations with charge-conserving current deposition on graphic processing units

    NASA Astrophysics Data System (ADS)

    Ren, Chuang; Kong, Xianglong; Huang, Michael; Decyk, Viktor; Mori, Warren

    2011-10-01

    Recently using CUDA, we have developed an electromagnetic Particle-in-Cell (PIC) code with charge-conserving current deposition for Nvidia graphic processing units (GPU's) (Kong et al., Journal of Computational Physics 230, 1676 (2011). On a Tesla M2050 (Fermi) card, the GPU PIC code can achieve a one-particle-step process time of 1.2 - 3.2 ns in 2D and 2.3 - 7.2 ns in 3D, depending on plasma temperatures. In this talk we will discuss novel algorithms for GPU-PIC including charge-conserving current deposition scheme with few branching and parallel particle sorting. These algorithms have made efficient use of the GPU shared memory. We will also discuss how to replace the computation kernels of existing parallel CPU codes while keeping their parallel structures. This work was supported by U.S. Department of Energy under Grant Nos. DE-FG02-06ER54879 and DE-FC02-04ER54789 and by NSF under Grant Nos. PHY-0903797 and CCF-0747324.

  11. Texture segmentation: do the processing units on the saliency map increase with eccentricity?

    PubMed

    Schade, Ursula; Meinecke, Cristina

    2011-01-01

    The saliency map is a computational model and has been constructed for simulating human saliency processing, e.g. pop-out target detection (e.g. Itti & Koch, 2000). In this study the spatial structure on the saliency map was investigated. It is proposed that the saliency map is structured into processing units whose size is increasing with retinal eccentricity. In two experiments the distance between a target in the stimulus and an irrelevant structure in the mask was varied systematically. Our findings had two main points. Firstly, in texture segmentation tasks the saliency signals from two texture irregularities interfere, when these irregularities appear within a critical spatial distance. Second, the critical distances increase with target eccentricity. The eccentricity-dependent critical distances can be interpreted as crowding effects. It is assumed that additionally to the target eccentricity, also the strength of a saliency signal can determine the spatial area of its impairing influence. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Modeling of 2D diffusion processes based on microscopy data: parameter estimation and practical identifiability analysis.

    PubMed

    Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J

    2013-01-01

    Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.

  13. Method of identifying plant pathogen tolerance

    DOEpatents

    Ecker, J.R.; Staskawicz, B.J.; Bent, A.F.; Innes, R.W.

    1997-10-07

    A process for identifying a plant having disease tolerance comprising administering to a plant an inhibitory amount of ethylene and screening for ethylene insensitivity, thereby identifying a disease tolerant plant, is described. Plants identified by the foregoing process are also described. 7 figs.

  14. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF).

    PubMed

    Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S

    2012-02-23

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  15. Intermediate SCDC Spanish Curricula Units. Science/Health, Unit 1, Kits 1-4, Teacher's Guide.

    ERIC Educational Resources Information Center

    Spanish Curricula Development Center, Miami Beach, FL.

    Unified by the theme "our community", this unit, part of nine basic instructional units for intermediate level, reflects the observations of Mexican Americans, Puerto Ricans, and Cubans in various regions of the United States. Comprised of Kits 1-4, the unit extends the following basic and interpreted science processes: observing, communicating,…

  16. Two-digit number comparison: Decade-unit and unit-decade produce the same compatibility effect with number words.

    PubMed

    Macizo, Pedro; Herrera, Amparo

    2010-03-01

    This study explored the processing of 2-digit number words by examining the unit-decade compatibility effect in Spanish. Participants were required to choose the larger of 2-digit number words presented in verbal notation. In compatible trials the decade and unit comparisons led to the same response (e.g., 53-68) while in incompatible trials the decade and unit comparisons led to different responses (e.g., 59-74). Participants were slower on compatible trials as compared to incompatible trials. In Experiments 2 and 3, we evaluated whether the reverse compatibility effect in Spanish was only due to a pure left-to-right encoding which favours the decade processing in this language (decade-unit order). When participants processed 2-digit number words presented in reverse form (in the unit-decade order), the same reverse compatibility effect was found. This pattern of results suggests that participants have learnt a language-dependent process for analysing written numbers which is used irrespective of the specific arrangement of units and decades in the comparison task. 2010 APA, all rights reserved.

  17. Nanoscale multireference quantum chemistry: full configuration interaction on graphical processing units.

    PubMed

    Fales, B Scott; Levine, Benjamin G

    2015-10-13

    Methods based on a full configuration interaction (FCI) expansion in an active space of orbitals are widely used for modeling chemical phenomena such as bond breaking, multiply excited states, and conical intersections in small-to-medium-sized molecules, but these phenomena occur in systems of all sizes. To scale such calculations up to the nanoscale, we have developed an implementation of FCI in which electron repulsion integral transformation and several of the more expensive steps in σ vector formation are performed on graphical processing unit (GPU) hardware. When applied to a 1.7 × 1.4 × 1.4 nm silicon nanoparticle (Si72H64) described with the polarized, all-electron 6-31G** basis set, our implementation can solve for the ground state of the 16-active-electron/16-active-orbital CASCI Hamiltonian (more than 100,000,000 configurations) in 39 min on a single NVidia K40 GPU.

  18. Kinematic modelling of disc galaxies using graphics processing units

    NASA Astrophysics Data System (ADS)

    Bekiaris, G.; Glazebrook, K.; Fluke, C. J.; Abraham, R.

    2016-01-01

    With large-scale integral field spectroscopy (IFS) surveys of thousands of galaxies currently under-way or planned, the astronomical community is in need of methods, techniques and tools that will allow the analysis of huge amounts of data. We focus on the kinematic modelling of disc galaxies and investigate the potential use of massively parallel architectures, such as the graphics processing unit (GPU), as an accelerator for the computationally expensive model-fitting procedure. We review the algorithms involved in model-fitting and evaluate their suitability for GPU implementation. We employ different optimization techniques, including the Levenberg-Marquardt and nested sampling algorithms, but also a naive brute-force approach based on nested grids. We find that the GPU can accelerate the model-fitting procedure up to a factor of ˜100 when compared to a single-threaded CPU, and up to a factor of ˜10 when compared to a multithreaded dual CPU configuration. Our method's accuracy, precision and robustness are assessed by successfully recovering the kinematic properties of simulated data, and also by verifying the kinematic modelling results of galaxies from the GHASP and DYNAMO surveys as found in the literature. The resulting GBKFIT code is available for download from: http://supercomputing.swin.edu.au/gbkfit.

  19. Identifying and tracking dynamic processes in social networks

    NASA Astrophysics Data System (ADS)

    Chung, Wayne; Savell, Robert; Schütt, Jan-Peter; Cybenko, George

    2006-05-01

    The detection and tracking of embedded malicious subnets in an active social network can be computationally daunting due to the quantity of transactional data generated in the natural interaction of large numbers of actors comprising a network. In addition, detection of illicit behavior may be further complicated by evasive strategies designed to camouflage the activities of the covert subnet. In this work, we move beyond traditional static methods of social network analysis to develop a set of dynamic process models which encode various modes of behavior in active social networks. These models will serve as the basis for a new application of the Process Query System (PQS) to the identification and tracking of covert dynamic processes in social networks. We present a preliminary result from application of our technique in a real-world data stream-- the Enron email corpus.

  20. Cross-disciplinary links in environmental systems science: Current state and claimed needs identified in a meta-review of process models.

    PubMed

    Ayllón, Daniel; Grimm, Volker; Attinger, Sabine; Hauhs, Michael; Simmer, Clemens; Vereecken, Harry; Lischeid, Gunnar

    2018-05-01

    Terrestrial environmental systems are characterised by numerous feedback links between their different compartments. However, scientific research is organized into disciplines that focus on processes within the respective compartments rather than on interdisciplinary links. Major feedback mechanisms between compartments might therefore have been systematically overlooked so far. Without identifying these gaps, initiatives on future comprehensive environmental monitoring schemes and experimental platforms might fail. We performed a comprehensive overview of feedbacks between compartments currently represented in environmental sciences and explores to what degree missing links have already been acknowledged in the literature. We focused on process models as they can be regarded as repositories of scientific knowledge that compile findings of numerous single studies. In total, 118 simulation models from 23 model types were analysed. Missing processes linking different environmental compartments were identified based on a meta-review of 346 published reviews, model intercomparison studies, and model descriptions. Eight disciplines of environmental sciences were considered and 396 linking processes were identified and ascribed to the physical, chemical or biological domain. There were significant differences between model types and scientific disciplines regarding implemented interdisciplinary links. The most wide-spread interdisciplinary links were between physical processes in meteorology, hydrology and soil science that drive or set the boundary conditions for other processes (e.g., ecological processes). In contrast, most chemical and biological processes were restricted to links within the same compartment. Integration of multiple environmental compartments and interdisciplinary knowledge was scarce in most model types. There was a strong bias of suggested future research foci and model extensions towards reinforcing existing interdisciplinary knowledge rather than

  1. Making processes reliable: a validated pubmed search strategy for identifying new or emerging technologies.

    PubMed

    Varela-Lema, Leonora; Punal-Riobóo, Jeanette; Acción, Beatriz Casal; Ruano-Ravina, Alberto; García, Marisa López

    2012-10-01

    Horizon scanning systems need to handle a wide range of sources to identify new or emerging health technologies. The objective of this study is to develop a validated Medline bibliographic search strategy (PubMed search engine) to systematically identify new or emerging health technologies. The proposed Medline search strategy combines free text terms commonly used in article titles to denote innovation within index terms that make reference to the specific fields of interest. Efficacy was assessed by running the search over a period of 1 year (2009) and analyzing its retrieval performance (number and characteristics). For comparison purposes, all article abstracts published during 2009 in six preselected key research journals and eight high impact surgery journals were scanned. Sensitivity was defined as the proportion of relevant new or emerging technologies published in key journals that would be identified in the search strategy within the first 2 years of publication. The search yielded 6,228 abstracts of potentially new or emerging technologies. Of these, 459 were classified as new or emerging (383 truly new or emerging and 76 new indications). The scanning of 12,061 journal abstracts identified 35 relevant new or emerging technologies. Of these, twenty-nine were located within the Medline search strategy during the first 2 years of publication (sensitivity = 83 percent). The current search strategy, validated against key journals, has demonstrated to be effective for horizon scanning. Even though it can require adaptations depending on the scope of the horizon scanning system, it could serve to simplify and standardize scanning processes.

  2. Fast generation of computer-generated hologram by graphics processing unit

    NASA Astrophysics Data System (ADS)

    Matsuda, Sho; Fujii, Tomohiko; Yamaguchi, Takeshi; Yoshikawa, Hiroshi

    2009-02-01

    A cylindrical hologram is well known to be viewable in 360 deg. This hologram depends high pixel resolution.Therefore, Computer-Generated Cylindrical Hologram (CGCH) requires huge calculation amount.In our previous research, we used look-up table method for fast calculation with Intel Pentium4 2.8 GHz.It took 480 hours to calculate high resolution CGCH (504,000 x 63,000 pixels and the average number of object points are 27,000).To improve quality of CGCH reconstructed image, fringe pattern requires higher spatial frequency and resolution.Therefore, to increase the calculation speed, we have to change the calculation method. In this paper, to reduce the calculation time of CGCH (912,000 x 108,000 pixels), we employ Graphics Processing Unit (GPU).It took 4,406 hours to calculate high resolution CGCH on Xeon 3.4 GHz.Since GPU has many streaming processors and a parallel processing structure, GPU works as the high performance parallel processor.In addition, GPU gives max performance to 2 dimensional data and streaming data.Recently, GPU can be utilized for the general purpose (GPGPU).For example, NVIDIA's GeForce7 series became a programmable processor with Cg programming language.Next GeForce8 series have CUDA as software development kit made by NVIDIA.Theoretically, calculation ability of GPU is announced as 500 GFLOPS. From the experimental result, we have achieved that 47 times faster calculation compared with our previous work which used CPU.Therefore, CGCH can be generated in 95 hours.So, total time is 110 hours to calculate and print the CGCH.

  3. Health care units and human resources management trends.

    PubMed

    André, Adriana Maria; Ciampone, Maria Helena Trench; Santelle, Odete

    2013-02-01

    To identify factors producing new trends in basic health care unit management and changes in management models. This was a prospective study with ten health care unit managers and ten specialists in the field of Health in São Paulo, Southeastern Brazil, in 2010. The Delphi methodology was adopted. There were four stages of data collection, three quantitative and the fourth qualitative. The first three rounds dealt with changing trends in management models, manager profiles and required competencies, and the Mann-Whitney test was used in the analysis. The fourth round took the form of a panel of those involved, using thematic analysis. The main factors which are driving change in basic health care units were identified, as were changes in management models. There was consensus that this process is influenced by the difficulties in managing teams and by politics. The managers were found to be up-to-date with trends in the wider context, with the arrival of social health organizations, but they are not yet anticipating these within the institutions. Not only the content, but the professional development aspect of training courses in this area should be reviewed. Selection and recruitment, training and assessment of these professionals should be guided by these competencies aligned to the health service mission, vision, values and management models.

  4. Graphics processing unit accelerated intensity-based optical coherence tomography angiography using differential frames with real-time motion correction.

    PubMed

    Watanabe, Yuuki; Takahashi, Yuhei; Numazawa, Hiroshi

    2014-02-01

    We demonstrate intensity-based optical coherence tomography (OCT) angiography using the squared difference of two sequential frames with bulk-tissue-motion (BTM) correction. This motion correction was performed by minimization of the sum of the pixel values using axial- and lateral-pixel-shifted structural OCT images. We extract the BTM-corrected image from a total of 25 calculated OCT angiographic images. Image processing was accelerated by a graphics processing unit (GPU) with many stream processors to optimize the parallel processing procedure. The GPU processing rate was faster than that of a line scan camera (46.9 kHz). Our OCT system provides the means of displaying structural OCT images and BTM-corrected OCT angiographic images in real time.

  5. A Systematic Approach of Employing Quality by Design Principles: Risk Assessment and Design of Experiments to Demonstrate Process Understanding and Identify the Critical Process Parameters for Coating of the Ethylcellulose Pseudolatex Dispersion Using Non-Conventional Fluid Bed Process.

    PubMed

    Kothari, Bhaveshkumar H; Fahmy, Raafat; Claycamp, H Gregg; Moore, Christine M V; Chatterjee, Sharmista; Hoag, Stephen W

    2017-05-01

    The goal of this study was to utilize risk assessment techniques and statistical design of experiments (DoE) to gain process understanding and to identify critical process parameters for the manufacture of controlled release multiparticulate beads using a novel disk-jet fluid bed technology. The material attributes and process parameters were systematically assessed using the Ishikawa fish bone diagram and failure mode and effect analysis (FMEA) risk assessment methods. The high risk attributes identified by the FMEA analysis were further explored using resolution V fractional factorial design. To gain an understanding of the processing parameters, a resolution V fractional factorial study was conducted. Using knowledge gained from the resolution V study, a resolution IV fractional factorial study was conducted; the purpose of this IV study was to identify the critical process parameters (CPP) that impact the critical quality attributes and understand the influence of these parameters on film formation. For both studies, the microclimate, atomization pressure, inlet air volume, product temperature (during spraying and curing), curing time, and percent solids in the coating solutions were studied. The responses evaluated were percent agglomeration, percent fines, percent yield, bead aspect ratio, median particle size diameter (d50), assay, and drug release rate. Pyrobuttons® were used to record real-time temperature and humidity changes in the fluid bed. The risk assessment methods and process analytical tools helped to understand the novel disk-jet technology and to systematically develop models of the coating process parameters like process efficiency and the extent of curing during the coating process.

  6. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

    ERIC Educational Resources Information Center

    Hill, Stephen E.; Schvaneveldt, Shane J.

    2011-01-01

    This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

  7. Gravity driven and in situ fractional crystallization processes in the Centre Hill complex, Abitibi Subprovince, Canada: Evidence from bilaterally-paired cyclic units

    NASA Astrophysics Data System (ADS)

    Thériault, R. D.; Fowler, A. D.

    1996-12-01

    The formation of layers in mafic intrusions has been explained by various processes, making it the subject of much controversy. The concept that layering originates from gravitational settling of crystals has been superseded in recent years by models involving in situ fractional crystallization. Here we present evidence from the Centre Hill complex that both processes may be operative simultaneously within the same intrusion. The Centre Hill complex is part of the Munro Lake sill, an Archean layered mafic intrusion emplaced in volcanic rocks of the Abitibi Subprovince. The Centre Hill complex comprises the following lithostratigraphic units: six lower cyclic units of peridotite and clinopyroxenite; a middle unit of leucogabbro; six upper cyclic units of branching-textured gabbro (BTG) and clotted-textured gabbro (CTG), the uppermost of these units being overlain by a marginal zone of fine-grained gabbro. The cyclic units of peridotite/clinopyroxenite and BTG/CTG are interpreted to have formed concurrently through fractional crystallization, associated with periodic replenishment of magma to the chamber. The units of peridotite and clinopyroxenite formed by gravitational accumulation of crystals that grew under the roof. The cyclic units of BTG and CTG formed along the upper margin of the sill by two different mechanisms: (1) layers of BTG crystallized in situ along an inward-growing roof and (2) layers of CTG formed by accumulation of buoyant plagioclase crystals. The layers of BTG are characterized by branching pseudomorphs after fayalite up to 50 cm in length that extend away from the upper margin. The original branching crystals are interpreted to have grown from stagnant intercumulus melt in a high thermal gradient resulting from the injection of new magma to the chamber.

  8. Defining the Process of a Cardiovascular Risk Assessment Program: Lessons Learnt From Cardiac Assessment of Elite Soccer Players in the United Kingdom.

    PubMed

    Speers, Christopher; Seth, Ajai Narain; Patel, Kiran Chhaganbhai; Rakhit, Dhrubo Jyoti; Gillett, Mark James

    2017-12-14

    Retrospectively analyze the cardiac assessment process for elite soccer players, and provide team physicians with a systematic guide to managing longitudinal cardiac risk. Descriptive Epidemiology Study. Cardiac assessments incorporating clinical examination, 12-lead ECG, echocardiography, and health questionnaire. Soccer players at 5 professional clubs in England, the United Kingdom. Data was retrospectively collected, inspected, and analyzed to determine their clinical management and subsequent follow-up. Over 2 years, 265 soccer players, aged 13 to 37 years with 66% of white European ethnicity, were included in the cohort. Eleven percent had "not-normal" assessments, of these assessments, 83% were considered gray screens, falling into three broad categories: structural cardiac features (including valvular abnormalities), functional cardiac features, and electrocardiogram changes. After cardiology consultation, all assessments were grouped into low, enhanced and high-risk categories for ongoing longitudinal risk management. Overall clear-cut pathology was identified in 2%. Cardiovascular assessment is a vital tool in identifying athletes at risk of sudden cardiac death to mitigate their risk through surveillance, intervention, or participation restriction. The decision whether a player is fit to play or not requires a robust risk assessment followed by input from a multidisciplinary team that includes both the team physician and cardiologist. This educational article proposes a clinical management pathway to aid clinicians with this process. Sudden cardiac death is the important medical cause of death during exercise. The team physician should assume responsibility for the management of the longitudinal risk of their players' cardiac assessments in conjunction with sports cardiologist.

  9. Fast ray-tracing of human eye optics on Graphics Processing Units.

    PubMed

    Wei, Qi; Patkar, Saket; Pai, Dinesh K

    2014-05-01

    We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Using information theory to identify redundancy in common laboratory tests in the intensive care unit.

    PubMed

    Lee, Joon; Maslove, David M

    2015-07-31

    Clinical workflow is infused with large quantities of data, particularly in areas with enhanced monitoring such as the Intensive Care Unit (ICU). Information theory can quantify the expected amounts of total and redundant information contained in a given clinical data type, and as such has the potential to inform clinicians on how to manage the vast volumes of data they are required to analyze in their daily practice. The objective of this proof-of-concept study was to quantify the amounts of redundant information associated with common ICU lab tests. We analyzed the information content of 11 laboratory test results from 29,149 adult ICU admissions in the MIMIC II database. Information theory was applied to quantify the expected amount of redundant information both between lab values from the same ICU day, and between consecutive ICU days. Most lab values showed a decreasing trend over time in the expected amount of novel information they contained. Platelet, blood urea nitrogen (BUN), and creatinine measurements exhibited the most amount of redundant information on days 2 and 3 compared to the previous day. The creatinine-BUN and sodium-chloride pairs had the most redundancy. Information theory can help identify and discourage unnecessary testing and bloodwork, and can in general be a useful data analytic technique for many medical specialties that deal with information overload.

  11. Monitoring of health care personnel employee and occupational health immunization program practices in the United States.

    PubMed

    Carrico, Ruth M; Sorrells, Nikka; Westhusing, Kelly; Wiemken, Timothy

    2014-01-01

    Recent studies have identified concerns with various elements of health care personnel immunization programs, including the handling and management of the vaccine. The purpose of this study was to assess monitoring processes that support evaluation of the care of vaccines in health care settings. An 11-question survey instrument was developed for use in scripted telephone surveys. State health departments in all 50 states in the United States and the District of Columbia were the target audience for the surveys. Data from a total of 47 states were obtained and analyzed. No states reported an existing monitoring process for evaluation of health care personnel immunization programs in their states. Our assessment indicates that vaccine evaluation processes for health care facilities are rare to nonexistent in the United States. Identifying existing practice gaps and resultant opportunities for improvements may be an important safety initiative that protects patients and health care personnel. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  12. Method for identifying biochemical and chemical reactions and micromechanical processes using nanomechanical and electronic signal identification

    DOEpatents

    Holzrichter, J.F.; Siekhaus, W.J.

    1997-04-15

    A scanning probe microscope, such as an atomic force microscope (AFM) or a scanning tunneling microscope (STM), is operated in a stationary mode on a site where an activity of interest occurs to measure and identify characteristic time-varying micromotions caused by biological, chemical, mechanical, electrical, optical, or physical processes. The tip and cantilever assembly of an AFM is used as a micromechanical detector of characteristic micromotions transmitted either directly by a site of interest or indirectly through the surrounding medium. Alternatively, the exponential dependence of the tunneling current on the size of the gap in the STM is used to detect micromechanical movement. The stationary mode of operation can be used to observe dynamic biological processes in real time and in a natural environment, such as polymerase processing of DNA for determining the sequence of a DNA molecule. 6 figs.

  13. Method for identifying biochemical and chemical reactions and micromechanical processes using nanomechanical and electronic signal identification

    DOEpatents

    Holzrichter, John F.; Siekhaus, Wigbert J.

    1997-01-01

    A scanning probe microscope, such as an atomic force microscope (AFM) or a scanning tunneling microscope (STM), is operated in a stationary mode on a site where an activity of interest occurs to measure and identify characteristic time-varying micromotions caused by biological, chemical, mechanical, electrical, optical, or physical processes. The tip and cantilever assembly of an AFM is used as a micromechanical detector of characteristic micromotions transmitted either directly by a site of interest or indirectly through the surrounding medium. Alternatively, the exponential dependence of the tunneling current on the size of the gap in the STM is used to detect micromechanical movement. The stationary mode of operation can be used to observe dynamic biological processes in real time and in a natural environment, such as polymerase processing of DNA for determining the sequence of a DNA molecule.

  14. Accelerating cardiac bidomain simulations using graphics processing units.

    PubMed

    Neic, A; Liebmann, M; Hoetzl, E; Mitchell, L; Vigmond, E J; Haase, G; Plank, G

    2012-08-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6-20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20 GPUs, 476 CPU cores were required on a national supercomputing facility.

  15. Accelerating Cardiac Bidomain Simulations Using Graphics Processing Units

    PubMed Central

    Neic, Aurel; Liebmann, Manfred; Hoetzl, Elena; Mitchell, Lawrence; Vigmond, Edward J.; Haase, Gundolf

    2013-01-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6–20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20GPUs, 476 CPU cores were required on a national supercomputing facility. PMID:22692867

  16. Violations identified from routine swimming pool inspections--selected states and counties, United States, 2008.

    PubMed

    2010-05-21

    Swimming is the third most popular U.S. sport or exercise activity, with approximately 314 million visits to recreational water venues, including treated venues (e.g., pools), each year. The most frequently reported type of recreational water illness (RWI) outbreak is gastroenteritis, the incidence of which is increasing. During 1997--2006, chlorine- and bromine-susceptible pathogens (e.g., Shigella and norovirus) caused 24 (23%) of 104 treated venue--associated RWI outbreaks of gastroenteritis, indicating lapses in proper operation of pools. Pool inspectors help minimize the risk for RWIs and injuries by enforcing regulations that govern public treated recreational water venues. To assess pool code compliance, CDC analyzed 2008 data from 121,020 routine pool inspections conducted by a convenience sample of 15 state and local agencies. Because pool codes and, therefore, inspection items differed across jurisdictions, reported denominators varied. Of 111,487 inspections, 13,532 (12.1%) resulted in immediate closure because of serious violations (e.g., lack of disinfectant in the water). Of 120,975 inspections, 12,917 (10.7%) identified disinfectant level violations. Although these results likely are not representative of all pools in the United States, they suggest the need for increased public health scrutiny and improved pool operation. The results also demonstrate that pool inspection data can be used as a potential source for surveillance to guide resource allocation and regulatory decision-making. Collecting pool inspection data in a standardized, electronic format can facilitate routine analysis to support efforts to reduce health and safety risks for swimmers.

  17. Processes and patterns of interaction as units of selection: An introduction to ITSNTS thinking.

    PubMed

    Doolittle, W Ford; Inkpen, S Andrew

    2018-04-17

    Many practicing biologists accept that nothing in their discipline makes sense except in the light of evolution, and that natural selection is evolution's principal sense-maker. But what natural selection actually is (a force or a statistical outcome, for example) and the levels of the biological hierarchy (genes, organisms, species, or even ecosystems) at which it operates directly are still actively disputed among philosophers and theoretical biologists. Most formulations of evolution by natural selection emphasize the differential reproduction of entities at one or the other of these levels. Some also recognize differential persistence, but in either case the focus is on lineages of material things: even species can be thought of as spatiotemporally restricted, if dispersed, physical beings. Few consider-as "units of selection" in their own right-the processes implemented by genes, cells, species, or communities. "It's the song not the singer" (ITSNTS) theory does that, also claiming that evolution by natural selection of processes is more easily understood and explained as differential persistence than as differential reproduction. ITSNTS was formulated as a response to the observation that the collective functions of microbial communities (the songs) are more stably conserved and ecologically relevant than are the taxa that implement them (the singers). It aims to serve as a useful corrective to claims that "holobionts" (microbes and their animal or plant hosts) are aggregate "units of selection," claims that often conflate meanings of that latter term. But ITSNS also seems broadly applicable, for example, to the evolution of global biogeochemical cycles and the definition of ecosystem function.

  18. Social policy devolution: a historical review of Canada, the United kingdom, and the United States (1834-1999).

    PubMed

    Dunlop, Judith M

    2009-01-01

    This paper explores the recurring themes of devolution and social policy across time and nation in Canada, the United Kingdom, and the United States. Devolution is defined as the transfer of responsibility from national governments to state and local levels. Using a historical framework, the central/local tensions that characterize devolution and social policy in these countries are noted from 1834 to the late 1990s. This chronology shows that despite their geographical, ideological, and cultural differences, all of these countries have shifted responsibility for social provision back and forth between central and local governments in similar ways throughout the three eras delineated in this analysis. Clearly, devolution characterizes the current social policy climate in these three countries and across many Western democracies. Recent trends in the environment such as privatization, mandatory collaboration, community capacity building, and service integration are identified, and process questions are presented as a guide for practitioners who seek to explore the current devolution reality.

  19. Full Stokes finite-element modeling of ice sheets using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Seddik, H.; Greve, R.

    2016-12-01

    Thermo-mechanical simulation of ice sheets is an important approach to understand and predict their evolution in a changing climate. For that purpose, higher order (e.g., ISSM, BISICLES) and full Stokes (e.g., Elmer/Ice, http://elmerice.elmerfem.org) models are increasingly used to more accurately model the flow of entire ice sheets. In parallel to this development, the rapidly improving performance and capabilities of Graphics Processing Units (GPUs) allows to efficiently offload more calculations of complex and computationally demanding problems on those devices. Thus, in order to continue the trend of using full Stokes models with greater resolutions, using GPUs should be considered for the implementation of ice sheet models. We developed the GPU-accelerated ice-sheet model Sainō. Sainō is an Elmer (http://www.csc.fi/english/pages/elmer) derivative implemented in Objective-C which solves the full Stokes equations with the finite element method. It uses the standard OpenCL language (http://www.khronos.org/opencl/) to offload the assembly of the finite element matrix on the GPU. A mesh-coloring scheme is used so that elements with the same color (non-sharing nodes) are assembled in parallel on the GPU without the need for synchronization primitives. The current implementation shows that, for the ISMIP-HOM experiment A, during the matrix assembly in double precision with 8000, 87,500 and 252,000 brick elements, Sainō is respectively 2x, 10x and 14x faster than Elmer/Ice (when both models are run on a single processing unit). In single precision, Sainō is even 3x, 20x and 25x faster than Elmer/Ice. A detailed description of the comparative results between Sainō and Elmer/Ice will be presented, and further perspectives in optimization and the limitations of the current implementation.

  20. Identifying geochemical processes using End Member Mixing Analysis to decouple chemical components for mixing ratio calculations

    NASA Astrophysics Data System (ADS)

    Pelizardi, Flavia; Bea, Sergio A.; Carrera, Jesús; Vives, Luis

    2017-07-01

    Mixing calculations (i.e., the calculation of the proportions in which end-members are mixed in a sample) are essential for hydrological research and water management. However, they typically require the use of conservative species, a condition that may be difficult to meet due to chemical reactions. Mixing calculation also require identifying end-member waters, which is usually achieved through End Member Mixing Analysis (EMMA). We present a methodology to help in the identification of both end-members and such reactions, so as to improve mixing ratio calculations. The proposed approach consists of: (1) identifying the potential chemical reactions with the help of EMMA; (2) defining decoupled conservative chemical components consistent with those reactions; (3) repeat EMMA with the decoupled (i.e., conservative) components, so as to identify end-members waters; and (4) computing mixing ratios using the new set of components and end-members. The approach is illustrated by application to two synthetic mixing examples involving mineral dissolution and cation exchange reactions. Results confirm that the methodology can be successfully used to identify geochemical processes affecting the mixtures, thus improving the accuracy of mixing ratios calculations and relaxing the need for conservative species.

  1. Sono-leather technology with ultrasound: a boon for unit operations in leather processing - review of our research work at Central Leather Research Institute (CLRI), India.

    PubMed

    Sivakumar, Venkatasubramanian; Swaminathan, Gopalaraman; Rao, Paruchuri Gangadhar; Ramasami, Thirumalachari

    2009-01-01

    Ultrasound is a sound wave with a frequency above the human audible range of 16 Hz to 16 kHz. In recent years, numerous unit operations involving physical as well as chemical processes are reported to have been enhanced by ultrasonic irradiation. There have been benefits such as improvement in process efficiency, process time reduction, performing the processes under milder conditions and avoiding the use of some toxic chemicals to achieve cleaner processing. These could be a better way of augmentation for the processes as an advanced technique. The important point here is that ultrasonic irradiation is physical method activation rather than using chemical entities. Detailed studies have been made in the unit operations related to leather such as diffusion rate enhancement through porous leather matrix, cleaning, degreasing, tanning, dyeing, fatliquoring, oil-water emulsification process and solid-liquid tannin extraction from vegetable tanning materials as well as in precipitation reaction in wastewater treatment. The fundamental mechanism involved in these processes is ultrasonic cavitation in liquid media. In addition to this there also exist some process specific mechanisms for the enhancement of the processes. For instance, possible real-time reversible pore-size changes during ultrasound propagation through skin/leather matrix could be a reason for diffusion rate enhancement in leather processing as reported for the first time. Exhaustive scientific research work has been carried out in this area by our group working in Chemical Engineering Division of CLRI and most of these benefits have been proven with publications in valued peer-reviewed international journals. The overall results indicate that about 2-5-fold increase in the process efficiency due to ultrasound under the given process conditions for various unit operations with additional benefits. Scale-up studies are underway for converting these concepts in to a real viable larger scale operation. In

  2. Panel 2: anticipatory risk assessment: identifying, assessing, and mitigating exposure risks before they occur.

    PubMed

    Guidotti, Tee L; Pacha, Laura

    2011-07-01

    Health threats place the military mission and deployed service members at risk. A commander's focus is on preventing acute health risks, such as diarrhea, because these quickly compromise the mission. However, in recent conflicts chronic and long-term illness risks have emerged as concerns. Department of Defense and Joint Chiefs of Staff mandates require documentation of exposures and environmental conditions to reconstruct exposures and evaluate future health risks. Current processes for identifying and assessing hazards, including identification and assessment before deployment and in time to take action to prevent or reduce exposures, when followed, are generally adequate for known hazards. Identifying and addressing novel, unexpected risks remain challenges. Armed conflicts are associated with rapidly changing conditions, making ongoing hazard identification and assessment difficult. Therefore, surveillance of the environment for hazards and surveillance of personnel for morbidity must be practiced at all times. Communication of risk information to decision makers is critical but problematic. Preventive Medicine (PM) personnel should take responsibility for communicating this information to non-PM military medical people and to military commanders. Communication of risks identified and lessons learned between PM personnel of different military units is extremely important when one military unit replaces another in a deployed environment.

  3. Establishment and progress of the chest pain unit certification process in Germany and the local experiences of Mainz.

    PubMed

    Post, Felix; Gori, Tommaso; Senges, Jochen; Giannitsis, Evangelos; Katus, Hugo; Münzel, Thomas

    2012-03-01

    The establishment of chest pain units (CPUs) in the USA and UK has led to improvements in the prognosis of patients with chest pain and myocardial infarction, optimizing access to specialized diagnostic and therapeutic facilities and reducing costs. To establish a uniform implementation of this type of service in Germany, the German Cardiac Society (DGK) founded a 'CPU task force' in 2007, which developed a set of standard requirements and a nationwide certification programme. The recommendations for minimum standard requirements were published in 2008. As of November 2011, 132 CPUs were certified and 36 units were in the certification process. The aim of the DGK is to certify as many as 250 centres (units) throughout Germany within the next 2 years, to provide nationwide coverage. Applications from Switzerland are also being filed. Public awareness campaigns in cooperation with national league soccer teams were organized to raise awareness of the importance for early diagnosis and treatment of cardiac diseases and to publicize the existence of these new facilities. The German model of CPU certification allows nationwide and prospectively European-wide standardization of patient care and to improve adherence to international guidelines. Coupled with awareness campaigns and with the launch of a German CPU Registry, this process is aimed at improving the education and treatment of patients with chest pain and to provide scientific information about the quality of patient care.

  4. Imperfect physician assistant and physical therapist admissions processes in the United States

    PubMed Central

    2014-01-01

    We compared and contrasted physician assistant and physical therapy profession admissions processes based on the similar number of accredited programs in the United States and the co-existence of many programs in the same school of health professions, because both professions conduct similar centralized application procedures administered by the same organization. Many studies are critical of the fallibility and inadequate scientific rigor of the high-stakes nature of health professions admissions decisions, yet typical admission processes remain very similar. Cognitive variables, most notably undergraduate grade point averages, have been shown to be the best predictors of academic achievement in the health professions. The variability of non-cognitive attributes assessed and the methods used to measure them have come under increasing scrutiny in the literature. The variance in health professions students’ performance in the classroom and on certifying examinations remains unexplained, and cognitive considerations vary considerably between and among programs that describe them. One uncertainty resulting from this review is whether or not desired candidate attributes highly sought after by individual programs are more student-centered or graduate-centered. Based on the findings from the literature, we suggest that student success in the classroom versus the clinic is based on a different set of variables. Given the range of positions and general lack of reliability and validity in studies of non-cognitive admissions attributes, we think that health professions admissions processes remain imperfect works in progress. PMID:24810020

  5. Prompting children to reason proportionally: Processing discrete units as continuous amounts.

    PubMed

    Boyer, Ty W; Levine, Susan C

    2015-05-01

    Recent studies reveal that children can solve proportional reasoning problems presented with continuous amounts that enable intuitive strategies by around 6 years of age but have difficulties with problems presented with discrete units that tend to elicit explicit count-and-match strategies until at least 10 years of age. The current study tests whether performance on discrete unit problems might be improved by prompting intuitive reasoning with continuous-format problems. Participants were kindergarten, second-grade, and fourth-grade students (N = 194) assigned to either an experimental condition, where they were given continuous amount proportion problems before discrete unit proportion problems, or a control condition, where they were given all discrete unit problems. Results of a three-way mixed-model analysis of variance examining school grade, experimental condition, and block of trials indicated that fourth-grade students in the experimental condition outperformed those in the control condition on discrete unit problems in the second half of the experiment, but kindergarten and second-grade students did not differ by condition. This suggests that older children can be prompted to use intuitive strategies to reason proportionally. (c) 2015 APA, all rights reserved).

  6. The implementation of unit-based perinatal mortality audit in perinatal cooperation units in the northern region of the Netherlands

    PubMed Central

    2012-01-01

    Background Perinatal (mortality) audit can be considered to be a way to improve the careprocess for all pregnant women and their newborns by creating an opportunity to learn from unwanted events in the care process. In unit-based perinatal audit, the caregivers involved in cases that result in mortality are usually part of the audit group. This makes such an audit a delicate matter. Methods The purpose of this study was to implement unit-based perinatal mortality audit in all 15 perinatal cooperation units in the northern region of the Netherlands between September 2007 and March 2010. These units consist of hospital-based and independent community-based perinatal caregivers. The implementation strategy encompassed an information plan, an organization plan, and a training plan. The main outcomes are the number of participating perinatal cooperation units at the end of the project, the identified substandard factors (SSF), the actions to improve care, and the opinions of the participants. Results The perinatal mortality audit was implemented in all 15 perinatal cooperation units. 677 different caregivers analyzed 112 cases of perinatal mortality and identified 163 substandard factors. In 31% of cases the guidelines were not followed and in 23% care was not according to normal practice. In 28% of cases, the documentation was not in order, while in 13% of cases the communication between caregivers was insufficient. 442 actions to improve care were reported for ‘external cooperation’ (15%), ‘internal cooperation’ (17%), ‘practice organization’ (26%), ‘training and education’ (10%), and ‘medical performance’ (27%). Valued aspects of the audit meetings were: the multidisciplinary character (13%), the collective and non-judgmental search for substandard factors (21%), the perception of safety (13%), the motivation to reflect on one’s own professional performance (5%), and the inherent postgraduate education (10%). Conclusion Following our

  7. Exploring the institutional logics of health professions education scholarship units.

    PubMed

    Varpio, Lara; O'Brien, Bridget; Hu, Wendy; Ten Cate, Olle; Durning, Steven J; van der Vleuten, Cees; Gruppen, Larry; Irby, David; Humphrey-Murto, Susan; Hamstra, Stanley J

    2017-07-01

    Although health professions education scholarship units (HPESUs) share a commitment to the production and dissemination of rigorous educational practices and research, they are situated in many different contexts and have a wide range of structures and functions. In this study, the authors explore the institutional logics common across HPESUs, and how these logics influence the organisation and activities of HPESUs. The authors analysed interviews with HPESU leaders in Canada (n = 12), Australia (n = 21), New Zealand (n = 3) and the USA (n = 11). Using an iterative process, they engaged in inductive and deductive analyses to identify institutional logics across all participating HPESUs. They explored the contextual factors that influence how these institutional logics impact each HPESU's structure and function. Participants identified three institutional logics influencing the organisational structure and functions of an HPESU: (i) the logic of financial accountability; (ii) the logic of a cohesive education continuum, and (iii) the logic of academic research, service and teaching. Although most HPESUs embodied all three logics, the power of the logics varied among units. The relative power of each logic influenced leaders' decisions about how members of the unit allocate their time, and what kinds of scholarly contribution and product are valued by the HPESU. Identifying the configuration of these three logics within and across HPESUs provides insights into the reasons why individual units are structured and function in particular ways. Having a common language in which to discuss these logics can enhance transparency, facilitate evaluation, and help leaders select appropriate indicators of HPESU success. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  8. Unit Operation Experiment Linking Classroom with Industrial Processing

    ERIC Educational Resources Information Center

    Benson, Tracy J.; Richmond, Peyton C.; LeBlanc, Weldon

    2013-01-01

    An industrial-type distillation column, including appropriate pumps, heat exchangers, and automation, was used as a unit operations experiment to provide a link between classroom teaching and real-world applications. Students were presented with an open-ended experiment where they defined the testing parameters to solve a generalized problem. The…

  9. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    PubMed

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  10. Statistical Transformation and the Interpretation of Inpatient Glucose Control Data From the Intensive Care Unit

    PubMed Central

    Saulnier, George E.; Castro, Janna C.

    2014-01-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box–Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. PMID:24876620

  11. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification.

    PubMed

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang; Li, Jinghong

    2017-08-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5'-ASO could block RNA splicing by inhibiting the first step, while 3'-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs.

  12. Evaluation of solar angle variation over digital processing of LANDSAT imagery. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1984-01-01

    The effects of the seasonal variation of illumination over digital processing of LANDSAT images are evaluated. Original images are transformed by means of digital filtering to enhance their spatial features. The resulting images are used to obtain an unsupervised classification of relief units. After defining relief classes, which are supposed to be spectrally different, topographic variables (declivity, altitude, relief range and slope length) are used to identify the true relief units existing on the ground. The samples are also clustered by means of an unsupervised classification option. The results obtained for each LANDSAT overpass are compared. Digital processing is highly affected by illumination geometry. There is no correspondence between relief units as defined by spectral features and those resulting from topographic features.

  13. Transposon mutagenesis identifies genes and cellular processes driving epithelial-mesenchymal transition in hepatocellular carcinoma

    PubMed Central

    Kodama, Takahiro; Newberg, Justin Y.; Kodama, Michiko; Rangel, Roberto; Yoshihara, Kosuke; Tien, Jean C.; Parsons, Pamela H.; Wu, Hao; Finegold, Milton J.; Copeland, Neal G.; Jenkins, Nancy A.

    2016-01-01

    Epithelial-mesenchymal transition (EMT) is thought to contribute to metastasis and chemoresistance in patients with hepatocellular carcinoma (HCC), leading to their poor prognosis. The genes driving EMT in HCC are not yet fully understood, however. Here, we show that mobilization of Sleeping Beauty (SB) transposons in immortalized mouse hepatoblasts induces mesenchymal liver tumors on transplantation to nude mice. These tumors show significant down-regulation of epithelial markers, along with up-regulation of mesenchymal markers and EMT-related transcription factors (EMT-TFs). Sequencing of transposon insertion sites from tumors identified 233 candidate cancer genes (CCGs) that were enriched for genes and cellular processes driving EMT. Subsequent trunk driver analysis identified 23 CCGs that are predicted to function early in tumorigenesis and whose mutation or alteration in patients with HCC is correlated with poor patient survival. Validation of the top trunk drivers identified in the screen, including MET (MET proto-oncogene, receptor tyrosine kinase), GRB2-associated binding protein 1 (GAB1), HECT, UBA, and WWE domain containing 1 (HUWE1), lysine-specific demethylase 6A (KDM6A), and protein-tyrosine phosphatase, nonreceptor-type 12 (PTPN12), showed that deregulation of these genes activates an EMT program in human HCC cells that enhances tumor cell migration. Finally, deregulation of these genes in human HCC was found to confer sorafenib resistance through apoptotic tolerance and reduced proliferation, consistent with recent studies showing that EMT contributes to the chemoresistance of tumor cells. Our unique cell-based transposon mutagenesis screen appears to be an excellent resource for discovering genes involved in EMT in human HCC and potentially for identifying new drug targets. PMID:27247392

  14. Porting of the transfer-matrix method for multilayer thin-film computations on graphics processing units

    NASA Astrophysics Data System (ADS)

    Limmer, Steffen; Fey, Dietmar

    2013-07-01

    Thin-film computations are often a time-consuming task during optical design. An efficient way to accelerate these computations with the help of graphics processing units (GPUs) is described. It turned out that significant speed-ups can be achieved. We investigate the circumstances under which the best speed-up values can be expected. Therefore we compare different GPUs among themselves and with a modern CPU. Furthermore, the effect of thickness modulation on the speed-up and the runtime behavior depending on the input data is examined.

  15. People detection method using graphics processing units for a mobile robot with an omnidirectional camera

    NASA Astrophysics Data System (ADS)

    Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki

    2011-12-01

    This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.

  16. Integration Process for the Habitat Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tn, Terry; Toups, Larry; Howe, A. Scott; Smitherman, David

    2011-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities. The HDU previously served as a test bed for testing technologies and sub-systems in a terrestrial surface environment. in 2010 in the Pressurized Excursion Module (PEM) configuration. Due to the amount of work involved to make the HDU project successful, the HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators The size of the team and number of systems involved With the HDU makes Integration a complicated process. However, because the HDU shell manufacturing is complete, the team has a head start on FY--11 integration activities and can focus on integrating upgrades to existing systems as well as integrating new additions. To complete the development of the FY-11 HDU from conception to rollout for operations in July 2011, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads. The highlighted HDU work for FY-11 will focus on performing upgrades to the PEM configuration, adding the X-Hab as a second level, adding a new porch providing the astronauts a larger work area outside the HDU for EVA preparations, and adding a Hygiene module. Together these upgrades result in a prototype configuration of the Deep Space Habitat (DSH), an element under evaluation by NASA's Human Exploration Framework Team (HEFT) Scheduled activates include early fit-checks and the utilization of a Habitat avionics test bed prior to installation into HDU. A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development. Modeling tools have been effective in hardware systems layout, cable routing, sub-system interface length estimation and human factors analysis. Decision processes on integration and use of all new subsystems will be defined early in the project to

  17. Critical Concerns for Oral Communication Education in the United States and the United Kingdom

    ERIC Educational Resources Information Center

    Emanuel, Richard

    2011-01-01

    An examination of oral communication education in the United States (U.S.) and United Kingdom (U.K.) identified four critical concerns: (1) Today's college students are not getting adequate oral communication education; (2) Oral communication education is being relegated to a "module" in another discipline-specific course; (3) When an…

  18. Decreasing laboratory turnaround time and patient wait time by implementing process improvement methodologies in an outpatient oncology infusion unit.

    PubMed

    Gjolaj, Lauren N; Gari, Gloria A; Olier-Pino, Angela I; Garcia, Juan D; Fernandez, Gustavo L

    2014-11-01

    Prolonged patient wait times in the outpatient oncology infusion unit indicated a need to streamline phlebotomy processes by using existing resources to decrease laboratory turnaround time and improve patient wait time. Using the DMAIC (define, measure, analyze, improve, control) method, a project to streamline phlebotomy processes within the outpatient oncology infusion unit in an academic Comprehensive Cancer Center known as the Comprehensive Treatment Unit (CTU) was completed. Laboratory turnaround time for patients who needed same-day lab and CTU services and wait time for all CTU patients was tracked for 9 weeks. During the pilot, the wait time from arrival to CTU to sitting in treatment area decreased by 17% for all patients treated in the CTU during the pilot. A total of 528 patients were seen at the CTU phlebotomy location, representing 16% of the total patients who received treatment in the CTU, with a mean turnaround time of 24 minutes compared with a baseline turnaround time of 51 minutes. Streamlining workflows and placing a phlebotomy station inside of the CTU decreased laboratory turnaround times by 53% for patients requiring same day lab and CTU services. The success of the pilot project prompted the team to make the station a permanent fixture. Copyright © 2014 by American Society of Clinical Oncology.

  19. Efficiency of the energy transfer in the FMO complex using hierarchical equations on Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Kramer, Tobias; Kreisbeck, Christoph; Rodriguez, Mirta; Hein, Birgit

    2011-03-01

    We study the efficiency of the energy transfer in the Fenna-Matthews-Olson complex solving the non-Markovian hierarchical equations (HE) proposed by Ishizaki and Fleming in 2009, which include properly the reorganization process. We compare it to the Markovian approach and find that the Markovian dynamics overestimates the thermalization rate, yielding higher efficiencies than the HE. Using the high-performance of graphics processing units (GPU) we cover a large range of reorganization energies and temperatures and find that initial quantum beatings are important for the energy distribution, but of limited influence to the efficiency. Our efficient GPU implementation of the HE allows us to calculate nonlinear spectra of the FMO complex. References see www.quantumdynamics.de

  20. Using Lean principles to manage throughput on an inpatient rehabilitation unit.

    PubMed

    Chiodo, Anthony; Wilke, Ruste; Bakshi, Rishi; Craig, Anita; Duwe, Doug; Hurvitz, Edward

    2012-11-01

    Performance improvement is a mainstay of operations management and maintenance of certification. In this study at a University Hospital inpatient rehabilitation unit, Lean management techniques were used to manage throughput of patients into and out of the inpatient rehabilitation unit. At the start of this process, the average admission time to the rehabilitation unit was 5:00 p.m., with a median time of 3:30 p.m., and no patients received therapy on the day of admission. Within 8 mos, the mean admission time was 1:22 p.m., 50% of the patients were on the rehabilitation unit by 1:00 p.m., and more than 70% of all patients received therapy on the day of admission. Negative variance from this performance was evaluated, the identification of inefficient discharges holding up admissions as a problem was identified, and a Lean workshop was initiated. Once this problem was tackled, the prime objective of 70% of patients receiving therapy on the date of admission was consistently met. Lean management tools are effective in improving throughput on an inpatient rehabilitation unit.

  1. Informatics for the Modern Intensive Care Unit.

    PubMed

    Anderson, Diana C; Jackson, Ashley A; Halpern, Neil A

    Advanced informatics systems can help improve health care delivery and the environment of care for critically ill patients. However, identifying, testing, and deploying advanced informatics systems can be quite challenging. These processes often require involvement from a collaborative group of health care professionals of varied disciplines with knowledge of the complexities related to designing the modern and "smart" intensive care unit (ICU). In this article, we explore the connectivity environment within the ICU, middleware technologies to address a host of patient care initiatives, and the core informatics concepts necessary for both the design and implementation of advanced informatics systems.

  2. Effects of the Scientific Argumentation Based Learning Process on Teaching the Unit of Cell Division and Inheritance to Eighth Grade Students

    ERIC Educational Resources Information Center

    Balci, Ceyda; Yenice, Nilgun

    2016-01-01

    The aim of this study is to analyse the effects of scientific argumentation based learning process on the eighth grade students' achievement in the unit of "cell division and inheritance". It also deals with the effects of this process on their comprehension about the nature of scientific knowledge, their willingness to take part in…

  3. Parallel design of JPEG-LS encoder on graphics processing units

    NASA Astrophysics Data System (ADS)

    Duan, Hao; Fang, Yong; Huang, Bormin

    2012-01-01

    With recent technical advances in graphic processing units (GPUs), GPUs have outperformed CPUs in terms of compute capability and memory bandwidth. Many successful GPU applications to high performance computing have been reported. JPEG-LS is an ISO/IEC standard for lossless image compression which utilizes adaptive context modeling and run-length coding to improve compression ratio. However, adaptive context modeling causes data dependency among adjacent pixels and the run-length coding has to be performed in a sequential way. Hence, using JPEG-LS to compress large-volume hyperspectral image data is quite time-consuming. We implement an efficient parallel JPEG-LS encoder for lossless hyperspectral compression on a NVIDIA GPU using the computer unified device architecture (CUDA) programming technology. We use the block parallel strategy, as well as such CUDA techniques as coalesced global memory access, parallel prefix sum, and asynchronous data transfer. We also show the relation between GPU speedup and AVIRIS block size, as well as the relation between compression ratio and AVIRIS block size. When AVIRIS images are divided into blocks, each with 64×64 pixels, we gain the best GPU performance with 26.3x speedup over its original CPU code.

  4. Volcanic Centers in the East Africa Rift: Volcanic Processes with Seismic Stresses to Identify Potential Hydrothermal Vents

    NASA Astrophysics Data System (ADS)

    Patlan, E.; Wamalwa, A. M.; Kaip, G.; Velasco, A. A.

    2015-12-01

    The Geothermal Development Company (GDC) in Kenya actively seeks to produce geothermal energy, which lies within the East African Rift System (EARS). The EARS, an active continental rift zone, appears to be a developing tectonic plate boundary and thus, has a number of active as well as dormant volcanoes throughout its extent. These volcanic centers can be used as potential sources for geothermal energy. The University of Texas at El Paso (UTEP) and the GDC deployed seismic sensors to monitor several volcanic centers: Menengai, Silali, and Paka, and Korosi. We identify microseismic, local events, and tilt like events using automatic detection algorithms and manual review to identify potential local earthquakes within our seismic network. We then perform the double-difference location method of local magnitude less than two to image the boundary of the magma chamber and the conduit feeding the volcanoes. In the process of locating local seismicity, we also identify long-period, explosion, and tremor signals that we interpret as magma passing through conduits of the magma chamber and/or fluid being transported as a function of magma movement or hydrothermal activity. We used waveform inversion and S-wave shear wave splitting to approximate the orientation of the local stresses from the vent or fissure-like conduit of the volcano. The microseismic events and long period events will help us interpret the activity of the volcanoes. Our goal is to investigate basement structures beneath the volcanoes and identify the extent of magmatic modifications of the crust. Overall, these seismic techniques will help us understand magma movement and volcanic processes in the region.

  5. Distributive Distillation Enabled by Microchannel Process Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arora, Ravi

    The application of microchannel technology for distributive distillation was studied to achieve the Grand Challenge goals of 25% energy savings and 10% return on investment. In Task 1, a detailed study was conducted and two distillation systems were identified that would meet the Grand Challenge goals if the microchannel distillation technology was used. Material and heat balance calculations were performed to develop process flow sheet designs for the two distillation systems in Task 2. The process designs were focused on two methods of integrating the microchannel technology 1) Integrating microchannel distillation to an existing conventional column, 2) Microchannel distillation formore » new plants. A design concept for a modular microchannel distillation unit was developed in Task 3. In Task 4, Ultrasonic Additive Machining (UAM) was evaluated as a manufacturing method for microchannel distillation units. However, it was found that a significant development work would be required to develop process parameters to use UAM for commercial distillation manufacturing. Two alternate manufacturing methods were explored. Both manufacturing approaches were experimentally tested to confirm their validity. The conceptual design of the microchannel distillation unit (Task 3) was combined with the manufacturing methods developed in Task 4 and flowsheet designs in Task 2 to estimate the cost of the microchannel distillation unit and this was compared to a conventional distillation column. The best results were for a methanol-water separation unit for the use in a biodiesel facility. For this application microchannel distillation was found to be more cost effective than conventional system and capable of meeting the DOE Grand Challenge performance requirements.« less

  6. Area-delay trade-offs of texture decompressors for a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Novoa Súñer, Emilio; Ituero, Pablo; López-Vallejo, Marisa

    2011-05-01

    Graphics Processing Units have become a booster for the microelectronics industry. However, due to intellectual property issues, there is a serious lack of information on implementation details of the hardware architecture that is behind GPUs. For instance, the way texture is handled and decompressed in a GPU to reduce bandwidth usage has never been dealt with in depth from a hardware point of view. This work addresses a comparative study on the hardware implementation of different texture decompression algorithms for both conventional (PCs and video game consoles) and mobile platforms. Circuit synthesis is performed targeting both a reconfigurable hardware platform and a 90nm standard cell library. Area-delay trade-offs have been extensively analyzed, which allows us to compare the complexity of decompressors and thus determine suitability of algorithms for systems with limited hardware resources.

  7. Development Status of a Power Processing Unit for Low Power Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Bowers, Glen E.; Lafontaine, Eric M.

    2000-01-01

    An advanced breadboard Power Processing Unit (PPU) for a low power ion propulsion system incorporating mass reduction techniques was designed and fabricated. As a result of similar output current requirements, the discharge supply was also used to provide the neutralizer heater and discharge heater functions by using three relays to switch the output connections. This multi-function supply reduces to four the number of power converters needed to produce the required six electrical outputs. Switching frequencies of 20 and 50 kHz were chosen as a compromise between the size of the magnetic components and switching losses. The advanced breadboard PPU is capable of a maximum total output power of 0.47 kW. Its component mass is 0.65 kg and its total mass 1.9 kg. The total efficiency at full power is 0.89.

  8. Performance and scalability of Fourier domain optical coherence tomography acceleration using graphics processing units.

    PubMed

    Li, Jian; Bloch, Pavel; Xu, Jing; Sarunic, Marinko V; Shannon, Lesley

    2011-05-01

    Fourier domain optical coherence tomography (FD-OCT) provides faster line rates, better resolution, and higher sensitivity for noninvasive, in vivo biomedical imaging compared to traditional time domain OCT (TD-OCT). However, because the signal processing for FD-OCT is computationally intensive, real-time FD-OCT applications demand powerful computing platforms to deliver acceptable performance. Graphics processing units (GPUs) have been used as coprocessors to accelerate FD-OCT by leveraging their relatively simple programming model to exploit thread-level parallelism. Unfortunately, GPUs do not "share" memory with their host processors, requiring additional data transfers between the GPU and CPU. In this paper, we implement a complete FD-OCT accelerator on a consumer grade GPU/CPU platform. Our data acquisition system uses spectrometer-based detection and a dual-arm interferometer topology with numerical dispersion compensation for retinal imaging. We demonstrate that the maximum line rate is dictated by the memory transfer time and not the processing time due to the GPU platform's memory model. Finally, we discuss how the performance trends of GPU-based accelerators compare to the expected future requirements of FD-OCT data rates.

  9. Bayesian Treed Multivariate Gaussian Process with Adaptive Design: Application to a Carbon Capture Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik

    2014-05-16

    Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Montemore » Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.« less

  10. [Nurse's concept in the managerial conception of a basic health unit].

    PubMed

    Passos, Joanir Pereira; Ciosak, Suely Itsuko

    2006-12-01

    This study is part of a larger survey called "Use of indicators in nurses' managerial practice in Basic Health Care Units in the city of Rio de Janeiro", which was carried out in the Basic Health Care Units of the Planning Area 5.3 and whose objectives were to identify nurses' conception regarding the tools required for management in those units and to discuss the role of management in organizing health services. The study is descriptive and data were collected in interviews with seven nurse managers. The results show that health services actions are organized and directed to the purpose of the working process through the relationship established between the object, the instruments and the final product, and that for those nurses the end result to be achieved is client's satisfaction and the quality of medical and nursing care.

  11. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    PubMed

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  12. Collaboration as a process and an outcome: Consumer experiences of collaborating with nurses in care planning in an acute inpatient mental health unit.

    PubMed

    Reid, Rebecca; Escott, Phil; Isobel, Sophie

    2018-04-14

    This qualitative study explores inpatient mental health consumer perceptions of how collaborative care planning with mental health nurses impacts personal recovery. Semi-structured interviews were conducted with consumers close to discharge from one unit in Sydney, Australia. The unit had been undertaking a collaborative care planning project which encouraged nurses to use care plan documentation to promote person-centred and goal-focussed interactions and the development of meaningful strategies to aid consumer recovery. The interviews explored consumer understandings of the collaborative care planning process, perceptions of the utility of the care plan document and the process of collaborating with the nurses, and their perception of the impact of collaboration on their recovery. Findings are presented under four organizing themes: the process of collaborating, the purpose of collaborating, the nurse as collaborator and the role of collaboration in wider care and recovery. Consumers highlighted the importance of the process of developing their care plan with a nurse as being as helpful for recovery as the goals and strategies themselves. The findings provide insights into consumers' experiences of care planning in an acute inpatient unit, the components of care that support recovery and highlight specific areas for mental health nursing practice improvement in collaboration. © 2018 Australian College of Mental Health Nurses Inc.

  13. Vocabulary Development and Maintenance--Identifiers. ERIC Processing Manual, Section VIII (Part 2).

    ERIC Educational Resources Information Center

    Weller, Carolyn R., Ed.; Houston, Jim, Ed.

    Comprehensive rules, guidelines, and examples are provided for use by ERIC indexers and lexicographers in creating and using Identifiers, and in developing and maintaining the ERIC Identifier file via the "Identifier Authority List (IAL)." Identifiers and the IAL are defined/described: Identifiers are highly specific entities, including…

  14. PGAS in-memory data processing for the Processing Unit of the Upgraded Electronics of the Tile Calorimeter of the ATLAS Detector

    NASA Astrophysics Data System (ADS)

    Ohene-Kwofie, Daniel; Otoo, Ekow

    2015-10-01

    The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level. We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.

  15. Viscoelastic Finite Difference Modeling Using Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Fabien-Ouellet, G.; Gloaguen, E.; Giroux, B.

    2014-12-01

    Full waveform seismic modeling requires a huge amount of computing power that still challenges today's technology. This limits the applicability of powerful processing approaches in seismic exploration like full-waveform inversion. This paper explores the use of Graphics Processing Units (GPU) to compute a time based finite-difference solution to the viscoelastic wave equation. The aim is to investigate whether the adoption of the GPU technology is susceptible to reduce significantly the computing time of simulations. The code presented herein is based on the freely accessible software of Bohlen (2002) in 2D provided under a General Public License (GNU) licence. This implementation is based on a second order centred differences scheme to approximate time differences and staggered grid schemes with centred difference of order 2, 4, 6, 8, and 12 for spatial derivatives. The code is fully parallel and is written using the Message Passing Interface (MPI), and it thus supports simulations of vast seismic models on a cluster of CPUs. To port the code from Bohlen (2002) on GPUs, the OpenCl framework was chosen for its ability to work on both CPUs and GPUs and its adoption by most of GPU manufacturers. In our implementation, OpenCL works in conjunction with MPI, which allows computations on a cluster of GPU for large-scale model simulations. We tested our code for model sizes between 1002 and 60002 elements. Comparison shows a decrease in computation time of more than two orders of magnitude between the GPU implementation run on a AMD Radeon HD 7950 and the CPU implementation run on a 2.26 GHz Intel Xeon Quad-Core. The speed-up varies depending on the order of the finite difference approximation and generally increases for higher orders. Increasing speed-ups are also obtained for increasing model size, which can be explained by kernel overheads and delays introduced by memory transfers to and from the GPU through the PCI-E bus. Those tests indicate that the GPU memory size

  16. Sanitary Engineering Unit Operations and Unit Processes Laboratory Manual.

    ERIC Educational Resources Information Center

    American Association of Professors in Sanitary Engineering.

    This manual contains a compilation of experiments in Physical Operations, Biological and Chemical Processes for various education and equipment levels. The experiments are designed to be flexible so that they can be adapted to fit the needs of a particular program. The main emphasis is on hands-on student experiences to promote understanding.…

  17. Making things happen through challenging goals: leader proactivity, trust, and business-unit performance.

    PubMed

    Crossley, Craig D; Cooper, Cecily D; Wernsing, Tara S

    2013-05-01

    Building on decades of research on the proactivity of individual performers, this study integrates research on goal setting and trust in leadership to examine manager proactivity and business unit sales performance in one of the largest sales organizations in the United States. Results of a moderated-mediation model suggest that proactive senior managers establish more challenging goals for their business units (N = 50), which in turn are associated with higher sales performance. We further found that employees' trust in the manager is a critical contingency variable that facilitates the relationship between challenging sales goals and subsequent sales performance. This research contributes to growing literatures on trust in leadership and proactivity by studying their joint effects at a district-unit level of analysis while identifying district managers' tendency to set challenging goals as a process variable that helps translate their proactivity into the collective performance of their units. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  18. Using stable isotopes to identify the scaling effects of riparian peatlands on runoff generation processes and DOC mobilisation

    NASA Astrophysics Data System (ADS)

    Tunaley, Claire; Tetzlaff, Doerthe; Soulsby, Chris

    2017-04-01

    Knowledge of hydrological sources, flow paths, and their connectivity is fundamental to understanding stream flow generation and surface water quality in peatlands. Stable isotopes are proven tools for tracking the sources and flow paths of runoff. However, relativity few studies have used isotopes in peat-dominated catchments. Here, we combined 13 months (June 2014 - July 2015) of daily isotope measurements in stream water with daily DOC and 15 minute FDOM (fluorescent component of dissolved organic matter) data, at three nested scales in NE Scotland, to identify the hydrological processes occurring in riparian peatlands. We investigated how runoff generation processes in a small, riparian peatland dominated headwater catchment (0.65 km2) propagate to larger scales (3.2 km2 and 31 km2) with decreasing percentage of riparian peatland coverage. Isotope damping was most pronounced in the 0.65 km2 catchment due to high water storage in the organic soils which encouraged tracer mixing and resulted in attenuated runoff peaks. At the largest scale, stream flow and water isotope dynamics showed a more flashy response. Particularly insightful in this study was calculating the deviation of the isotopes from the local meteoric water line, the lc-excess. The lc-excess revealed evaporative fractionation in the peatland dominated catchment, particularly during summer low flows. This implied high hydrological connectivity in the form of constant seepage from the peatlands sustaining high baseflows at the headwater scale. This constant connectivity resulted in high DOC concentrations at the peatland site during baseflow ( 5 mg l-1). In contrast, at the larger scales, DOC was minimal during low flows ( 2 mg l-1) due to increased groundwater influence and the disconnection between DOC sources and the stream. Insights into event dynamics through the analysis of DOC hysteresis loops showed slight dilution on the rising limb, the strong influence of dry antecedent conditions and a

  19. Computerized nursing process in the Intensive Care Unit: ergonomics and usability.

    PubMed

    Almeida, Sônia Regina Wagner de; Sasso, Grace Teresinha Marcon Dal; Barra, Daniela Couto Carvalho

    2016-01-01

    Analyzing the ergonomics and usability criteria of the Computerized Nursing Process based on the International Classification for Nursing Practice in the Intensive Care Unit according to International Organization for Standardization(ISO). A quantitative, quasi-experimental, before-and-after study with a sample of 16 participants performed in an Intensive Care Unit. Data collection was performed through the application of five simulated clinical cases and an evaluation instrument. Data analysis was performed by descriptive and inferential statistics. The organization, content and technical criteria were considered "excellent", and the interface criteria were considered "very good", obtaining means of 4.54, 4.60, 4.64 and 4.39, respectively. The analyzed standards obtained means above 4.0, being considered "very good" by the participants. The Computerized Nursing Processmet ergonomic and usability standards according to the standards set by ISO. This technology supports nurses' clinical decision-making by providing complete and up-to-date content for Nursing practice in the Intensive Care Unit. Analisar os critérios de ergonomia e usabilidade do Processo de Enfermagem Informatizado a partir da Classificação Internacional para as Práticas de Enfermagem, em Unidade de Terapia Intensiva, de acordo com os padrões da InternationalOrganization for Standardization (ISO). Pesquisa quantitativa, quase-experimental do tipo antes e depois, com uma amostra de 16 participantes, realizada em uma Unidade de Terapia Intensiva. Coleta de dados realizada por meio da aplicação de cinco casos clínicos simulados e instrumento de avaliação. A análise dos dados foi realizada pela estatística descritiva e inferencial. Os critérios organização, conteúdo e técnico foram considerados "excelentes", e o critério interface "muito bom", obtendo médias 4,54, 4,60, 4,64 e 4,39, respectivamente. Os padrões analisados obtiveram médias acima de 4,0, sendo considerados "muito bons

  20. Identification of Action Units Related to Affective States in a Tutoring System for Mathematics

    ERIC Educational Resources Information Center

    Padrón-Rivera, Gustavo; Rebolledo-Mendez, Genaro; Parra, Pilar Pozos; Huerta-Pacheco, N. Sofia

    2016-01-01

    Affect is an important element of the learning process both in the classroom and with educational technology. This paper presents analyses in relation to the identification of Action Units (AUs) related to affective states and their impact on learning with a tutoring system. To assess affect, a tool was devised to identify AUs on pictures of human…