Sample records for standard central processing

  1. DEC Personnel Preparation Standards: Revision 2005-2008

    ERIC Educational Resources Information Center

    Lifter, Karin; Chandler, Lynette K.; Cochran, Deborah C.; Dinnebeil, Laurie A.; Gallagher, Peggy A.; Christensen, Kimberly A.; Stayton, Vicki D.

    2011-01-01

    The revision and process of validation of standards for early childhood special education (ECSE) and early intervention (EI) personnel at the initial and advanced levels of preparation, which occurred during 2005-2008, are described to provide a record of the process and to inform future cycles of standards revision. Central components focus on…

  2. Multicenter Cell Processing for Cardiovascular Regenerative Medicine Applications - The Cardiovascular Cell Therapy Research Network (CCTRN) Experience

    PubMed Central

    Gee, Adrian P.; Richman, Sara; Durett, April; McKenna, David; Traverse, Jay; Henry, Timothy; Fisk, Diann; Pepine, Carl; Bloom, Jeannette; Willerson, James; Prater, Karen; Zhao, David; Koç, Jane Reese; Ellis, Steven; Taylor, Doris; Cogle, Christopher; Moyé, Lemuel; Simari, Robert; Skarlatos, Sonia

    2013-01-01

    Background Aims Multi-center cellular therapy clinical trials require the establishment and implementation of standardized cell processing protocols and associated quality control mechanisms. The aims here were to develop such an infrastructure in support of the Cardiovascular Cell Therapy Research Network (CCTRN) and to report on the results of processing for the first 60 patients. Methods Standardized cell preparations, consisting of autologous bone marrow mononuclear cells, prepared using the Sepax device were manufactured at each of the five processing facilities that supported the clinical treatment centers. Processing staff underwent centralized training that included proficiency evaluation. Quality was subsequently monitored by a central quality control program that included product evaluation by the CCTRN biorepositories. Results Data from the first 60 procedures demonstrate that uniform products, that met all release criteria, could be manufactured at all five sites within 7 hours of receipt of the bone marrow. Uniformity was facilitated by use of the automated systems (the Sepax for processing and the Endosafe device for endotoxin testing), standardized procedures and centralized quality control. Conclusions Complex multicenter cell therapy and regenerative medicine protocols can, where necessary, successfully utilize local processing facilities once an effective infrastructure is in place to provide training, and quality control. PMID:20524773

  3. Addressing the medicinal chemistry bottleneck: a lean approach to centralized purification.

    PubMed

    Weller, Harold N; Nirschl, David S; Paulson, James L; Hoffman, Steven L; Bullock, William H

    2012-09-10

    The use of standardized lean manufacturing principles to improve drug discovery productivity is often thought to be at odds with fostering innovation. This manuscript describes how selective implementation of a lean optimized process, in this case centralized purification for medicinal chemistry, can improve operational productivity and increase scientist time available for innovation. A description of the centralized purification process is provided along with both operational and impact (productivity) metrics, which indicate lower cost, higher output, and presumably more free time for innovation as a result of the process changes described.

  4. Noise Equally Degrades Central Auditory Processing in 2- and 4-Year-Old Children

    ERIC Educational Resources Information Center

    Niemitalo-Haapola, Elina; Haapala, Sini; Kujala, Teija; Raappana, Antti; Kujala, Tiia; Jansson-Verkasalo, Eira

    2017-01-01

    Purpose: The aim of this study was to investigate developmental and noise-induced changes in central auditory processing indexed by event-related potentials in typically developing children. Method: P1, N2, and N4 responses as well as mismatch negativities (MMNs) were recorded for standard syllables and consonants, frequency, intensity, vowel, and…

  5. Monte Carlo based toy model for fission process

    NASA Astrophysics Data System (ADS)

    Kurniadi, R.; Waris, A.; Viridi, S.

    2014-09-01

    There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.

  6. National Education Standards: The Complex Challenge for Educational Leaders.

    ERIC Educational Resources Information Center

    Faidley, Ray; Musser, Steven

    1991-01-01

    National standards for education are important elements in the excellence process, but standards imposed by a central authority simply do not work in the Information Era. It would be wise to increase teachers' decision-making role in establishing and implementing local level excellence standards and train teachers to employ the Japanese "kaizen"…

  7. Preliminary design review package for the solar heating and cooling central data processing system

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Central Data Processing System (CDPS) is designed to transform the raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems. Software requirements for the CDPS are described. The programming standards to be used in development, documentation, and maintenance of the software are discussed along with the CDPS operations approach in support of daily data collection and processing.

  8. 45 CFR 95.613 - Procurement standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Automatic Data Processing Equipment and Services-Conditions for Federal Financial Participation (FFP... conditions for prior approval. Those standards include a requirement for maximum practical open and free... State or local agency, and the ADP services and equipment acquired by a State or local Central Data...

  9. [Study on effect of 3 types of drinking water emergent disinfection models in flood/waterlog areas].

    PubMed

    Ban, Haiqun; Li, Jin; Li, Xinwu; Zhang, Liubo

    2010-09-01

    To establish 3 drinking water emergent disinfection processing models, separated medicate dispensing, specific duty medicate dispensing, and centralized filtering, in flood/waterlog areas, and compare the effects of these 3 models on the drinking water disinfection processing. From October to December, 2008, 18 villages were selected as the trial field in Yanglinwei town, Xiantao city, Hubei province, which were divided into three groups, separated medicate dispensing, specific duty medicate dispensing, and centralized filtering. Every 2 weeks, drinking water source water, yielding water of emergency central filtrate water equipment (ECFWE) and container water in the kitchen were sampled and microbe indices of the water sample, standard plate-count bacteria, total coliforms, thermotolerant coliform bacteria, Escherichia coli were measured. The microbe pollution of the water of these 3 water source groups are heavy, all failed. The eliminating rate of the standard plate-count bacteria of the drinking water emergent centralized processing equipment is 99.95%; those of the separate medicate dispensing, specific duty medicate dispensing and centralized filtering are 81.93%, 99.67%, and 98.28%, respectively. The passing rates of the microbe indice of the resident contained water are 13.33%, 70.00%, and 43.33%, respectively. The difference has statistical significance. The drinking water disinfection effects of the centralized filtering model and of the specific duty medicate dispensing model are better than that of the separated medicate dispensing model in the flood/waterlog areas.

  10. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-04-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  11. The Central Limit Theorem for Supercritical Oriented Percolation in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Tzioufas, Achillefs

    2018-06-01

    We consider the cardinality of supercritical oriented bond percolation in two dimensions. We show that, whenever the the origin is conditioned to percolate, the process appropriately normalized converges asymptotically in distribution to the standard normal law. This resolves a longstanding open problem pointed out to in several instances in the literature. The result applies also to the continuous-time analog of the process, viz. the basic one-dimensional contact process. We also derive general random-indices central limit theorems for associated random variables as byproducts of our proof.

  12. Cracks in Continuing Education's Mirror and a Fix To Correct Its Distorted Internal and External Image.

    ERIC Educational Resources Information Center

    Loch, John R.

    2003-01-01

    Outlines problems in continuing higher education, suggesting that it lacks (1) a standard name; (2) a unified voice on national issues; (3) a standard set of roles and functions; (4) a standard title for the chief administrative officer; (5) an accreditation body and process; and (6) resolution of the centralization/decentralization issue. (SK)

  13. DNA methylation-based classification of central nervous system tumours.

    PubMed

    Capper, David; Jones, David T W; Sill, Martin; Hovestadt, Volker; Schrimpf, Daniel; Sturm, Dominik; Koelsche, Christian; Sahm, Felix; Chavez, Lukas; Reuss, David E; Kratz, Annekathrin; Wefers, Annika K; Huang, Kristin; Pajtler, Kristian W; Schweizer, Leonille; Stichel, Damian; Olar, Adriana; Engel, Nils W; Lindenberg, Kerstin; Harter, Patrick N; Braczynski, Anne K; Plate, Karl H; Dohmen, Hildegard; Garvalov, Boyan K; Coras, Roland; Hölsken, Annett; Hewer, Ekkehard; Bewerunge-Hudler, Melanie; Schick, Matthias; Fischer, Roger; Beschorner, Rudi; Schittenhelm, Jens; Staszewski, Ori; Wani, Khalida; Varlet, Pascale; Pages, Melanie; Temming, Petra; Lohmann, Dietmar; Selt, Florian; Witt, Hendrik; Milde, Till; Witt, Olaf; Aronica, Eleonora; Giangaspero, Felice; Rushing, Elisabeth; Scheurlen, Wolfram; Geisenberger, Christoph; Rodriguez, Fausto J; Becker, Albert; Preusser, Matthias; Haberler, Christine; Bjerkvig, Rolf; Cryan, Jane; Farrell, Michael; Deckert, Martina; Hench, Jürgen; Frank, Stephan; Serrano, Jonathan; Kannan, Kasthuri; Tsirigos, Aristotelis; Brück, Wolfgang; Hofer, Silvia; Brehmer, Stefanie; Seiz-Rosenhagen, Marcel; Hänggi, Daniel; Hans, Volkmar; Rozsnoki, Stephanie; Hansford, Jordan R; Kohlhof, Patricia; Kristensen, Bjarne W; Lechner, Matt; Lopes, Beatriz; Mawrin, Christian; Ketter, Ralf; Kulozik, Andreas; Khatib, Ziad; Heppner, Frank; Koch, Arend; Jouvet, Anne; Keohane, Catherine; Mühleisen, Helmut; Mueller, Wolf; Pohl, Ute; Prinz, Marco; Benner, Axel; Zapatka, Marc; Gottardo, Nicholas G; Driever, Pablo Hernáiz; Kramm, Christof M; Müller, Hermann L; Rutkowski, Stefan; von Hoff, Katja; Frühwald, Michael C; Gnekow, Astrid; Fleischhack, Gudrun; Tippelt, Stephan; Calaminus, Gabriele; Monoranu, Camelia-Maria; Perry, Arie; Jones, Chris; Jacques, Thomas S; Radlwimmer, Bernhard; Gessi, Marco; Pietsch, Torsten; Schramm, Johannes; Schackert, Gabriele; Westphal, Manfred; Reifenberger, Guido; Wesseling, Pieter; Weller, Michael; Collins, Vincent Peter; Blümcke, Ingmar; Bendszus, Martin; Debus, Jürgen; Huang, Annie; Jabado, Nada; Northcott, Paul A; Paulus, Werner; Gajjar, Amar; Robinson, Giles W; Taylor, Michael D; Jaunmuktane, Zane; Ryzhova, Marina; Platten, Michael; Unterberg, Andreas; Wick, Wolfgang; Karajannis, Matthias A; Mittelbronn, Michel; Acker, Till; Hartmann, Christian; Aldape, Kenneth; Schüller, Ulrich; Buslei, Rolf; Lichter, Peter; Kool, Marcel; Herold-Mende, Christel; Ellison, David W; Hasselblatt, Martin; Snuderl, Matija; Brandner, Sebastian; Korshunov, Andrey; von Deimling, Andreas; Pfister, Stefan M

    2018-03-22

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging-with substantial inter-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show that the availability of this method may have a substantial impact on diagnostic precision compared to standard methods, resulting in a change of diagnosis in up to 12% of prospective cases. For broader accessibility, we have designed a free online classifier tool, the use of which does not require any additional onsite data processing. Our results provide a blueprint for the generation of machine-learning-based tumour classifiers across other cancer entities, with the potential to fundamentally transform tumour pathology.

  14. Cataloging, Processing, Administering AV Materials. A Model for Wisconsin Schools. Revised, 1974.

    ERIC Educational Resources Information Center

    Little, Robert David, Ed.; And Others

    The Wisconsin Association of School Librarians has produced a manual for standardized processing of all nonprint media, based on two principles: (1) the media should be centralized, organized, and administered for maximum access; and (2) content is more important than form. Definitions, cataloging, processing, housing, circulation, and care are…

  15. Biospecimen Core Resource - TCGA

    Cancer.gov

    The Cancer Genome Atlas (TCGA) Biospecimen Core Resource centralized laboratory reviews and processes blood and tissue samples and their associated data using optimized standard operating procedures for the entire TCGA Research Network.

  16. A Discussion of Change Theory, System Theory, and State Designed Standards and Accountability Initiatives.

    ERIC Educational Resources Information Center

    McNeal, Larry; Christy, W. Keith

    This brief paper is a presentation that preceeded another case of considering the ongoing dialogue on the advantages and disadvantages of centralized and decentralized school-improvement processes. It attempts to raise a number of questions about the relationship between state-designed standards and accountability initiatives and change and…

  17. A School Leadership Faculty Struggles for Democracy: Leadership Education Priorities for a Democratic Society

    ERIC Educational Resources Information Center

    Bogotch, Ira

    2010-01-01

    For the past 3 years, the educational leadership faculty at Florida Atlantic University (FAU) has been engaged in program reform and curricular innovation. The reform process was initiated by centralized external authorities--a combination of international rankings, national accreditation bodies, national leadership standards, and state standards,…

  18. Microgliomatosis in a Schnauzer dog.

    PubMed

    Willard, M D; Delahunta, A

    1982-04-01

    Microgliomatosis was found in the central nervous system of a 7-year-old male Standard Schnauzer. History, neurologic examination, laboratory tests and electrodiagnostics could not localize the disease process in the central nervous system. The animal was not treated, continued to deteriorate, and was euthanatized approximately 8 weeks after clinical signs were first detected. Diagnosis was made upon histologic examination of the brain.

  19. 40 CFR 63.161 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... good engineering judgement and standards, such as ANSI B31-3. In food/medical service means that a... manufacture a Food and Drug Administration regulated product where leakage of a barrier fluid into the process..., storage at the chemical manufacturing process unit to which the records pertain, or storage in central...

  20. The Local and the National in a Diverse County: Objectification as a Social and Policy Process

    ERIC Educational Resources Information Center

    Kuipers, Joel

    2008-01-01

    The rich diversity of suburban Washington DC's ethnic and linguistic communities flourish side by side with some of the most powerful homogenizing forces in the United States: National Bureau of Standards, National Institutes of Health, Food and Drug Administration, Department of Education. This concern on accommodating centralized standards--not…

  1. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  2. A Distributed Processing Approach to Payroll Time Reporting for a Large School District.

    ERIC Educational Resources Information Center

    Freeman, Raoul J.

    1983-01-01

    Describes a system for payroll reporting from geographically disparate locations in which data is entered, edited, and verified locally on minicomputers and then uploaded to a central computer for the standard payroll process. Communications and hardware, time-reporting software, data input techniques, system implementation, and its advantages are…

  3. Structural and process factors affecting the implementation of antimicrobial resistance prevention and control strategies in U.S. hospitals.

    PubMed

    Chou, Ann F; Yano, Elizabeth M; McCoy, Kimberly D; Willis, Deanna R; Doebbeling, Bradley N

    2008-01-01

    To address increases in the incidence of infection with antimicrobial-resistant pathogens, the National Foundation for Infectious Diseases and Centers for Disease Control and Prevention proposed two sets of strategies to (a) optimize antibiotic use and (b) prevent the spread of antimicrobial resistance and control transmission. However, little is known about the implementation of these strategies. Our objective is to explore organizational structural and process factors that facilitate the implementation of National Foundation for Infectious Diseases/Centers for Disease Control and Prevention strategies in U.S. hospitals. We surveyed 448 infection control professionals from a national sample of hospitals. Clinically anchored in the Donabedian model that defines quality in terms of structural and process factors, with the structural domain further informed by a contingency approach, we modeled the degree to which National Foundation for Infectious Diseases and Centers for Disease Control and Prevention strategies were implemented as a function of formalization and standardization of protocols, centralization of decision-making hierarchy, information technology capabilities, culture, communication mechanisms, and interdepartmental coordination, controlling for hospital characteristics. Formalization, standardization, centralization, institutional culture, provider-management communication, and information technology use were associated with optimal antibiotic use and enhanced implementation of strategies that prevent and control antimicrobial resistance spread (all p < .001). However, interdepartmental coordination for patient care was inversely related with antibiotic use in contrast to antimicrobial resistance spread prevention and control (p < .0001). Formalization and standardization may eliminate staff role conflict, whereas centralized authority may minimize ambiguity. Culture and communication likely promote internal trust, whereas information technology use helps integrate and support these organizational processes. These findings suggest concrete strategies for evaluating current capabilities to implement effective practices and foster and sustain a culture of patient safety.

  4. Scene and human face recognition in the central vision of patients with glaucoma

    PubMed Central

    Aptel, Florent; Attye, Arnaud; Guyader, Nathalie; Boucart, Muriel; Chiquet, Christophe; Peyrin, Carole

    2018-01-01

    Primary open-angle glaucoma (POAG) firstly mainly affects peripheral vision. Current behavioral studies support the idea that visual defects of patients with POAG extend into parts of the central visual field classified as normal by static automated perimetry analysis. This is particularly true for visual tasks involving processes of a higher level than mere detection. The purpose of this study was to assess visual abilities of POAG patients in central vision. Patients were assigned to two groups following a visual field examination (Humphrey 24–2 SITA-Standard test). Patients with both peripheral and central defects and patients with peripheral but no central defect, as well as age-matched controls, participated in the experiment. All participants had to perform two visual tasks where low-contrast stimuli were presented in the central 6° of the visual field. A categorization task of scene images and human face images assessed high-level visual recognition abilities. In contrast, a detection task using the same stimuli assessed low-level visual function. The difference in performance between detection and categorization revealed the cost of high-level visual processing. Compared to controls, patients with a central visual defect showed a deficit in both detection and categorization of all low-contrast images. This is consistent with the abnormal retinal sensitivity as assessed by perimetry. However, the deficit was greater for categorization than detection. Patients without a central defect showed similar performances to the controls concerning the detection and categorization of faces. However, while the detection of scene images was well-maintained, these patients showed a deficit in their categorization. This suggests that the simple loss of peripheral vision could be detrimental to scene recognition, even when the information is displayed in central vision. This study revealed subtle defects in the central visual field of POAG patients that cannot be predicted by static automated perimetry assessment using Humphrey 24–2 SITA-Standard test. PMID:29481572

  5. 12 CFR 234.4 - Standards for central securities depositories and central counterparties.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... it meets or exceeds the following risk-management standards with respect to the payment, clearing... central counterparty's risk-management procedures. (9) The central securities depository or central... plausible market conditions. (b) The Board, by order, may apply heightened risk-management standards to a...

  6. 12 CFR 234.4 - Standards for central securities depositories and central counterparties.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... it meets or exceeds the following risk-management standards with respect to the payment, clearing... central counterparty's risk-management procedures. (9) The central securities depository or central... plausible market conditions. (b) The Board, by order, may apply heightened risk-management standards to a...

  7. The Power of Teacher Selection to Improve Education. Evidence Speaks Reports, Vol 1, #12

    ERIC Educational Resources Information Center

    Jacob, Brian A.

    2016-01-01

    This report describes the findings from a new study of the teacher selection process in Washington, DC public schools. In 2009, the district created a centralized application process to streamline hiring by screening out less desirable candidates. Following the collection of standard information, applicants are asked to complete up to three…

  8. Defense Finance and Accounting Service Work on the Navy Defense Business Operations Fund FY 1995 Financial Statements

    DTIC Science & Technology

    1996-11-22

    consolidation of financial statements , and for an automated process to transfer financial statement data from the Central Data Base to a... consolidation of financial statements . The Deputy Chief Financial Officer also indicated that the DFAS Cleveland Center approved a system change request...ently is developing Standard Operating Procedures to ensure consistency and standardization in the adjustment and consolidation of financial statements .

  9. Morphometric analysis of pulp size in maxillary permanent central incisors correlated with age: An indirect digital study.

    PubMed

    Ravindra, S V; Mamatha, G P; Sunita, J D; Balappanavar, Aswini Y; Sardana, Varun

    2015-01-01

    Teeth are hardest part of the body and are least affected by the taphonomic process. They are considered as one of the reliable methods of identification of a person in forensic sciences. The aim of the following study is to establish morphometeric measurements by AutoCad 2009 (Autodesk, Inc) of permanent maxillary central incisors in different age groups of Udaipur population. Hospital-based descriptive cross-sectional study carried out in Udaipur. A study was carried out on 308 subjects of both genders with the age range of 9-68 years. Standardized intra-oral radiographs were made by paralleling technique and processed. The radiographs were scanned and the obtained images were standardized to the actual size of radiographic film. This was followed by measuring them using software AutoCad 2009. F-test, post-hoc test, Pearson's correlation test. For left maxillary central incisor, the total pulp area was found to be of 38.41 ± 12.88 mm and 14.32 ± 7.04 mm respectively. For right maxillary central incisor, the total pulp size was 38.39 ± 14.95 mm and 12.35 ± 5 mm respectively. Males (32.50, 32.87 mm(2)) had more pulp area when compared with females (28.82, 30.05 mm(2)). There was a decrease in total pulp area with increasing age which may be attributed to secondary dentin formation.

  10. Dual-Task Processing When Task 1 Is Hard and Task 2 Is Easy: Reversed Central Processing Order?

    ERIC Educational Resources Information Center

    Leonhard, Tanja; Fernandez, Susana Ruiz; Ulrich, Rolf; Miller, Jeff

    2011-01-01

    Five psychological refractory period (PRP) experiments were conducted with an especially time-consuming first task (Experiments 1, 3, and 5: mental rotation; Experiments 2 and 4: memory scanning) and with equal emphasis on the first task and on the second (left-right tone judgment). The standard design with varying stimulus onset asynchronies…

  11. Application Transparent HTTP Over a Disruption Tolerant Smartnet

    DTIC Science & Technology

    2014-09-01

    American Standard Code for Information Interchange BP Bundle Protocol BPA bundle protocol agent CLA convergence layer adapters CPU central processing...forwarding them through the plugin pipeline. The initial version of the DTNInput plugin uses the BBN Spindle bundle protocol agent ( BPA ) implementation

  12. Electrophysiological Evidence for Hyperfocusing of Spatial Attention in Schizophrenia.

    PubMed

    Kreither, Johanna; Lopez-Calderon, Javier; Leonard, Carly J; Robinson, Benjamin M; Ruffle, Abigail; Hahn, Britta; Gold, James M; Luck, Steven J

    2017-04-05

    A recently proposed hyperfocusing hypothesis of cognitive dysfunction in schizophrenia proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more narrowly but more intensely than healthy control subjects (HCS). The present study tests a key prediction of this hypothesis, namely, that PSZ will hyperfocus on information presented at the center of gaze. This should lead to greater filtering of peripheral stimuli when the task requires focusing centrally but reduced filtering of central stimuli when the task requires attending broadly in the periphery. These predictions were tested in a double oddball paradigm, in which frequent standard stimuli and rare oddball stimuli were presented at central and peripheral locations while event-related potentials were recorded. Participants were instructed to discriminate between the standard and oddball stimuli at either the central location or at the peripheral locations. PSZ and HCS showed opposite patterns of spatial bias at the level of early sensory processing, as assessed with the P1 component: PSZ exhibited stronger sensory suppression of peripheral stimuli when the task required attending narrowly to the central location, whereas HCS exhibited stronger sensory suppression of central stimuli when the task required attending broadly to the peripheral locations. Moreover, PSZ exhibited a stronger stimulus categorization response than HCS, as assessed with the P3b component, for central stimuli when the task required attending to the peripheral region. These results provide strong evidence of hyperfocusing in PSZ, which may provide a unified mechanistic account of multiple aspects of cognitive dysfunction in schizophrenia. SIGNIFICANCE STATEMENT Schizophrenia clearly involves impaired attention, but attention is complex, and delineating the precise nature of attentional dysfunction in schizophrenia has been difficult. The present study tests a new hyperfocusing hypothesis, which proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more intensely but more narrowly than healthy control subjects (HCS). Using electrophysiological measures of sensory and cognitive processing, we found that PSZ were actually superior to HCS in focusing attention at the point of gaze and filtering out peripheral distractors when the task required a narrow focusing of attention. This finding of superior filtering in PSZ supports the hyperfocusing hypothesis, which may provide the mechanism underlying a broad range of cognitive impairments in schizophrenia. Copyright © 2017 the authors 0270-6474/17/373813-11$15.00/0.

  13. Electrophysiological Evidence for Hyperfocusing of Spatial Attention in Schizophrenia

    PubMed Central

    Kreither, Johanna; Lopez-Calderon, Javier; Leonard, Carly J.; Robinson, Benjamin M.; Ruffle, Abigail; Hahn, Britta; Gold, James M.

    2017-01-01

    A recently proposed hyperfocusing hypothesis of cognitive dysfunction in schizophrenia proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more narrowly but more intensely than healthy control subjects (HCS). The present study tests a key prediction of this hypothesis, namely, that PSZ will hyperfocus on information presented at the center of gaze. This should lead to greater filtering of peripheral stimuli when the task requires focusing centrally but reduced filtering of central stimuli when the task requires attending broadly in the periphery. These predictions were tested in a double oddball paradigm, in which frequent standard stimuli and rare oddball stimuli were presented at central and peripheral locations while event-related potentials were recorded. Participants were instructed to discriminate between the standard and oddball stimuli at either the central location or at the peripheral locations. PSZ and HCS showed opposite patterns of spatial bias at the level of early sensory processing, as assessed with the P1 component: PSZ exhibited stronger sensory suppression of peripheral stimuli when the task required attending narrowly to the central location, whereas HCS exhibited stronger sensory suppression of central stimuli when the task required attending broadly to the peripheral locations. Moreover, PSZ exhibited a stronger stimulus categorization response than HCS, as assessed with the P3b component, for central stimuli when the task required attending to the peripheral region. These results provide strong evidence of hyperfocusing in PSZ, which may provide a unified mechanistic account of multiple aspects of cognitive dysfunction in schizophrenia. SIGNIFICANCE STATEMENT Schizophrenia clearly involves impaired attention, but attention is complex, and delineating the precise nature of attentional dysfunction in schizophrenia has been difficult. The present study tests a new hyperfocusing hypothesis, which proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more intensely but more narrowly than healthy control subjects (HCS). Using electrophysiological measures of sensory and cognitive processing, we found that PSZ were actually superior to HCS in focusing attention at the point of gaze and filtering out peripheral distractors when the task required a narrow focusing of attention. This finding of superior filtering in PSZ supports the hyperfocusing hypothesis, which may provide the mechanism underlying a broad range of cognitive impairments in schizophrenia. PMID:28283557

  14. 49 CFR 71.6 - Central zone.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.6 Central zone. The third zone, the central standard time zone, includes that part of the United States that is west of the boundary line between the eastern and central standard time zones described in § 71.5 and east of the...

  15. Educational Organization, School Localization and the Process of Urbanization in Sweden.

    ERIC Educational Resources Information Center

    Andrae, Annika

    Traditionally Sweden's educational system has been highly centralized; physical characteristics, administrative factors, and teacher qualifications have been generally standardized as have curriculums, though local implementation has been afforded considerable freedom. In 1971 the upper secondary school (9-12) consolidated three previously…

  16. [Effect of angiotensin II depot administration on bioelectric functional processes of the central nervous system].

    PubMed

    Martin, G; Baumann, H; Grieger, F

    1976-01-01

    Using the average evoked potential technique, angiotensin-II depot effects (1 mg implantate = 3--4 mg/kg body weight angiotensin-II) were studied neuroelectrophysiologically in reticular, hippocampal and neocrotical structures of albino rats. A multivariate variance and discriminance analysis program revealed differentiated changes of the bioelectrical processing data of the CNS. Evidence was obtained for a varying structural sensitivity of central-nervous substructures under depot administration of angiotensin-II. In later phases of angiotensin-II action, the hippocampus was characterized by an electrographic synchronization phenomenon with high-amplitude average evoked potentials. The reticular formation, and to a lesser extent the visual cortex, showed an angiotensin-induced diminution of bioelectrical excitation. However, the intensity of the change in functional CNS patterns did not always correlate with maximal blood pressure rises. The described changes of afference processing to standardized sensory stimuli, especially in hippocampal and reticular structures of the CNS foll owing angiotensin depot action, point to a central-nervous action mechanism of angiotensin-II.

  17. Technical Challenges and Opportunities of Centralizing Space Science Mission Operations (SSMO) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ido, Haisam; Burns, Rich

    2015-01-01

    The NASA Goddard Space Science Mission Operations project (SSMO) is performing a technical cost-benefit analysis for centralizing and consolidating operations of a diverse set of missions into a unified and integrated technical infrastructure. The presentation will focus on the notion of normalizing spacecraft operations processes, workflows, and tools. It will also show the processes of creating a standardized open architecture, creating common security models and implementations, interfaces, services, automations, notifications, alerts, logging, publish, subscribe and middleware capabilities. The presentation will also discuss how to leverage traditional capabilities, along with virtualization, cloud computing services, control groups and containers, and possibly Big Data concepts.

  18. Morphometric analysis of pulp size in maxillary permanent central incisors correlated with age: An indirect digital study

    PubMed Central

    Ravindra, S. V.; Mamatha, G. P.; Sunita, J. D.; Balappanavar, Aswini Y.; Sardana, Varun

    2015-01-01

    Context: Teeth are hardest part of the body and are least affected by the taphonomic process. They are considered as one of the reliable methods of identification of a person in forensic sciences. Aim: The aim of the following study is to establish morphometeric measurements by AutoCad 2009 (Autodesk, Inc) of permanent maxillary central incisors in different age groups of Udaipur population. Setting and Design: Hospital-based descriptive cross-sectional study carried out in Udaipur. Materials and Methods: A study was carried out on 308 subjects of both genders with the age range of 9-68 years. Standardized intra-oral radiographs were made by paralleling technique and processed. The radiographs were scanned and the obtained images were standardized to the actual size of radiographic film. This was followed by measuring them using software AutoCad 2009. Statistical Analysis Used: F-test, post-hoc test, Pearson's correlation test. Results: For left maxillary central incisor, the total pulp area was found to be of 38.41 ± 12.88 mm and 14.32 ± 7.04 mm respectively. For right maxillary central incisor, the total pulp size was 38.39 ± 14.95 mm and 12.35 ± 5 mm respectively. Males (32.50, 32.87 mm2) had more pulp area when compared with females (28.82, 30.05 mm2). Conclusion: There was a decrease in total pulp area with increasing age which may be attributed to secondary dentin formation. PMID:26816461

  19. Propulsion Test Handbook: MSFC and SSC. Draft 01

    NASA Technical Reports Server (NTRS)

    Hammond, John M.

    2010-01-01

    This Handbook was prepared to provide Propulsion Test Personnel a central source of fundamental reference material. The Testing Process, which is a three-part process of pre-test activities, testing, and post-test activities, involves a collaborative effort from the mechanical, electrical, safety, and environmental disciplines in the test environment. Pre-test activities, testing, and post-test activities processes will vary, per test requirements; however, the content of this Handbook should cover basic procedures and standards that are shared across Centers.

  20. ESO science data product standard for 1D spectral products

    NASA Astrophysics Data System (ADS)

    Micol, Alberto; Arnaboldi, Magda; Delmotte, Nausicaa A. R.; Mascetti, Laura; Retzlaff, Joerg

    2016-07-01

    The ESO Phase 3 process allows the upload, validation, storage, and publication of reduced data through the ESO Science Archive Facility. Since its introduction, 2 million data products have been archived and published; 80% of them are one-dimensional extracted and calibrated spectra. Central to Phase3 is the ESO science data product standard that defines metadata and data format of any product. This contribution describes the ESO data standard for 1d-spectra, its adoption by the reduction pipelines of selected instrument modes for in-house generation of reduced spectra, the enhanced archive legacy value. Archive usage statistics are provided.

  1. 76 FR 18581 - Correction; Central Valley Project Improvement Act, Standard Criteria for Agricultural and Urban...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... DEPARTMENT OF THE INTERIOR Bureau of Reclamation Correction; Central Valley Project Improvement Act, Standard Criteria for Agricultural and Urban Water Management Plans AGENCY: Bureau of Reclamation... notice in the Federal Register at 76 FR 16818 on the Central Valley Project Improvement Act Standard...

  2. Advanced Map For Real-Time Process Control

    NASA Astrophysics Data System (ADS)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  3. A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data

    NASA Astrophysics Data System (ADS)

    Meek, Sam; Jackson, Mike; Leibovici, Didier G.

    2016-02-01

    The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.

  4. Improving Adherence to Mediterranean-Style Diet With a Community Culinary Coaching Program: Methodology Development and Process Evaluation.

    PubMed

    Polak, Rani; Pober, David; Morris, Avigail; Arieli, Rakefet; Moore, Margaret; Berry, Elliot; Ziv, Mati

    The Community Culinary Coaching Program is a community-based participatory program aimed at improving communal settlement residents' nutrition. The residents, central kitchens, preschools, and communal dining rooms were identified as areas for intervention. Evaluation included goals accomplishment assessed by food purchases by the central kitchens, and residents' feedback through focus groups. Purchasing included more vegetables (mean (standard error) percent change), (+7% (4); P = .32), fish (+115% (11); P < .001), whole grains, and legumes (+77% (9); P < .001); and less soup powders (-40% (9); P < .05), processed beef (-55% (8); P < .001), and margarine (-100% (4); P < .001). Residents recommended continuing the program beyond the project duration. This model might be useful in organizations with communal dining facilities.

  5. 49 CFR 71.7 - Boundary line between central and mountain zones.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Boundary line between central and mountain zones. 71.7 Section 71.7 Transportation Office of the Secretary of Transportation STANDARD TIME ZONE... mountain standard time zone, except Murdo, S. Dak., which is in the central standard time zone. [Amdt. 71...

  6. Isoform-specific functions of Mud/NuMA mediate binucleation of Drosophila male accessory gland cells.

    PubMed

    Taniguchi, Kiichiro; Kokuryo, Akihiko; Imano, Takao; Minami, Ryunosuke; Nakagoshi, Hideki; Adachi-Yamada, Takashi

    2014-12-20

    In standard cell division, the cells undergo karyokinesis and then cytokinesis. Some cells, however, such as cardiomyocytes and hepatocytes, can produce binucleate cells by going through mitosis without cytokinesis. This cytokinesis skipping is thought to be due to the inhibition of cytokinesis machinery such as the central spindle or the contractile ring, but the mechanisms regulating it are unclear. We investigated them by characterizing the binucleation event during development of the Drosophila male accessory gland, in which all cells are binucleate. The accessory gland cells arrested the cell cycle at 50 hours after puparium formation (APF) and in the middle of the pupal stage stopped proliferating for 5 hours. They then restarted the cell cycle and at 55 hours APF entered the M-phase synchronously. At this stage, accessory gland cells binucleated by mitosis without cytokinesis. Binucleating cells displayed the standard karyokinesis progression but also showed unusual features such as a non-round shape, spindle orientation along the apico-basal axis, and poor assembly of the central spindle. Mud, a Drosophila homolog of NuMA, regulated the processes responsible for these three features, the classical isoform Mud(PBD) and the two newly characterized isoforms Mud(L) and Mud(S) regulated them differently: Mud(L) repressed cell rounding, Mud(PBD) and Mud(S) oriented the spindle along the apico-basal axis, and Mud(S) and Mud(L) repressed central spindle assembly. Importantly, overexpression of Mud(S) induced binucleation even in standard proliferating cells such as those in imaginal discs. We characterized the binucleation in the Drosophila male accessory gland and examined mechanisms that regulated unusual morphologies of binucleating cells. We demonstrated that Mud, a microtubule binding protein regulating spindle orientation, was involved in this binucleation. We suggest that atypical functions exerted by three structurally different isoforms of Mud regulate cell rounding, spindle orientation and central spindle assembly in binucleation. We also propose that Mud(S) is a key regulator triggering cytokinesis skipping in binucleation processes.

  7. Who Needs Lewis Structures to Get VSEPR Geometries?

    ERIC Educational Resources Information Center

    Lindmark, Alan F.

    2010-01-01

    Teaching the VSEPR (valence shell electron-pair repulsion) model can be a tedious process. Traditionally, Lewis structures are drawn and the number of "electron clouds" (groups) around the central atom are counted and related to the standard VSEPR table of possible geometries. A simpler method to deduce the VSEPR structure without first drawing…

  8. Using a Guided Inquiry Approach in the Traditional Vertebrate Anatomy Laboratory

    ERIC Educational Resources Information Center

    Meuler, Debra

    2008-01-01

    A central theme of the "National Science Education Standards" is teaching science as an inquiry process, allowing students to explore an authentic problem using the tools and skills of the discipline. Research indicates that more active participation by the student, which usually requires higher-order thinking skills, results in deeper learning.…

  9. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.

  10. Implementation of the Clinical Encounters Tracking system at the Indiana University School of Medicine.

    PubMed

    Hatfield, Amy J; Bangert, Michael P

    2005-01-01

    The Indiana University School of Medicine (IUSM) Office of Medical Education &Student Services directed the IUSM Educational Technology Unit to develop a Clinical Encounters Tracking system in response to the Liaison Committee on Medical Education's (LCME) updated accreditation standards. A personal digital assistant (PDA) and centralized database server solution was implemented. Third-year medical students are required to carry a PDA on which they record clinical encounter experiences during all clerkship clinical rotations. Clinical encounters data collected on the PDAs are routinely uploaded to the central server via the PDA HotSyncing process. Real-time clinical encounter summary reports are accessed in the school's online curriculum management system: ANGEL. The resulting IUSM Clinical Encounters Tracking program addresses the LCME accreditation standard which mandates the tracking of medical students' required clinical curriculum experiences.

  11. Comparison of the Effect of Aliskiren Versus Negative Controls on Aortic Stiffness in Patients With Marfan Syndrome Under Treatment With Atenolol.

    PubMed

    Hwang, Ji-Won; Kim, Eun Kyoung; Jang, Shin Yi; Chung, Tae-Young; Ki, Chang-Seok; Sung, Kiick; Kim, Sung Mok; Ahn, Joonghyun; Carriere, Keumhee; Choe, Yeon Hyeon; Chang, Sung-A; Kim, Duk-Kyung

    2017-11-29

    The aim of this study was to evaluate the effect of aliskiren on aortic stiffness in patients with Marfan syndrome (MS). Twenty-eight MS patients (mean age ± standard deviation: 32.6 ± 10.6 years) were recruited from November 2009 to October 2014. All patients were receiving atenolol as standard beta-blocker therapy. A prospective randomization process was performed to assign participants to either aliskiren treatment (150-300mg orally per day) or no aliskiren treatment (negative control) in an open-label design. Central aortic distensibility and central pulsed wave velocity (PWV) by magnetic resonance imaging (MRI), peripheral PWV, central aortic blood pressure and augmentation index by peripheral tonometry, and aortic dilatation by echocardiography were examined initially and after 24 weeks. The primary endpoint was central aortic distensibility by MRI. In analyses of differences between baseline and 24 weeks for the aliskiren treatment group vs the negative control group, central distensibility (overall; P = .26) and central PWV (0.2 ± 0.9 vs 0.03 ± 0.7 [m/s]; P = .79) by MRI were not significantly different. Central systolic aortic blood pressure tended to be lower by 14mmHg in patients in the aliskiren treatment group than in the control group (P = .09). A significant decrease in peripheral PWV (brachial-ankle PWV) in the aliskiren treatment group (-1.6 m/s) compared with the control group (+0.28 m/s) was noted (P = .005). Among patients with MS, the addition of aliskiren to beta-blocker treatment did not significantly improve central aortic stiffness during a 24-week period. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  12. Atypical central auditory speech-sound discrimination in children who stutter as indexed by the mismatch negativity.

    PubMed

    Jansson-Verkasalo, Eira; Eggers, Kurt; Järvenpää, Anu; Suominen, Kalervo; Van den Bergh, Bea; De Nil, Luc; Kujala, Teija

    2014-09-01

    Recent theoretical conceptualizations suggest that disfluencies in stuttering may arise from several factors, one of them being atypical auditory processing. The main purpose of the present study was to investigate whether speech sound encoding and central auditory discrimination, are affected in children who stutter (CWS). Participants were 10 CWS, and 12 typically developing children with fluent speech (TDC). Event-related potentials (ERPs) for syllables and syllable changes [consonant, vowel, vowel-duration, frequency (F0), and intensity changes], critical in speech perception and language development of CWS were compared to those of TDC. There were no significant group differences in the amplitudes or latencies of the P1 or N2 responses elicited by the standard stimuli. However, the Mismatch Negativity (MMN) amplitude was significantly smaller in CWS than in TDC. For TDC all deviants of the linguistic multifeature paradigm elicited significant MMN amplitudes, comparable with the results found earlier with the same paradigm in 6-year-old children. In contrast, only the duration change elicited a significant MMN in CWS. The results showed that central auditory speech-sound processing was typical at the level of sound encoding in CWS. In contrast, central speech-sound discrimination, as indexed by the MMN for multiple sound features (both phonetic and prosodic), was atypical in the group of CWS. Findings were linked to existing conceptualizations on stuttering etiology. The reader will be able (a) to describe recent findings on central auditory speech-sound processing in individuals who stutter, (b) to describe the measurement of auditory reception and central auditory speech-sound discrimination, (c) to describe the findings of central auditory speech-sound discrimination, as indexed by the mismatch negativity (MMN), in children who stutter. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    PubMed

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  14. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  15. Local, regional and national interoperability in hospital-level systems architecture.

    PubMed

    Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J

    2007-01-01

    Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.

  16. The National and University Library in Zagreb: The Goal Is Known--How Can It Be Attained?

    ERIC Educational Resources Information Center

    Miletic-Vejzovic, Laila

    1994-01-01

    Provides an overview of the state of libraries and their resources in Croatia. Highlights include destruction of libraries resulting from the war; the need for centralization, uniformity, and standards; the role of the National and University Library; processing library materials; and the development of an automated system and network. (Contains…

  17. 7 CFR 457.147 - Central and Southern potato crop insurance provisions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Processing; for potatoes produced for seed, the United States Standards for Grades of Seed Potatoes; and for.... Definitions Certified seed. Potatoes that were entered into the potato certified seed program and that meet all requirements for production to be used to produce a seed crop for the next crop year or a potato...

  18. 7 CFR 457.147 - Central and Southern potato crop insurance provisions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Processing; for potatoes produced for seed, the United States Standards for Grades of Seed Potatoes; and for.... Definitions Certified seed. Potatoes that were entered into the potato certified seed program and that meet all requirements for production to be used to produce a seed crop for the next crop year or a potato...

  19. Visual Processing in Adolescents with Autism Spectrum Disorder: Evidence from Embedded Figures and Configural Superiority Tests

    ERIC Educational Resources Information Center

    Dillen, Claudia; Steyaert, Jean; Op de Beeck, Hans P.; Boets, Bart

    2015-01-01

    The embedded figures test has often been used to reveal weak central coherence in individuals with autism spectrum disorder (ASD). Here, we administered a more standardized automated version of the embedded figures test in combination with the configural superiority task, to investigate the effect of contextual modulation on local feature…

  20. Towards a Personal Best: A Case for Introducing Ipsative Assessment in Higher Education

    ERIC Educational Resources Information Center

    Hughes, Gwyneth

    2011-01-01

    The central role that assessment plays is recognised in higher education, in particular how formative feedback guides learning. A model for effective feedback practice is used to argue that, in current schemes, formative feedback is often not usable because it is strongly linked to external criteria and standards, rather than to the processes of…

  1. Industrial Processes to Reduce Generation of Hazardous Waste at DoD Facilities. Phase III Report. Appendix C. Workshop Manual Centralized Vehicle Wash Racks and Scheduled Maintenance Facilities, Fort Lewis, Washington.

    DTIC Science & Technology

    1985-12-01

    Lobster Shop - 759-2165. 4013 Ruston Way. Known for excellent seafood. Nautical The Bay Co. - 752-6661. 3327 Ruston Way. Various entrees. CI Shenanigans ...se- RCRA permit is inappropriate." Ac- forms of financial responsibility for rious potential health problem in New cording to Rogers and Darrah, under...admini- strative, monitoring, and financial standards for them. EPA will use these independently enforceable standards to issue permits to owners

  2. CDISC SHARE, a Global, Cloud-based Resource of Machine-Readable CDISC Standards for Clinical and Translational Research

    PubMed Central

    Hume, Samuel; Chow, Anthony; Evans, Julie; Malfait, Frederik; Chason, Julie; Wold, J. Darcy; Kubick, Wayne; Becnel, Lauren B.

    2018-01-01

    The Clinical Data Interchange Standards Consortium (CDISC) is a global non-profit standards development organization that creates consensus-based standards for clinical and translational research. Several of these standards are now required by regulators for electronic submissions of regulated clinical trials’ data and by government funding agencies. These standards are free and open, available for download on the CDISC Website as PDFs. While these documents are human readable, they are not amenable to ready use by electronic systems. CDISC launched the CDISC Shared Health And Research Electronic library (SHARE) to provide the standards metadata in machine-readable formats to facilitate the automated management and implementation of the standards. This paper describes how CDISC SHARE’S standards can facilitate collecting, aggregating and analyzing standardized data from early design to end analysis; and its role as a central resource providing information systems with metadata that drives process automation including study setup and data pipelining. PMID:29888049

  3. Failure to Rescue, Rescue Surgery and Centralization of Postoperative Complications: A Challenge for General and Acute Care Surgeons.

    PubMed

    Zago, Mauro; Bozzo, Samantha; Carrara, Giulia; Mariani, Diego

    2017-01-01

    To explore the current literature on the failure to rescue and rescue surgery concepts, to identify the key items for decreasing the failure to rescue rate and improve outcome, to verify if there is a rationale for centralization of patients suffering postoperative complications. There is a growing awareness about the need to assess and measure the failure to rescue rate, on institutional, regional and national basis. Many factors affect failure to rescue, and all should be individually analyzed and considered. Rescue surgery is one of these factors. Rescue surgery assumes an acute care surgery background. Measurement of failure to rescue rate should become a standard for quality improvement programs. Implementation of all clinical and organizational items involved is the key for better outcomes. Preparedness for rescue surgery is a main pillar in this process. Centralization of management, audit, and communication are important as much as patient centralization. Celsius.

  4. Centralized Data Management in a Multicountry, Multisite Population-based Study.

    PubMed

    Rahman, Qazi Sadeq-ur; Islam, Mohammad Shahidul; Hossain, Belal; Hossain, Tanvir; Connor, Nicholas E; Jaman, Md Jahiduj; Rahman, Md Mahmudur; Ahmed, A S M Nawshad Uddin; Ahmed, Imran; Ali, Murtaza; Moin, Syed Mamun Ibne; Mullany, Luke; Saha, Samir K; El Arifeen, Shams

    2016-05-01

    A centralized data management system was developed for data collection and processing for the Aetiology of Neonatal Infection in South Asia (ANISA) study. ANISA is a longitudinal cohort study involving neonatal infection surveillance and etiology detection in multiple sites in South Asia. The primary goal of designing such a system was to collect and store data from different sites in a standardized way to pool the data for analysis. We designed the data management system centrally and implemented it to enable data entry at individual sites. This system uses validation rules and audit that reduce errors. The study sites employ a dual data entry method to minimize keystroke errors. They upload collected data weekly to a central server via internet to create a pooled central database. Any inconsistent data identified in the central database are flagged and corrected after discussion with the relevant site. The ANISA Data Coordination Centre in Dhaka provides technical support for operations, maintenance and updating the data management system centrally. Password-protected login identifications and audit trails are maintained for the management system to ensure the integrity and safety of stored data. Centralized management of the ANISA database helps to use common data capture forms (DCFs), adapted to site-specific contextual requirements. DCFs and data entry interfaces allow on-site data entry. This reduces the workload as DCFs do not need to be shipped to a single location for entry. It also improves data quality as all collected data from ANISA goes through the same quality check and cleaning process.

  5. Managing public health in the Army through a standard community health promotion council model.

    PubMed

    Courie, Anna F; Rivera, Moira Shaw; Pompey, Allison

    2014-01-01

    Public health processes in the US Army remain uncoordinated due to competing lines of command, funding streams and multiple subject matter experts in overlapping public health concerns. The US Army Public Health Command (USAPHC) has identified a standard model for community health promotion councils (CHPCs) as an effective framework for synchronizing and integrating these overlapping systems to ensure a coordinated approach to managing the public health process. The purpose of this study is to test a foundational assumption of the CHPC effectiveness theory: the 3 features of a standard CHPC model - a CHPC chaired by a strong leader, ie, the senior commander; a full time health promotion team dedicated to the process; and centralized management through the USAPHC - will lead to high quality health promotion councils capable of providing a coordinated approach to addressing public health on Army installations. The study employed 2 evaluation questions: (1) Do CHPCs with centralized management through the USAPHC, alignment with the senior commander, and a health promotion operations team adhere more closely to the evidence-based CHPC program framework than CHPCs without these 3 features? (2) Do members of standard CHPCs report that participation in the CHPC leads to a well-coordinated approach to public health at the installation? The results revealed that both time (F(5,76)=25.02, P<.0001) and the 3 critical features of the standard CHPC model (F(1,76)=28.40, P<.0001) independently predicted program adherence. Evaluation evidence supports the USAPHC's approach to CHPC implementation as part of public health management on Army installations. Preliminary evidence suggests that the standard CHPC model may lead to a more coordinated approach to public health and may assure that CHPCs follow an evidence-informed design. This is consistent with past research demonstrating that community coalitions and public health systems that have strong leadership; dedicated staff time and expertise; influence over policy, governance and oversight; and formalized rules and regulations function more effectively than those without. It also demonstrates the feasibility of implementing an evidence-informed approach to community coalitions in an Army environment.

  6. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  7. “Heidelberg standard examination” and “Heidelberg standard procedures” – Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education

    PubMed Central

    Nikendei, C.; Ganschow, P.; Groener, J. B.; Huwendiek, S.; Köchel, A.; Köhl-Hackert, N.; Pjontek, R.; Rodrian, J.; Scheibe, F.; Stadler, A.-K.; Steiner, T.; Stiepak, J.; Tabatabai, J.; Utz, A.; Kadmon, M.

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects “Heidelberg standard examination” and “Heidelberg standard procedures”, which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties. PMID:27579354

  8. Rapid cycle development of a multifactorial intervention achieved sustained reductions in central line-associated bloodstream infections in haematology oncology units at a children's hospital: a time series analysis.

    PubMed

    Dandoy, Christopher E; Hausfeld, Jackie; Flesch, Laura; Hawkins, Deanna; Demmel, Kathy; Best, Deanna; Osterkamp, Erin; Bracke, Tracey; Nagarajan, Rajaram; Jodele, Sonata; Holt, Julie; Giaccone, Mary Jo; Davies, Stella M; Kotagal, Uma; Simmons, Jeffrey

    2016-08-01

    Immunocompromised children are at high risk for central line-associated bloodstream infections (CLABSIs) and its associated morbidity and mortality. Prevention of CLABSIs depends on highly reliable care. Since the summer of 2013, we saw an increase in patient volume and acuity in our centre. Additionally, CLABSIs rates more than tripled during this period. The purpose of this initiative was to rapidly identify and mitigate potential underlying drivers to the increased CLABSI rate. Through small tests of change, we implemented a standard process for daily hygiene; increased awareness of high-risk patients with CLABSI; improved education/assistance for nurses performing high-risk central venous catheter procedures; and developed a system to improve allocation of resources to de-escalate system stress. The CLABSI rate from June 2013 to May 2014 was 2.03 CLABSIs/1000 line days. After implementation of our interventions, we saw a significant decrease in the CLABSI rate to 0.39 CLABSIs/1000 line days (p=0.008). Key processes have become more reliable: 100% of dressing changes are completed with the new two-person standard; daily hygiene adherence has increased from 25% to 70%; 100% of nurses are approached daily by senior nursing for assistance with high-risk procedures; and patients at risk for a CLABSI are identified daily. Stress to a complex system caring for high-risk patients can challenge CLABSI rates. Identifying key processes and executing them reliably can stabilise outcomes during times of system stress. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. 24 CFR 3280.511 - Comfort cooling certificate and information.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Refrigeration Institute Standards The central air conditioning system provided with this home has been sized... and Refrigeration Institute Standards. The central air conditioning system provided with this home has... the appropriate Air Conditioning and Refrigeration Institute Standards. When the air circulators of...

  10. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activity 08LCA04 in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, Central Florida, September 2008

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2009-01-01

    From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  11. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  12. Digital surfaces and hydrogeologic data for the Floridan aquifer system in Florida and in parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Bellino, Jason C.

    2011-01-01

    A digital dataset for the Floridan aquifer system in Florida and in parts of Georgia, Alabama, and South Carolina was developed from selected reports published as part of the Regional Aquifer-System Analysis (RASA) Program of the U.S. Geological Survey (USGS) in the 1980s. These reports contain maps and data depicting the extent and elevation of both time-stratigraphic and hydrogeologic units of which the aquifer system is composed, as well as data on hydrology, meteorology, and aquifer properties. The three primary reports used for this dataset compilation were USGS Professional Paper 1403-B (Miller, 1986), Professional Paper 1403-C (Bush and Johnston, 1988), and USGS Open-File Report 88-86 (Miller, 1988). Paper maps from Professional Papers 1403-B and 1403-C were scanned and georeferenced to the North American Datum of 1927 (NAD27) using the Lambert Conformal Conic projection (standard parallels 33 and 45 degrees, central longitude -96 degrees, central latitude 39 degrees). Once georeferenced, tracing of pertinent line features contained in each image (for example, contours and faults) was facilitated by specialized software using algorithms that automated much of the process. Resulting digital line features were then processed using standard geographic information system (GIS) software to remove artifacts from the digitization process and to verify and update attribute tables. The digitization process for polygonal features (for example, outcrop areas and unit extents) was completed by hand using GIS software.

  13. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report.

    PubMed

    Levitt, Heidi M; Bamberg, Michael; Creswell, John W; Frost, David M; Josselson, Ruthellen; Suárez-Orozco, Carola

    2018-01-01

    The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS-Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a research report to enable and facilitate the review process. This publication marks a historical moment-the first inclusion of qualitative research in APA Style, which is the basis of both the Publication Manual of the American Psychological Association (APA, 2010) and APA Style CENTRAL, an online program to support APA Style. In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed for psychological qualitative research but may hold utility for a broad range of social sciences. They honor a range of qualitative traditions, methods, and reporting styles. The Working Group was composed of a group of researchers with backgrounds in varying methods, research topics, and approaches to inquiry. In this article, they present these standards and their rationale, and they detail the ways that the standards differ from the quantitative research reporting standards. They describe how the standards can be used by authors in the process of writing qualitative research for submission as well as by reviewers and editors in the process of reviewing research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. When does word frequency influence written production?

    PubMed

    Baus, Cristina; Strijkers, Kristof; Costa, Albert

    2013-01-01

    The aim of the present study was to explore the central (e.g., lexical processing) and peripheral processes (motor preparation and execution) underlying word production during typewriting. To do so, we tested non-professional typers in a picture typing task while continuously recording EEG. Participants were instructed to write (by means of a standard keyboard) the corresponding name for a given picture. The lexical frequency of the words was manipulated: half of the picture names were of high-frequency while the remaining were of low-frequency. Different measures were obtained: (1) first keystroke latency and (2) keystroke latency of the subsequent letters and duration of the word. Moreover, ERPs locked to the onset of the picture presentation were analyzed to explore the temporal course of word frequency in typewriting. The results showed an effect of word frequency for the first keystroke latency but not for the duration of the word or the speed to which letter were typed (interstroke intervals). The electrophysiological results showed the expected ERP frequency effect at posterior sites: amplitudes for low-frequency words were more positive than those for high-frequency words. However, relative to previous evidence in the spoken modality, the frequency effect appeared in a later time-window. These results demonstrate two marked differences in the processing dynamics underpinning typing compared to speaking: First, central processing dynamics between speaking and typing differ already in the manner that words are accessed; second, central processing differences in typing, unlike speaking, do not cascade to peripheral processes involved in response execution.

  15. When does word frequency influence written production?

    PubMed Central

    Baus, Cristina; Strijkers, Kristof; Costa, Albert

    2013-01-01

    The aim of the present study was to explore the central (e.g., lexical processing) and peripheral processes (motor preparation and execution) underlying word production during typewriting. To do so, we tested non-professional typers in a picture typing task while continuously recording EEG. Participants were instructed to write (by means of a standard keyboard) the corresponding name for a given picture. The lexical frequency of the words was manipulated: half of the picture names were of high-frequency while the remaining were of low-frequency. Different measures were obtained: (1) first keystroke latency and (2) keystroke latency of the subsequent letters and duration of the word. Moreover, ERPs locked to the onset of the picture presentation were analyzed to explore the temporal course of word frequency in typewriting. The results showed an effect of word frequency for the first keystroke latency but not for the duration of the word or the speed to which letter were typed (interstroke intervals). The electrophysiological results showed the expected ERP frequency effect at posterior sites: amplitudes for low-frequency words were more positive than those for high-frequency words. However, relative to previous evidence in the spoken modality, the frequency effect appeared in a later time-window. These results demonstrate two marked differences in the processing dynamics underpinning typing compared to speaking: First, central processing dynamics between speaking and typing differ already in the manner that words are accessed; second, central processing differences in typing, unlike speaking, do not cascade to peripheral processes involved in response execution. PMID:24399980

  16. Quantitative analysis of residual protein contamination of podiatry instruments reprocessed through local and central decontamination units

    PubMed Central

    2011-01-01

    Background The cleaning stage of the instrument decontamination process has come under increased scrutiny due to the increasing complexity of surgical instruments and the adverse affects of residual protein contamination on surgical instruments. Instruments used in the podiatry field have a complex surface topography and are exposed to a wide range of biological contamination. Currently, podiatry instruments are reprocessed locally within surgeries while national strategies are favouring a move toward reprocessing in central facilities. The aim of this study was to determine the efficacy of local and central reprocessing on podiatry instruments by measuring residual protein contamination of instruments reprocessed by both methods. Methods The residual protein of 189 instruments reprocessed centrally and 189 instruments reprocessed locally was determined using a fluorescent assay based on the reaction of proteins with o-phthaldialdehyde/sodium 2-mercaptoethanesulfonate. Results Residual protein was detected on 72% (n = 136) of instruments reprocessed centrally and 90% (n = 170) of instruments reprocessed locally. Significantly less protein (p < 0.001) was recovered from instruments reprocessed centrally (median 20.62 μg, range 0 - 5705 μg) than local reprocessing (median 111.9 μg, range 0 - 6344 μg). Conclusions Overall, the results show the superiority of central reprocessing for complex podiatry instruments when protein contamination is considered, though no significant difference was found in residual protein between local decontamination unit and central decontamination unit processes for Blacks files. Further research is needed to undertake qualitative identification of protein contamination to identify any cross contamination risks and a standard for acceptable residual protein contamination applicable to different instruments and specialities should be considered as a matter of urgency. PMID:21219613

  17. Quantitative analysis of residual protein contamination of podiatry instruments reprocessed through local and central decontamination units.

    PubMed

    Smith, Gordon Wg; Goldie, Frank; Long, Steven; Lappin, David F; Ramage, Gordon; Smith, Andrew J

    2011-01-10

    The cleaning stage of the instrument decontamination process has come under increased scrutiny due to the increasing complexity of surgical instruments and the adverse affects of residual protein contamination on surgical instruments. Instruments used in the podiatry field have a complex surface topography and are exposed to a wide range of biological contamination. Currently, podiatry instruments are reprocessed locally within surgeries while national strategies are favouring a move toward reprocessing in central facilities. The aim of this study was to determine the efficacy of local and central reprocessing on podiatry instruments by measuring residual protein contamination of instruments reprocessed by both methods. The residual protein of 189 instruments reprocessed centrally and 189 instruments reprocessed locally was determined using a fluorescent assay based on the reaction of proteins with o-phthaldialdehyde/sodium 2-mercaptoethanesulfonate. Residual protein was detected on 72% (n = 136) of instruments reprocessed centrally and 90% (n = 170) of instruments reprocessed locally. Significantly less protein (p < 0.001) was recovered from instruments reprocessed centrally (median 20.62 μg, range 0 - 5705 μg) than local reprocessing (median 111.9 μg, range 0 - 6344 μg). Overall, the results show the superiority of central reprocessing for complex podiatry instruments when protein contamination is considered, though no significant difference was found in residual protein between local decontamination unit and central decontamination unit processes for Blacks files. Further research is needed to undertake qualitative identification of protein contamination to identify any cross contamination risks and a standard for acceptable residual protein contamination applicable to different instruments and specialities should be considered as a matter of urgency.

  18. Assessment of quality and geochemical processes occurring in groundwaters near central air conditioning plant site in Trombay, Maharashtra, India.

    PubMed

    Tirumalesh, K; Shivanna, K; Sriraman, A K; Tyagi, A K

    2010-04-01

    This paper summarizes the findings obtained in a monitoring study to understand the sources and processes affecting the quality of shallow and deep groundwater near central air conditioning plant site in Trombay region by making use of physicochemical and biological analyses. All the measured parameters of the groundwaters indicate that the groundwater quality is good and within permissible limits set by (Indian Bureau of Standards 1990). Shallow groundwater is dominantly of Na-HCO(3) type whereas deep groundwater is of Ca-Mg-HCO(3) type. The groundwater chemistry is mainly influenced by dissolution of minerals and base exchange processes. High total dissolved solids in shallow groundwater compared to deeper ones indicate faster circulation of groundwater in deep zone preferably through fissures and fractures whereas groundwater flow is sluggish in shallow zone. The characteristic ionic ratio values and absence of bromide point to the fact that seawater has no influence on groundwater system.

  19. The Vanderbilt Professional Nursing Practice Program, part 3: managing an advancement process.

    PubMed

    Steaban, Robin; Fudge, Mitzie; Leutgens, Wendy; Wells, Nancy

    2003-11-01

    Consistency of performance standards across multiple clinical settings is an essential component of a credible advancement system. Our advancement process incorporates a central committee, composed of nurses from all clinical settings within the institution, to ensure consistency of performance in inpatient, outpatient, and procedural settings. An analysis of nurses advanced during the first 18 months of the program indicates that performance standards are applicable to nurses in all clinical settings. The first article (September 2003) in this 3-part series described the foundation for and the philosophical background of the Vanderbilt Professional Nursing Practice Program (VPNPP), the career advancement program underway at Vanderbilt University Medical Center. Part 2 described the development of the evaluation tools used in the VPNPP, the implementation and management of this new system, program evaluation, and improvements since the program's inception. The purpose of this article is to review the advancement process, review the roles of those involved in the process, and to describe outcomes and lessons learned.

  20. Biomarker Discovery in Gulf War Veterans: Development of a War Illness Diagnostic Panel

    DTIC Science & Technology

    2016-12-01

    with GWI reflect a persistent disruption in central nervous system (CNS) proinflammatory and neuroendocrine parameters. These processes can...differences in these systems are more subtle than the frank “abnormalities” identified with standard diagnostic tests (e.g., measures indicating...the coagulation system in Gulf War Illness: a potential pathophysiologic link with chronic fatigue syndrome. A laboratory approach to diagnosis. Blood

  1. Oil Pharmacy at the Thermal Protection System Facility

    NASA Image and Video Library

    2017-08-08

    An overall view of the Oil Pharmacy operated under the Test and Operations Support Contract, or TOSC. The facility consolidated storage and distribution of petroleum products used in equipment maintained under the contract. This included standardized naming, testing processes and provided a central location for distribution of oils used in everything from simple machinery to the crawler-transporter and cranes in the Vehicle Assembly Building.

  2. Army Logistician. Volume 39, Issue 2, March-April 2007

    DTIC Science & Technology

    2007-04-01

    Most Army Reduces Tactical Supply System Footprint by thoMas h. aMent, jr. Centralizing all of the Army’s Corps/Theater Automated Data Processing...Middleware, which comprises both hardware and software, revises data in the Standard Army Retail Supply System (SARSS), thereby extending the use of the...Logistics: Supply Based or Distribution Based? The Changing Face of Fuel Management Combat Logistics Patrol Methodology Distribution-Based

  3. Scale structure: Processing Minimum Standard and Maximum Standard Scalar Adjectives

    PubMed Central

    Frazier, Lyn; Clifton, Charles; Stolterfoht, Britta

    2008-01-01

    Gradable adjectives denote a function that takes an object and returns a measure of the degree to which the object possesses some gradable property (Kennedy, 1999). Scales, ordered sets of degrees, have begun to be studied systematically in semantics (Kennedy, to appear, Kennedy & McNally, 2005, Rotstein & Winter, 2004). We report four experiments designed to investigate the processing of absolute adjectives with a maximum standard (e.g., clean) and their minimum standard antonyms (dirty). The central hypothesis is that the denotation of an absolute adjective introduces a ‘standard value’ on a scale as part of the normal comprehension of a sentence containing the adjective (the “Obligatory Scale” hypothesis). In line with the predictions of Kennedy and McNally (2005) and Rotstein and Winter (2004), maximum standard adjectives and minimum standard adjectives systematically differ from each other when they are combined with minimizing modifiers like slightly, as indicated by speeded acceptability judgments. An eye movement recording study shows that, as predicted by the Obligatory Scale hypothesis, the penalty due to combining slightly with a maximum standard adjective can be observed during the processing of the sentence; the penalty is not the result of some after-the-fact inferencing mechanism. Further, a type of ‘quantificational variability effect’ may be observed when a quantificational adverb (mostly) is combined with a minimum standard adjective in sentences like The dishes are mostly dirty, which may receive either a degree interpretation (e.g. 80% dirty) or a quantity interpretation (e.g., 80% of the dishes are dirty). The quantificational variability results provide suggestive support for the Obligatory Scale hypothesis by showing that the standard of a scalar adjective influences the preferred interpretation of other constituents in the sentence. PMID:17376422

  4. DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)

    NASA Technical Reports Server (NTRS)

    Keith, B.

    1994-01-01

    Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.

  5. Adapting the CUAHSI Hydrologic Information System to OGC standards

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Whitenack, T.; Zaslavsky, I.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) provides web and desktop client access to hydrologic observations via water data web services using an XML schema called “WaterML”. The WaterML 1.x specification and the corresponding Water Data Services have been the backbone of the HIS service-oriented architecture (SOA) and have been adopted for serving hydrologic data by several federal agencies and many academic groups. The central discovery service, HIS Central, is based on an metadata catalog that references 4.7 billion observations, organized as 23 million data series from 1.5 million sites from 51 organizations. Observations data are published using HydroServer nodes that have been deployed at 18 organizations. Usage of HIS has increased by 8x from 2008 to 2010, and doubled in usage from 1600 data series a day in 2009 to 3600 data series a day in the first half of 2010. The HIS central metadata catalog currently harvests information from 56 Water Data Services. We collaborate on the catalog updates with two federal partners, USGS and US EPA: their data series are periodically reloaded into the HIS metadata catalog. We are pursuing two main development directions in the HIS project: Cloud-based computing, and further compliance with Open Geospatial Consortium (OGC) standards. The goal of moving to cloud-computing is to provide a scalable collaborative system with a simpler deployment and less dependence of hardware maintenance and staff. This move requires re-architecting the information models underlying the metadata catalog, and Water Data Services to be independent of the underlying relational database model, allowing for implementation on both relational databases, and cloud-based processing systems. Cloud-based HIS central resources can be managed collaboratively; partners share responsibility for their metadata by publishing data series information into the centralized catalog. Publishing data series will use REST-based service interfaces, like OData, as the basis for ingesting data series information into a cloud-hosted catalog. The future HIS services involve providing information via OGC Standards that will allow for observational data access from commercial GIS applications. Use of standards will allow for tools to access observational data from other projects using standards, such as the Ocean Observatories Initiative, and for tools from such projects to be integrated into the HIS toolset. With international collaborators, we have been developing a water information exchange language called “WaterML 2.0” which will be used to deliver observations data over OGC Sensor Observation Services (SOS). A software stack of OGC standard services will provide access to HIS information. In addition to SOS, Web Mapping and Feature Services (WMS, and WFS) will provide access to location information. Catalog Services for the Web (CSW) will provide a catalog for water information that is both centralized, and distributed. We intend the OGC standards supplement the existing HIS service interfaces, rather than replace the present service interfaces. The ultimate goal of this development is expand access to hydrologic observations data, and create an environment where these data can be seamlessly integrated with standards-compliant data resources.

  6. Topographic mapping of electroencephalography coherence in hypnagogic state.

    PubMed

    Tanaka, H; Hayashi, M; Hori, T

    1998-04-01

    The present study examined the topographic characteristics of hypnagogic electroencephalography (EEG), using topographic mapping of EEG power and coherence corresponding to nine EEG stages (Hori's hypnagogic EEG stages). EEG stages 1 and 2, the EEG stages 3-8, and the EEG stage 9 each correspond with standard sleep stage W, 1 and 2, respectively. The dominant topographic components of delta and theta activities increased clearly from the vertex sharp-wave stage (the EEG stages 6 and 7) in the anterior-central areas. The dominant topographic component of alpha 3 activities increased clearly from the EEG stage 9 in the anterior-central areas. The dominant topographic component of sigma activities increased clearly from the EEG stage 8 in the central-parietal area. These results suggested basic sleep process might start before the onset of sleep stage 2 or of the manually scored spindles.

  7. Bringing central line-associated bloodstream infection prevention home: CLABSI definitions and prevention policies in home health care agencies.

    PubMed

    Rinke, Michael L; Bundy, David G; Milstone, Aaron M; Deuber, Kristin; Chen, Allen R; Colantuoni, Elizabeth; Miller, Marlene R

    2013-08-01

    A study was conducted to investigate health care agency central line-associated bloodstream infection (CLABSI) definitions and prevention policies and pare them to the Joint Commission National Patient Safety Goal (NPSG.07.04.01), the Centers for Disease Control and Prevention (CDC) CLABSI prevention recommendations, and a best-practice central line care bundle for inpatients. A telephone-based survey was conducted in 2011 of a convenience sample of home health care agencies associated with children's hematology/oncology centers. Of the 97 eligible home health care agencies, 57 (59%) completed the survey. No agency reported using all five aspects of the National Healthcare and Safety Network/Association for Professionals in Infection Control and Epidemiology CLABSI definition and adjudication process, and of the 50 agencies that reported tracking CLABSI rates, 20 (40%) reported using none. Only 10 agencies (18%) had policies consistent with all elements of the inpatient-focused NPSG.07.04.01, 10 agencies (18%) were consistent with all elements of the home care targeted CDC CLABSI prevention recommendations, and no agencies were consistent with all elements of the central line care bundle. Only 14 agencies (25%) knew their overall CLABSI rate: mean 0.40 CLABSIs per 1,000 central line days (95% confidence interval [CI], 0.18 to 0.61). Six agencies (11%) knew their agency's pediatric CLABSI rate: mean 0.54 CLABSIs per 1,000 central line days (95% CI, 0.06 to 1.01). The policies of a national sample of home health care agencies varied significantly from national inpatient and home health care agency targeted standards for CLABSI definitions and prevention. Future research should assess strategies for standardizing home health care practices consistent with evidence-based recommendations.

  8. Domain Modeling and Application Development of an Archetype- and XML-based EHRS. Practical Experiences and Lessons Learnt.

    PubMed

    Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin

    2017-06-28

    Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.

  9. Quality of haemophilia care in The Netherlands: new standards for optimal care.

    PubMed

    Leebeek, Frank W G; Fischer, Kathelijn

    2014-04-01

    In the Netherlands, the first formal haemophilia comprehensive care centre was established in 1964, and Dutch haemophilia doctors have been organised since 1972. Although several steps were taken to centralise haemophilia care and maintain quality of care, treatment was still delivered in many hospitals, and formal criteria for haemophilia treatment centres as well as a national haemophilia registry were lacking. In collaboration with patients and other stakeholders, Dutch haemophilia doctors have undertaken a formal process to draft new quality standards for the haemophilia treatment centres. First a project group including doctors, nurses, patients and the institute for harmonisation of quality standards undertook a literature study on quality standards and performed explorative visits to several haemophilia treatment centres in the Netherlands. Afterwards concept standards were defined and validated in two treatment centres. Next, the concept standards were evaluated by haemophilia doctors, patients, health insurance representatives and regulators. Finally, the final version of the standards of care was approved by Central body of Experts on quality standards in clinical care and the Dutch Ministry of Health. A team of expert auditors have been trained and, together with an independent auditor, will perform audits in haemophilia centres applying for formal certification. Concomitantly, a national registry for haemophilia and allied disorders is being set up. It is expected that these processes will lead to further concentration and improved quality of haemophilia care in the Netherlands.

  10. Vascular surgical data registries for small computers.

    PubMed

    Kaufman, J L; Rosenberg, N

    1984-08-01

    Recent designs for computer-based vascular surgical registries and clinical data bases have employed large centralized systems with formal programming and mass storage. Small computers, of the types created for office use or for word processing, now contain sufficient speed and memory storage capacity to allow construction of decentralized office-based registries. Using a standardized dictionary of terms and a method of data organization adapted to word processing, we have created a new vascular surgery data registry, "VASREG." Data files are organized without programming, and a limited number of powerful logical statements in English are used for sorting. The capacity is 25,000 records with current inexpensive memory technology. VASREG is adaptable to computers made by a variety of manufacturers, and interface programs are available for conversion of the word processor formated registry data into forms suitable for analysis by programs written in a standard programming language. This is a low-cost clinical data registry available to any physician. With a standardized dictionary, preparation of regional and national statistical summaries may be facilitated.

  11. Standardization of fixation, processing and staining methods for the central nervous system of vertebrates.

    PubMed

    Aldana Marcos, H J; Ferrari, C C; Benitez, I; Affanni, J M

    1996-12-01

    This paper reports the standardization of methods used for processing and embedding various vertebrate brains of different size in paraffin. Other technical details developed for avoiding frequent difficulties arising during laboratory routine are also reported. Some modifications of the Nissl and Klüver-Barrera staining methods are proposed. These modifications include: 1) a Nissl stain solution with a rapid and efficient action with easier differentiation; 2) the use of a cheap microwave oven for the Klüver-Barrera stain. These procedures have the advantage of permitting Nissl and Klüver-Barrera staining of nervous tissue in about five and fifteen minutes respectively. The proposed procedures have been tested in brains obtained from fish, amphibians, reptiles and mammals of different body sizes. They are the result of our long experience in preparing slides for comparative studies. Serial sections of excellent quality were regularly obtained in all the specimens studied. These standardized methods, being simple and quick, are recommended for routine use in neurobiological laboratories.

  12. Acute effects of Delta9-tetrahydrocannabinol and standardized cannabis extract on the auditory evoked mismatch negativity.

    PubMed

    Juckel, Georg; Roser, Patrik; Nadulski, Thomas; Stadelmann, Andreas M; Gallinat, Jürgen

    2007-12-01

    Reduced amplitudes of auditory evoked mismatch negativity (MMN) have often been found in schizophrenic patients, indicating deficient auditory information processing and working memory. Cannabis-induced psychotic states may resemble schizophrenia. Currently, there are discussions focusing on the close relationship between cannabis, the endocannabinoid and dopaminergic system, and the onset of schizophrenic psychosis. This study investigated the effects of cannabis on MMN amplitude in 22 healthy volunteers (age 28+/-6 years, 11 male) by comparing Delta(9)-tetrahydrocannabinol (Delta(9)-THC) and standardized cannabis extract containing Delta(9)-THC and cannabidiol (CBD) in a prospective, double-blind, placebo-controlled cross-over design. The MMNs resulting from 1000 auditory stimuli were recorded by 32 channel EEG. The standard stimuli were 1000 Hz, 80 dB SPL, and 100 ms duration. The deviant stimuli differed in frequency (1500 Hz). Significantly greater MMN amplitude values at central electrodes were found under cannabis extract, but not under Delta(9)-THC. There were no significant differences between MMN amplitudes at frontal electrodes. MMN amplitudes at central electrodes were significantly correlated with 11-OH-THC concentration, the most important psychoactive metabolite of Delta(9)-THC. Since the main difference between Delta(9)-THC and standardized cannabis extract is CBD, which seems to have neuroprotective and anti-psychotic properties, it can be speculated whether the greater MMN amplitude that may imply higher cortical activation and cognitive performance is related to the positive effects of CBD. This effect may be relevant for auditory cortex activity in particular because only MMN amplitudes at the central, but not at the frontal electrodes were enhanced under cannabis.

  13. Sr/Ca and Mg/Ca vital effects correlated with skeletal architecture in a scleractinian deep-sea coral and the role of Rayleigh fractionation

    NASA Astrophysics Data System (ADS)

    Gagnon, Alexander C.; Adkins, Jess F.; Fernandez, Diego P.; Robinson, Laura F.

    2007-09-01

    Deep-sea corals are a new tool in paleoceanography with the potential to provide century long records of deep ocean change at sub-decadal resolution. Complicating the reconstruction of past deep-sea temperatures, Mg/Ca and Sr/Ca paleothermometers in corals are also influenced by non-environmental factors, termed vital effects. To determine the magnitude, pattern and mechanism of vital effects we measure detailed collocated Sr/Ca and Mg/Ca ratios, using a combination of micromilling and isotope-dilution ICP-MS across skeletal features in recent samples of Desmophyllum dianthus, a scleractinian coral that grows in the near constant environment of the deep-sea. Sr/Ca variability across skeletal features is less than 5% (2σ relative standard deviation) and variability of Sr/Ca within the optically dense central band, composed of small and irregular aragonite crystals, is significantly less than the surrounding skeleton. The mean Sr/Ca of the central band, 10.6 ± 0.1 mmol/mol (2σ standard error), and that of the surrounding skeleton, 10.58±0.09 mmol/mol, are statistically similar, and agree well with the inorganic aragonite Sr/Ca-temperature relationship at the temperature of coral growth. In the central band, Mg/Ca is greater than 3 mmol/mol, more than twice that of the surrounding skeleton, a general result observed in the relative Mg/Ca ratios of D. dianthus collected from separate oceanographic locations. This large vital effect corresponds to a ˜ 10 °C signal, when calibrated via surface coral Mg/Ca-temperature relationships, and has the potential to complicate paleoreconstructions. Outside the central band, Mg/Ca ratios increase with decreasing Sr/Ca. We explain the correlated behavior of Mg/Ca and Sr/Ca outside the central band by Rayleigh fractionation from a closed pool, an explanation that has been proposed elsewhere, but which is tested in this study by a simple and general relationship. We constrain the initial solution and effective partition coefficients for a Rayleigh process consistent with our accurate Metal/Ca measurements. A process other than Rayleigh fractionation influences Mg in the central band and our data constrain a number of possible mechanisms for the precipitation of this aragonite. Understanding the process affecting tracer behavior during coral biomineralization can help us better interpret paleoproxies in biogenic carbonates and lead to an improved deep-sea paleothermometer.

  14. Radiation processing and market economy

    NASA Astrophysics Data System (ADS)

    Zagórski, Z. P.

    1998-06-01

    In the system of totalitarian economy, regulated by bureaucracy, the real value of equipment, materials and services is almost completely unknown, what makes impossible the comparison of different technologies, eliminates competition, disturbs research and development. With introduction of market economy in Central and Eastern Europe, the radiation processing has lost doubtful support, becoming an independent business, subject to laws of free market economy. Only the most valuable objects of processing have survived that test. At the top of the list are: radiation sterilization of medical equipment and radiation induced crosslinking of polymers, polyethylene in particular. New elements of competition has entered the scene, as well as questions of international regulations and standards have appeared.

  15. Application-ready expedited MODIS data for operational land surface monitoring of vegetation condition

    USGS Publications Warehouse

    Brown, Jesslyn; Howard, Daniel M.; Wylie, Bruce K.; Friesz, Aaron M.; Ji, Lei; Gacke, Carolyn

    2015-01-01

    Monitoring systems benefit from high temporal frequency image data collected from the Moderate Resolution Imaging Spectroradiometer (MODIS) system. Because of near-daily global coverage, MODIS data are beneficial to applications that require timely information about vegetation condition related to drought, flooding, or fire danger. Rapid satellite data streams in operational applications have clear benefits for monitoring vegetation, especially when information can be delivered as fast as changing surface conditions. An “expedited” processing system called “eMODIS” operated by the U.S. Geological Survey provides rapid MODIS surface reflectance data to operational applications in less than 24 h offering tailored, consistently-processed information products that complement standard MODIS products. We assessed eMODIS quality and consistency by comparing to standard MODIS data. Only land data with known high quality were analyzed in a central U.S. study area. When compared to standard MODIS (MOD/MYD09Q1), the eMODIS Normalized Difference Vegetation Index (NDVI) maintained a strong, significant relationship to standard MODIS NDVI, whether from morning (Terra) or afternoon (Aqua) orbits. The Aqua eMODIS data were more prone to noise than the Terra data, likely due to differences in the internal cloud mask used in MOD/MYD09Q1 or compositing rules. Post-processing temporal smoothing decreased noise in eMODIS data.

  16. Individual Differences in Effectiveness of Cochlear Implants in Children Who Are Prelingually Deaf: New Process Measures of Performance

    PubMed Central

    Pisoni, David B.; Cleary, Miranda; Geers, Ann E.; Tobey, Emily A.

    2011-01-01

    The efficacy of cochlear implants in children who are deaf has been firmly established in the literature. However, the effectiveness of cochlear implants varies widely and is influenced by demographic and experiential factors. Several key findings suggest new directions for research on central auditory factors that underlie the effectiveness of cochlear implants. First, enormous individual differences have been observed in both adults and children on a wide range of audiological outcome measures. Some patients show large increases in speech perception scores after implantation, whereas others display only modest gains on standardized tests. Second, age of implantation and length of deafness affect all outcome measures. Children implanted at younger ages do better than children implanted at older ages, and children who have been deaf for shorter periods do better than children who have been deaf for longer periods. Third, communication mode affects outcome measures. Children from “oral-only” environments do much better on standardized tests that assess phonological processing skills than children who use Total Communication. Fourth, at the present time there are no preimplant predictors of outcome performance in young children. The underlying perceptual, cognitive, and linguistic abilities and skills emerge after implantation and improve over time. Finally, there are no significant differences in audiological outcome measures among current implant devices or processing strategies. This finding suggests that the major source of variance in outcome measures lies in the neural and cognitive information processing operations that the user applies to the signal provided by the implant. Taken together, this overall pattern of results suggests that higher-level central processes such as perception, attention, learning, and memory may play important roles in explaining the large individual differences observed among users of cochlear implants. Investigations of the content and flow of information in the central nervous system and interactions between sensory input and stored knowledge may provide important new insights into the basis of individual differences. Knowledge about the underlying basis of individual differences may also help in developing new intervention strategies to improve the effectiveness of cochlear implants in children who show relatively poor development of oral/aural language skills. PMID:21666760

  17. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  18. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenthal, Andrew

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stovemore » pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.« less

  19. Geologic sources and concentrations of selenium in the West-Central Denver Basin, including the Toll Gate Creek watershed, Aurora, Colorado, 2003-2007

    USGS Publications Warehouse

    Paschke, Suzanne S.; Walton-Day, Katherine; Beck, Jennifer A.; Webbers, Ank; Dupree, Jean A.

    2014-01-01

    Toll Gate Creek, in the west-central part of the Denver Basin, is a perennial stream in which concentrations of dissolved selenium have consistently exceeded the Colorado aquatic-life standard of 4.6 micrograms per liter. Recent studies of selenium in Toll Gate Creek identified the Denver lignite zone of the non-marine Cretaceous to Tertiary-aged (Paleocene) Denver Formation underlying the watershed as the geologic source of dissolved selenium to shallow ground-water and surface water. Previous work led to this study by the U.S. Geological Survey, in cooperation with the City of Aurora Utilities Department, which investigated geologic sources of selenium and selenium concentrations in the watershed. This report documents the occurrence of selenium-bearing rocks and groundwater within the Cretaceous- to Tertiary-aged Denver Formation in the west-central part of the Denver Basin, including the Toll Gate Creek watershed. The report presents background information on geochemical processes controlling selenium concentrations in the aquatic environment and possible geologic sources of selenium; the hydrogeologic setting of the watershed; selenium results from groundwater-sampling programs; and chemical analyses of solids samples as evidence that weathering of the Denver Formation is a geologic source of selenium to groundwater and surface water in the west-central part of the Denver Basin, including Toll Gate Creek. Analyses of water samples collected from 61 water-table wells in 2003 and from 19 water-table wells in 2007 indicate dissolved selenium concentrations in groundwater in the west-central Denver Basin frequently exceeded the Colorado aquatic-life standard and in some locations exceeded the primary drinking-water standard of 50 micrograms per liter. The greatest selenium concentrations were associated with oxidized groundwater samples from wells completed in bedrock materials. Selenium analysis of geologic core samples indicates that total selenium concentrations were greatest in samples containing indications of reducing conditions and organic matter (dark gray to black claystones and lignite horizons). The Toll Gate Creek watershed is situated in a unique hydrogeologic setting in the west-central part of the Denver Basin such that weathering of Cretaceous- to Tertiary-aged, non-marine, selenium-bearing rocks releases selenium to groundwater and surface water under present-day semi-arid environmental conditions. The Denver Formation contains several known and suspected geologic sources of selenium including: (1) lignite deposits; (2) tonstein partings; (3) organic-rich bentonite claystones; (4) salts formed as secondary weathering products; and possibly (5) the Cretaceous-Tertiary boundary. Organically complexed selenium and/or selenium-bearing pyrite in the enclosing claystones are likely the primary mineral sources of selenium in the Denver Formation, and correlations between concentration of dissolved selenium and dissolved organic carbon in groundwater indicate weathering and dissolution of organically complexed selenium from organic-rich claystone is a primary process mobilizing selenium. Secondary salts accumulated along fractures and bedding planes in the weathered zone are another potential geologic source of selenium, although their composition was not specifically addressed by the solids analyses. Results from this and previous work indicate that shallow groundwater and streams similarly positioned over Denver Formation claystone units at other locations in the Denver Basin also may contain concentrations of dissolved selenium greater than the Colorado aquatic-life standard or the drinking- water standard.

  20. The Structure of Dark Matter Halos in Dwarf Galaxies

    NASA Astrophysics Data System (ADS)

    Burkert, A.

    1995-07-01

    Recent observations indicate that dark matter halos have flat central density profiles. Cosmological simulations with nonbaryonic dark matter, however, predict self-similar halos with central density cusps. This contradiction has lead to the conclusion that dark matter must be baryonic. Here it is shown that the dark matter halos of dwarf spiral galaxies represent a one-parameter family with self-similar density profiles. The observed global halo parameters are coupled with each other through simple scaling relations which can be explained by the standard cold dark matter model if one assumes that all the halos formed from density fluctuations with the same primordial amplitude. We find that the finite central halo densities correlate with the other global parameters. This result rules out scenarios where the flat halo cores formed subsequently through violent dynamical processes in the baryonic component. These cores instead provide important information on the origin and nature of dark matter in dwarf galaxies.

  1. Central auditory neurons have composite receptive fields.

    PubMed

    Kozlov, Andrei S; Gentner, Timothy Q

    2016-02-02

    High-level neurons processing complex, behaviorally relevant signals are sensitive to conjunctions of features. Characterizing the receptive fields of such neurons is difficult with standard statistical tools, however, and the principles governing their organization remain poorly understood. Here, we demonstrate multiple distinct receptive-field features in individual high-level auditory neurons in a songbird, European starling, in response to natural vocal signals (songs). We then show that receptive fields with similar characteristics can be reproduced by an unsupervised neural network trained to represent starling songs with a single learning rule that enforces sparseness and divisive normalization. We conclude that central auditory neurons have composite receptive fields that can arise through a combination of sparseness and normalization in neural circuits. Our results, along with descriptions of random, discontinuous receptive fields in the central olfactory neurons in mammals and insects, suggest general principles of neural computation across sensory systems and animal classes.

  2. Neuroimaging of Pain: Human Evidence and Clinical Relevance of Central Nervous System Processes and Modulation.

    PubMed

    Martucci, Katherine T; Mackey, Sean C

    2018-06-01

    Neuroimaging research has demonstrated definitive involvement of the central nervous system in the development, maintenance, and experience of chronic pain. Structural and functional neuroimaging has helped elucidate central nervous system contributors to chronic pain in humans. Neuroimaging of pain has provided a tool for increasing our understanding of how pharmacologic and psychologic therapies improve chronic pain. To date, findings from neuroimaging pain research have benefitted clinical practice by providing clinicians with an educational framework to discuss the biopsychosocial nature of pain with patients. Future advances in neuroimaging-based therapeutics (e.g., transcranial magnetic stimulation, real-time functional magnetic resonance imaging neurofeedback) may provide additional benefits for clinical practice. In the future, with standardization and validation, brain imaging could provide objective biomarkers of chronic pain, and guide treatment for personalized pain management. Similarly, brain-based biomarkers may provide an additional predictor of perioperative prognoses.

  3. Acid-$beta$-glycerophosphatase reaction products in the central nervous system mitochondria following x-ray irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roizin, L.; Orlovskaja, D.; Liu, J.C.

    A survey of the literature to date on the enzyme histochemistry of intracellular organelles has not yielded any reference to the presence of acid phosphatase reaction products in the mammalian mitochondria of the central nervous system. A combination of Gomori's acid phosphatase method, however, with standard electron microscopy has disclosed the presence of enzyme reaction products in the mitochondria of the central nervous system of rats from 2 hr to 22 weeks after x-ray irradiation, as well as in a cerebral biopsy performed on a patient affected by Huntington's chorea. No enzyme reaction products, on the other hand, were observedmore » in serial sections that had been incubated in substrates either containing sodium fluoride or lacking in $beta$- glycerophosphate. The abnormal mitochondrial enzyme reaction (chemical lesion) is considered to be the consequence of the pathologic process affecting the ultrastructural-chemical organization of the organelle. (auth)« less

  4. Tools to manage the enterprise-wide picture archiving and communications system environment.

    PubMed

    Lannum, L M; Gumpf, S; Piraino, D

    2001-06-01

    The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.

  5. The Best Interest Standard: Same Name but Different Roles in Pediatric Bioethics and Child Rights Frameworks.

    PubMed

    Ross, Lainie Friedman; Swota, Alissa Hurwitz

    2017-01-01

    This article explores the intersection of pediatric bioethics and child rights by examining the best interest standard as it operates within the pediatric bioethics framework in the United States and the child rights framework based on the UN 1989 Convention on the Rights of the Child (CRC). While the "best interest of the child" standard is central to both pediatric bioethics and the child rights community, it operates only as a guidance principle, and not as an intervention principle, in decision-making within U.S. pediatric bioethics, whereas it operates as both a guidance and intervention principle in the child rights community. The differences in how the best interest standard is operationalized lead to different roles for the family, the state, and the minor in decision-making processes and outcomes. We examine the recent case of Charlie Gard to illustrate some of these differences.

  6. Specifications of Standards in Systems and Synthetic Biology.

    PubMed

    Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan

    2015-09-04

    Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.

  7. A cross sectional study on nursing process implementation and associated factors among nurses working in selected hospitals of Central and Northwest zones, Tigray Region, Ethiopia.

    PubMed

    Baraki, Zeray; Girmay, Fiseha; Kidanu, Kalayou; Gerensea, Hadgu; Gezehgne, Dejen; Teklay, Hafte

    2017-01-01

    The nursing process is a systematic method of planning, delivering, and evaluating individualized care for clients in any state of health or illness. Many countries have adopted the nursing process as the standard of care to guide nursing practice; however, the problem is its implementation. If nurses fail to carry out the necessary nursing care through the nursing process; the effectiveness of patient progress may be compromised and can lead to preventable adverse events. This study was aimed to assess the implementation of nursing process and associated factors among nurses working in selected hospitals of central and northwest zones of Tigray, Ethiopia, 2015. A cross sectional observational study design was utilized. Data was collected from 200 participants using structured self-administered questionnaire which was contextually adapted from standardized, reliable and validated measures. The data were entered using Epi Info version 7 and analyzed using SPSS version 20 software. Data were summarized and described using descriptive statistics and multivariate logistic regression was used to determine the relationship of independent and dependent variable. Then, finally, data were presented in tables, graphs, frequency percentage of different variables. Seventy (35%) of participants have implemented nursing process. Different factors showed significant association. Nurses who worked in a stressful atmosphere of the workplace were 99% less likely to implement the nursing process than nurses who worked at a very good atmosphere. The nurses with an educational level of BSc. Degree were 6.972 times more likely to implement the nursing process than those who were diploma qualified. Nurses with no consistent material supply to use the nursing process were 95.1% less likely to implement the nursing process than nurses with consistent material supply. The majority of the participants were not implementing the nursing process properly. There are many factors that hinder them from applying the nursing process of which level of education, knowledge of nurses, skill of nurses, atmosphere of the work place, shortage of material supply to use the nursing process and high number of patient load were scientifically significant for the association test.

  8. Governing decentralization in health care under tough budget constraint: what can we learn from the Italian experience?

    PubMed

    Tediosi, Fabrizio; Gabriele, Stefania; Longo, Francesco

    2009-05-01

    In many European countries, since the World War II, there has been a trend towards decentralization of health policy to lower levels of governments, while more recently there have been re-centralization processes. Whether re-centralization will be the new paradigm of European health policy or not is difficult to say. In the Italian National Health Service (SSN) decentralization raised two related questions that might be interesting for the international debate on decentralization in health care: (a) what sort of regulatory framework and institutional balances are required to govern decentralization in health care in a heterogeneous country under tough budget constraints? (b) how can it be ensured that the most advanced parts of the country remain committed to solidarity, supporting the weakest ones? To address these questions this article describes the recent trends in SSN funding and expenditure, it reviews the strategy adopted by the Italian government for governing the decentralization process and discusses the findings to draw policy conclusions. The main lessons emerging from this experience are that: (1) when the differences in administrative and policy skills, in socio-economic standards and social capital are wide, decentralization may lead to undesirable divergent evolution paths; (2) even in decentralized systems, the role of the Central government can be very important to contain health expenditure; (3) a strong governance of the Central government may help and not hinder the enforcement of decentralization; and (4) supporting the weakest Regions and maintaining inter-regional solidarity is hard but possible. In Italy, despite an increasing role of the Central government in steering the SSN, the pattern of regional decentralization of health sector decision making does not seem at risk. Nevertheless, the Italian case confirms the complexity of decentralization and re-centralization processes that sometimes can be paradoxically reinforcing each other.

  9. Sequential anaerobic-aerobic biological treatment of colored wastewaters: case study of a textile dyeing factory wastewater.

    PubMed

    Abiri, Fardin; Fallah, Narges; Bonakdarpour, Babak

    2017-03-01

    In the present study the feasibility of the use of a bacterial batch sequential anaerobic-aerobic process, in which activated sludge was used in both parts of the process, for pretreatment of wastewater generated by a textile dyeing factory has been considered. Activated sludge used in the process was obtained from a municipal wastewater treatment plant and adapted to real dyeing wastewater using either an anaerobic-only or an anaerobic-aerobic process over a period of 90 days. The use of activated sludge adapted using the anaerobic-aerobic process resulted in a higher overall decolorization efficiency compared to that achieved with activated sludge adapted using the anaerobic-only cycles. Anaerobic and aerobic periods of around 34 and 22 hours respectively resulted in an effluent with chemical oxygen demand (COD) and color content which met the standards for discharge into the centralized wastewater treatment plant of the industrial estate in which the dyeing factory was situated. Neutralization of the real dyeing wastewater and addition of carbon source to it, both of which results in significant increase in the cost of the bacterial treatment process, was not found to be necessary to achieve the required discharge standards.

  10. Proportions of maxillary anterior teeth relative to each other and to golden standard in tabriz dental faculty students.

    PubMed

    Parnia, Fereydoun; Hafezeqoran, Ali; Mahboub, Farhang; Moslehifard, Elnaz; Koodaryan, Rodabeh; Moteyagheni, Rosa; Saleh Saber, Fariba

    2010-01-01

    Various methods are used to measure the size and form of the teeth, including the golden pro-portion, and the width-to-length ratio of central teeth, referred to as the golden standard. The aim of this study was to eval-uate the occurrence of golden standard values and golden proportion in the anterior teeth. Photographs of 100 dentistry students (50 males and 50 females) were taken under standard conditions. The visible widths and lengths of maxillary right and left incisors were calculated and the ratios were compared with golden standard. Data was analyzed using SPSS 14 software. Review of the results of the means showed statistically significant differences between the width ratio of right lateral teeth to the central teeth width with golden proportion (P<0.001). Likewise, the difference was significant for the left side, too (P<0.001). Test results of mean differences showed that the mean difference between proportion of right laterals to centrals with golden proportion was significant (P<0.001). The difference was significant for the left side, too (P<0.001). As a result, there is no golden proportion among maxillary incisors. The review of results of mean differences for single samples showed that the mean differences between the proportion of width-to-length of left and right central teeth was statistically significant by golden standard (P<0.001). Therefore, considering the width-to-length proportion of maxillary central teeth, no golden standard exists. In the evaluation of the width-to-width and width-to-length proportions of maxillary incisors no golden proportions and standards were detected, respectively.

  11. [Merging different biobanks under one roof : Benefits and constraints on the way to a centralized biobank using the example of the BMBH].

    PubMed

    Schmitt, S; Döllinger, C; Maier, A; Herpel, E; Schirmacher, P; Kirsten, R

    2018-05-23

    Founded in 1386, Heidelberg University is Germany's oldest and one of Europe's most reputable universities. As a scientific hub in Germany, Heidelberg is home to several internationally renowned medical research facilities that have an enormous demand for biomaterial samples and data-especially in the field of translational and cancer research.The main objective of the BMBF-funded project "BioMaterialBank Heidelberg" (BMBH) was the harmonization of local biobanking under the same administrative roof through the implementation of common and standardized project, data, and quality management procedures.In the very beginning, existing structures and processes of the participating biobanks in Heidelberg were identified and a common administrative structure with central representatives for IT and quality management (QM) was established to coordinate all BMBH activities.Over time, implementation of consented structures and processes took place, also revealing organizational challenges that had to be solved concerning, for example, differences in sample handling and the definition of consistent access regulations.We will discuss below these challenges as well as the opportunities of building a centralized biobank and show how issues can be resolved using the example of the BMBH.

  12. Oil Pharmacy at the Thermal Protection System Facility

    NASA Image and Video Library

    2017-08-08

    Tim King of Jacobs at NASA's Kennedy Space Center in Florida, explains operations in the Oil Pharmacy operated under the Test and Operations Support Contract, or TOSC. The facility consolidated storage and distribution of petroleum products used in equipment maintained under the contract. This included standardized naming, testing processes and provided a central location for distribution of oils used in everything from simple machinery to the crawler-transporter and cranes in the Vehicle Assembly Building.

  13. A Practical Guide to the Open Standards for Unattended Sensors (OSUS)

    DTIC Science & Technology

    2018-01-01

    collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE...in OSUS development and some of the benefits of implementing an OSUS controller as the central processing platform in a smart sensing device. The... controller 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 91 19a. NAME OF

  14. Using OPC and HL7 Standards to Incorporate an Industrial Big Data Historian in a Health IT Environment.

    PubMed

    Cruz, Márcio Freire; Cavalcante, Carlos Arthur Mattos Teixeira; Sá Barretto, Sérgio Torres

    2018-05-30

    Health Level Seven (HL7) is one of the standards most used to centralize data from different vital sign monitoring systems. This solution significantly limits the data available for historical analysis, because it typically uses databases that are not effective in storing large volumes of data. In industry, a specific Big Data Historian, known as a Process Information Management System (PIMS), solves this problem. This work proposes the same solution to overcome the restriction on storing vital sign data. The PIMS needs a compatible communication standard to allow storing, and the one most commonly used is the OLE for Process Control (OPC). This paper presents a HL7-OPC Server that permits communication between vital sign monitoring systems with PIMS, thus allowing the storage of long historical series of vital signs. In addition, it carries out a review about local and cloud-based Big Medical Data researches, followed by an analysis of the PIMS in a Health IT Environment. Then it shows the architecture of HL7 and OPC Standards. Finally, it shows the HL7-OPC Server and a sequence of tests that proved its full operation and performance.

  15. Systems Architecture for a Nationwide Healthcare System.

    PubMed

    Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio

    2015-01-01

    From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.

  16. Noise Equally Degrades Central Auditory Processing in 2- and 4-Year-Old Children.

    PubMed

    Niemitalo-Haapola, Elina; Haapala, Sini; Kujala, Teija; Raappana, Antti; Kujala, Tiia; Jansson-Verkasalo, Eira

    2017-08-16

    The aim of this study was to investigate developmental and noise-induced changes in central auditory processing indexed by event-related potentials in typically developing children. P1, N2, and N4 responses as well as mismatch negativities (MMNs) were recorded for standard syllables and consonants, frequency, intensity, vowel, and vowel duration changes in silent and noisy conditions in the same 14 children at the ages of 2 and 4 years. The P1 and N2 latencies decreased and the N2, N4, and MMN amplitudes increased with development of the children. The amplitude changes were strongest at frontal electrodes. At both ages, background noise decreased the P1 amplitude, increased the N2 amplitude, and shortened the N4 latency. The noise-induced amplitude changes of P1, N2, and N4 were strongest frontally. Furthermore, background noise degraded the MMN. At both ages, MMN was significantly elicited only by the consonant change, and at the age of 4 years, also by the vowel duration change during noise. Developmental changes indexing maturation of central auditory processing were found from every response studied. Noise degraded sound encoding and echoic memory and impaired auditory discrimination at both ages. The older children were as vulnerable to the impact of noise as the younger children. https://doi.org/10.23641/asha.5233939.

  17. Protocol and standard operating procedures for common use in a worldwide multicenter study on reference values.

    PubMed

    Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George

    2013-05-01

    The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.

  18. The Large Introductory Class as an Exercise in Organization Design.

    ERIC Educational Resources Information Center

    Wagner, John A., III; Van Dyne, Linn

    1999-01-01

    Four methods for large group instruction differ in control and coordination dimensions: (1) centralization with mutual adjustment; (2) centralization with standardization; (3) decentralization with standardization; and (4) decentralization with mutual adjustment. Other factors to consider include class size and interests of various constituencies:…

  19. Ozone exposures and implications for vegetation in rural areas of the central Appalachian Mountains, U.S.A.

    Treesearch

    Pamela Edwards; Cindy Huber; Frederica Wood

    2004-01-01

    The United States is making the transition from the 1979 1 hr maximum ozone standard to the newly adopted 8 hr ozone standard (3 yr average of the 4th highest maximum 8 hr ozone concentration). Consequently, we analyzed and compared ozone concentrations under both standards from a variety of monitoring sites throughout the central Appalachian region of Kentucky (KY),...

  20. Environmental assessment for the proposed effluent limitations guidelines, pretreatment standards, and new source performance standards for the centralized waste treatment industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    This report assesses the water quality related benefits that would be expected if the US Environmental Protection Agency (EPA) adopts the proposed effluent limitations, guidelines and pretreatment standards for the Centralized Waste Treatment (CWT) Industry. EPA estimates that under baseline conditions 205 CWT facilities discharge approximately 5.22 million lbs/year of metal and organic pollutants.

  1. Standardizing Flow Cytometry Immunophenotyping Analysis from the Human ImmunoPhenotyping Consortium

    PubMed Central

    Finak, Greg; Langweiler, Marc; Jaimes, Maria; Malek, Mehrnoush; Taghiyar, Jafar; Korin, Yael; Raddassi, Khadir; Devine, Lesley; Obermoser, Gerlinde; Pekalski, Marcin L.; Pontikos, Nikolas; Diaz, Alain; Heck, Susanne; Villanova, Federica; Terrazzini, Nadia; Kern, Florian; Qian, Yu; Stanton, Rick; Wang, Kui; Brandes, Aaron; Ramey, John; Aghaeepour, Nima; Mosmann, Tim; Scheuermann, Richard H.; Reed, Elaine; Palucka, Karolina; Pascual, Virginia; Blomberg, Bonnie B.; Nestle, Frank; Nussenblatt, Robert B.; Brinkman, Ryan Remy; Gottardo, Raphael; Maecker, Holden; McCoy, J Philip

    2016-01-01

    Standardization of immunophenotyping requires careful attention to reagents, sample handling, instrument setup, and data analysis, and is essential for successful cross-study and cross-center comparison of data. Experts developed five standardized, eight-color panels for identification of major immune cell subsets in peripheral blood. These were produced as pre-configured, lyophilized, reagents in 96-well plates. We present the results of a coordinated analysis of samples across nine laboratories using these panels with standardized operating procedures (SOPs). Manual gating was performed by each site and by a central site. Automated gating algorithms were developed and tested by the FlowCAP consortium. Centralized manual gating can reduce cross-center variability, and we sought to determine whether automated methods could streamline and standardize the analysis. Within-site variability was low in all experiments, but cross-site variability was lower when central analysis was performed in comparison with site-specific analysis. It was also lower for clearly defined cell subsets than those based on dim markers and for rare populations. Automated gating was able to match the performance of central manual analysis for all tested panels, exhibiting little to no bias and comparable variability. Standardized staining, data collection, and automated gating can increase power, reduce variability, and streamline analysis for immunophenotyping. PMID:26861911

  2. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  3. Cross flow cyclonic flotation column for coal and minerals beneficiation

    DOEpatents

    Lai, Ralph W.; Patton, Robert A.

    2000-01-01

    An apparatus and process for the separation of coal from pyritic impurities using a modified froth flotation system. The froth flotation column incorporates a helical track about the inner wall of the column in a region intermediate between the top and base of the column. A standard impeller located about the central axis of the column is used to generate a centrifugal force thereby increasing the separation efficiency of coal from the pyritic particles and hydrophillic tailings.

  4. Avionics Architecture Standards as an Approach to Obsolescence Management

    DTIC Science & Technology

    2000-10-01

    and goals is one method of system. The term System Architecture refers to a achieving the necessary critical mass of skilled and consistent set of such...Processing Module (GPM), Mass Memory Module executed on the modules within an ASAAC system will (MMM) and Power Conversion Module (PCM). be stored in a central...location, the Mass Memory * MOS -Module Support Layer to Operating System Module (MMM). Therefore, if modules are to be The purpose of the MOS

  5. 76 FR 16818 - Central Valley Project Improvement Act, Standard Criteria for Ag and Urban Water Management Plans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-25

    ... Valley Project water conservation best management practices (BMPs) that shall develop Criteria for... project contractors using best available cost- effective technology and best management practices.'' The... DEPARTMENT OF THE INTERIOR Bureau of Reclamation Central Valley Project Improvement Act, Standard...

  6. Incorporating Experience Curves in Appliance Standards Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners,more » clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.« less

  7. Establishment and maintenance of a standardized glioma tissue bank: Huashan experience.

    PubMed

    Aibaidula, Abudumijiti; Lu, Jun-feng; Wu, Jin-song; Zou, He-jian; Chen, Hong; Wang, Yu-qian; Qin, Zhi-yong; Yao, Yu; Gong, Ye; Che, Xiao-ming; Zhong, Ping; Li, Shi-qi; Bao, Wei-min; Mao, Ying; Zhou, Liang-fu

    2015-06-01

    Cerebral glioma is the most common brain tumor as well as one of the top ten malignant tumors in human beings. In spite of the great progress on chemotherapy and radiotherapy as well as the surgery strategies during the past decades, the mortality and morbidity are still high. One of the major challenges is to explore the pathogenesis and invasion of glioma at various "omics" levels (such as proteomics or genomics) and the clinical implications of biomarkers for diagnosis, prognosis or treatment of glioma patients. Establishment of a standardized tissue bank with high quality biospecimens annotated with clinical information is pivotal to the solution of these questions as well as the drug development process and translational research on glioma. Therefore, based on previous experience of tissue banks, standardized protocols for sample collection and storage were developed. We also developed two systems for glioma patient and sample management, a local database for medical records and a local image database for medical images. For future set-up of a regional biobank network in Shanghai, we also founded a centralized database for medical records. Hence we established a standardized glioma tissue bank with sufficient clinical data and medical images in Huashan Hospital. By September, 2013, tissues samples from 1,326 cases were collected. Histological diagnosis revealed that 73 % were astrocytic tumors, 17 % were oligodendroglial tumors, 2 % were oligoastrocytic tumors, 4 % were ependymal tumors and 4 % were other central nervous system neoplasms.

  8. 40 CFR 437.16 - Pretreatment standards for new sources (PSNS).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS THE CENTRALIZED WASTE TREATMENT POINT SOURCE CATEGORY Metals Treatment... standards: Standards for antimony, arsenic, cadmium, chromium, cobalt, copper, lead, mercury, nickel, silver...

  9. A Preliminary Investigation into the Effect of Standards-Based Grading on the Academic Performance of African-American Students

    NASA Astrophysics Data System (ADS)

    Bradbury-Bailey, Mary

    With the implementation of No Child Left Behind came a wave of educational reform intended for those working with student populations whose academic performance seemed to indicate an alienation from the educational process. Central to these reforms was the implementation of standards-based instruction and their accompanying standardized assessments; however, in one area reform seemed nonexistent---the teacher's gradebook. (Erickson, 2010, Marzano, 2006; Scriffiny, 2008). Given the link between the grading process and achievement motivation, Ames (1992) suggested the use of practices that promote mastery goal orientation. The purpose of this study was to examine the impact of standards-based grading system as a factor contributing to mastery goal orientation on the academic performance of urban African American students. To determine the degree of impact, this study first compared the course content averages and End-of-Course-Test (EOCT) scores for science classes using a traditional grading system to those using a standards-based grading system by employing an Analysis of Covariance (ANCOVA). While there was an increase in all grading areas, two showed a significant difference---the Physical Science course content average (p = 0.024) and ix the Biology EOCT scores (p = 0.0876). These gains suggest that standards-based grading can have a positive impact on the academic performance of African American students. Secondly, this study examined the correlation between the course content averages and the EOCT scores for both the traditional and standards-based grading system; for both Physical Science and Biology, there was a stronger correlation between these two scores for the standards-based grading system.

  10. 49 CFR 71.6 - Central zone.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... States that is west of the boundary line between the eastern and central standard time zones described in... between the central and mountain time zones. The Chicago, Rock Island and Gulf Railway Company and the...

  11. Heart Rate and Oxygen Uptake Kinetics in Type 2 Diabetes Patients - A Pilot Study on the Influence of Cardiovascular Medication on Regulatory Processes.

    PubMed

    Koschate, Jessica; Drescher, Uwe; Baum, Klaus; Brinkmann, Christian; Schiffer, Thorsten; Latsch, Joachim; Brixius, Klara; Hoffmann, Uwe

    2017-05-01

    The aim of this pilot study was to investigate whether there are differences in heart rate and oxygen uptake kinetics in type 2 diabetes patients, considering their cardiovascular medication. It was hypothesized that cardiovascular medication would affect heart rate and oxygen uptake kinetics and that this could be detected using a standardized exercise test. 18 subjects were tested for maximal oxygen uptake. Kinetics were measured in a single test session with standardized, randomized moderate-intensity work rate changes. Time series analysis was used to estimate kinetics. Greater maxima in cross-correlation functions indicate faster kinetics. 6 patients did not take any cardiovascular medication, 6 subjects took peripherally acting medication and 6 patients were treated with centrally acting medication. Maximum oxygen uptake was not significantly different between groups. Significant main effects were identified regarding differences in muscular oxygen uptake kinetics and heart rate kinetics. Muscular oxygen uptake kinetics were significantly faster than heart rate kinetics in the group with no cardiovascular medication (maximum in cross-correlation function of muscular oxygen uptake vs. heart rate; 0.32±0.08 vs. 0.25±0.06; p=0.001) and in the group taking peripherally acting medication (0.34±0.05 vs. 0.28±0.05; p=0.009) but not in the patients taking centrally acting medication (0.28±0.05 vs. 0.30±0.07; n.s.). It can be concluded that regulatory processes for the achievement of a similar maximal oxygen uptake are different between the groups. The used standardized test provided plausible results for heart rate and oxygen uptake kinetics in a single measurement session in this patient group. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Implementation of standardized time limits in sickness insurance and return-to-work: experiences of four actors.

    PubMed

    Ståhl, Christian; Müssener, Ulrika; Svensson, Tommy

    2012-01-01

    In 2008, time limits were introduced in Swedish sickness insurance, comprising a pre-defined schedule for return-to-work. The purpose of this study was to explore experienced consequences of these time limits. Sick-listed persons, physicians, insurance officials and employers were interviewed regarding the process of sick-listing, rehabilitation and return-to-work in relation to the reform. The study comprises qualitative interviews with 11 sick-listed persons, 4 insurance officials, 5 employers and 4 physicians (n = 24). Physicians, employers, and sick-listed persons described insurance officials as increasingly passive, and that responsibility for the process was placed on the sick-listed. Several ethical dilemmas were identified, where officials were forced to act against their ethical principles. Insurance officials' principle of care often clashed with the standardization of the process, that is based on principles of egalitarianism and equal treatment. The cases reported in this study suggest that a policy for activation and early return-to-work in some cases has had the opposite effect: central actors remain passive and the responsibility is placed on the sick-listed, who lacks the strength and knowledge to understand and navigate through the system. The standardized insurance system here promoted experiences of procedural injustice, for both officials and sick-listed persons.

  13. Making Validated Educational Models Central in Preschool Standards.

    ERIC Educational Resources Information Center

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  14. Rolling Deck to Repository I: Designing a Database Infrastructure

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.

    2008-12-01

    The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.

  15. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  16. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  17. Forward J / ψ production at high energy: Centrality dependence and mean transverse momentum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ducloué, B.; Lappi, T.; Mäntysaari, H.

    2016-10-21

    Forward rapidity J/more » $$\\psi$$ meson production in proton-nucleus collisions can be an important constraint of descriptions of the small- x nuclear wave function. In an earlier work we studied this process using a dipole cross section satisfying the Balitsky-Kovchegov equation, fit to HERA inclusive data and consistently extrapolated to the nuclear case using a standard Woods-Saxon distribution. In this paper we present further calculations of these cross sections, studying the mean transverse momentum of the meson and the dependence on collision centrality. We also extend the calculation to backward rapidities using nuclear parton distribution functions. Here, we show that the parametrization is overall rather consistent with the available experimental data. However, there is a tendency towards a too strong centrality dependence. This can be traced back to the rather small transverse area occupied by small- x gluons in the nucleon that is seen in the HERA data, compared to the total inelastic nucleon-nucleon cross section.« less

  18. Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table

    NASA Technical Reports Server (NTRS)

    Li, Zhen-Ping; Savki, Cetin

    2005-01-01

    The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.

  19. Identifying the source of fluvial terrace deposits using XRF scanning and Canonical Discriminant Analysis: A case study of the Chihshang terraces, eastern Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Queenie; Lee, Jian-Cheng; Hunag, Jyh-Jaan; Wei, Kuo-Yen; Chen, Yue-Gau; Byrne, Timothy B.

    2018-05-01

    The source of fluvial deposits in terraces provides important information about the catchment fluvial processes and landform evolution. In this study, we propose a novel approach that combines high-resolution Itrax-XRF scanning and Canonical Discriminant Analysis (CDA) to identify the source of fine-grained fluvial terrace deposits. We apply this approach to a group of terraces that are located on the hanging wall of the Chihshang Fault in eastern Taiwan with two possible sources, the Coastal Range on the east and the Central Range on the west. Our results of standard samples from the two potential sources show distinct ranges of canonical variables, which provided a better separation ability than individual chemical elements. We then tested the possibility of using this approach by applying it to several samples with known sediment sources and obtain positive results. Applying this same approach to the fine-grained sediments in Chihshang terraces indicates that they are mostly composed of Coastal Range material but also contain some inputs from the Central Range. In two lowest terraces T1 and T2, the fine-grained deposits show significant Central Range component. For terrace T4, the results show less Central Range input and a trend of decreasing Central Range influences up section. The Coastal Range material becomes dominant in the two highest terraces T7 and T10. Sediments in terrace T5 appear to have been potentially altered by post-deposition chemical alteration processes and are not included in the analysis. Our results show that the change in source material in the terraces deposits was relatively gradual rather than the sharp changes suggested by the composition of the gravels and conglomerates. We suggest that this change in sources is related to the change in dominant fluvial processes that controlled by the tectonic activity.

  20. Sensitivity of boundary layer variables to PBL schemes over the central Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Xu, L.; Liu, H.; Wang, L.; Du, Q.; Liu, Y.

    2017-12-01

    Planetary Boundary Layer (PBL) parameterization schemes play critical role in numerical weather prediction and research. They describe physical processes associated with the momentum, heat and humidity exchange between land surface and atmosphere. In this study, two non-local (YSU and ACM2) and two local (MYJ and BouLac) planetary boundary layer parameterization schemes in the Weather Research and Forecasting (WRF) model have been tested over the central Tibetan Plateau regarding of their capability to model boundary layer parameters relevant for surface energy exchange. The model performance has been evaluated against measurements from the Third Tibetan Plateau atmospheric scientific experiment (TIPEX-III). Simulated meteorological parameters and turbulence fluxes have been compared with observations through standard statistical measures. Model results show acceptable behavior, but no particular scheme produces best performance for all locations and parameters. All PBL schemes underestimate near surface air temperatures over the Tibetan Plateau. By investigating the surface energy budget components, the results suggest that downward longwave radiation and sensible heat flux are the main factors causing the lower near surface temperature. Because the downward longwave radiation and sensible heat flux are respectively affected by atmosphere moisture and land-atmosphere coupling, improvements in water vapor distribution and land-atmosphere energy exchange is meaningful for better presentation of PBL physical processes over the central Tibetan Plateau.

  1. Central effects of acetylsalicylic acid on trigeminal-nociceptive stimuli

    PubMed Central

    2014-01-01

    Background Acetylsalicylic acid is one of the most used analgesics to treat an acute migraine attack. Next to the inhibitory effects on peripheral prostaglandin synthesis, central mechanisms of action have also been discussed. Methods Using a standardized model for trigeminal-nociceptive stimulation during fMRI scanning, we investigated the effect of acetylsalicylic acid on acute pain compared to saline in 22 healthy volunteers in a double-blind within-subject design. Painful stimulation was applied using gaseous ammonia and presented in a pseudo-randomized order with several control stimuli. All participants were instructed to rate the intensity and unpleasantness of every stimulus on a VAS scale. Based on previous results, we hypothesized to find an effect of ASA on central pain processing structures like the ACC, SI and SII as well as the trigeminal nuclei and the hypothalamus. Results Even though we did not find any differences in pain ratings between saline and ASA, we observed decreased BOLD signal changes in response to trigemino-nociceptive stimulation in the ACC and SII after administration of ASA compared to saline. This finding is in line with earlier imaging results investigating the effect of ASA on acute pain. Contrary to earlier findings from animal studies, we could not find an effect of ASA on the trigeminal nuclei in the brainstem or within the hypothalamic area. Conclusion Taken together our study replicates earlier findings of an attenuating effect of ASA on pain processing structures, which adds further evidence to a possibly central mechanism of action of ASA. PMID:25201152

  2. 75 FR 64971 - Proposed Establishment of Class E Airspace; Central City, NE

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ...-0837; Airspace Docket No. 10-ACE-10] Proposed Establishment of Class E Airspace; Central City, NE...: This action proposes to establish Class E airspace at Central City, NE. Controlled airspace is necessary to accommodate new Standard Instrument Approach Procedures (SIAP) at Central City Municipal--Larry...

  3. Total quality management: It works for aerospace information services

    NASA Technical Reports Server (NTRS)

    Erwin, James; Eberline, Carl; Colquitt, Wanda

    1993-01-01

    Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.

  4. Hybrid life-cycle assessment of natural gas based fuel chains for transportation.

    PubMed

    Strømman, Anders Hammer; Solli, Christian; Hertwich, Edgar G

    2006-04-15

    This research compares the use of natural gas, methanol, and hydrogen as transportation fuels. These three fuel chains start with the extraction and processing of natural gas in the Norwegian North Sea and end with final use in Central Europe. The end use is passenger transportation with a sub-compact car that has an internal combustion engine for the natural gas case and a fuel cell for the methanol and hydrogen cases. The life cycle assessment is performed by combining a process based life-cycle inventory with economic input-output data. The analysis shows that the potential climate impacts are lowest for the hydrogen fuel scenario with CO2 deposition. The hydrogen fuel chain scenario has no significant environmental disadvantage compared to the other fuel chains. Detailed analysis shows that the construction of the car contributes significantly to most impact categories. Finally, it is shown how the application of a hybrid inventory model ensures a more complete inventory description compared to standard process-based life-cycle assessment. This is particularly significant for car construction which would have been significantly underestimated in this study using standard process life-cycle assessment alone.

  5. Mismatch and conflict: neurophysiological and behavioral evidence for conflict priming.

    PubMed

    Mager, Ralph; Meuth, Sven G; Kräuchi, Kurt; Schmidlin, Maria; Müller-Spahn, Franz; Falkenstein, Michael

    2009-11-01

    Conflict-related cognitive processes are critical for adapting to sudden environmental changes that confront the individual with inconsistent or ambiguous information. Thus, these processes play a crucial role to cope with daily life. Generally, conflicts tend to accumulate especially in complex and threatening situations. Therefore, the question arises how conflict-related cognitive processes are modulated by the close succession of conflicts. In the present study, we investigated the effect of interactions between different types of conflict on performance as well as on electrophysiological parameters. A task-irrelevant auditory stimulus and a task-relevant visual stimulus were presented successively. The auditory stimulus consisted of a standard or deviant tone, followed by a congruent or incongruent Stroop stimulus. After standard prestimuli, performance deteriorated for incongruent compared to congruent Stroop stimuli, which were accompanied by a widespread negativity for incongruent versus congruent stimuli in the event-related potentials (ERPs). However, after deviant prestimuli, performance was better for incongruent than for congruent Stroop stimuli and an additional early negativity in the ERP emerged with a fronto-central maximum. Our data show that deviant auditory prestimuli facilitate specifically the processing of stimulus-related conflict, providing evidence for a conflict-priming effect.

  6. Trends in Planetary Data Analysis. Executive summary of the Planetary Data Workshop

    NASA Technical Reports Server (NTRS)

    Evans, N.

    1984-01-01

    Planetary data include non-imaging remote sensing data, which includes spectrometric, radiometric, and polarimetric remote sensing observations. Also included are in-situ, radio/radar data, and Earth based observation. Also discussed is development of a planetary data system. A catalog to identify observations will be the initial entry point for all levels of users into the data system. There are seven distinct data support services: encyclopedia, data index, data inventory, browse, search, sample, and acquire. Data systems for planetary science users must provide access to data, process, store, and display data. Two standards will be incorporated into the planetary data system: Standard communications protocol and Standard format data unit. The data system configuration must combine a distributed system with those of a centralized system. Fiscal constraints have made prioritization important. Activities include saving previous mission data, planning/cost analysis, and publishing of proceedings.

  7. A centralized informatics infrastructure for the National Institute on Drug Abuse Clinical Trials Network.

    PubMed

    Pan, Jeng-Jong; Nahm, Meredith; Wakim, Paul; Cushing, Carol; Poole, Lori; Tai, Betty; Pieper, Carl F

    2009-02-01

    Clinical trial networks (CTNs) were created to provide a sustaining infrastructure for the conduct of multisite clinical trials. As such, they must withstand changes in membership. Centralization of infrastructure including knowledge management, portfolio management, information management, process automation, work policies, and procedures in clinical research networks facilitates consistency and ultimately research. In 2005, the National Institute on Drug Abuse (NIDA) CTN transitioned from a distributed data management model to a centralized informatics infrastructure to support the network's trial activities and administration. We describe the centralized informatics infrastructure and discuss our challenges to inform others considering such an endeavor. During the migration of a clinical trial network from a decentralized to a centralized data center model, descriptive data were captured and are presented here to assess the impact of centralization. We present the framework for the informatics infrastructure and evaluative metrics. The network has decreased the time from last patient-last visit to database lock from an average of 7.6 months to 2.8 months. The average database error rate decreased from 0.8% to 0.2%, with a corresponding decrease in the interquartile range from 0.04%-1.0% before centralization to 0.01-0.27% after centralization. Centralization has provided the CTN with integrated trial status reporting and the first standards-based public data share. A preliminary cost-benefit analysis showed a 50% reduction in data management cost per study participant over the life of a trial. A single clinical trial network comprising addiction researchers and community treatment programs was assessed. The findings may not be applicable to other research settings. The identified informatics components provide the information and infrastructure needed for our clinical trial network. Post centralization data management operations are more efficient and less costly, with higher data quality.

  8. Size-dependent standard deviation for growth rates: Empirical results and theoretical modeling

    NASA Astrophysics Data System (ADS)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H. Eugene; Grosse, I.

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation σ(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation σ(R) on the average value of the wages with a scaling exponent β≈0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation σ(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of σ(R) on the average payroll with a scaling exponent β≈-0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  9. Size-dependent standard deviation for growth rates: empirical results and theoretical modeling.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H Eugene; Grosse, I

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation sigma(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation sigma(R) on the average value of the wages with a scaling exponent beta approximately 0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation sigma(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of sigma(R) on the average payroll with a scaling exponent beta approximately -0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  10. A randomized trial of protocol-based care for early septic shock.

    PubMed

    Yealy, Donald M; Kellum, John A; Huang, David T; Barnato, Amber E; Weissfeld, Lisa A; Pike, Francis; Terndrup, Thomas; Wang, Henry E; Hou, Peter C; LoVecchio, Frank; Filbin, Michael R; Shapiro, Nathan I; Angus, Derek C

    2014-05-01

    In a single-center study published more than a decade ago involving patients presenting to the emergency department with severe sepsis and septic shock, mortality was markedly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy (EGDT), in which intravenous fluids, vasopressors, inotropes, and blood transfusions were adjusted to reach central hemodynamic targets, than among those receiving usual care. We conducted a trial to determine whether these findings were generalizable and whether all aspects of the protocol were necessary. In 31 emergency departments in the United States, we randomly assigned patients with septic shock to one of three groups for 6 hours of resuscitation: protocol-based EGDT; protocol-based standard therapy that did not require the placement of a central venous catheter, administration of inotropes, or blood transfusions; or usual care. The primary end point was 60-day in-hospital mortality. We tested sequentially whether protocol-based care (EGDT and standard-therapy groups combined) was superior to usual care and whether protocol-based EGDT was superior to protocol-based standard therapy. Secondary outcomes included longer-term mortality and the need for organ support. We enrolled 1341 patients, of whom 439 were randomly assigned to protocol-based EGDT, 446 to protocol-based standard therapy, and 456 to usual care. Resuscitation strategies differed significantly with respect to the monitoring of central venous pressure and oxygen and the use of intravenous fluids, vasopressors, inotropes, and blood transfusions. By 60 days, there were 92 deaths in the protocol-based EGDT group (21.0%), 81 in the protocol-based standard-therapy group (18.2%), and 86 in the usual-care group (18.9%) (relative risk with protocol-based therapy vs. usual care, 1.04; 95% confidence interval [CI], 0.82 to 1.31; P=0.83; relative risk with protocol-based EGDT vs. protocol-based standard therapy, 1.15; 95% CI, 0.88 to 1.51; P=0.31). There were no significant differences in 90-day mortality, 1-year mortality, or the need for organ support. In a multicenter trial conducted in the tertiary care setting, protocol-based resuscitation of patients in whom septic shock was diagnosed in the emergency department did not improve outcomes. (Funded by the National Institute of General Medical Sciences; ProCESS ClinicalTrials.gov number, NCT00510835.).

  11. Definition, discrimination, diagnosis and treatment of central breathing disturbances during sleep.

    PubMed

    Randerath, Winfried; Verbraecken, Johan; Andreas, Stefan; Arzt, Michael; Bloch, Konrad E; Brack, Thomas; Buyse, Bertien; De Backer, Wilfried; Eckert, Danny Joel; Grote, Ludger; Hagmeyer, Lars; Hedner, Jan; Jennum, Poul; La Rovere, Maria Teresa; Miltz, Carla; McNicholas, Walter T; Montserrat, Josep; Naughton, Matthew; Pepin, Jean-Louis; Pevernagie, Dirk; Sanner, Bernd; Testelmans, Dries; Tonia, Thomy; Vrijsen, Bart; Wijkstra, Peter; Levy, Patrick

    2017-01-01

    The complexity of central breathing disturbances during sleep has become increasingly obvious. They present as central sleep apnoeas (CSAs) and hypopnoeas, periodic breathing with apnoeas, or irregular breathing in patients with cardiovascular, other internal or neurological disorders, and can emerge under positive airway pressure treatment or opioid use, or at high altitude. As yet, there is insufficient knowledge on the clinical features, pathophysiological background and consecutive algorithms for stepped-care treatment. Most recently, it has been discussed intensively if CSA in heart failure is a "marker" of disease severity or a "mediator" of disease progression, and if and which type of positive airway pressure therapy is indicated. In addition, disturbances of respiratory drive or the translation of central impulses may result in hypoventilation, associated with cerebral or neuromuscular diseases, or severe diseases of lung or thorax. These statements report the results of an European Respiratory Society Task Force addressing actual diagnostic and therapeutic standards. The statements are based on a systematic review of the literature and a systematic two-step decision process. Although the Task Force does not make recommendations, it describes its current practice of treatment of CSA in heart failure and hypoventilation. Copyright ©ERS 2017.

  12. Extraction and visualization of the central chest lymph-node stations

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Merritt, Scott A.; Higgins, William E.

    2008-03-01

    Lung cancer remains the leading cause of cancer death in the United States and is expected to account for nearly 30% of all cancer deaths in 2007. Central to the lung-cancer diagnosis and staging process is the assessment of the central chest lymph nodes. This assessment typically requires two major stages: (1) location of the lymph nodes in a three-dimensional (3D) high-resolution volumetric multi-detector computed-tomography (MDCT) image of the chest; (2) subsequent nodal sampling using transbronchial needle aspiration (TBNA). We describe a computer-based system for automatically locating the central chest lymph-node stations in a 3D MDCT image. Automated analysis methods are first run that extract the airway tree, airway-tree centerlines, aorta, pulmonary artery, lungs, key skeletal structures, and major-airway labels. This information provides geometrical and anatomical cues for localizing the major nodal stations. Our system demarcates these stations, conforming to criteria outlined for the Mountain and Wang standard classification systems. Visualization tools within the system then enable the user to interact with these stations to locate visible lymph nodes. Results derived from a set of human 3D MDCT chest images illustrate the usage and efficacy of the system.

  13. The Vanderbilt Holistic Face Processing Test: A short and reliable measure of holistic face processing

    PubMed Central

    Richler, Jennifer J.; Floyd, R. Jackie; Gauthier, Isabel

    2014-01-01

    Efforts to understand individual differences in high-level vision necessitate the development of measures that have sufficient reliability, which is generally not a concern in group studies. Holistic processing is central to research on face recognition and, more recently, to the study of individual differences in this area. However, recent work has shown that the most popular measure of holistic processing, the composite task, has low reliability. This is particularly problematic for the recent surge in interest in studying individual differences in face recognition. Here, we developed and validated a new measure of holistic face processing specifically for use in individual-differences studies. It avoids some of the pitfalls of the standard composite design and capitalizes on the idea that trial variability allows for better traction on reliability. Across four experiments, we refine this test and demonstrate its reliability. PMID:25228629

  14. 78 FR 77019 - Energy Conservation Program: Energy Conservation Standards for Certain Consumer Products

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... Regulations the definitions for ``through-the-wall central air conditioner'' and ``through-the-wall central... superseded effective in 2006, and the now defunct references to the ``through-the-wall air conditioner and... definitions for ``through-the-wall central air conditioner'' and ``through-the-wall central air conditioning...

  15. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  16. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  17. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  18. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  19. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  20. Arterial stiffness, central hemodynamics, and cardiovascular risk in hypertension

    PubMed Central

    Palatini, Paolo; Casiglia, Edoardo; Gąsowski, Jerzy; Głuszek, Jerzy; Jankowski, Piotr; Narkiewicz, Krzysztof; Saladini, Francesca; Stolarz-Skrzypek, Katarzyna; Tikhonoff, Valérie; Van Bortel, Luc; Wojciechowska, Wiktoria; Kawecka-Jaszcz, Kalina

    2011-01-01

    This review summarizes several scientific contributions at the recent Satellite Symposium of the European Society of Hypertension, held in Milan, Italy. Arterial stiffening and its hemodynamic consequences can be easily and reliably measured using a range of noninvasive techniques. However, like blood pressure (BP) measurements, arterial stiffness should be measured carefully under standardized patient conditions. Carotid-femoral pulse wave velocity has been proposed as the gold standard for arterial stiffness measurement and is a well recognized predictor of adverse cardiovascular outcome. Systolic BP and pulse pressure in the ascending aorta may be lower than pressures measured in the upper limb, especially in young individuals. A number of studies suggest closer correlation of end-organ damage with central BP than with peripheral BP, and central BP may provide additional prognostic information regarding cardiovascular risk. Moreover, BP-lowering drugs can have differential effects on central aortic pressures and hemodynamics compared with brachial BP. This may explain the greater beneficial effect provided by newer antihypertensive drugs beyond peripheral BP reduction. Although many methodological problems still hinder the wide clinical application of parameters of arterial stiffness, these will likely contribute to cardiovascular assessment and management in future clinical practice. Each of the abovementioned parameters reflects a different characteristic of the atherosclerotic process, involving functional and/or morphological changes in the vessel wall. Therefore, acquiring simultaneous measurements of different parameters of vascular function and structure could theoretically enhance the power to improve risk stratification. Continuous technological effort is necessary to refine our methods of investigation in order to detect early arterial abnormalities. Arterial stiffness and its consequences represent the great challenge of the twenty-first century for affluent countries, and “de-stiffening” will be the goal of the next decades. PMID:22174583

  1. Measurement of HbA1c in multicentre diabetes trials - should blood samples be tested locally or sent to a central laboratory: an agreement analysis.

    PubMed

    Arch, Barbara N; Blair, Joanne; McKay, Andrew; Gregory, John W; Newland, Paul; Gamble, Carrol

    2016-10-24

    Glycated haemoglobin (HbA1c) is an important outcome measure in diabetes clinical trials. For multicentre designs, HbA1c can be measured locally at participating centres or by sending blood samples to a central laboratory. This study analyses the agreement between local and central measurements, using 1-year follow-up data collected in a multicentre randomised controlled trial (RCT) of newly diagnosed children with type I diabetes. HbA1c measurements were routinely analysed both locally and centrally at baseline and then at 3, 6, 9 and 12 months and the data reported in mmol/mol. Agreement was assessed by calculating the bias and 95 % limits of agreement, using the Bland-Altman analysis method. A predetermined benchmark for clinically acceptable margin of error between measurements was subjectively set as ±10 % for HbA1c. The percentage of pairs of measurements that were classified as clinically acceptable was calculated. Descriptive statistics were used to examine the agreement within centres. Treatment group was not considered. Five hundred and ninety pairs of measurement, representing 255 children and 15 trial centres across four follow-up time points, were compared. There was no significant bias: local measurements were an average of 0.16 mmol/mol (SD = 4.5, 95 % CI -0.2 to 0.5) higher than central. The 95 % limits of agreement were -8.6 to 9.0 mmol/mol (local minus central). Eighty percent of local measurements were within ±10 % of corresponding central measurements. Some trial centres were more varied in the differences observed between local and central measurements: IQRs ranging from 3 to 9 mmol/mol; none indicated systematic bias. Variation in agreement between HbA1c measurements was greater than had been expected although no overall bias was detected and standard deviations were similar. Discrepancies were present across all participating centres. These findings have implications for the comparison of standards of clinical care between centres, the design of future multicentre RCTs and existing quality assurance processes for HbA1c measurements. We recommend that centralised HbA1c measurement is preferable in the multicentre clinical trial setting. Eudract No. 2010-023792-25 , registered on 4 November 2010.

  2. Race to the Top Leaves Children and Future Citizens behind: The Devastating Effects of Centralization, Standardization, and High Stakes Accountability

    ERIC Educational Resources Information Center

    Onosko, Joe

    2011-01-01

    President Barack Obama's Race to the Top (RTT) is a profoundly flawed educational reform plan that increases standardization, centralization, and test-based accountability in our nation's schools. Following a brief summary of the interest groups supporting the plan, who is currently participating in this race, why so many states voluntarily…

  3. Comparing Standardized Test Scores among Arts-Integrated and Non-Arts Integrated Schools in Central Mississippi

    ERIC Educational Resources Information Center

    Dean, Darlene

    2014-01-01

    The topic of arts integration creates continuing dialog among educators and arts advocates. This study examined the degree to which student achievement was affected when arts education is limited or eliminated from schools to meet the mandates of NCLB (2001) legislation. Standardized test scores from 12 schools in Central Mississippi were used to…

  4. 42 CFR 412.64 - Federal rates for inpatient operating costs for Federal fiscal year 2005 and subsequent fiscal...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the greater number of workers in the county commute if the rural county would otherwise be considered part of an urban area, under the standards for designating MSAs if the commuting rates used in... commute to (and, if applicable under the standards, from) the central county or central counties of all...

  5. 42 CFR 412.64 - Federal rates for inpatient operating costs for Federal fiscal year 2005 and subsequent fiscal...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the greater number of workers in the county commute if the rural county would otherwise be considered part of an urban area, under the standards for designating MSAs if the commuting rates used in... commute to (and, if applicable under the standards, from) the central county or central counties of all...

  6. 42 CFR 412.64 - Federal rates for inpatient operating costs for Federal fiscal year 2005 and subsequent fiscal...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the greater number of workers in the county commute if the rural county would otherwise be considered part of an urban area, under the standards for designating MSAs if the commuting rates used in... commute to (and, if applicable under the standards, from) the central county or central counties of all...

  7. Prediction by regression and intrarange data scatter in surface-process studies

    USGS Publications Warehouse

    Toy, T.J.; Osterkamp, W.R.; Renard, K.G.

    1993-01-01

    Modeling is a major component of contemporary earth science, and regression analysis occupies a central position in the parameterization, calibration, and validation of geomorphic and hydrologic models. Although this methodology can be used in many ways, we are primarily concerned with the prediction of values for one variable from another variable. Examination of the literature reveals considerable inconsistency in the presentation of the results of regression analysis and the occurrence of patterns in the scatter of data points about the regression line. Both circumstances confound utilization and evaluation of the models. Statisticians are well aware of various problems associated with the use of regression analysis and offer improved practices; often, however, their guidelines are not followed. After a review of the aforementioned circumstances and until standard criteria for model evaluation become established, we recommend, as a minimum, inclusion of scatter diagrams, the standard error of the estimate, and sample size in reporting the results of regression analyses for most surface-process studies. ?? 1993 Springer-Verlag.

  8. A portable device for detecting fruit quality by diffuse reflectance Vis/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Sun, Hongwei; Peng, Yankun; Li, Peng; Wang, Wenxiu

    2017-05-01

    Soluble solid content (SSC) is a major quality parameter to fruit, which has influence on its flavor or texture. Some researches on the on-line non-invasion detection of fruit quality were published. However, consumers desire portable devices currently. This study aimed to develop a portable device for accurate, real-time and nondestructive determination of quality factors of fruit based on diffuse reflectance Vis/NIR spectroscopy (520-950 nm). The hardware of the device consisted of four units: light source unit, spectral acquisition unit, central processing unit, display unit. Halogen lamp was chosen as light source. When working, its hand-held probe was in contact with the surface of fruit samples thus forming dark environment to shield the interferential light outside. Diffuse reflectance light was collected and measured by spectrometer (USB4000). ARM (Advanced RISC Machines), as central processing unit, controlled all parts in device and analyzed spectral data. Liquid Crystal Display (LCD) touch screen was used to interface with users. To validate its reliability and stability, 63 apples were tested in experiment, 47 of which were chosen as calibration set, while others as prediction set. Their SSC reference values were measured by refractometer. At the same time, samples' spectral data acquired by portable device were processed by standard normalized variables (SNV) and Savitzky-Golay filter (S-G) to eliminate the spectra noise. Then partial least squares regression (PLSR) was applied to build prediction models, and the best predictions results was achieved with correlation coefficient (r) of 0.855 and standard error of 0.6033° Brix. The results demonstrated that this device was feasible to quantitatively analyze soluble solid content of apple.

  9. Distribution of magnesium in central nervous system tissue, trabecular and cortical bone in rats fed with unbalanced diets of minerals.

    PubMed

    Yasui, M; Yano, I; Yase, Y; Ota, K

    1990-11-01

    Recent epidemiological changes in patterns of foci of amyotrophic lateral sclerosis (ALS) in the Western Pacific suggest that environmental factors play a contributory role in the pathogenic process of this disorder. In this experimental study on rats, a similar situation of dietary mineral imbalance was created as is found in the soil and drinking water of these ALS foci with a low content of calcium (Ca) and magnesium (Mg) and a high content of aluminum (Al). In groups of rats fed a low Ca diet, low Ca-Mg diet, and low Ca-Mg plus high Al diet, serum Ca levels were found to be lower than those in a group fed a standard diet. Also, serum Mg levels were lower in the groups fed a low Ca-Mg diet and a low Ca-Mg plus high Al diet than in the groups fed a standard diet and only a low Ca diet. There was no significant difference in Mg content of central nervous system (CNS) tissues of groups fed unbalanced and standard diets, except for a significant decrease in Mg content of the spinal cord of rats fed a low Ca-Mg plus high Al diet. Mg content of the lumbar spine and cortical bone decreased in the unbalanced diet groups compared with that of a group fed a standard diet. These findings suggest that under the disturbed bone mineralization induced by unbalanced mineral diets, Mg may be mobilized from bone to maintain the level necessary for vital activity in soft tissues including CNS tissue.

  10. Coals of Hungary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landis, E.R.; Rohrbacher, T.J.; Gluskoter, H.

    1999-07-01

    As part of the activities conducted under the U.S. Hungarian Science and Technology Fund, a total of 39 samples from five coal mines in Hungary were selected for standard coal analyses and major, minor and trace elements analysis. The mine areas sampled were selected to provide a spectrum of coal quality information for comparison with other coal areas in central Europe and worldwide. All of the areas are of major importance in the energy budget of Hungary. The five sample sites contain coal in rocks of Jurassic, Cretaceous, Eocene, Miocene, and Pliocene age. The coals, from four underground and onemore » surface mine, range in rank from high volatile bituminous to lignite B. Most of the coal produced from the mines sampled is used to generate electricity. Some of the power plants that utilize the coals also provide heat for domestic and process usage. The standard coal analysis program is based on tests performed in accordance with standards of the American Society for Testing and Materials (ASTM). Proximate and ultimate analyses were supplemented by determinations of the heating value, equilibrium moisture, forms of sulfur, free-swelling index, ash fusion temperatures (both reducing and oxidizing), apparent specific gravity and Hardgrove Grindability index. The major, minor and trace element analyses were performed in accordance with standardized procedures of the U.S. Geological Survey. The analytical results will be available in the International Coal Quality Data Base of the USGS. The results of the program provide data for comparison with test data from Europe and information of value to potential investors or cooperators in the coal industry of Hungary and Central Europe.« less

  11. Cross-sectional description of nursing and midwifery pre-service education accreditation in east, central, and southern Africa in 2013.

    PubMed

    McCarthy, Carey F; Gross, Jessica M; Verani, Andre R; Nkowane, Annette M; Wheeler, Erica L; Lipato, Thokozire J; Kelley, Maureen A

    2017-07-24

    In 2013, the World Health Organization issued guidelines, Transforming and Scaling Up Health Professional Education and Training, to improve the quality and relevance of health professional pre-service education. Central to these guidelines was establishing and strengthening education accreditation systems. To establish what current accreditation systems were for nursing and midwifery education and highlight areas for strengthening these systems, a study was undertaken to document the pre-service accreditation policies, approaches, and practices in 16 African countries relative to the 2013 WHO guidelines. This study utilized a cross-sectional group survey with a standardized questionnaire administered to a convenience sample of approximately 70 nursing and midwifery leaders from 16 countries in east, central, and southern Africa. Each national delegation completed one survey together, representing the responses for their country. Almost all countries in this study (15; 94%) mandated pre-service nursing education accreditation However, there was wide variation in who was responsible for accrediting programs. The percent of active programs accredited decreased by program level from 80% for doctorate programs to 62% for masters nursing to 50% for degree nursing to 35% for diploma nursing programs. The majority of countries indicated that accreditation processes were transparent (i.e., included stakeholder engagement (81%), self-assessment (100%), evaluation feedback (94%), and public disclosure (63%)) and that the processes were evaluated on a routine basis (69%). Over half of the countries (nine; 56%) reported limited financial resources as a barrier to increasing accreditation activities, and seven countries (44%) noted limited materials and technical expertise. In line with the 2013 WHO guidelines, there was a strong legal mandate for nursing education accreditation as compared to the global average of 50%. Accreditation levels were low in the programs that produce the majority of the nurses in this region and were higher in public programs than non-public programs. WHO guidelines for transparency and routine review were met more so than standards-based and independent accreditation processes. The new global strategy, Workforce 2030, has renewed the focus on accreditation and provides an opportunity to strengthen pre-service accreditation and ensure the production of a qualified and relevant nursing workforce.

  12. The influence of (central) auditory processing disorder on the severity of speech-sound disorders in children.

    PubMed

    Vilela, Nadia; Barrozo, Tatiane Faria; Pagan-Neves, Luciana de Oliveira; Sanches, Seisse Gabriela Gandolfi; Wertzner, Haydée Fiszbein; Carvallo, Renata Mota Mamede

    2016-02-01

    To identify a cutoff value based on the Percentage of Consonants Correct-Revised index that could indicate the likelihood of a child with a speech-sound disorder also having a (central) auditory processing disorder . Language, audiological and (central) auditory processing evaluations were administered. The participants were 27 subjects with speech-sound disorders aged 7 to 10 years and 11 months who were divided into two different groups according to their (central) auditory processing evaluation results. When a (central) auditory processing disorder was present in association with a speech disorder, the children tended to have lower scores on phonological assessments. A greater severity of speech disorder was related to a greater probability of the child having a (central) auditory processing disorder. The use of a cutoff value for the Percentage of Consonants Correct-Revised index successfully distinguished between children with and without a (central) auditory processing disorder. The severity of speech-sound disorder in children was influenced by the presence of (central) auditory processing disorder. The attempt to identify a cutoff value based on a severity index was successful.

  13. The Exchange Data Communication System based on Centralized Database for the Meat Industry

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yuichi; Taniguchi, Yoji; Terada, Shuji; Komoda, Norihisa

    We propose applying the EDI system that is based on centralized database and supports conversion of code data to the meat industry. This system makes it possible to share exchange data on beef between enterprises from producers to retailers by using Web EDI technology. In order to efficiently convert code direct conversion of a sender's code to a receiver's code using a code map is used. This system that mounted this function has been implemented in September 2004. Twelve enterprises including retailers, and processing traders, and wholesalers were using the system as of June 2005. In this system, the number of code maps relevant to the introductory cost of the code conversion function was lower than the theoretical value and were close to the case that a standard code is mediated.

  14. Strategy for development of the Polish electricity sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dybowski, J.

    1995-12-01

    This paper represents the strategy for development of the Polish Electricity Sector dealing with specific problems which are common for all of East Central Europe. In 1990 Poland adopted a restructuring program for the entire energy sector. Very ambitious plans were changed several times but still the main direction of change was preserved. The most difficult period of transformation is featured by several contradictions which have to be balanced. Electricity prices should increase in order to cover the modernization and development program but the society is not able to take this burden in such a short time. Furthermore the newmore » environment protection standards force the growth of capital investment program which sooner or later has to be transferred through the electricity prices. New economic mechanisms have to be introduced to the electricity sector to replace the old ones noneffective, centrally planned. This process has to follow slow management changes. Also, introduction of new electricity market is limited by those constraints. However, this process of change would not be possible without parallel governmental initiation like preparation of new energy law and regulatory frames.« less

  15. Archive of digital boomer seismic reflection data collected during USGS field activities 95LCA03 and 96LCA02 in the Peace River of West-Central Florida, 1995 and 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.

    2006-01-01

    In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  16. [Martin Heidegger, beneficence, health, and evidence based medicine--contemplations regarding ethics and complementary and alternative medicine].

    PubMed

    Oberbaum, Menachem; Gropp, Cornelius

    2015-03-01

    Beneficence is considered a core principle of medical ethics. Evidence Based Medicine (EBM) is used almost synonymously with beneficence and has become the gold standard of efficiency of conventional medicine. Conventional modern medicine and EBM in particular are based on what Heidegger called calculative thinking, whereas complementary medicine (CM) is often based on contemplative thinking according to Heidegger's distinction of different thinking processes. A central issue of beneficence is the striving for health and wellbeing. EBM is little concerned directly with wellbeing, though it does claim to aim at improving quality of life by correcting pathological processes and conditions like infectious diseases, ischemic heart disease but also hypertension and hyperlipidemia. On the other hand, wellbeing is central to therapeutic efforts of CM. Scientific methods to gauge results of EBM are quantitative and based on calculative thinking, while results of treatments with CM are expressed in a qualitative way and based on meditative thinking. In order to maximize beneficence it seems important and feasible to use both approaches, by combining EBM and CM in the best interest of the individual patient.

  17. Cost of and soil loss on "minimum-standard" forest truck roads constructed in the central Appalachians

    Treesearch

    J. N. Kochenderfer; G. W. Wendel; H. Clay Smith

    1984-01-01

    A "minimum-standard" forest truck road that provides efficient and environmentally acceptable access for several forest activities is described. Cost data are presented for eight of these roads constructed in the central Appalachians. The average cost per mile excluding gravel was $8,119. The range was $5,048 to $14,424. Soil loss was measured from several...

  18. Measurement Assurance for End-Item Users

    NASA Technical Reports Server (NTRS)

    Mimbs, Scott M.

    2008-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to assure the end product meets specifications and customer requirements. Measuring devices, often called measuring and test equipments (MTE), provide the evidence of product conformity to the prescribed requirements. Therefore the processes which employ MTE can become a weak link to the overall QMS if proper attention is not given to development and execution of these processes. Traditionally, calibration of MTE is given more focus in industry standards and process control efforts than the equally important proper usage of the same equipment. It is a common complaint of calibration laboratory personnel that MTE users are only interested in "a sticker." If the QMS requires the MTE "to demonstrate conformity of the product," then the quality of the measurement process must be adequate for the task. This leads to an ad hoc definition; measurement assurance is a discipline that assures that all processes, activities, environments, standards, and procedures involved in making a measurement produce a result that can be rigorously evaluated for validity and accuracy. To evaluate that the existing measurement processes are providing an adequate level of quality to support the decisions based upon this measurement data, an understanding of measurement assurance basics is essential. This topic is complimentary to the calibration standard, ANSI/NCSL Z540.3-2006, which targets the calibration of MTE at the organizational level. This paper will discuss general measurement assurance when MTE is used to provide evidence of product conformity, therefore the target audience of this paper is end item users of MTE. A central focus of the paper will be the verification of tolerances and the associated risks, so calibration professionals may find the paper useful in communication with their customers, MTE users.

  19. Exacerbated degradation and desertification of grassland in Central Asia

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Xiao, X.; Biradar, C. M.; Dong, J.; Zhou, Y.; Qin, Y.; Zhang, Y.; Liu, F.; Ding, M.; Thomas, R. J.

    2016-12-01

    Grassland desertification is a complex process, including both state conversion (e.g., grasslands to deserts) and gradual within-state change (e.g., greenness dynamics). Existing studies generally did not separate the two components and analyzed them based on time series vegetation indices, which however cannot provide a clear and comprehensive picture for desertification. Here we proposed a desertification zone classification-based grassland degradation strategy to detect the grassland desertification process in Central Asia. First, annual spatially explicit maps of grasslands and deserts were generated to track the conversion between grasslands and deserts. The results showed that 13 % of grasslands were converted to deserts from 2000 to 2014, with an increasing desertification trend northward in the latitude range of 43-48°N. Second, a fragile and unstable Transitional zone was identified in southern Kazakhstan based on desert frequency maps. Third, gradual vegetation dynamics during the thermal growing season (EVITGS) were investigated using linear regression and Mann-Kendall approaches. The results indicated that grasslands generally experienced widespread degradation in Central Asia, with an additional hotspot identified in the northern Kazakhstan. Finally, attribution analyses of desertification were conducted by correlating vegetation dynamics with three different drought indices (Palmer Drought Severity Index (PDSI), Standardized Precipitation Index (SPI), and Drought Severity Index (DSI)), precipitation, and temperature, and showed that grassland desertification was exacerbated by droughts, and persistent drought was the main factor for grassland desertification in Central Asia. This study provided essential information for taking practical actions to prevent the further desertification and targeting right spots for better intervention to combat the land degradation in the region.

  20. Investing in innovation: trade-offs in the costs and cost-efficiency of school feeding using community-based kitchens in Bangladesh.

    PubMed

    Gelli, Aulo; Suwa, Yuko

    2014-09-01

    School feeding programs have been a key response to the recent food and economic crises and function to some degree in nearly every country in the world. However, school feeding programs are complex and exhibit different, context-specific models or configurations. To examine the trade-offs, including the costs and cost-efficiency, of an innovative cluster kitchen implementation model in Bangladesh using a standardized framework. A supply chain framework based on international standards was used to provide benchmarks for meaningful comparisons across models. Implementation processes specific to the program in Bangladesh were mapped against this reference to provide a basis for standardized performance measures. Qualitative and quantitative data on key metrics were collected retrospectively using semistructured questionnaires following an ingredients approach, including both financial and economic costs. Costs were standardized to a 200-feeding-day year and 700 kcal daily. The cluster kitchen model had similarities with the semidecentralized model and outsourced models in the literature, the main differences involving implementation scale, scale of purchasing volumes, and frequency of purchasing. Two important features stand out in terms of implementation: the nutritional quality of meals and the level of community involvement. The standardized full cost per child per year was US$110. Despite the nutritious content of the meals, the overall cost-efficiency in cost per nutrient output was lower than the benchmark for centralized programs, due mainly to support and start-up costs. Cluster kitchens provide an example of an innovative implementation model, combining an emphasis on quality meal delivery with strong community engagement. However, the standardized costs-per child were above the average benchmarks for both low-and middle-income countries. In contrast to the existing benchmark data from mature, centralized models, the main cost drivers of the program were associated with support and start-up activities. Further research is required to better understand changes in cost drivers as programs mature.

  1. Power iteration ranking via hybrid diffusion for vital nodes identification

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Xian, Xingping; Zhong, Linfeng; Xiong, Xi; Stanley, H. Eugene

    2018-09-01

    One of the most interesting challenges in network science is to understand the relation between network structure and dynamics on it, and many topological properties, including degree distribution, community strength and clustering coefficient, have been proposed in the last decade. Prominent in this context is the centrality measures, which aim at quantifying the relative importance of individual nodes in the overall topology with regard to network organization and function. However, most of the previous centrality measures have been proposed based on different concepts and each of them focuses on a specific structural feature of networks. Thus, the straightforward and standard methods may lead to some bias against node importance measure. In this paper, we introduce two physical processes with potential complementarity between them. Then we propose to combine them as an elegant integration with the classic eigenvector centrality framework to improve the accuracy of node ranking. To test the produced power iteration ranking (PIRank) algorithm, we apply it to the selection of attack targets in network optimal attack problem. Extensive experimental results on synthetic networks and real-world networks suggest that the proposed centrality performs better than other well-known measures. Moreover, comparing with the eigenvector centrality, the PIRank algorithm can achieve about thirty percent performance improvement while keeping similar running time. Our experiment on random networks also shows that PIRank algorithm can avoid the localization phenomenon of eigenvector centrality, in particular for the networks with high-degree hubs.

  2. The importance of becoming double-stranded: Innate immunity and the kinetic model of HIV-1 central plus strand synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poeschla, Eric, E-mail: poeschla.eric@mayo.edu

    Central initiation of plus strand synthesis is a conserved feature of lentiviruses and certain other retroelements. This complication of the standard reverse transcription mechanism produces a transient “central DNA flap” in the viral cDNA, which has been proposed to mediate its subsequent nuclear import. This model has assumed that the important feature is the flapped DNA structure itself rather than the process that produces it. Recently, an alternative kinetic model was proposed. It posits that central plus strand synthesis functions to accelerate conversion to the double-stranded state, thereby helping HIV-1 to evade single-strand DNA-targeting antiviral restrictions such as APOBEC3 proteins,more » and perhaps to avoid innate immune sensor mechanisms. The model is consistent with evidence that lentiviruses must often synthesize their cDNAs when dNTP concentrations are limiting and with data linking reverse transcription and uncoating. There may be additional kinetic advantages for the artificial genomes of lentiviral gene therapy vectors. - Highlights: • Two main functional models for HIV central plus strand synthesis have been proposed. • In one, a transient central DNA flap in the viral cDNA mediates HIV-1 nuclear import. • In the other, multiple kinetic consequences are emphasized. • One is defense against APOBEC3G, which deaminates single-stranded DNA. • Future questions pertain to antiviral restriction, uncoating and nuclear import.« less

  3. Concept of Integrated Information Systems of Rail Transport

    NASA Astrophysics Data System (ADS)

    Siergiejczyk, Mirosław; Gago, Stanisław

    This paper will present a need to create integrated information systems of the rail transport and their links with other means of public transportation. IT standards will be discussed that are expected to create the integrated information systems of the rail transport. Also the main tasks will be presented of centralized information systems, the concept of their architecture, business processes and their implementation as well as the proposed measures to secure data. A method shall be proposed to implement a system to inform participants of rail transport in Polish conditions.

  4. [Biobanking requirements from the perspective of the clinician : Experiences in hematology and oncology].

    PubMed

    Koschmieder, S; Brümmendorf, T H

    2018-04-05

    The requirements for optimal biobanking from the point of view of the clinical partner can be highly variable. Depending on the material, processing, storage conditions, clinical data, and involvement of external partners, there will be special requirements for the participating clinician and specialist areas. What they all have in common is that the goal of any biobanking must be to improve clinical, translational, and basic research. While in the past biomaterials often had to be individually stored for each research project, modern biobanking offers decisive advantages: a comprehensive ethics vote fulfilling state-of-the-art data safety requirements, standardized processing and storage protocols, specialized biobank software for pseudonymization and localization, protection against power failures and defects of the equipment, centralized and sustainable storage, easy localization and return of samples, and their destruction or anonymization after completion of an individual project. In addition to this important pure storage function, central biobanking can provide a link to clinical data as well as the anonymous use of samples for project-independent research. Both biobank functions serve different purposes, are associated with specific requirements, and should be pursued in parallel. If successful, central biomaterial management can achieve a sustainable improvement of academic and non-academic biomedical research and the optimal use of resources. The close collaboration between clinicians and non-clinicians is a crucial prerequisite for this.

  5. Development and validation of a novel hydrolysis probe real-time polymerase chain reaction for agamid adenovirus 1 in the central bearded dragon (Pogona vitticeps).

    PubMed

    Fredholm, Daniel V; Coleman, James K; Childress, April L; Wellehan, James F X

    2015-03-01

    Agamid adenovirus 1 (AgAdv-1) is a significant cause of disease in bearded dragons (Pogona sp.). Clinical manifestations of AgAdv-1 infection are variable and often nonspecific; the manifestations range from lethargy, weight loss, and inappetence, to severe enteritis, hepatitis, and sudden death. Currently, diagnosis of AgAdv-1 infection is achieved through a single published method: standard nested polymerase chain reaction (nPCR) and sequencing. Standard nPCR with sequencing provides reliable sensitivity, specificity, and validation of PCR products. However, this process is comparatively expensive, laborious, and slow. Probe hybridization, as used in a TaqMan assay, represents the best option for validating PCR products aside from the time-consuming process of sequencing. This study developed a real-time PCR (qPCR) assay using a TaqMan probe-based assay, targeting a highly conserved region of the AgAdv-1 genome. Standard curves were generated, detection results were compared with the gold standard conventional PCR and sequencing assay, and limits of detection were determined. Additionally, the qPCR assay was run on samples known to be positive for AgAdv-1 and samples known to be positive for other adenoviruses. Based on the results of these evaluations, this assay allows for a less expensive, rapid, quantitative detection of AgAdv-1 in bearded dragons. © 2015 The Author(s).

  6. Current controlled vocabularies are insufficient to uniquely map molecular entities to mass spectrometry signal.

    PubMed

    Smith, Rob; Taylor, Ryan M; Prince, John T

    2015-01-01

    The comparison of analyte mass spectrometry precursor (MS1) signal is central to many proteomic (and other -omic) workflows. Standard vocabularies for mass spectrometry exist and provide good coverage for most experimental applications yet are insufficient for concise and unambiguous description of data concepts spanning the range of signal provenance from a molecular perspective (e.g. from charged peptides down to fine isotopes). Without a standard unambiguous nomenclature, literature searches, algorithm reproducibility and algorithm evaluation for MS-omics data processing are nearly impossible. We show how terms from current official ontologies are too vague or ambiguous to explicitly map molecular entities to MS signals and we illustrate the inconsistency and ambiguity of current colloquially used terms. We also propose a set of terms for MS1 signal that uniquely, succinctly and intuitively describe data concepts spanning the range of signal provenance from full molecule downs to fine isotopes. We suggest that additional community discussion of these terms should precede any further standardization efforts. We propose a novel nomenclature that spans the range of the required granularity to describe MS data processing from the perspective of the molecular provenance of the MS signal. The proposed nomenclature provides a chain of succinct and unique terms spanning the signal created by a charged molecule down through each of its constituent subsignals. We suggest that additional community discussion of these terms should precede any further standardization efforts.

  7. Standardized, Interdepartmental, Simulation-Based Central Line Insertion Course Closes an Educational Gap and Improves Intern Comfort with the Procedure.

    PubMed

    Grudziak, Joanna; Herndon, Blair; Dancel, Ria D; Arora, Harendra; Tignanelli, Christopher J; Phillips, Michael R; Crowner, Jason R; True, Nicholas A; Kiser, Andy C; Brown, Rebecca F; Goodell, Harry P; Murty, Neil; Meyers, Michael O; Montgomery, Sean P

    2017-06-01

    Central line placement is a common procedure, routinely performed by junior residents in medical and surgical departments. Before this project, no standardized instructional course on the insertion of central lines existed at our institution, and few interns had received formal ultrasound training. Interns from five departments participated in a simulation-based central line insertion course. Intern familiarity with the procedure and with ultrasound, as well as their prior experience with line placement and their level of comfort, was assessed. Of the 99 interns in participating departments, 45 per cent had been trained as of October 2015. Forty-one per cent were female. The majority (59.5%) had no prior formal ultrasound training, and 46.0 per cent had never placed a line as primary operator. Scores increased significantly, from a precourse score mean of 13.7 to a postcourse score mean of 16.1, P < 0.001. All three of the self-reported measures of comfort with ultrasound also improved significantly. All interns reported the course was "very much" helpful, and 100 per cent reported they felt "somewhat" or "much" more comfortable with the procedure after attendance. To our knowledge, this is the first hospital-wide, standardized, simulation-based central line insertion course in the United States. Preliminary results indicate overwhelming satisfaction with the course, better ultrasound preparedness, and improved comfort with central line insertion.

  8. High Available COTS Based Computer for Space

    NASA Astrophysics Data System (ADS)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  9. Corporate governance and the adoption of health information technology within integrated delivery systems.

    PubMed

    Baird, Aaron; Furukawa, Michael F; Rahman, Bushra; Schneller, Eugene S

    2014-01-01

    Although several previous studies have found "system affiliation" to be a significant and positive predictor of health information technology (IT) adoption, little is known about the association between corporate governance practices and adoption of IT within U.S. integrated delivery systems (IDSs). Rooted in agency theory and corporate governance research, this study examines the association between corporate governance practices (centralization of IT decision rights and strategic alignment between business and IT strategy) and IT adoption, standardization, and innovation within IDSs. Cross-sectional, retrospective analyses using data from the 2011 Health Information and Management Systems Society Analytics Database on adoption within IDSs (N = 485) is used to analyze the correlation between two corporate governance constructs (centralization of IT decision rights and strategic alignment) and three IT constructs (adoption, standardization, and innovation) for clinical and supply chain IT. Multivariate fractional logit, probit, and negative binomial regressions are applied. Multivariate regressions controlling for IDS and market characteristics find that measures of IT adoption, IT standardization, and innovative IT adoption are significantly associated with centralization of IT decision rights and strategic alignment. Specifically, centralization of IT decision rights is associated with 22% higher adoption of Bar Coding for Materials Management and 30%-35% fewer IT vendors for Clinical Data Repositories and Materials Management Information Systems. A combination of centralization and clinical IT strategic alignment is associated with 50% higher Computerized Physician Order Entry adoption, and centralization along with supply chain IT strategic alignment is significantly negatively correlated with Radio Frequency Identification adoption : Although IT adoption and standardization are likely to benefit from corporate governance practices within IDSs, innovation is likely to be delayed. In addition, corporate governance is not one-size-fits-all, and contingencies are important considerations.

  10. Gamification and Microlearning for Engagement With Quality Improvement (GAMEQI): A Bundled Digital Intervention for the Prevention of Central Line-Associated Bloodstream Infection.

    PubMed

    Orwoll, Benjamin; Diane, Shelley; Henry, Duncan; Tsang, Lisa; Chu, Kristin; Meer, Carrie; Hartman, Kevin; Roy-Burman, Arup

    Central line-associated bloodstream infections (CLABSIs) cause major patient harm, preventable through attention to line care best practice standards. The objective was to determine if a digital self-assessment application (CLABSI App), bundling line care best practices with social gamification and in-context microlearning, could engage nurses in CLABSI prevention. Nurses caring for children with indwelling central venous catheters in 3 high-risk units were eligible to participate. All other units served as controls. The intervention was a 12-month nonrandomized quality improvement study of CLABSI App implementation with interunit competitions. Compared to the preceding year, the intervention group (9886 line days) CLABSI rate decreased by 48% ( P = .03). Controls (7879 line days) did not change significantly. In all, 105 unique intervention group nurses completed 673 self-assessments. Competitions were associated with increased engagement as measured by self-assessments and unique participants. This model could be extended to other health care-associated infections, and more broadly to process improvement within and across health care systems.

  11. Improving Prediction Accuracy of “Central Line-Associated Blood Stream Infections” Using Data Mining Models

    PubMed Central

    Noaman, Amin Y.; Jamjoom, Arwa; Al-Abdullah, Nabeela; Nasir, Mahreen; Ali, Anser G.

    2017-01-01

    Prediction of nosocomial infections among patients is an important part of clinical surveillance programs to enable the related personnel to take preventive actions in advance. Designing a clinical surveillance program with capability of predicting nosocomial infections is a challenging task due to several reasons, including high dimensionality of medical data, heterogenous data representation, and special knowledge required to extract patterns for prediction. In this paper, we present details of six data mining methods implemented using cross industry standard process for data mining to predict central line-associated blood stream infections. For our study, we selected datasets of healthcare-associated infections from US National Healthcare Safety Network and consumer survey data from Hospital Consumer Assessment of Healthcare Providers and Systems. Our experiments show that central line-associated blood stream infections (CLABSIs) can be successfully predicted using AdaBoost method with an accuracy up to 89.7%. This will help in implementing effective clinical surveillance programs for infection control, as well as improving the accuracy detection of CLABSIs. Also, this reduces patients' hospital stay cost and maintains patients' safety. PMID:29085836

  12. Dendritic cells for active immunotherapy: optimizing design and manufacture in order to develop commercially and clinically viable products.

    PubMed

    Nicolette, C A; Healey, D; Tcherepanova, I; Whelton, P; Monesmith, T; Coombs, L; Finke, L H; Whiteside, T; Miesowicz, F

    2007-09-27

    Dendritic cell (DC) active immunotherapy is potentially efficacious in a broad array of malignant disease settings. However, challenges remain in optimizing DC-based therapy for maximum clinical efficacy within manufacturing processes that permit quality control and scale-up of consistent products. In this review we discuss the critical issues that must be addressed in order to optimize DC-based product design and manufacture, and highlight the DC based platforms currently addressing these issues. Variables in DC-based product design include the type of antigenic payload used, DC maturation steps and activation processes, and functional assays. Issues to consider in development include: (a) minimizing the invasiveness of patient biological material collection; (b) minimizing handling and manipulations of tissue at the clinical site; (c) centralized product manufacturing and standardized processing and capacity for commercial-scale production; (d) rapid product release turnaround time; (e) the ability to manufacture sufficient product from limited starting material; and (f) standardized release criteria for DC phenotype and function. Improvements in the design and manufacture of DC products have resulted in a handful of promising leads currently in clinical development.

  13. Evaluating concentration estimation errors in ELISA microarray experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daly, Don S.; White, Amanda M.; Varnum, Susan M.

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Althoughmore » propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.« less

  14. Terminological aspects of data elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strehlow, R.A.; Kenworthey, W.H. Jr.; Schuldt, R.E.

    1991-01-01

    The creation and display of data comprise a process that involves a sequence of steps requiring both semantic and systems analysis. An essential early step in this process is the choice, definition, and naming of data element concepts and is followed by the specification of other needed data element concept attributes. The attributes and the values of data element concept remain associated with them from their birth as a concept to a generic data element that serves as a template for final application. Terminology is, therefore, centrally important to the entire data creation process. Smooth mapping from natural language tomore » a database is a critical aspect of database, and consequently, it requires terminology standardization from the outset of database work. In this paper the semantic aspects of data elements are analyzed and discussed. Seven kinds of data element concept information are considered and those that require terminological development and standardization are identified. The four terminological components of a data element are the hierarchical type of a concept, functional dependencies, schematas showing conceptual structures, and definition statements. These constitute the conventional role of terminology in database design. 12 refs., 8 figs., 1 tab.« less

  15. Evaluating environmental survivability of optical coatings

    NASA Astrophysics Data System (ADS)

    Joseph, Shay; Yadlovker, Doron; Marcovitch, Orna; Zipin, Hedva

    2009-05-01

    In this paper we report an on going research to correlate between optical coating survivability and military (MIL) standards. For this purpose 8 different types of coatings were deposited on 1" substrates of sapphire, multi-spectral ZnS (MS-ZnS), germanium, silicon and BK7. All coatings underwent MIL standard evaluation as defined by customer specifications and have passed successfully. Two other sets were left to age for 12 months at two different locations, one near central Tel-Aviv and one by the shoreline of the Mediterranean Sea. A third set was aged for 2000 hours at a special environmental chamber simulating conditions of temperature, humidity and ultra-violet (UV) radiation simultaneously. Measurements of optical transmission before and after aging from all 3 sets reveal, in some cases, major transmission loss indicating severe coating damage. The different aging methods and their relation to the MIL standards are discussed in detail. The most pronounced conclusion is that MIL standards alone are not sufficient for predicting the lifetime of an external coated optical element and are only useful in certifying the coating process and comparison between coatings.

  16. The governance of quality management in dutch health care: new developments and strategic challenges.

    PubMed

    Maarse, J A M; Ruwaard, D; Spreeuwenberg, C

    2013-01-01

    This article gives a brief sketch of quality management in Dutch health care. Our focus is upon the governance of guideline development and quality measurement. Governance is conceptualized as the structure and process of steering of quality management. The governance structure of guideline development in the Netherlands can be conceptualized as a network without central coordination. Much depends upon the self-initiative of stakeholders. A similar picture can be found in quality measurement. Special attention is given to the development of care standards for chronic disease. Care standards have a broader scope than guidelines and take an explicit patient perspective. They not only contain evidence-based and up-to-date guidelines for the care pathway but also contain standards for self-management. Furthermore, they comprise a set of indicators for measuring the quality of care of the entire pathway covered by the standard. The final part of the article discusses the mission, tasks and strategic challenges of the newly established National Health Care Institute (Zorginstituut Nederland), which is scheduled to be operative in 2013.

  17. Central Asia Water (CAWa) - A visualization platform for hydro-meteorological sensor data

    NASA Astrophysics Data System (ADS)

    Stender, Vivien; Schroeder, Matthias; Wächter, Joachim

    2014-05-01

    Water is an indispensable necessity of life for people in the whole world. In central Asia, water is the key factor for economic development, but is already a narrow resource in this region. In fact of climate change, the water problem handling will be a big challenge for the future. The regional research Network "Central Asia Water" (CAWa) aims at providing a scientific basis for transnational water resources management for the five Central Asia States Kyrgyzstan, Uzbekistan, Tajikistan, Turkmenistan and Kazakhstan. CAWa is part of the Central Asia Water Initiative (also known as the Berlin Process) which was launched by the Federal Foreign Office on 1 April 2008 at the "Water Unites" conference in Berlin. To produce future scenarios and strategies for sustainable water management, data on water reserves and the use of water in Central Asia must therefore be collected consistently across the region. Hydro-meteorological stations equipped with sophisticated sensors are installed in Central Asia and send their data via real-time satellite communication to the operation centre of the monitoring network and to the participating National Hydro-meteorological Services.[1] The challenge for CAWa is to integrate the whole aspects of data management, data workflows, data modeling and visualizations in a proper design of a monitoring infrastructure. The use of standardized interfaces to support data transfer and interoperability is essential in CAWa. An uniform treatment of sensor data can be realized by the OGC Sensor Web Enablement (SWE) , which makes a number of standards and interface definitions available: Observation & Measurement (O&M) model for the description of observations and measurements, Sensor Model Language (SensorML) for the description of sensor systems, Sensor Observation Service (SOS) for obtaining sensor observations, Sensor Planning Service (SPS) for tasking sensors, Web Notification Service (WNS) for asynchronous dialogues and Sensor Alert Service (SAS) for sending alerts. An OpenSource web-platform bundles the data, provided by the SWE web services of the hydro-meteorological stations, and provides tools for data visualization and data access. The visualization tool was implemented by using OpenSource tools like GeoExt/ExtJS and OpenLayers. Using the application the user can query the relevant sensor data, select parameter and time period, visualize and finally download the data. [1] http://www.cawa-project.net

  18. 78 FR 54394 - Determination of Attainment for the West Central Pinal Nonattainment Area for the 2006 Fine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ...] Determination of Attainment for the West Central Pinal Nonattainment Area for the 2006 Fine Particle Standard... Central Pinal nonattainment area in Arizona has attained the 2006 24-hour fine particle (PM 2.5 ) National... NAAQS \\2\\ for [[Page 54395

  19. SU-F-T-476: Performance of the AS1200 EPID for Periodic Photon Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; Fraass, B; Yang, W

    2016-06-15

    Purpose: To assess the dosimetric performance of a new amorphous silicon flat-panel electronic portal imaging device (EPID) suitable for high-intensity, flattening-filter-free delivery mode. Methods: An EPID-based QA suite was created with automation to periodically monitor photon central-axis output and two-dimensional beam profile constancy as a function of gantry angle and dose-rate. A Varian TrueBeamTM linear accelerator installed with Developer Mode was used to customize and deliver XML script routines for the QA suite using the dosimetry mode image acquisition for an aS1200 EPID. Automatic post-processing software was developed to analyze the resulting DICOM images. Results: The EPID was used tomore » monitor photon beam output constancy (central-axis), flatness, and symmetry over a period of 10 months for four photon beam energies (6x, 15x, 6xFFF, and 10xFFF). EPID results were consistent to those measured with a standard daily QA check device. At the four cardinal gantry angles, the standard deviation of the EPID central-axis output was <0.5%. Likewise, EPID measurements were independent for the wide range of dose rates (including up to 2400 mu/min for 10xFFF) studied with a standard deviation of <0.8% relative to the nominal dose rate for each energy. Also, profile constancy and field size measurements showed good agreement with the reference acquisition of 0° gantry angle and nominal dose rate. XML script files were also tested for MU linearity and picket-fence delivery. Using Developer Mode, the test suite was delivered in <60 minutes for all 4 photon energies with 4 dose rates per energy and 5 picket-fence acquisitions. Conclusion: Dosimetry image acquisition using a new EPID was found to be accurate for standard and high-intensity photon beams over a broad range of dose rates over 10 months. Developer Mode provided an efficient platform to customize the EPID acquisitions by using custom script files which significantly reduced the time. This work was funded in part by Varian Medical Systems.« less

  20. Towards a standards-compliant genomic and metagenomic publication record

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenner, Marsha W; Garrity, George M.; Field, Dawn

    2008-04-01

    Increasingly we are aware as a community of the growing need to manage the avalanche of genomic and metagenomic data, in addition to related data types like ribosomal RNA and barcode sequences, in a way that tightly integrates contextual data with traditional literature in a machine-readable way. It is for this reason that the Genomic Standards Consortium (GSC) formed in 2005. Here we suggest that we move beyond the development of standards and tackle standards-compliance and improved data capture at the level of the scientific publication. We are supported in this goal by the fact that the scientific community ismore » in the midst of a publishing revolution. This revolution is marked by a growing shift away from a traditional dichotomy between 'journal articles' and 'database entries' and an increasing adoption of hybrid models of collecting and disseminating scientific information. With respect to genomes and metagenomes and related data types, we feel the scientific community would be best served by the immediate launch of a central repository of short, highly structured 'Genome Notes' that must be standards-compliant. This could be done in the context of an existing journal, but we also suggest the more radical solution of launching a new journal. Such a journal could be designed to cater to a wide range of standards-related content types that are not currently centralized in the published literature. It could also support the demand for centralizing aspects of the 'gray literature' (documents developed by institutions or communities) such as the call by the GSCl for a central repository of Standard Operating Procedures describing the genomic annotation pipelines of the major sequencing centers. We argue that such an 'eJournal', published under the Open Access paradigm by the GSC, could be an attractive publishing forum for a broader range of standardization initiatives within, and beyond, the GSC and thereby fill an unoccupied yet increasingly important niche within the current research landscape.« less

  1. Phenotypes of intermediate forms of Fasciola hepatica and F. gigantica in buffaloes from Central Punjab, Pakistan.

    PubMed

    Afshan, K; Valero, M A; Qayyum, M; Peixoto, R V; Magraner, A; Mas-Coma, S

    2014-12-01

    Fascioliasis is an important food-borne parasitic disease caused by the two trematode species, Fasciola hepatica and Fasciola gigantica. The phenotypic features of fasciolid adults and eggs infecting buffaloes inhabiting the Central Punjab area, Pakistan, have been studied to characterize fasciolid populations involved. Morphometric analyses were made with a computer image analysis system (CIAS) applied on the basis of standardized measurements. Since it is the first study of this kind undertaken in Pakistan, the results are compared to pure fasciolid populations: (a) F. hepatica from the European Mediterranean area; and (b) F. gigantica from Burkina Faso; i.e. geographical areas where both species do not co-exist. Only parasites obtained from bovines were used. The multivariate analysis showed that the characteristics, including egg morphometrics, of fasciolids from Central Punjab, Pakistan, are between F. hepatica and F. gigantica standard populations. Similarly, the morphometric measurements of fasciolid eggs from Central Punjab are also between F. hepatica and F. gigantica standard populations. These results demonstrate the existence of fasciolid intermediate forms in endemic areas in Pakistan.

  2. Applying the food technology neophobia scale in a developing country context. A case-study on processed matooke (cooking banana) flour in Central Uganda.

    PubMed

    De Steur, Hans; Odongo, Walter; Gellynck, Xavier

    2016-01-01

    The success of new food technologies largely depends on consumers' behavioral responses to the innovation. In Eastern Africa, and Uganda in particular, a technology to process matooke into flour has been introduced with limited success. We measure and apply the Food technology Neophobia Scale (FTNS) to this specific case. This technique has been increasingly used in consumer research to determine consumers' fear for foods produced by novel technologies. Although it has been successful in developed countries, the low number and limited scope of past studies underlines the need for testing its applicability in a developing country context. Data was collected from 209 matooke consumers from Central Uganda. In general, respondents are relatively neophobic towards the new technology, with an average FTNS score of 58.7%, which hampers the success of processed matooke flour. Besides socio-demographic indicators, 'risk perception', 'healthiness' and the 'necessity of technologies' were key factors that influenced consumer's preference of processed matooke flour. Benchmarking the findings against previous FTNS surveys allows to evaluate factor solutions, compare standardized FTNS scores and further lends support for the multidimensionality of the FTNS. Being the first application in a developing country context, this study provides a case for examining food technology neophobia for processed staple crops in various regions and cultures. Nevertheless, research is needed to replicate this method and evaluate the external validity of our findings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Cao, J.

    2016-09-01

    Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.

  4. Hypercalculia in savant syndrome: central executive failure?

    PubMed

    González-Garrido, Andrés Antonio; Ruiz-Sandoval, José Luis; Gómez-Velázquez, Fabiola R; de Alba, José Luis Oropeza; Villaseñor-Cabrera, Teresa

    2002-01-01

    The existence of outstanding cognitive talent in mentally retarded subjects persists as a challenge to present knowledge. We report the case of a 16-year-old male patient with exceptional mental calculation abilities and moderate mental retardation. The patient was clinically evaluated. Data from standard magnetic resonance imaging (MRI) and two 99mTc-ethyl cysteine dimer (ECD)-single photon emission computer tomography (SPECT) (in resting condition and performing a mental calculation task) studies were analyzed. Main neurologic findings were brachycephalia, right-side neurologic soft signs, obsessive personality profile, low color-word interference effect in Stroop test, and diffuse increased cerebral blood flow during calculation task in 99mTc-ECD SPECT. MRI showed anatomical temporal plane inverse asymmetry. Evidence appears to support the hypothesis that savant skill is related to excessive and erroneous use of cognitive processing resources instigated by probable failure in central executive control mechanisms.

  5. From the selfish gene to selfish metabolism: revisiting the central dogma.

    PubMed

    de Lorenzo, Víctor

    2014-03-01

    The standard representation of the Central Dogma (CD) of Molecular Biology conspicuously ignores metabolism. However, both the metabolites and the biochemical fluxes behind any biological phenomenon are encrypted in the DNA sequence. Metabolism constrains and even changes the information flow when the DNA-encoded instructions conflict with the homeostasis of the biochemical network. Inspection of adaptive virulence programs and emergence of xenobiotic-biodegradation pathways in environmental bacteria suggest that their main evolutionary drive is the expansion of their metabolic networks towards new chemical landscapes rather than perpetuation and spreading of their DNA sequences. Faulty enzymatic reactions on suboptimal substrates often produce reactive oxygen species (ROS), a process that fosters DNA diversification and ultimately couples catabolism of the new chemicals to growth. All this calls for a revision of the CD in which metabolism (rather than DNA) has the leading role. © 2014 WILEY Periodicals, Inc.

  6. A national clinical decision support infrastructure to enable the widespread and consistent practice of genomic and personalized medicine.

    PubMed

    Kawamoto, Kensaku; Lobach, David F; Willard, Huntington F; Ginsburg, Geoffrey S

    2009-03-23

    In recent years, the completion of the Human Genome Project and other rapid advances in genomics have led to increasing anticipation of an era of genomic and personalized medicine, in which an individual's health is optimized through the use of all available patient data, including data on the individual's genome and its downstream products. Genomic and personalized medicine could transform healthcare systems and catalyze significant reductions in morbidity, mortality, and overall healthcare costs. Critical to the achievement of more efficient and effective healthcare enabled by genomics is the establishment of a robust, nationwide clinical decision support infrastructure that assists clinicians in their use of genomic assays to guide disease prevention, diagnosis, and therapy. Requisite components of this infrastructure include the standardized representation of genomic and non-genomic patient data across health information systems; centrally managed repositories of computer-processable medical knowledge; and standardized approaches for applying these knowledge resources against patient data to generate and deliver patient-specific care recommendations. Here, we provide recommendations for establishing a national decision support infrastructure for genomic and personalized medicine that fulfills these needs, leverages existing resources, and is aligned with the Roadmap for National Action on Clinical Decision Support commissioned by the U.S. Office of the National Coordinator for Health Information Technology. Critical to the establishment of this infrastructure will be strong leadership and substantial funding from the federal government. A national clinical decision support infrastructure will be required for reaping the full benefits of genomic and personalized medicine. Essential components of this infrastructure include standards for data representation; centrally managed knowledge repositories; and standardized approaches for leveraging these knowledge repositories to generate patient-specific care recommendations at the point of care.

  7. Polishing of treated palm oil mill effluent (POME) from ponding system by electrocoagulation process.

    PubMed

    Bashir, Mohammed J K; Mau Han, Tham; Jun Wei, Lim; Choon Aun, Ng; Abu Amr, Salem S

    2016-01-01

    As the ponding system used to treat palm oil mill effluent (POME) frequently fails to satisfy the discharge standard in Malaysia, the present study aimed to resolve this problem using an optimized electrocoagulation process. Thus, a central composite design (CCD) module in response surface methodology was employed to optimize the interactions of process variables, namely current density, contact time and initial pH targeted on maximum removal of chemical oxygen demand (COD), colour and turbidity with satisfactory pH of discharge POME. The batch study was initially designed by CCD and statistical models of responses were subsequently derived to indicate the significant terms of interactive process variables. All models were verified by analysis of variance showing model significances with Prob > F < 0.01. The optimum performance was obtained at the current density of 56 mA/cm(2), contact time of 65 min and initial pH of 4.5, rendering complete removal of colour and turbidity with COD removal of 75.4%. The pH of post-treated POME of 7.6 was achieved, which is suitable for direct discharge. These predicted outputs were subsequently confirmed by insignificant standard deviation readings between predicted and actual values. This optimum condition also permitted the simultaneous removal of NH3-N, and various metal ions, signifying the superiority of the electrocoagulation process optimized by CCD.

  8. The definition of pneumonia, the assessment of severity, and clinical standardization in the Pneumonia Etiology Research for Child Health study.

    PubMed

    Scott, J Anthony G; Wonodi, Chizoba; Moïsi, Jennifer C; Deloria-Knoll, Maria; DeLuca, Andrea N; Karron, Ruth A; Bhat, Niranjan; Murdoch, David R; Crawley, Jane; Levine, Orin S; O'Brien, Katherine L; Feikin, Daniel R

    2012-04-01

    To develop a case definition for the Pneumonia Etiology Research for Child Health (PERCH) project, we sought a widely acceptable classification that was linked to existing pneumonia research and focused on very severe cases. We began with the World Health Organization's classification of severe/very severe pneumonia and refined it through literature reviews and a 2-stage process of expert consultation. PERCH will study hospitalized children, aged 1-59 months, with pneumonia who present with cough or difficulty breathing and have either severe pneumonia (lower chest wall indrawing) or very severe pneumonia (central cyanosis, difficulty breastfeeding/drinking, vomiting everything, convulsions, lethargy, unconsciousness, or head nodding). It will exclude patients with recent hospitalization and children with wheeze whose indrawing resolves after bronchodilator therapy. The PERCH investigators agreed upon standard interpretations of the symptoms and signs. These will be maintained by a clinical standardization monitor who conducts repeated instruction at each site and by recurrent local training and testing.

  9. The Definition of Pneumonia, the Assessment of Severity, and Clinical Standardization in the Pneumonia Etiology Research for Child Health Study

    PubMed Central

    Wonodi, Chizoba; Moïsi, Jennifer C.; Deloria-Knoll, Maria; DeLuca, Andrea N.; Karron, Ruth A.; Bhat, Niranjan; Murdoch, David R.; Crawley, Jane; Levine, Orin S.; O’Brien, Katherine L.; Feikin, Daniel R.

    2012-01-01

    To develop a case definition for the Pneumonia Etiology Research for Child Health (PERCH) project, we sought a widely acceptable classification that was linked to existing pneumonia research and focused on very severe cases. We began with the World Health Organization’s classification of severe/very severe pneumonia and refined it through literature reviews and a 2-stage process of expert consultation. PERCH will study hospitalized children, aged 1–59 months, with pneumonia who present with cough or difficulty breathing and have either severe pneumonia (lower chest wall indrawing) or very severe pneumonia (central cyanosis, difficulty breastfeeding/drinking, vomiting everything, convulsions, lethargy, unconsciousness, or head nodding). It will exclude patients with recent hospitalization and children with wheeze whose indrawing resolves after bronchodilator therapy. The PERCH investigators agreed upon standard interpretations of the symptoms and signs. These will be maintained by a clinical standardization monitor who conducts repeated instruction at each site and by recurrent local training and testing. PMID:22403224

  10. The Effect of Highlighting on Processing and Memory of Central and Peripheral Text Information: Evidence from Eye Movements

    ERIC Educational Resources Information Center

    Yeari, Menahem; Oudega, Marja; van den Broek, Paul

    2017-01-01

    The present study investigated the effect of text highlighting on online processing and memory of central and peripheral information. We compared processing time (using eye-tracking methodology) and recall of central and peripheral information for three types of highlighting: (a) highlighting of central information, (b) highlighting of peripheral…

  11. Three-dimensional measurement system for crime scene documentation

    NASA Astrophysics Data System (ADS)

    Adamczyk, Marcin; Hołowko, Elwira; Lech, Krzysztof; Michoński, Jakub; MÄ czkowski, Grzegorz; Bolewicki, Paweł; Januszkiewicz, Kamil; Sitnik, Robert

    2017-10-01

    Three dimensional measurements (such as photogrammetry, Time of Flight, Structure from Motion or Structured Light techniques) are becoming a standard in the crime scene documentation process. The usage of 3D measurement techniques provide an opportunity to prepare more insightful investigation and helps to show every trace in the context of the entire crime scene. In this paper we would like to present a hierarchical, three-dimensional measurement system that is designed for crime scenes documentation process. Our system reflects the actual standards in crime scene documentation process - it is designed to perform measurement in two stages. First stage of documentation, the most general, is prepared with a scanner with relatively low spatial resolution but also big measuring volume - it is used for the whole scene documentation. Second stage is much more detailed: high resolution but smaller size of measuring volume for areas that required more detailed approach. The documentation process is supervised by a specialised application CrimeView3D, that is a software platform for measurements management (connecting with scanners and carrying out measurements, automatic or semi-automatic data registration in the real time) and data visualisation (3D visualisation of documented scenes). It also provides a series of useful tools for forensic technicians: virtual measuring tape, searching for sources of blood spatter, virtual walk on the crime scene and many others. In this paper we present our measuring system and the developed software. We also provide an outcome from research on metrological validation of scanners that was performed according to VDI/VDE standard. We present a CrimeView3D - a software-platform that was developed to manage the crime scene documentation process. We also present an outcome from measurement sessions that were conducted on real crime scenes with cooperation with Technicians from Central Forensic Laboratory of Police.

  12. Technical Note: The Modular Earth Submodel System (MESSy) - a new approach towards Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Jöckel, P.; Sander, R.; Kerkweg, A.; Tost, H.; Lelieveld, J.

    2005-02-01

    The development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes, requires coupling of the different domains (land, ocean, atmosphere, ...). One strategy is to link existing domain-specific models with a universal coupler, i.e. an independent standalone program organizing the communication between other programs. In many cases, however, a much simpler approach is more feasible. We have developed the Modular Earth Submodel System (MESSy). It comprises (1) a modular interface structure to connect to a , (2) an extendable set of such for miscellaneous processes, and (3) a coding standard. MESSy is therefore not a coupler in the classical sense, but exchanges data between a and several within one comprehensive executable. The internal complexity of the is controllable in a transparent and user friendly way. This provides remarkable new possibilities to study feedback mechanisms (by two-way coupling). Note that the MESSy and the coupler approach can be combined. For instance, an atmospheric model implemented according to the MESSy standard could easily be coupled to an ocean model by means of an external coupler. The vision is to ultimately form a comprehensive ESM which includes a large set of submodels, and a base model which contains only a central clock and runtime control. This can be reached stepwise, since each process can be included independently. Starting from an existing model, process submodels can be reimplemented according to the MESSy standard. This procedure guarantees the availability of a state-of-the-art model for scientific applications at any time of the development. In principle, MESSy can be implemented into any kind of model, either global or regional. So far, the MESSy concept has been applied to the general circulation model ECHAM5 and a number of process boxmodels.

  13. Centralized drug review processes: are they fair?

    PubMed

    Mitton, Craig R; McMahon, Meghan; Morgan, Steve; Gibson, Jennifer

    2006-07-01

    Numerous countries have implemented centralized drug review processes to assist in making drug coverage decisions. In addition to examining the final recommendations of these bodies, it is also important to ensure fairness in decision making. Accountability for reasonableness is an ethics-based framework for examining the fairness of priority setting processes. The objective of this study was to assess the fairness of four internationally established centralized drug review processes using accountability for reasonableness. Semi-structured telephone interviews were conducted with stakeholders in Canada, New Zealand, Australia and the UK (n=16). Participants were asked to evaluate their country's centralized drug review process against the four conditions of accountability for reasonableness. Each centralized drug review process satisfied at least one of the four ethical conditions, but none satisfied all four conditions. All participants viewed transparency as critical to both the legitimacy and fairness of centralized drug review processes. Additional strides need to be made in each of the four countries under study to improve the fairness of their centralized drug review processes. Ideally, a fair priority setting process should foster constructive stakeholder engagement and enhance the legitimacy of decisions made in assessing pharmaceutical products for funding. As policy makers are under increasing scrutiny in allocating limited resources, fair process should be seen as a critical component of such activity. This study represents the first attempt to conduct an international comparison of the fairness of centralized drug review agencies in the eyes of participating stakeholders.

  14. NCI Central Review Board Receives Accreditation

    Cancer.gov

    The Association for the Accreditation of Human Research Protection Programs has awarded the NCI Central Institutional Review Board full accreditation. AAHRPP awards accreditation to organizations demonstrating the highest ethical standards in clinical res

  15. Shifting Gears: Standards, Assessments, Curriculum, & Instruction.

    ERIC Educational Resources Information Center

    Dougherty, Eleanor

    This book is designed to help educators move from a system that measures students against students to one that values mastery of central concepts and skills, striving for proficiency in publicly acknowledged standards of academic performance. It aims to connect the operative parts of standards-based education (standards, assessment, curriculum,…

  16. Centralization and Decentralization in American Education Policy

    ERIC Educational Resources Information Center

    DeBoer, Jennifer

    2012-01-01

    This article examines the trend toward centralization in American education policy over the last century through a variety of lenses. The overall picture that emerges is one of a continuous tug-of-war, with national and local policymakers stumbling together toward incrementally more standardized and centralized policies. There is a center of power…

  17. Centrally Determined Standardization of Flow Cytometry Methods Reduces Interlaboratory Variation in a Prospective Multicenter Study.

    PubMed

    Westera, Liset; van Viegen, Tanja; Jeyarajah, Jenny; Azad, Azar; Bilsborough, Janine; van den Brink, Gijs R; Cremer, Jonathan; Danese, Silvio; D'Haens, Geert; Eckmann, Lars; Faubion, William; Filice, Melissa; Korf, Hannelie; McGovern, Dermot; Panes, Julian; Salas, Azucena; Sandborn, William J; Silverberg, Mark S; Smith, Michelle I; Vermeire, Severine; Vetrano, Stefania; Shackelton, Lisa M; Stitt, Larry; Jairath, Vipul; Levesque, Barrett G; Spencer, David M; Feagan, Brian G; Vande Casteele, Niels

    2017-11-02

    Flow cytometry (FC) aids in characterization of cellular and molecular factors involved in pathologic immune responses. Although FC has potential to facilitate early drug development in inflammatory bowel disease, interlaboratory variability limits its use in multicenter trials. Standardization of methods may address this limitation. We compared variability in FC-aided quantitation of T-cell responses across international laboratories using three analytical strategies. Peripheral blood mononuclear cells (PBMCs) were isolated from three healthy donors, stimulated with phorbol 12-myristate 13-acetate and ionomycin at a central laboratory, fixed, frozen, and shipped to seven international laboratories. Permeabilization and staining was performed in triplicate at each laboratory using a common protocol and centrally provided reagents. Gating was performed using local gating with a local strategy (LGLS), local gating with a central strategy (LGCS), and central gating (CG). Median cell percentages were calculated across triplicates and donors, and reported for each condition and strategy. The coefficient of variation (CV) was calculated across laboratories. Between-strategy comparisons were made using a two-way analysis of variance adjusting for donor. Mean interlaboratory CV ranged from 1.8 to 102.1% depending on cell population and gating strategy (LGLS, 4.4-102.1%; LGCS, 10.9-65.6%; CG, 1.8-20.9%). Mean interlaboratory CV differed significantly across strategies and was consistently lower with CG. Central gating was the only strategy with mean CVs consistently lower than 25%, which is a proposed standard for pharmacodynamic and exploratory biomarker assays.

  18. libdrdc: software standards library

    NASA Astrophysics Data System (ADS)

    Erickson, David; Peng, Tie

    2008-04-01

    This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.

  19. Managing the clinical setting for best nursing practice: a brief overview of contemporary initiatives.

    PubMed

    Henderson, Amanda; Winch, Sarah

    2008-01-01

    Leadership strategies are important in facilitating the nursing profession to reach their optimum standards in the practice environment. To compare and contrast the central tenets of contemporary quality initiatives that are commensurate with enabling the environment so that best practice can occur. Democratic leadership, accessible and relevant education and professional development, the incorporation of evidence into practice and the ability of facilities to be responsive to change are core considerations for the successful maintenance of practice standards that are consistent with best nursing practice. While different concerns of management drive the adoption of contemporary approaches, there are many similarities in the how these approaches are translated into action in the clinical setting. Managers should focus on core principles of professional nursing that add value to practice rather than business processes.

  20. Security Requirements Management in Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Mellado, Daniel; Fernández-Medina, Eduardo; Piattini, Mario

    Security requirements engineering is both a central task and a critical success factor in product line development due to the complexity and extensive nature of product lines. However, most of the current product line practices in requirements engineering do not adequately address security requirements engineering. Therefore, in this chapter we will propose a security requirements engineering process (SREPPLine) driven by security standards and based on a security requirements decision model along with a security variability model to manage the variability of the artefacts related to security requirements. The aim of this approach is to deal with security requirements from the early stages of the product line development in a systematic way, in order to facilitate conformance with the most relevant security standards with regard to the management of security requirements, such as ISO/IEC 27001 and ISO/IEC 15408.

  1. Are visual peripheries forever young?

    PubMed

    Burnat, Kalina

    2015-01-01

    The paper presents a concept of lifelong plasticity of peripheral vision. Central vision processing is accepted as critical and irreplaceable for normal perception in humans. While peripheral processing chiefly carries information about motion stimuli features and redirects foveal attention to new objects, it can also take over functions typical for central vision. Here I review the data showing the plasticity of peripheral vision found in functional, developmental, and comparative studies. Even though it is well established that afferent projections from central and peripheral retinal regions are not established simultaneously during early postnatal life, central vision is commonly used as a general model of development of the visual system. Based on clinical studies and visually deprived animal models, I describe how central and peripheral visual field representations separately rely on early visual experience. Peripheral visual processing (motion) is more affected by binocular visual deprivation than central visual processing (spatial resolution). In addition, our own experimental findings show the possible recruitment of coarse peripheral vision for fine spatial analysis. Accordingly, I hypothesize that the balance between central and peripheral visual processing, established in the course of development, is susceptible to plastic adaptations during the entire life span, with peripheral vision capable of taking over central processing.

  2. West Central U.S. Imagery (GOES-WEST) - Satellite Services Division /

    Science.gov Websites

    Single Image Java Loop Flash Loops HTML5 Loops With Lat/Lon No Lat/Lon Standard Standard Enhanced Same Sector from GOES East Flash Loop Note: Standard - usually 12-15 images many static overlays

  3. Analysis of moisture content, acidity and contamination by yeast and molds in Apis mellifera L. honey from central Brazil

    PubMed Central

    Ananias, Karla Rubia; de Melo, Adriane Alexandre Machado; de Moura, Celso José

    2013-01-01

    The development of mold of environmental origin in honey affects its quality and leads to its deterioration, so yeasts and molds counts have been used as an important indicator of hygiene levels during its processing, transportation and storage. The aim of this study was to evaluate the levels of yeasts and molds contamination and their correlation with moisture and acidity levels in Apis mellifera L. honey from central Brazil. In 20% of the samples, the yeasts and molds counts exceeded the limit established by legislation for the marketing of honey in the MERCOSUR, while 42.8% and 5.7% presented above-standard acidity and moisture levels, respectively. Although samples showed yeasts and molds counts over 1.0 × 102 UFC.g−1, there was no correlation between moisture content and the number of microorganisms, since, in part of the samples with above-standard counts, the moisture level was below 20%. In some samples the acidity level was higher than that established by legislation, but only one sample presented a yeasts and molds count above the limit established by MERCOSUR, which would suggest the influence of the floral source on this parameter. In general, of the 35 samples analyzed, the quality was considered inadequate in 45.7% of cases. PMID:24516434

  4. Analysis of moisture content, acidity and contamination by yeast and molds in Apis mellifera L. honey from central Brazil.

    PubMed

    Ananias, Karla Rubia; de Melo, Adriane Alexandre Machado; de Moura, Celso José

    2013-01-01

    The development of mold of environmental origin in honey affects its quality and leads to its deterioration, so yeasts and molds counts have been used as an important indicator of hygiene levels during its processing, transportation and storage. The aim of this study was to evaluate the levels of yeasts and molds contamination and their correlation with moisture and acidity levels in Apis mellifera L. honey from central Brazil. In 20% of the samples, the yeasts and molds counts exceeded the limit established by legislation for the marketing of honey in the MERCOSUR, while 42.8% and 5.7% presented above-standard acidity and moisture levels, respectively. Although samples showed yeasts and molds counts over 1.0 × 10(2) UFC.g(-1), there was no correlation between moisture content and the number of microorganisms, since, in part of the samples with above-standard counts, the moisture level was below 20%. In some samples the acidity level was higher than that established by legislation, but only one sample presented a yeasts and molds count above the limit established by MERCOSUR, which would suggest the influence of the floral source on this parameter. In general, of the 35 samples analyzed, the quality was considered inadequate in 45.7% of cases.

  5. Nasa-wide Standard Administrative Systems

    NASA Technical Reports Server (NTRS)

    Schneck, P.

    1984-01-01

    Factors to be considered in developing agency-wide standard administrative systems for NASA include uniformity of hardware and software; centralization vs. decentralization; risk exposure; and models for software development.

  6. IMPLICATIONS OF NEW ARSENIC STANDARDS ON OKLAHOMA WATER RESOURCES

    EPA Science Inventory

    The new national standard for arsenic in drinking water supplies, slated to take effect in 2006, is having an unexpected impact on a number of Oklahoma communities. Currently, several municipalities in north central Oklahoma are in compliance with existing arsenic standards (50 ...

  7. ANSI Standard: Complying with Background Noise Limits.

    ERIC Educational Resources Information Center

    Schaffer, Mark E.

    2003-01-01

    Discusses the new classroom acoustics standard, ANSI Standard S12.60, which specifies maximum sound level limits that are significantly lower than currently typical for classrooms. Addresses guidelines for unducted HVAC systems, ducted single-zone systems, and central VAV or multizone systems. (EV)

  8. [National Conference on Cataloguing Standards (Ottawa, May 19-20, 1970].

    ERIC Educational Resources Information Center

    National Library of Canada, Ottawa (Ontario).

    The following papers were presented at an invitational conference on cataloging standards: (1) "Canadiana Meets Automation;" (2) "The Union Catalogues in the National Library - The Present Condition;" (3) "A Centralized Bibliographic Data Bank;" (4) "The Standardization of Cataloguing;" (5) "The…

  9. Tele-ICU and Patient Safety Considerations.

    PubMed

    Hassan, Erkan

    The tele-ICU is designed to leverage, not replace, the need for bedside clinical expertise in the diagnosis, treatment, and assessment of various critical illnesses. Tele-ICUs are primarily decentralized or centralized models with differing advantages and disadvantages. The centralized model has sufficiently powered published data to be associated with improved mortality and ICU length of stay in a cost-effective manner. Factors associated with improved clinical outcomes include improved compliance with best practices; providing off-hours implementation of the bedside physician's care plan; and identification of and rapid response to physiological instability (initial clinical review within 1 hour) and rapid response to alerts, alarms, or direct notification by bedside clinicians. With improved communication and frequent review of patients between the tele-ICU and the bedside clinicians, the bedside clinician can provide the care that only they can provide. Although technology continues to evolve at a rapid pace, technology alone will most likely not improve clinical outcomes. Technology will enable us to process real or near real-time data into complex and powerful predictive algorithms. However, the remote and bedside teams must work collaboratively to develop care processes to better monitor, prioritize, standardize, and expedite care to drive greater efficiencies and improve patient safety.

  10. Improvement of constraint-based flux estimation during L-phenylalanine production with Escherichia coli using targeted knock-out mutants.

    PubMed

    Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk

    2014-07-01

    Fed-batch production of the aromatic amino acid L-phenylalanine was studied with recombinant Escherichia coli strains on a 15 L-scale using glycerol as carbon source. Flux Variability Analysis (FVA) was applied for intracellular flux estimation to obtain an insight into intracellular flux distribution during L-phenylalanine production. Variability analysis revealed great flux uncertainties in the central carbon metabolism, especially concerning malate consumption. Due to these results two recombinant strains were genetically engineered differing in the ability of malate degradation and anaplerotic reactions (E. coli FUS4.11 ΔmaeA pF81kan and E. coli FUS4.11 ΔmaeA ΔmaeB pF81kan). Applying these malic enzyme knock-out mutants in the standardized L-phenylalanine production process resulted in almost identical process performances (e.g., L-phenylalanine concentration, production rate and byproduct formation). This clearly highlighted great redundancies in central metabolism in E. coli. Uncertainties of intracellular flux estimations by constraint-based analyses during fed-batch production of L-phenylalanine were drastically reduced by application of the malic enzyme knock-out mutants. © 2014 Wiley Periodicals, Inc.

  11. Combining local and global limitations of visual search.

    PubMed

    Põder, Endel

    2017-04-01

    There are different opinions about the roles of local interactions and central processing capacity in visual search. This study attempts to clarify the problem using a new version of relevant set cueing. A central precue indicates two symmetrical segments (that may contain a target object) within a circular array of objects presented briefly around the fixation point. The number of objects in the relevant segments, and density of objects in the array were varied independently. Three types of search experiments were run: (a) search for a simple visual feature (color, size, and orientation); (b) conjunctions of simple features; and (c) spatial configuration of simple features (rotated Ts). For spatial configuration stimuli, the results were consistent with a fixed global processing capacity and standard crowding zones. For simple features and their conjunctions, the results were different, dependent on the features involved. While color search exhibits virtually no capacity limits or crowding, search for an orientation target was limited by both. Results for conjunctions of features can be partly explained by the results from the respective features. This study shows that visual search is limited by both local interference and global capacity, and the limitations are different for different visual features.

  12. Improving Building Construction Specifications in State and Local Governments

    NASA Technical Reports Server (NTRS)

    1980-01-01

    State and local governments can benefit from master specifications systems that centralize data on all types of building materials, products, and processes. Most of these systems are organized according to the MASTERFORMAT system, which, along with guide specifications that require the insertion or deletion of standardized information, resulted from the specific needs of users and providers. For jurisdictions preparing their own specifications, staff time and cost are reduced. For those subcontracting the preparation, master specifications provide a means of evaluating the specifications submitted. Current management specification systems described include SPECINTACT, OMSPEC, MASTERPEC, and the NAVFAC, Corps of Engineers, and GSA guide specifications.

  13. Employment training for disadvantaged or dependent populations.

    PubMed

    Stern, H

    1982-01-01

    The vocational rehabilitation process is viewed as having two dominant work-related components: the actual work-training experience and employability skills. The paper argues that both components are critical and must be integrated. The major role of the vocational rehabilitation agency is viewed as that of provider of employability (or job-seeking) skills programs. These programs consist of: (1) employability skills courses, (2) work performance demand standard setting, and (3) on-the-job rotational task schemes. Actual work skills can only be provided in the "real world" of work. Centralized work-training programs are viewed as creating inappropriate socialization and only moderately transferable skills.

  14. High-contrast grating hollow-core waveguide splitter applied to optical phased array

    NASA Astrophysics Data System (ADS)

    Zhao, Che; Xue, Ping; Zhang, Hanxing; Chen, Te; Peng, Chao; Hu, Weiwei

    2014-11-01

    A novel hollow-core (HW) Y-branch waveguide splitter based on high-contrast grating (HCG) is presented. We calculated and designed the HCG-HW splitter using Rigorous Coupled Wave Analysis (RCWA). Finite-different timedomain (FDTD) simulation shows that the splitter has a broad bandwidth and the branching loss is as low as 0.23 dB. Fabrication is accomplished with standard Silicon-On-Insulator (SOI) process. The experimental measurement results indicate its good performance on beam splitting near the central wavelength λ = 1550 nm with a total insertion loss of 7.0 dB.

  15. Changes in Student Attributions Due to the Implementation of Central Exit Exams

    ERIC Educational Resources Information Center

    Oerke, Britta; Maag Merki, Katharina; Holmeier, Monika; Jager, Daniela J.

    2011-01-01

    The central aim of standardized exit exams is to motivate students and teachers to work harder on critical subject matters and thus increase student achievement. However, the effects of the implementation of central exams on student motivation have not been analyzed in a longitudinal section until now. In the present study, the consequences of…

  16. 76 FR 37407 - Energy Conservation Program: Energy Conservation Standards for Residential Furnaces and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-27

    ...The Energy Policy and Conservation Act of 1975 (EPCA), as amended, prescribes energy conservation standards for various consumer products and certain commercial and industrial equipment, including residential furnaces and residential central air conditioners and heat pumps. EPCA also requires the U.S. Department of Energy (DOE) to determine whether more-stringent, amended standards for these products would be technologically feasible and economically justified, and would save a significant amount of energy. In this direct final rule, DOE adopts amended energy conservation standards for residential furnaces and for residential central air conditioners and heat pumps. A notice of proposed rulemaking that proposes identical energy efficiency standards is published elsewhere in this issue of the Federal Register. If DOE receives adverse comment and determines that such comment may provide a reasonable basis for withdrawing the direct final rule, this final rule will be withdrawn, and DOE will proceed with the proposed rule.

  17. A differentially expressed set of microRNAs in cerebro-spinal fluid (CSF) can diagnose CNS malignancies

    PubMed Central

    Drusco, Alessandra; Bottoni, Arianna; Laganà, Alessandro; Acunzo, Mario; Fassan, Matteo; Cascione, Luciano; Antenucci, Anna; Kumchala, Prasanthi; Vicentini, Caterina; Gardiman, Marina P.; Alder, Hansjuerg; Carosi, Mariantonia A.; Ammirati, Mario; Gherardi, Stefano; Luscrì, Marilena; Carapella, Carmine; Zanesi, Nicola; Croce, Carlo M.

    2015-01-01

    Central Nervous System malignancies often require stereotactic biopsy or biopsy for differential diagnosis, and for tumor staging and grading. Furthermore, stereotactic biopsy can be non-diagnostic or underestimate grading. Hence, there is a compelling need of new diagnostic biomarkers to avoid such invasive procedures. Several biological markers have been proposed, but they can only identify specific prognostic subtype of Central Nervous System tumors, and none of them has found a standardized clinical application. The aim of the study was to identify a Cerebro-Spinal Fluid microRNA signature that could differentiate among Central Nervous System malignancies. CSF total RNA of 34 neoplastic and of 14 non-diseased patients was processed by NanoString. Comparison among groups (Normal, Benign, Glioblastoma, Medulloblastoma, Metastasis and Lymphoma) lead to the identification of a microRNA profile that was further confirmed by RT-PCR and in situ hybridization. Hsa-miR-451, -711, 935, -223 and -125b were significantly differentially expressed among the above mentioned groups, allowing us to draw an hypothetical diagnostic chart for Central Nervous System malignancies. This is the first study to employ the NanoString technique for Cerebro-Spinal Fluid microRNA profiling. In this article, we demonstrated that Cerebro-Spinal Fluid microRNA profiling mirrors Central Nervous System physiologic or pathologic conditions. Although more cases need to be tested, we identified a diagnostic Cerebro-Spinal Fluid microRNA signature with good perspectives for future diagnostic clinical applications. PMID:26246487

  18. A differentially expressed set of microRNAs in cerebro-spinal fluid (CSF) can diagnose CNS malignancies.

    PubMed

    Drusco, Alessandra; Bottoni, Arianna; Laganà, Alessandro; Acunzo, Mario; Fassan, Matteo; Cascione, Luciano; Antenucci, Anna; Kumchala, Prasanthi; Vicentini, Caterina; Gardiman, Marina P; Alder, Hansjuerg; Carosi, Mariantonia A; Ammirati, Mario; Gherardi, Stefano; Luscrì, Marilena; Carapella, Carmine; Zanesi, Nicola; Croce, Carlo M

    2015-08-28

    Central Nervous System malignancies often require stereotactic biopsy or biopsy for differential diagnosis, and for tumor staging and grading. Furthermore, stereotactic biopsy can be non-diagnostic or underestimate grading. Hence, there is a compelling need of new diagnostic biomarkers to avoid such invasive procedures. Several biological markers have been proposed, but they can only identify specific prognostic subtype of Central Nervous System tumors, and none of them has found a standardized clinical application.The aim of the study was to identify a Cerebro-Spinal Fluid microRNA signature that could differentiate among Central Nervous System malignancies.CSF total RNA of 34 neoplastic and of 14 non-diseased patients was processed by NanoString. Comparison among groups (Normal, Benign, Glioblastoma, Medulloblastoma, Metastasis and Lymphoma) lead to the identification of a microRNA profile that was further confirmed by RT-PCR and in situ hybridization.Hsa-miR-451, -711, 935, -223 and -125b were significantly differentially expressed among the above mentioned groups, allowing us to draw an hypothetical diagnostic chart for Central Nervous System malignancies.This is the first study to employ the NanoString technique for Cerebro-Spinal Fluid microRNA profiling. In this article, we demonstrated that Cerebro-Spinal Fluid microRNA profiling mirrors Central Nervous System physiologic or pathologic conditions. Although more cases need to be tested, we identified a diagnostic Cerebro-Spinal Fluid microRNA signature with good perspectives for future diagnostic clinical applications.

  19. Policy implications of differential health status in East and West Europe. The case of Hungary.

    PubMed

    Makara, P

    1994-11-01

    Morbidity and mortality trends in Western and Eastern Europe have differed considerably during the past three decades, although the major unfavourable processes have been essentially the same in each of the Central European countries. The most striking feature has been the decline in average life expectancy and deterioration of age-specific mortality rates for the middle-aged, especially men. The former socialist government took no effective action. Due to the denial of social and environmental problems, social, health and environmental policy were underdeveloped and deformed. Partly inherited from previous historical traditions, wishful thinking, victimization and a patronizing attitude were primary ways of dealing with problems. In these circumstances even the few specially supported health education campaigns were doomed to fail. People depended on the omnipotent central state in vain to solve their problems so that health promotion based on the community and self-empowerment did not develop. During the early nineties, in Eastern and Central Europe no central political strategies were initiated or launched to combat the mortality and morbidity tendences. The economic and social prerequisites of a long-term gradual improvement in the health status are missing in Central and Eastern Europe. A declining standard of living due to recession, growing deprivation, poverty, unemployment and migration are unfavourable to improvements in health. In a time of crisis, with stress but without adequate skills of coping, forced adaptation associated with sudden changes and perceived failure have only made matters worse. There are no short-term 'solutions'.

  20. Integration of spectral domain optical coherence tomography with microperimetry generates unique datasets for the simultaneous identification of visual function and retinal structure in ophthalmological applications

    NASA Astrophysics Data System (ADS)

    Koulen, Peter; Gallimore, Gary; Vincent, Ryan D.; Sabates, Nelson R.; Sabates, Felix N.

    2011-06-01

    Conventional perimeters are used routinely in various eye disease states to evaluate the central visual field and to quantitatively map sensitivity. However, standard automated perimetry proves difficult for retina and specifically macular disease due to the need for central and steady fixation. Advances in instrumentation have led to microperimetry, which incorporates eye tracking for placement of macular sensitivity values onto an image of the macular fundus thus enabling a precise functional and anatomical mapping of the central visual field. Functional sensitivity of the retina can be compared with the observed structural parameters that are acquired with high-resolution spectral domain optical coherence tomography and by integration of scanning laser ophthalmoscope-driven imaging. Findings of the present study generate a basis for age-matched comparison of sensitivity values in patients with macular pathology. Microperimetry registered with detailed structural data performed before and after intervention treatments provides valuable information about macular function, disease progression and treatment success. This approach also allows for the detection of disease or treatment related changes in retinal sensitivity when visual acuity is not affected and can drive the decision making process in choosing different treatment regimens and guiding visual rehabilitation. This has immediate relevance for applications in central retinal vein occlusion, central serous choroidopathy, age-related macular degeneration, familial macular dystrophy and several other forms of retina related visual disability.

  1. Implementation of a reference standard and proficiency testing programme by the World Wide Antimalarial Resistance Network (WWARN)

    PubMed Central

    2010-01-01

    Background The Worldwide Antimalarial Resistance Network (WWARN) is a global collaboration to support the objective that anyone affected by malaria receives effective and safe drug treatment. The Pharmacology module aims to inform optimal anti-malarial drug selection. There is an urgent need to define the drug exposure - effect relationship for most anti-malarial drugs. Few anti-malarials have had their therapeutic blood concentration levels defined. One of the main challenges in assessing safety and efficacy data in relation to drug concentrations is the comparability of data generated from different laboratories. To explain differences in anti-malarial pharmacokinetics in studies with different measurement laboratories it is necessary to confirm the accuracy of the assay methods. This requires the establishment of an external quality assurance process to assure results that can be compared. This paper describes this process. Methods The pharmacology module of WWARN has established a quality assurance/quality control (QA/QC) programme consisting of two separate components: 1. A proficiency testing programme where blank human plasma spiked with certified reference material (CRM) in different concentrations is sent out to participating bioanalytical laboratories. 2. A certified reference standard programme where accurately weighed amounts of certified anti-malarial reference standards, metabolites, and internal standards are sent to participating bioanalytical and in vitro laboratories. Conclusion The proficiency testing programme is designed as a cooperative effort to help participating laboratories assess their ability to carry out drug analysis, resolve any potential problem areas and to improve their results - and, in so doing, to improve the quality of anti-malarial pharmacokinetic data published and shared with WWARN. By utilizing the same source of standards for all laboratories, it is possible to minimize bias arising from poor quality reference standards. By providing anti-malarial drug standards from a central point, it is possible to lower the cost of these standards. PMID:21184684

  2. Current controlled vocabularies are insufficient to uniquely map molecular entities to mass spectrometry signal

    PubMed Central

    2015-01-01

    Background The comparison of analyte mass spectrometry precursor (MS1) signal is central to many proteomic (and other -omic) workflows. Standard vocabularies for mass spectrometry exist and provide good coverage for most experimental applications yet are insufficient for concise and unambiguous description of data concepts spanning the range of signal provenance from a molecular perspective (e.g. from charged peptides down to fine isotopes). Without a standard unambiguous nomenclature, literature searches, algorithm reproducibility and algorithm evaluation for MS-omics data processing are nearly impossible. Results We show how terms from current official ontologies are too vague or ambiguous to explicitly map molecular entities to MS signals and we illustrate the inconsistency and ambiguity of current colloquially used terms. We also propose a set of terms for MS1 signal that uniquely, succinctly and intuitively describe data concepts spanning the range of signal provenance from full molecule downs to fine isotopes. We suggest that additional community discussion of these terms should precede any further standardization efforts. We propose a novel nomenclature that spans the range of the required granularity to describe MS data processing from the perspective of the molecular provenance of the MS signal. Conclusions The proposed nomenclature provides a chain of succinct and unique terms spanning the signal created by a charged molecule down through each of its constituent subsignals. We suggest that additional community discussion of these terms should precede any further standardization efforts. PMID:25952148

  3. INFIBRA: machine vision inspection of acrylic fiber production

    NASA Astrophysics Data System (ADS)

    Davies, Roger; Correia, Bento A. B.; Contreiras, Jose; Carvalho, Fernando D.

    1998-10-01

    This paper describes the implementation of INFIBRA, a machine vision system for the inspection of acrylic fiber production lines. The system was developed by INETI under a contract from Fisipe, Fibras Sinteticas de Portugal, S.A. At Fisipe there are ten production lines in continuous operation, each approximately 40 m in length. A team of operators used to perform periodic manual visual inspection of each line in conditions of high ambient temperature and humidity. It is not surprising that failures in the manual inspection process occurred with some frequency, with consequences that ranged from reduced fiber quality to production stoppages. The INFIBRA system architecture is a specialization of a generic, modular machine vision architecture based on a network of Personal Computers (PCs), each equipped with a low cost frame grabber. Each production line has a dedicated PC that performs automatic inspection, using specially designed metrology algorithms, via four video cameras located at key positions on the line. The cameras are mounted inside custom-built, hermetically sealed water-cooled housings to protect them from the unfriendly environment. The ten PCs, one for each production line, communicate with a central PC via a standard Ethernet connection. The operator controls all aspects of the inspection process, from configuration through to handling alarms, via a simple graphical interface on the central PC. At any time the operator can also view on the central PC's screen the live image from any one of the 40 cameras employed by the system.

  4. Mismatch negativity as a potential neurobiological marker of early-stage Alzheimer disease and vascular dementia.

    PubMed

    Jiang, Shixiang; Yan, Chang; Qiao, Zhengxue; Yao, Haiqian; Jiang, Shiquan; Qiu, Xiaohui; Yang, Xiuxian; Fang, Deyu; Yang, Yanjie; Zhang, Limei; Wang, Lina; Zhang, Liming

    2017-04-24

    Alzheimer's disease (AD) and vascular dementia (VD) are serious, irreversible forms of cognitive impairment, which means that an early diagnosis is essential to slow down their progression. One potential neurophysiological biomarker of these diseases is the mismatch negativity (MMN) event-related potentials (ERP) component, which reflects an automatic detection mechanism at the pre-attentive stages of information processing. We evaluated the auditory MMN response in individuals from two patient groups: those in the prodromal stages of AD (P-AD) and those in the prodromal stages of VD (P-VD). Thirty patients (15 P-AD patients and 15 P-VD patients) and 30 age-matched controls were recruited to undergo electrophysiological recordings during the presentation of an auditory deviant-standard-reverse oddball paradigm that was used to elicit genuine MMN responses. We show that over the frontal-central area, the mean amplitude of the MMN was significantly reduced in both the P-AD (p=0.017) and P-VD groups (p=0.013) compared with controls. The MMN peak latency in P-VD patients was significantly shorter than in controls (p=0.027). No MMN response differences between the P-AD and P-VD were found in either the frontal-central or the temporal areas. These results indicate that P-AD and P-VD patients exhibit impaired pre-attentive information processing mechanisms as revealed by the frontal-central area MMN response, which is associated with sensory memory and cognitive deficits. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Comparison of intermediate-dose methotrexate with cranial irradiation for the post-induction treatment of acute lymphocytic leukemia in children

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, A.I.; Weinberg, V.; Brecher, M.L.

    1983-03-03

    The authors compared two regimens with respect to their ability to prolong disease-free survival in 506 children and adolescents with acute lymphocytic leukemia. All responders to induction therapy were randomized to treatment with 2400 rad of cranial irradiation plus intrathecal methotrexate or to treatment with intermediate-dose methotrexate plus intrathecal methotrexate, as prophylaxis for involvement of the central nervous system and other sanctuary areas. Complete responders were stratified into either standard-risk or increased-risk groups on the basis of age and white-cell count at presentation. Among patients with standard risk, hematologic relapses occurred in 9 of 117 given methotrexate and 24 ofmore » 120 given irradiation. The rate of central-nervous-system relapse was higher in the methotrexate group (23 of 117) than in the irradiation group. Among patients with increased risk, radiation offered greater protection to the central nervous system than methotrexate; there was no difference in the rate of hematologic relapse. Methotrexate offered better protection against systemic relapse in standard-risk patients and better protection against testicular relapse overall, but it offered less protection against relapses in the central nervous system than cranial irradiation.« less

  6. Spatial Reorientation of Sensorimotor Balance Control in Altered Gravity

    NASA Technical Reports Server (NTRS)

    Paloski, W. H.; Black, F. L.; Kaufman, G. D.; Reschke, M. F.; Wood, S. J.

    2007-01-01

    Sensorimotor coordination of body segments following space flight are more pronounced after landing when the head is actively tilted with respect to the trunk. This suggests that central vestibular processing shifts from a gravitational frame of reference to a head frame of reference in microgravity. A major effect of such changes is a significant postural instability documented by standard head-erect Sensory Organization Tests. Decrements in functional performance may still be underestimated when head and gravity reference frames remained aligned. The purpose of this study was to examine adaptive changes in spatial processing for balance control following space flight by incorporating static and dynamic tilts that dissociate head and gravity reference frames. A second aim of this study was to examine the feasibility of altering the re-adaptation process following space flight by providing discordant visual-vestibular-somatosensory stimuli using short-radius pitch centrifugation.

  7. ADASS Web Database XML Project

    NASA Astrophysics Data System (ADS)

    Barg, M. I.; Stobie, E. B.; Ferro, A. J.; O'Neil, E. J.

    In the spring of 2000, at the request of the ADASS Program Organizing Committee (POC), we began organizing information from previous ADASS conferences in an effort to create a centralized database. The beginnings of this database originated from data (invited speakers, participants, papers, etc.) extracted from HyperText Markup Language (HTML) documents from past ADASS host sites. Unfortunately, not all HTML documents are well formed and parsing them proved to be an iterative process. It was evident at the beginning that if these Web documents were organized in a standardized way, such as XML (Extensible Markup Language), the processing of this information across the Web could be automated, more efficient, and less error prone. This paper will briefly review the many programming tools available for processing XML, including Java, Perl and Python, and will explore the mapping of relational data from our MySQL database to XML.

  8. Variability of the institutional review board process within a national research network.

    PubMed

    Khan, Muhammad A; Barratt, Michelle S; Krugman, Scott D; Serwint, Janet R; Dumont-Driscoll, Marilyn

    2014-06-01

    To determine the variability of the institutional review board (IRB) process for a minimal risk multicenter study. Participants included 24 Continuity Research Network (CORNET) sites of the Academic Pediatric Association that participated in a cross-sectional study. Each site obtained individual institutional IRB approval. An anonymous questionnaire went to site investigators about the IRB process at their institution. Twenty-two of 24 sites (92%) responded. Preparation time ranged from 1 to 20 hours, mean of 7.1 hours. Individuals submitting ≤3 IRB applications/year required more time for completion than those submitting >3/year (P < .05). Thirteen of 22 (59%) study sites received approval with "exempt" status, and 6 (27%) approved as "expedited" studies. IRB experiences were highly variable across study sites. These findings indicate that multicenter research projects should anticipate barriers to timely study implementation. Improved IRB standardization or centralization for multicenter clinical studies would facilitate this type of practice-based clinical research.

  9. Enumeration of major peripheral blood leukocyte populations for multicenter clinical trials using a whole blood phenotyping assay.

    PubMed

    Hensley, Tiffany R; Easter, Austin B; Gerdts, Sarah E; De Rosa, Stephen C; Heit, Antje; McElrath, M Juliana; Andersen-Nissen, Erica

    2012-09-16

    Cryopreservation of peripheral blood leukocytes is widely used to preserve cells for immune response evaluations in clinical trials and offers many advantages for ease and standardization of immunological assessments, but detrimental effects of this process have been observed on some cell subsets, such as granulocytes, B cells, and dendritic cells. Assaying fresh leukocytes gives a more accurate picture of the in vivo state of the cells, but is often difficult to perform in the context of large clinical trials. Fresh cell assays are dependent upon volunteer commitments and timeframes and, if time-consuming, their application can be impractical due to the working hours required of laboratory personnel. In addition, when trials are conducted at multiple centers, laboratories with the resources and training necessary to perform the assays may not be located in sufficient proximity to clinical sites. To address these issues, we have developed an 11-color antibody staining panel that can be used with Trucount tubes (Becton Dickinson; San Jose, CA) to phenotype and enumerate the major leukocyte populations within the peripheral blood, yielding more robust cell-type specific information than assays such as a complete blood count (CBC) or assays with commercially-available panels designed for Trucount tubes that stain for only a few cell types. The staining procedure is simple, requires only 100 μl of fresh whole blood, and takes approximately 45 minutes, making it feasible for standard blood-processing labs to perform. It is adapted from the BD Trucount tube technical data sheet (version 8/2010). The staining antibody cocktail can be prepared in advance in bulk at a central assay laboratory and shipped to the site processing labs. Stained tubes can be fixed and frozen for shipment to the central assay laboratory for multicolor flow cytometry analysis. The data generated from this staining panel can be used to track changes in leukocyte concentrations over time in relation to intervention and could easily be further developed to assess activation states of specific cell types of interest. In this report, we demonstrate the procedure used by blood-processing lab technicians to perform staining on fresh whole blood and the steps to analyze these stained samples at a central assay laboratory supporting a multicenter clinical trial. The video details the procedure as it is performed in the context of a clinical trial blood draw in the HIV Vaccine Trials Network (HVTN).

  10. Assembling proteomics data as a prerequisite for the analysis of large scale experiments

    PubMed Central

    Schmidt, Frank; Schmid, Monika; Thiede, Bernd; Pleißner, Klaus-Peter; Böhme, Martina; Jungblut, Peter R

    2009-01-01

    Background Despite the complete determination of the genome sequence of a huge number of bacteria, their proteomes remain relatively poorly defined. Beside new methods to increase the number of identified proteins new database applications are necessary to store and present results of large- scale proteomics experiments. Results In the present study, a database concept has been developed to address these issues and to offer complete information via a web interface. In our concept, the Oracle based data repository system SQL-LIMS plays the central role in the proteomics workflow and was applied to the proteomes of Mycobacterium tuberculosis, Helicobacter pylori, Salmonella typhimurium and protein complexes such as 20S proteasome. Technical operations of our proteomics labs were used as the standard for SQL-LIMS template creation. By means of a Java based data parser, post-processed data of different approaches, such as LC/ESI-MS, MALDI-MS and 2-D gel electrophoresis (2-DE), were stored in SQL-LIMS. A minimum set of the proteomics data were transferred in our public 2D-PAGE database using a Java based interface (Data Transfer Tool) with the requirements of the PEDRo standardization. Furthermore, the stored proteomics data were extractable out of SQL-LIMS via XML. Conclusion The Oracle based data repository system SQL-LIMS played the central role in the proteomics workflow concept. Technical operations of our proteomics labs were used as standards for SQL-LIMS templates. Using a Java based parser, post-processed data of different approaches such as LC/ESI-MS, MALDI-MS and 1-DE and 2-DE were stored in SQL-LIMS. Thus, unique data formats of different instruments were unified and stored in SQL-LIMS tables. Moreover, a unique submission identifier allowed fast access to all experimental data. This was the main advantage compared to multi software solutions, especially if personnel fluctuations are high. Moreover, large scale and high-throughput experiments must be managed in a comprehensive repository system such as SQL-LIMS, to query results in a systematic manner. On the other hand, these database systems are expensive and require at least one full time administrator and specialized lab manager. Moreover, the high technical dynamics in proteomics may cause problems to adjust new data formats. To summarize, SQL-LIMS met the requirements of proteomics data handling especially in skilled processes such as gel-electrophoresis or mass spectrometry and fulfilled the PSI standardization criteria. The data transfer into a public domain via DTT facilitated validation of proteomics data. Additionally, evaluation of mass spectra by post-processing using MS-Screener improved the reliability of mass analysis and prevented storage of data junk. PMID:19166578

  11. Do clinical guidelines improve management of sepsis in critically ill elderly patients? A before-and-after study of the implementation of a sepsis protocol.

    PubMed

    Heppner, Hans Juergen; Singler, Katrin; Kwetkat, Anja; Popp, Steffen; Esslinger, Adelheid Susanne; Bahrmann, Philipp; Kaiser, Matthias; Bertsch, Thomas; Sieber, Cornel Christian; Christ, Michael

    2012-10-01

    Guidelines for the management of sepsis have been published but not validated for elderly patients, though a prompt work-up and initiation of appropriate therapy are crucial. This study assesses the impact of a sepsis protocol on timelines for therapy and mortality in standardized management. Consecutive patients aged 70 years and older who were diagnosed with sepsis and admitted during the observation periods were included in this before-and-after study at a medical intensive care unit (ICU). Age, sex, and process-of-care variables including timely administration of antibiotics, obtaining blood cultures before the start of antibiotics, documenting central venous pressure, evaluation of central venous blood oxygen saturation, fluid resuscitation, and patient outcome were recorded. A total of 122 patients were included. Sepsis was diagnosed in 22.9 % of patients prior to the introduction of the protocol and 57.4 % after introduction. Volume therapy was conducted in 63.9 % of the patients (11.5 % preprotocol). Blood culture samples were taken prior to the administration of antibiotics in 67.2 % of patients (4.9 % preprotocol), and antibiotics were applied early in 72.1 % of patients (32.8 % preprotocol). Lactate was set in 77.0 % of patients (11.5 % preprotocol). A central venous catheter was inserted in 88.5 % of patients (68.9 % preprotocol), and the target central venous pressure was achieved in 64.3 % of patients (47.2 % preprotocol). ICU mortality was reduced by 5.2 % and hospital mortality by 6.4 %. The use of standardized order sets for the management of sepsis in elderly patients should be strongly recommended for better performance in treatment. Compliance with the protocol was associated with reduced length of stay, reduced mortality, and improved initial appropriate therapy.

  12. 40 CFR 437.15 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS THE CENTRALIZED WASTE TREATMENT POINT SOURCE CATEGORY Metals Treatment..., cadmium, chromium, cobalt, copper, lead, mercury, nickel, silver, tin, titanium, vanadium, and zinc are...

  13. Visual processing in adolescents with autism spectrum disorder: evidence from embedded figures and configural superiority tests.

    PubMed

    Dillen, Claudia; Steyaert, Jean; Op de Beeck, Hans P; Boets, Bart

    2015-05-01

    The embedded figures test has often been used to reveal weak central coherence in individuals with autism spectrum disorder (ASD). Here, we administered a more standardized automated version of the embedded figures test in combination with the configural superiority task, to investigate the effect of contextual modulation on local feature detection in 23 adolescents with ASD and 26 matched typically developing controls. On both tasks both groups performed largely similarly in terms of accuracy and reaction time, and both displayed the contextual modulation effect. This indicates that individuals with ASD are equally sensitive compared to typically developing individuals to the contextual effects of the task and that there is no evidence for a local processing bias in adolescents with ASD.

  14. Renyi entropy measures of heart rate Gaussianity.

    PubMed

    Lake, Douglas E

    2006-01-01

    Sample entropy and approximate entropy are measures that have been successfully utilized to study the deterministic dynamics of heart rate (HR). A complementary stochastic point of view and a heuristic argument using the Central Limit Theorem suggests that the Gaussianity of HR is a complementary measure of the physiological complexity of the underlying signal transduction processes. Renyi entropy (or q-entropy) is a widely used measure of Gaussianity in many applications. Particularly important members of this family are differential (or Shannon) entropy (q = 1) and quadratic entropy (q = 2). We introduce the concepts of differential and conditional Renyi entropy rate and, in conjunction with Burg's theorem, develop a measure of the Gaussianity of a linear random process. Robust algorithms for estimating these quantities are presented along with estimates of their standard errors.

  15. National Emission Standards for Hazardous Air Pollutants (NESHAP) Memorandum of Agreement (MOA) Between NASA Headquarters and MSFC (Marshall Space Flight Center) for NASA Principal Center for Review of Clean Air Regulations

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.; Clark-Ingram, Marceia A.

    2000-01-01

    This paper presents a memorandum of agreement on Clean Air Regulations. NASA headquarters (code JE and code M) has asked MSFC to serve as principle center for review of Clean Air Act (CAA) regulations. The purpose of the principle center is to provide centralized support to NASA headquarters for the management and leadership of NASA's CAA regulation review process and to identify the potential impact of proposed CAA reguations on NASA program hardware and supporting facilities. The materials and processes utilized in the manufacture of NASA's programmatic hardware contain HAPs (Hazardous Air Pollutants), VOCs (Volatile Organic Compounds), and ODC (Ozone Depleting Chemicals). This paper is presented in viewgraph form.

  16. MATHEMATICAL METHODS IN MEDICAL IMAGE PROCESSING

    PubMed Central

    ANGENENT, SIGURD; PICHON, ERIC; TANNENBAUM, ALLEN

    2013-01-01

    In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation. PMID:23645963

  17. Optimization of composite coagulant made from polyferric chloride and tapioca starch in landfill leachate treatment

    NASA Astrophysics Data System (ADS)

    Shaylinda, M. Z. N.; Hamidi, A. A.; Mohd, N. A.; Ariffin, A.; Irvan, D.; Hazreek, Z. A. M.; Nizam, Z. M.

    2018-04-01

    In this research, the performance of polyferric chloride and tapioca flour as composite coagulants for partially stabilized leachate was investigated. Response surface methodology (RSM) was used to optimize the coagulation and flocculation process of partially stabilized leachate. Central composite design a standard design tool in RSM was applied to evaluate the interactions and effects of dose and pH. Dose 0.2 g/L Fe and pH 4.71 were the optimum value suggested by RSM. Experimental test based on the optimum condition, resulted in 95.9%, 94.6% and 50.4% of SS, color and COD removals, respectively. The percentage difference recorded between experimental and model responses was <5%. Therefore, it can be concluded that RSM was an appropriate optimization tool for coagulation and flocculation process.

  18. Preventing CLABSIs among pediatric hematology/oncology inpatients: national collaborative results.

    PubMed

    Bundy, David G; Gaur, Aditya H; Billett, Amy L; He, Bing; Colantuoni, Elizabeth A; Miller, Marlene R

    2014-12-01

    Central lines (CLs) are essential for the delivery of modern cancer care to children. Nonetheless, CLs are subject to potentially life-threatening complications, including central line-associated bloodstream infections (CLABSIs). The objective of this study was to assess the feasibility of a multicenter effort to standardize CL care and CLABSI tracking, and to quantify the impact of standardizing these processes on CLABSI rates among pediatric hematology/oncology inpatients. We conducted a multicenter quality improvement collaborative starting in November 2009. Multidisciplinary teams at participating sites implemented a standardized bundle of CL care practices and adopted a common approach to CLABSI surveillance. Thirty-two units participated in the collaborative and reported a mean, precollaborative CLABSI rate of 2.85 CLABSIs per 1000 CL-days. Self-reported adoption of the CL care bundle was brisk, with average compliance approaching 80% by the end of the first year of the collaborative and exceeding 80% thereafter. As of August 2012, the mean CLABSI rate during the collaborative was 2.04 CLABSIs per 1000 CL-days, a reduction of 28% (relative risk: 0.71 [95% confidence interval: 0.55-0.92]). Changes in self-reported CL care bundle compliance were not statistically associated with changes in CLABSI rates, although there was little variability in bundle compliance rates after the first year of the collaborative. A multicenter quality improvement collaborative found significant reductions in observed CLABSI rates in pediatric hematology/oncology inpatients. Additional interventions will likely be required to bring and sustain CLABSI rates closer to zero for this high-risk population. Copyright © 2014 by the American Academy of Pediatrics.

  19. Development and standardization of Arabic words in noise test in Egyptian children.

    PubMed

    Abdel Rahman, Tayseer Taha

    2018-05-01

    To develop and establish norms of Arabic Words in Noise test in Egyptian children. Total number of participants was 152 with normal hearing and ranging in age from 5 to 12 years. They are subdivided into two main groups (standardization group) which comprised 120 children with normal scholastic achievement and (application group) which comprised 32 children with different types of central auditory processing disorders. Arabic version of both Speech perception in noise (SPIN) and Words in Noise (WIN) tests were presented in each ear at zero signal to-noise ratio (SNR) using ipsilateral Cafeteria noise fixed at 50 dB sensation level (dBSL). The least performance in WIN test occurred between 5 and 7 years and highest scores from 9 to 12 years. However, no statistically significant difference was found among the three standardization age groups. Moreover, no statistically significant difference was found between the right and left ears scores or among the three lists. When the WIN test was compared to SPIN test in children with and without abnormal SPIN scores it showed highly consistent results except in children suffering from memory deficit reflecting that WIN test is more accurate than SPIN in this group of children. The Arabic WIN test can be used in children as young as 5 years. Also, it can be a good cross check test with SPIN test or used to follow up children after rehabilitation program in hearing impaired children or follow up after central auditory remediation of children with selective auditory attention deficit. Copyright © 2017. Published by Elsevier B.V.

  20. Centrally Determined Standardization of Flow Cytometry Methods Reduces Interlaboratory Variation in a Prospective Multicenter Study

    PubMed Central

    Westera, Liset; van Viegen, Tanja; Jeyarajah, Jenny; Azad, Azar; Bilsborough, Janine; van den Brink, Gijs R; Cremer, Jonathan; Danese, Silvio; D'Haens, Geert; Eckmann, Lars; Faubion, William; Filice, Melissa; Korf, Hannelie; McGovern, Dermot; Panes, Julian; Salas, Azucena; Sandborn, William J; Silverberg, Mark S; Smith, Michelle I; Vermeire, Severine; Vetrano, Stefania; Shackelton, Lisa M; Stitt, Larry; Jairath, Vipul; Levesque, Barrett G; Spencer, David M; Feagan, Brian G; Vande Casteele, Niels

    2017-01-01

    Objectives: Flow cytometry (FC) aids in characterization of cellular and molecular factors involved in pathologic immune responses. Although FC has potential to facilitate early drug development in inflammatory bowel disease, interlaboratory variability limits its use in multicenter trials. Standardization of methods may address this limitation. We compared variability in FC-aided quantitation of T-cell responses across international laboratories using three analytical strategies. Methods: Peripheral blood mononuclear cells (PBMCs) were isolated from three healthy donors, stimulated with phorbol 12-myristate 13-acetate and ionomycin at a central laboratory, fixed, frozen, and shipped to seven international laboratories. Permeabilization and staining was performed in triplicate at each laboratory using a common protocol and centrally provided reagents. Gating was performed using local gating with a local strategy (LGLS), local gating with a central strategy (LGCS), and central gating (CG). Median cell percentages were calculated across triplicates and donors, and reported for each condition and strategy. The coefficient of variation (CV) was calculated across laboratories. Between-strategy comparisons were made using a two-way analysis of variance adjusting for donor. Results: Mean interlaboratory CV ranged from 1.8 to 102.1% depending on cell population and gating strategy (LGLS, 4.4–102.1% LGCS, 10.9–65.6% CG, 1.8–20.9%). Mean interlaboratory CV differed significantly across strategies and was consistently lower with CG. Conclusions: Central gating was the only strategy with mean CVs consistently lower than 25%, which is a proposed standard for pharmacodynamic and exploratory biomarker assays. PMID:29095427

  1. Strengthening Learners' Perspectives in Professional Standards to Restore Relationality as Central to Teaching

    ERIC Educational Resources Information Center

    Kriewaldt, Jeana A.

    2015-01-01

    Australian teacher standards have effects on what is thought about teachers' work. Just as teacher standards give expression to some characteristics of quality teaching, so too do students' views if solicited and made public, yet the archive of teaching standards pays little attention to learners' perspectives. This paper uses a theoretical…

  2. Thermostable 𝜶-Amylase Activity from Thermophilic Bacteria Isolated from Bora Hot Spring, Central Sulawesi

    NASA Astrophysics Data System (ADS)

    Gazali, F. M.; Suwastika, I. N.

    2018-03-01

    α-Amylase is one of the most important enzyme in biotechnology field, especially in industrial application. Thermostability of α-Amylase produced by thermophilic bacteria improves industrial process of starch degradation in starch industry. The present study were concerned to the characterization of α-Amylase activity from indigenous thermophilic bacteria isolated from Bora hot spring, Central Sulawesi. There were 18 isolates which had successfully isolated from 90°C sediment samples of Bora hot spring and 13 of them showed amylolytic activity. The α-Amylase activity was measured qualitatively at starch agar and quantitatively based on DNS (3,5-Dinitrosalicylic acid) methods, using maltose as standard solution. Two isolates (out of 13 amylolytic bacteria), BR 002 and BR 015 showed amylolytic index of 0.8 mm and 0.5 mm respectively, after being incubated at 55°C in the 0.002% Starch Agar Medium. The α-Amylase activity was further characterized quantitatively which includes the optimum condition of pH and temperature of α-Amylase crude enzyme from each isolate. To our knowledge, this is the first report on isolation and characterization of a thermostable α-Amylase from thermophilic bacteria isolated from Central Sulawesi particularly from Bora hot spring.

  3. The effects of baryon physics, black holes and active galactic nucleus feedback on the mass distribution in clusters of galaxies

    NASA Astrophysics Data System (ADS)

    Martizzi, Davide; Teyssier, Romain; Moore, Ben; Wentz, Tina

    2012-06-01

    The spatial distribution of matter in clusters of galaxies is mainly determined by the dominant dark matter component; however, physical processes involving baryonic matter are able to modify it significantly. We analyse a set of 500 pc resolution cosmological simulations of a cluster of galaxies with mass comparable to Virgo, performed with the AMR code RAMSES. We compare the mass density profiles of the dark, stellar and gaseous matter components of the cluster that result from different assumptions for the subgrid baryonic physics and galaxy formation processes. First, the prediction of a gravity-only N-body simulation is compared to that of a hydrodynamical simulation with standard galaxy formation recipes, and then all results are compared to a hydrodynamical simulation which includes thermal active galactic nucleus (AGN) feedback from supermassive black holes (SMBHs). We find the usual effects of overcooling and adiabatic contraction in the run with standard galaxy formation physics, but very different results are found when implementing SMBHs and AGN feedback. Star formation is strongly quenched, producing lower stellar densities throughout the cluster, and much less cold gas is available for star formation at low redshifts. At redshift z= 0 we find a flat density core of radius 10 kpc in both the dark and stellar matter density profiles. We speculate on the possible formation mechanisms able to produce such cores and we conclude that they can be produced through the coupling of different processes: (I) dynamical friction from the decay of black hole orbits during galaxy mergers; (II) AGN-driven gas outflows producing fluctuations of the gravitational potential causing the removal of collisionless matter from the central region of the cluster; (III) adiabatic expansion in response to the slow expulsion of gas from the central region of the cluster during the quiescent mode of AGN activity.

  4. A stand density management diagram for sawtimber-sized mixed upland central hardwoods

    Treesearch

    J.A., Jr. Kershaw; B.C. Fischer

    1991-01-01

    Data from 190 CFI plots located in southern and west-central Indiana are used to develop a stand density diagram for sawtimber-sized mixed upland hardwoods in the Central States. The stand density diagram utilizes the concepts of self-thinning to establish a maximum size-density curve, and the stocking standards of Gingrich (1967) to formulate imtermediate stocking...

  5. An Exploratory Study of Central Office Best Practices in Washington's Top 5% Performing School Districts

    ERIC Educational Resources Information Center

    Ansingh, Pamela Jean

    2012-01-01

    How prepared are central office administrators to lead school districts in the face of changing standards and changing student populations? The purpose of this mixed methods study was to examine the responses of central office administrators in the top 5% performing school districts in Washington State with the goal of identifying common attitudes…

  6. Transfer function-derived central pressure and cardiovascular disease events: the Framingham Heart Study.

    PubMed

    Mitchell, Gary F; Hwang, Shih-Jen; Larson, Martin G; Hamburg, Naomi M; Benjamin, Emelia J; Vasan, Ramachandran S; Levy, Daniel; Vita, Joseph A

    2016-08-01

    Relations between central pulse pressure (PP) or pressure amplification and major cardiovascular disease (CVD) events are controversial. Estimates of central aortic pressure derived using radial artery tonometry and a generalized transfer function may better predict CVD risk beyond the predictive value of brachial SBP. Augmentation index, central SBP, central PP, and central-to-peripheral PP amplification were evaluated using radial artery tonometry and a generalized transfer function as implemented in the SphygmoCor device (AtCor Medical, Itasca, Illinois, USA). We used proportional hazards models to examine relations between central hemodynamics and first-onset major CVD events in 2183 participants (mean age 62 years, 58% women) in the Framingham Heart Study. During median follow-up of 7.8 (limits 0.2-8.9) years, 149 participants (6.8%) had an incident event. Augmentation index (P = 0.6), central aortic systolic pressure (P = 0.20), central aortic PP (P = 0.24), and PP amplification (P = 0.15) were not related to CVD events in multivariable models that adjusted for age, sex, brachial cuff systolic pressure, use of antihypertensive therapy, total and high-density lipoprotein cholesterol concentrations, smoking, and presence of diabetes. In a model that included standard risk factors, model fit was improved (P = 0.03) when brachial systolic pressure was added after central, whereas model fit was not improved (P = 0.30) when central systolic pressure was added after brachial. After considering standard risk factors, including brachial cuff SBP, augmentation index, central PP and PP amplification derived using radial artery tonometry, and a generalized transfer function were not predictive of CVD risk.

  7. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  8. Recognition of oral spelling is diagnostic of the central reading processes.

    PubMed

    Schubert, Teresa; McCloskey, Michael

    2015-01-01

    The task of recognition of oral spelling (stimulus: "C-A-T", response: "cat") is often administered to individuals with acquired written language disorders, yet there is no consensus about the underlying cognitive processes. We adjudicate between two existing hypotheses: Recognition of oral spelling uses central reading processes, or recognition of oral spelling uses central spelling processes in reverse. We tested the recognition of oral spelling and spelling to dictation abilities of a single individual with acquired dyslexia and dysgraphia. She was impaired relative to matched controls in spelling to dictation but unimpaired in recognition of oral spelling. Recognition of oral spelling for exception words (e.g., colonel) and pronounceable nonwords (e.g., larth) was intact. Our results were predicted by the hypothesis that recognition of oral spelling involves the central reading processes. We conclude that recognition of oral spelling is a useful tool for probing the integrity of the central reading processes.

  9. 45 CFR 170.205 - Content exchange standards and implementation specifications for exchanging electronic health...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... The Healthcare Information Technology Standards Panel (HITSP) Summary Documents Using HL7 CCD... specifications. Implementation Guide for Ambulatory Healthcare Provider Reporting to Central Cancer Registries...

  10. Iowa Central Quality Fuel Testing Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heach, Don; Bidieman, Julaine

    2013-09-30

    The objective of this project is to finalize the creation of an independent quality fuel testing laboratory on the campus of Iowa Central Community College in Fort Dodge, Iowa that shall provide the exploding biofuels industry a timely and cost-effective centrally located laboratory to complete all state and federal fuel and related tests that are required. The recipient shall work with various state regulatory agencies, biofuel companies and state and national industry associations to ensure that training and testing needs of their members and American consumers are met. The recipient shall work with the Iowa Department of Ag and Landmore » Stewardship on the development of an Iowa Biofuel Quality Standard along with the Development of a standard that can be used throughout industry.« less

  11. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    PubMed

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. © Hara et al.

  12. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility

    PubMed Central

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-01-01

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. PMID:29284701

  13. Recommendations for selecting drug-drug interactions for clinical decision support.

    PubMed

    Tilson, Hugh; Hines, Lisa E; McEvoy, Gerald; Weinstein, David M; Hansten, Philip D; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L; Huang, Shiew-Mei; Perre, Anthony; Bates, David W; Poikonen, John; Wittie, Michael A; Grizzle, Amy J; Brown, Mary; Malone, Daniel C

    2016-04-15

    Recommendations for including drug-drug interactions (DDIs) in clinical decision support (CDS) are presented. A conference series was conducted to improve CDS for DDIs. A work group consisting of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information vendors, and healthcare organizations was convened to address (1) the process to use for developing and maintaining a standard set of DDIs, (2) the information that should be included in a knowledge base of standard DDIs, (3) whether a list of contraindicated drug pairs can or should be established, and (4) how to more intelligently filter DDI alerts. We recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated and more research to identify methods to safely reduce repetitive and less-relevant alerts. An expert panel with a centralized organizer or convener should be established to develop and maintain a standard set of DDIs for CDS in the United States. The process should be evidence driven, transparent, and systematic, with feedback from multiple stakeholders for continuous improvement. The scope of the expert panel's work should be carefully managed to ensure that the process is sustainable. Support for research to improve DDI alerting in the future is also needed. Adoption of these steps may lead to consistent and clinically relevant content for interruptive DDIs, thus reducing alert fatigue and improving patient safety. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  14. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa

    PubMed Central

    Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Background Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. Objectives This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Methods Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Results Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. Conclusions The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care. PMID:28879108

  15. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa.

    PubMed

    Govender, Kerusha; Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care.

  16. Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.

    PubMed

    McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M

    2012-07-01

    There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.

  17. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  18. Donor human milk bank data collection in north america: an assessment of current status and future needs.

    PubMed

    Brownell, Elizabeth A; Lussier, Mary M; Herson, Victor C; Hagadorn, James I; Marinelli, Kathleen A

    2014-02-01

    The Human Milk Banking Association of North America (HMBANA) is a nonprofit association that standardizes and facilitates the establishment and operation of donor human milk (DHM) banks in North America. Each HMBANA milk bank in the network collects data on the DHM it receives and distributes, but a centralized data repository does not yet exist. In 2010, the Food and Drug Administration recognized the need to collect and disseminate systematic, standardized DHM bank data and suggested that HMBANA develop a DHM data repository. This study aimed to describe data currently collected by HMBANA DHM banks and evaluate feasibility and interest in participating in a centralized data repository. We conducted phone interviews with individuals in different HMBANA milk banks and summarized descriptive statistics. Eight of 13 (61.5%) sites consented to participate. All respondents collected donor demographics, and half (50%; n = 4) rescreened donors after 6 months of continued donation. The definition of preterm milk varied between DHM banks (≤ 32 to ≤ 40 weeks). The specific computer program used to house the data also differed. Half (50%; n = 4) indicated that they would consider participation in a centralized repository. Without standardized data across all HMBANA sites, the creation of a centralized data repository is not yet feasible. Lack of standardization and transparency may deter implementation of donor milk programs in the neonatal intensive care unit setting and hinder benchmarking, research, and quality improvement initiatives.

  19. Comparison of intermediate-dose methotrexate with cranial irradiation for the post-induction treatment of acute lymphocytic leukemia in children

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, A.I.; Weinberg, V.; Brecher, M.L.

    1983-03-03

    We compared two regimens with respect to their ability to prolong disease-free survival in 506 children and adolescents with acute lymphocytic leukemia. All responders to induction therapy were randomized to treatment with 2400 rad of cranial irradiation plus intrathecal methotrexate or to treatment with intermediate-dose methotrexate plus intrathecal methotrexate, as prophylaxis for involvement of the central nervous system and other sanctuary areas. Patients were then treated with a standard maintenance regimen. Complete responders were stratified into either standard-risk or increased-risk groups on the basis of age and white-cell count at presentation. Among patients with standard risk, hematologic relapses occurred inmore » 9 of 117 given methotrexate and 24 of 120 given irradiation (P less than 0.01). The rate of central-nervous-system relapse was higher in the methotrexate group (23 of 117) than in the irradiation group (8 of 120) (P . 0.01). Among patients with increased risk, radiation offered greater protection to the central nervous system than methotrexate (P . 0.03); there was no difference in the rate of hematologic relapse. In both risk strata the frequency of testicular relapse was significantly lower in the methotrexate group (1 patient) than the radiation group (10 patients) (P . 0.01). Methotrexate offered better protection against systemic relapse in standard-risk patients and better protection against testicular relapse overall, but it offered less protection against relapses in the central nervous system than cranial irradiation.« less

  20. Sterilization and disinfection in the physician's office.

    PubMed Central

    Drummond, D C; Skidmore, A G

    1991-01-01

    OBJECTIVE: To review the principles and practice of sterilization and disinfection of medical instruments in the office setting. DATA SOURCES: Searches of MEDLINE for articles published from 1980 to 1990 on disinfection, sterilization, cross infection, surgical instruments and iatrogenic disease, bibliographies, standard texts and reference material located in a central processing department. STUDY SELECTION: We reviewed surveys of decontamination practices in physicians' offices, reviews of current recommendations for office decontamination procedures, case reports of cross infection in offices and much of the standard reference material on decontamination theory and practice. DATA SYNTHESIS: There have been few surveys of physicians' decontamination practices and few case reports of cross infection. Office practitioners have little access to practical information on sterilization and disinfection. CONCLUSION: The increasing threat of cross infection from medical instruments calls for greater knowledge about decontamination. We have adapted material from various sources and offer a primer on the subject. PMID:1913427

  1. Molecular replication

    NASA Technical Reports Server (NTRS)

    Orgel, L. E.

    1986-01-01

    The object of our research program is to understand how polynucleotide replication originated on the primitive Earth. This is a central issue in studies of the origins of life, since a process similar to modern DNA and RNA synthesis is likely to have formed the basis for the most primitive system of genetic information transfer. The major conclusion of studies so far is that a preformed polynucleotide template under many different experimental conditions will facilitate the synthesis of a new oligonucleotide with a sequence complementary to that of the template. It has been shown, for example, that poly(C) facilitates the synthesis of long oligo(G)s and that the short template CCGCC facilities the synthesis of its complement GGCGG. Very recently we have shown that template-directed synthesis is not limited to the standard oligonucleotide substrates. Nucleic acid-like molecules with a pyrophosphate group replacing the phosphate of the standard nucleic acid backbone are readily synthesized from deoxynucleotide 3'-5'-diphosphates on appropriate templates.

  2. Automating Electronic Clinical Data Capture for Quality Improvement and Research: The CERTAIN Validation Project of Real World Evidence.

    PubMed

    Devine, Emily Beth; Van Eaton, Erik; Zadworny, Megan E; Symons, Rebecca; Devlin, Allison; Yanez, David; Yetisgen, Meliha; Keyloun, Katelyn R; Capurro, Daniel; Alfonso-Cristancho, Rafael; Flum, David R; Tarczy-Hornoch, Peter

    2018-05-22

    The availability of high fidelity electronic health record (EHR) data is a hallmark of the learning health care system. Washington State's Surgical Care Outcomes and Assessment Program (SCOAP) is a network of hospitals participating in quality improvement (QI) registries wherein data are manually abstracted from EHRs. To create the Comparative Effectiveness Research and Translation Network (CERTAIN), we semi-automated SCOAP data abstraction using a centralized federated data model, created a central data repository (CDR), and assessed whether these data could be used as real world evidence for QI and research. Describe the validation processes and complexities involved and lessons learned. Investigators installed a commercial CDR to retrieve and store data from disparate EHRs. Manual and automated abstraction systems were conducted in parallel (10/2012-7/2013) and validated in three phases using the EHR as the gold standard: 1) ingestion, 2) standardization, and 3) concordance of automated versus manually abstracted cases. Information retrieval statistics were calculated. Four unaffiliated health systems provided data. Between 6 and 15 percent of data elements were abstracted: 51 to 86 percent from structured data; the remainder using natural language processing (NLP). In phase 1, data ingestion from 12 out of 20 feeds reached 95 percent accuracy. In phase 2, 55 percent of structured data elements performed with 96 to 100 percent accuracy; NLP with 89 to 91 percent accuracy. In phase 3, concordance ranged from 69 to 89 percent. Information retrieval statistics were consistently above 90 percent. Semi-automated data abstraction may be useful, although raw data collected as a byproduct of health care delivery is not immediately available for use as real world evidence. New approaches to gathering and analyzing extant data are required.

  3. Gender effect on pre-attentive change detection in major depressive disorder patients revealed by auditory MMN.

    PubMed

    Qiao, Zhengxue; Yang, Aiying; Qiu, Xiaohui; Yang, Xiuxian; Zhang, Congpei; Zhu, Xiongzhao; He, Jincai; Wang, Lin; Bai, Bing; Sun, Hailian; Zhao, Lun; Yang, Yanjie

    2015-10-30

    Gender differences in rates of major depressive disorder (MDD) are well established, but gender differences in cognitive function have been little studied. Auditory mismatch negativity (MMN) was used to investigate gender differences in pre-attentive information processing in first episode MDD. In the deviant-standard reverse oddball paradigm, duration auditory MMN was obtained in 30 patients (15 males) and 30 age-/education-matched controls. Over frontal-central areas, mean amplitude of increment MMN (to a 150-ms deviant tone) was smaller in female than male patients; there was no sex difference in decrement MMN (to a 50-ms deviant tone). Neither increment nor decrement MMN differed between female and male patients over temporal areas. Frontal-central MMN and temporal MMN did not differ between male and female controls in any condition. Over frontal-central areas, mean amplitude of increment MMN was smaller in female patients than female controls; there was no difference in decrement MMN. Neither increment nor decrement MMN differed between female patients and female controls over temporal areas. Frontal-central MMN and temporal MMN did not differ between male patients and male controls. Mean amplitude of increment MMN in female patients did not correlate with symptoms, suggesting this sex-specific deficit is a trait- not a state-dependent phenomenon. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. 45 CFR 170.205 - Content exchange standards and implementation specifications for exchanging electronic health...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... The Healthcare Information Technology Standards Panel (HITSP) Summary Documents Using HL7 CCD... Guide for Ambulatory Healthcare Provider Reporting to Central Cancer Registries, HL7 Clinical Document...

  5. Comparison of Centralized-Manual, Centralized-Computerized, and Decentralized-Computerized Order and Management Information Models for the Turkish Air Force Logistics System.

    DTIC Science & Technology

    1986-09-01

    differentiation between the systems. This study will investigate an appropriate Order Processing and Management Information System (OP&MIS) to link base-level...methodology: 1. Reviewed the current order processing and information model of the TUAF Logistics System. (centralized-manual model) 2. Described the...RDS program’s order processing and information system. (centralized-computerized model) 3. Described the order irocessing and information system of

  6. Small-x Physics: From HERA to LHC and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leonid Frankfurt; Mark Strikman; Christian Weiss

    2005-07-01

    We summarize the lessons learned from studies of hard scattering processes in high-energy electron-proton collisions at HERA and antiproton-proton collisions at the Tevatron, with the aim of predicting new strong interaction phenomena observable in next-generation experiments at the Large Hadron Collider (LHC). Processes reviewed include inclusive deep-inelastic scattering (DIS) at small x exclusive and diffractive processes in DIS and hadron-hadron scattering, as well as color transparency and nuclear shadowing effects. A unified treatment of these processes is outlined, based on factorization theorems of quantum chromodynamics, and using the correspondence between the ''parton'' picture in the infinite-momentum frame and the 'dipole''more » picture of high-energy processes in the target rest frame. The crucial role of the three-dimensional quark and gluon structure of the nucleon is emphasized. A new dynamical effect predicted at high energies is the unitarity, or black disk, limit (BDL) in the interaction of small dipoles with hadronic matter, due to the increase of the gluon density at small x. This effect is marginally visible in diffractive DIS at HERA and will lead to the complete disappearance of Bjorken scaling at higher energies. In hadron-hadron scattering at LHC energies and beyond (cosmic ray physics), the BDL will be a standard feature of the dynamics, with implications for (a) hadron production at forward and central rapidities in central proton-proton and proton-nucleus collisions, in particular events with heavy particle production (Higgs), (b) proton-proton elastic scattering, (c) heavy-ion collisions. We also outline the possibilities for studies of diffractive processes and photon-induced reactions (ultraperipheral collisions) at LHC, as well as possible measurements with a future electron-ion collider.« less

  7. Standardizing economic analysis in prevention will require substantial effort.

    PubMed

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  8. Speech Comprehension Difficulties in Chronic Tinnitus and Its Relation to Hyperacusis

    PubMed Central

    Vielsmeier, Veronika; Kreuzer, Peter M.; Haubner, Frank; Steffens, Thomas; Semmler, Philipp R. O.; Kleinjung, Tobias; Schlee, Winfried; Langguth, Berthold; Schecklmann, Martin

    2016-01-01

    Objective: Many tinnitus patients complain about difficulties regarding speech comprehension. In spite of the high clinical relevance little is known about underlying mechanisms and predisposing factors. Here, we performed an exploratory investigation in a large sample of tinnitus patients to (1) estimate the prevalence of speech comprehension difficulties among tinnitus patients, to (2) compare subjective reports of speech comprehension difficulties with behavioral measurements in a standardized speech comprehension test and to (3) explore underlying mechanisms by analyzing the relationship between speech comprehension difficulties and peripheral hearing function (pure tone audiogram), as well as with co-morbid hyperacusis as a central auditory processing disorder. Subjects and Methods: Speech comprehension was assessed in 361 tinnitus patients presenting between 07/2012 and 08/2014 at the Interdisciplinary Tinnitus Clinic at the University of Regensburg. The assessment included standard audiological assessments (pure tone audiometry, tinnitus pitch, and loudness matching), the Goettingen sentence test (in quiet) for speech audiometric evaluation, two questions about hyperacusis, and two questions about speech comprehension in quiet and noisy environments (“How would you rate your ability to understand speech?”; “How would you rate your ability to follow a conversation when multiple people are speaking simultaneously?”). Results: Subjectively-reported speech comprehension deficits are frequent among tinnitus patients, especially in noisy environments (cocktail party situation). 74.2% of all investigated patients showed disturbed speech comprehension (indicated by values above 21.5 dB SPL in the Goettingen sentence test). Subjective speech comprehension complaints (both for general and in noisy environment) were correlated with hearing level and with audiologically-assessed speech comprehension ability. In contrast, co-morbid hyperacusis was only correlated with speech comprehension difficulties in noisy environments, but not with speech comprehension difficulties in general. Conclusion: Speech comprehension deficits are frequent among tinnitus patients. Whereas speech comprehension deficits in quiet environments are primarily due to peripheral hearing loss, speech comprehension deficits in noisy environments are related to both peripheral hearing loss and dysfunctional central auditory processing. Disturbed speech comprehension in noisy environments might be modulated by a central inhibitory deficit. In addition, attentional and cognitive aspects may play a role. PMID:28018209

  9. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    PubMed

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.

  10. Speech Comprehension Difficulties in Chronic Tinnitus and Its Relation to Hyperacusis.

    PubMed

    Vielsmeier, Veronika; Kreuzer, Peter M; Haubner, Frank; Steffens, Thomas; Semmler, Philipp R O; Kleinjung, Tobias; Schlee, Winfried; Langguth, Berthold; Schecklmann, Martin

    2016-01-01

    Objective: Many tinnitus patients complain about difficulties regarding speech comprehension. In spite of the high clinical relevance little is known about underlying mechanisms and predisposing factors. Here, we performed an exploratory investigation in a large sample of tinnitus patients to (1) estimate the prevalence of speech comprehension difficulties among tinnitus patients, to (2) compare subjective reports of speech comprehension difficulties with behavioral measurements in a standardized speech comprehension test and to (3) explore underlying mechanisms by analyzing the relationship between speech comprehension difficulties and peripheral hearing function (pure tone audiogram), as well as with co-morbid hyperacusis as a central auditory processing disorder. Subjects and Methods: Speech comprehension was assessed in 361 tinnitus patients presenting between 07/2012 and 08/2014 at the Interdisciplinary Tinnitus Clinic at the University of Regensburg. The assessment included standard audiological assessments (pure tone audiometry, tinnitus pitch, and loudness matching), the Goettingen sentence test (in quiet) for speech audiometric evaluation, two questions about hyperacusis, and two questions about speech comprehension in quiet and noisy environments ("How would you rate your ability to understand speech?"; "How would you rate your ability to follow a conversation when multiple people are speaking simultaneously?"). Results: Subjectively-reported speech comprehension deficits are frequent among tinnitus patients, especially in noisy environments (cocktail party situation). 74.2% of all investigated patients showed disturbed speech comprehension (indicated by values above 21.5 dB SPL in the Goettingen sentence test). Subjective speech comprehension complaints (both for general and in noisy environment) were correlated with hearing level and with audiologically-assessed speech comprehension ability. In contrast, co-morbid hyperacusis was only correlated with speech comprehension difficulties in noisy environments, but not with speech comprehension difficulties in general. Conclusion: Speech comprehension deficits are frequent among tinnitus patients. Whereas speech comprehension deficits in quiet environments are primarily due to peripheral hearing loss, speech comprehension deficits in noisy environments are related to both peripheral hearing loss and dysfunctional central auditory processing. Disturbed speech comprehension in noisy environments might be modulated by a central inhibitory deficit. In addition, attentional and cognitive aspects may play a role.

  11. A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.

  12. Centralization of Penile Cancer Management in the United States: A Combined Analysis of the American Board of Urology and National Cancer Data Base.

    PubMed

    Matulewicz, Richard S; Flum, Andrew S; Helenowski, Irene; Jovanovic, Borko; Palis, Bryan; Bilimoria, Karl Y; Meeks, Joshua J

    2016-04-01

    To assess the potential benefit of centralization of care in penile cancer. Centralization of care in other disease processes standardizes treatment and improves outcomes. Because penile cancer is a rare malignancy with unchanged mortality rates over the last two decades, we hypothesize that there may be a benefit to centralization. We identified surgeon, patient, and hospital characteristics captured by the National Cancer Data Base (1998-2012) and American Board of Urology case logs (2003-2013) for all penile cancer cases and procedures. Differences in patient demographics, stage of disease, referral patterns, and surgical quality indicators were assessed between academic and community hospitals. Using case logs to evaluate the distribution of penile cancer care, we found that only 4.1% of urologists performed a penile surgery and 1.5% performed a lymph node dissection (LND). Academic centers treated higher-stage cancers and saw more cases/year than community centers, suggesting informal centralization. Two guideline-based quality indicators demonstrated no difference in use of penile-sparing surgery but a higher likelihood of having an LND performed at an academic center (48.4% vs 26.6%). The total lymph node yield was significantly greater at academic centers (18.5 vs 12.5). Regression modeling demonstrated a 2.29 increased odds of having an LND at an academic center. Our data provide the first evidence for centralization of penile cancer in the US. At the time of diagnosis, equal number of patients is treated with penile-sparing surgery but there is greater use of LND and higher lymph node yield at academic centers. Ultimately, longer follow-up is necessary to determine if this improves survival of patients with penile cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The influence of (central) auditory processing disorder in speech sound disorders.

    PubMed

    Barrozo, Tatiane Faria; Pagan-Neves, Luciana de Oliveira; Vilela, Nadia; Carvallo, Renata Mota Mamede; Wertzner, Haydée Fiszbein

    2016-01-01

    Considering the importance of auditory information for the acquisition and organization of phonological rules, the assessment of (central) auditory processing contributes to both the diagnosis and targeting of speech therapy in children with speech sound disorders. To study phonological measures and (central) auditory processing of children with speech sound disorder. Clinical and experimental study, with 21 subjects with speech sound disorder aged between 7.0 and 9.11 years, divided into two groups according to their (central) auditory processing disorder. The assessment comprised tests of phonology, speech inconsistency, and metalinguistic abilities. The group with (central) auditory processing disorder demonstrated greater severity of speech sound disorder. The cutoff value obtained for the process density index was the one that best characterized the occurrence of phonological processes for children above 7 years of age. The comparison among the tests evaluated between the two groups showed differences in some phonological and metalinguistic abilities. Children with an index value above 0.54 demonstrated strong tendencies towards presenting a (central) auditory processing disorder, and this measure was effective to indicate the need for evaluation in children with speech sound disorder. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  14. Teacher Education Follow-Up Study 2001: A Summary of First and Second Year Teachers, and their Employers with Respect to the State of Missouri Standards.

    ERIC Educational Resources Information Center

    Zelazek, John R.; Williams, Wayne W.; McAdams, Charles; Palmer, Kyle

    This report represents the thirteenth Follow-Up Study by the Teacher Education Assessment Committee (TEAC) at Central Missouri State University. TEAC is a centralized system of data collection and assessment that solicits input from Central's professional education faculty, preservice teachers, program graduates, employers of teachers prepared at…

  15. [Relationship between crown form of upper central incisors and papilla filling in Chinese Han-nationality youth].

    PubMed

    Yang, X; Le, D; Zhang, Y L; Liang, L Z; Yang, G; Hu, W J

    2016-10-18

    To explore a crown form classification method for upper central incisor which is more objective and scientific than traditional classification method based on the standardized photography technique. To analyze the relationship between crown form of upper central incisors and papilla filling in periodontally healthy Chinese Han-nationality youth. In the study, 180 periodontally healthy Chinese youth ( 75 males, and 105 females ) aged 20-30 (24.3±4.5) years were included. With the standardized upper central incisor photography technique, pictures of 360 upper central incisors were obtained. Each tooth was classified as triangular, ovoid or square by 13 experienced specialist majors in prothodontics independently and the final classification result was decided by most evaluators in order to ensure objectivity. The standardized digital photo was also used to evaluate the gingival papilla filling situation. The papilla filling result was recorded as present or absent according to naked eye observation. The papilla filling rates of different crown forms were analyzed. Statistical analyses were performed with SPSS 19.0. The proportions of triangle, ovoid and square forms of upper central incisor in Chinese Han-nationality youth were 31.4% (113/360), 37.2% (134/360) and 31.4% (113/360 ), respectively, and no statistical difference was found between the males and females. Average κ value between each two evaluators was 0.381. Average κ value was raised up to 0.563 when compared with the final classification result. In the study, 24 upper central incisors without contact were excluded, and the papilla filling rates of triangle, ovoid and square crown were 56.4% (62/110), 69.6% (87/125), 76.2% (77/101) separately. The papilla filling rate of square form was higher (P=0.007). The proportion of clinical crown form of upper central incisor in Chinese Han-nationality youth is obtained. Compared with triangle form, square form is found to favor a gingival papilla that fills the interproximal embrasure space. The consistency of the present classification method for upper central incisor is not satisfying, which indicates that a new classification method, more scientific and objective than the present one, is to be found.

  16. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  17. Morphology of the Corneal Limbus Following Standard and Accelerated Corneal Collagen Cross-Linking (9 mW/cm2) for Keratoconus.

    PubMed

    Uçakhan, Ömür Ö; Bayraktutar, Betül

    2017-01-01

    To evaluate the morphological features of the corneal limbus as measured by in vivo confocal microscopy (IVCM) following standard and accelerated corneal collagen cross-linking (CXL) for keratoconus. Patients with progressive keratoconus scheduled to undergo standard CXL (group 1; 31 patients, 3 mW/cm, 370 nm, 30 minutes), or accelerated CXL (group 2; 20 patients, 9 mW/cm, 370 nm, 10 minutes) in the worse eye were included in this prospective study. Thirty eyes of 30 age-matched patients served as controls (group 3). All patient eyes underwent IVCM scanning of the central cornea and the inferior limbal area at baseline and 1, 3, and 6 months after CXL. After CXL, epithelial regrowth was complete by day 4 in both groups 1 and 2. There were no statistically significant differences between the baseline mean central corneal wing or basal cell density, limbus-palisade middle or basal cell densities of groups 1, 2, or 3. At postoperative months 1, 3, and 6, there were no statistically significant differences in either central or limbus-palisade epithelial cell densities or diameters in keratoconic eyes that underwent standard or accelerated CXL (P > 0.05). The morphology of the limbal cells was preserved as well. The morphology of limbus structures seems to be preserved following standard and accelerated CXL in short-term follow-up, as measured using IVCM.

  18. Techniques for estimating monthly mean streamflow at gaged sites and monthly streamflow duration characteristics at ungaged sites in central Nevada

    USGS Publications Warehouse

    Hess, G.W.; Bohman, L.R.

    1996-01-01

    Techniques for estimating monthly mean streamflow at gaged sites and monthly streamflow duration characteristics at ungaged sites in central Nevada were developed using streamflow records at six gaged sites and basin physical and climatic characteristics. Streamflow data at gaged sites were related by regression techniques to concurrent flows at nearby gaging stations so that monthly mean streamflows for periods of missing or no record can be estimated for gaged sites in central Nevada. The standard error of estimate for relations at these sites ranged from 12 to 196 percent. Also, monthly streamflow data for selected percent exceedence levels were used in regression analyses with basin and climatic variables to determine relations for ungaged basins for annual and monthly percent exceedence levels. Analyses indicate that the drainage area and percent of drainage area at altitudes greater than 10,000 feet are the most significant variables. For the annual percent exceedence, the standard error of estimate of the relations for ungaged sites ranged from 51 to 96 percent and standard error of prediction for ungaged sites ranged from 96 to 249 percent. For the monthly percent exceedence values, the standard error of estimate of the relations ranged from 31 to 168 percent, and the standard error of prediction ranged from 115 to 3,124 percent. Reliability and limitations of the estimating methods are described.

  19. Working Memory in Children With Neurocognitive Effects From Sickle Cell Disease: Contributions of the Central Executive and Processing Speed

    PubMed Central

    Smith, Kelsey E.; Schatz, Jeffrey

    2017-01-01

    Children with sickle cell disease (SCD) are at risk for working memory deficits due to multiple disease processes. We assessed working memory abilities and related functions in 32 school-age children with SCD and 85 matched comparison children using Baddeley’s working memory model as a framework. Children with SCD performed worse than controls for working memory, central executive function, and processing/rehearsal speed. Central executive function was found to mediate the relationship between SCD status and working memory, but processing speed did not. Cognitive remediation strategies that focus on central executive processes may be important for remediating working memory deficits in SCD. PMID:27759435

  20. ABR Examinations: The Why, What, and How

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, Gary J.; Bosma, Jennifer L., E-mail: jbosma@theabr.org; Guiberteau, Milton J.

    The American Board of Radiology (ABR) has provided certification for diagnostic radiologists and other specialists and subspecialists for more than 75 years. The Board certification process is a tangible expression of the social contract between the profession and the public by which the profession enjoys the privilege of self-regulation and the public is assured that it can expect medical professionals to put patients' interests first, guarantees the competence of practitioners, and guards the public health. A primary tool used by the ABR in fulfilling this responsibility is the secure proctored examination. This article sets forth seven standards based on authoritativemore » sources in the field of psychometrics (the science of mental measurements), and explains in each case how the ABR implements that standard. Readers are encouraged to understand that, despite the multiple opinions that may be held, these standards developed over decades by experts using the scientific method should be the central feature in any discussion or critique of examinations given for the privilege of professional practice and for safeguarding the public well-being.« less

  1. Alternative policies for the control of air pollution in Poland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, R.; Cofala, J.; Toman, M.

    1994-01-01

    Like other Central European countries, Poland faces the twin challenges of improving environmental quality while also promoting economic development. The study examines the cost of achieving alternative emission standards and the savings in abatement cost that might be achieved with policies that rely on economic incentives rather than with rigid command and control measures. A central element of the analysis is a dynamic model of least-cost energy supply in Poland that allows examination at a national level of the effects of different pollution standards and policies.

  2. Solitons riding on solitons and the quantum Newton's cradle.

    PubMed

    Ma, Manjun; Navarro, R; Carretero-González, R

    2016-02-01

    The reduced dynamics for dark and bright soliton chains in the one-dimensional nonlinear Schrödinger equation is used to study the behavior of collective compression waves corresponding to Toda lattice solitons. We coin the term hypersoliton to describe such solitary waves riding on a chain of solitons. It is observed that in the case of dark soliton chains, the formulated reduction dynamics provides an accurate an robust evolution of traveling hypersolitons. As an application to Bose-Einstein condensates trapped in a standard harmonic potential, we study the case of a finite dark soliton chain confined at the center of the trap. When the central chain is hit by a dark soliton, the energy is transferred through the chain as a hypersoliton that, in turn, ejects a dark soliton on the other end of the chain that, as it returns from its excursion up the trap, hits the central chain repeating the process. This periodic evolution is an analog of the classical Newton's cradle.

  3. The interRAI Acute Care instrument incorporated in an eHealth system for standardized and web-based geriatric assessment: strengths, weaknesses, opportunities and threats in the acute hospital setting

    PubMed Central

    2013-01-01

    Background The interRAI Acute Care instrument is a multidimensional geriatric assessment system intended to determine a hospitalized older persons’ medical, psychosocial and functional capacity and needs. Its objective is to develop an overall plan for treatment and long-term follow-up based on a common set of standardized items that can be used in various care settings. A Belgian web-based software system (BelRAI-software) was developed to enable clinicians to interpret the output and to communicate the patients’ data across wards and care organizations. The purpose of the study is to evaluate the (dis)advantages of the implementation of the interRAI Acute Care instrument as a comprehensive geriatric assessment instrument in an acute hospital context. Methods In a cross-sectional multicenter study on four geriatric wards in three acute hospitals, trained clinical staff (nurses, occupational therapists, social workers, and geriatricians) assessed 410 inpatients in routine clinical practice. The BelRAI-system was evaluated by focus groups, observations, and questionnaires. The Strengths, Weaknesses, Opportunities and Threats were mapped (SWOT-analysis) and validated by the participants. Results The primary strengths of the BelRAI-system were a structured overview of the patients’ condition early after admission and the promotion of multidisciplinary assessment. Our study was a first attempt to transfer standardized data between home care organizations, nursing homes and hospitals and a way to centralize medical, allied health professionals and nursing data. With the BelRAI-software, privacy of data is guaranteed. Weaknesses are the time-consuming character of the process and the overlap with other assessment instruments or (electronic) registration forms. There is room for improving the user-friendliness and the efficiency of the software, which needs hospital-specific adaptations. Opportunities are a timely and systematic problem detection and continuity of care. An actual shortage of funding of personnel to coordinate the assessment process is the most important threat. Conclusion The BelRAI-software allows standardized transmural information transfer and the centralization of medical, allied health professionals and nursing data. It is strictly secured and follows strict privacy regulations, allowing hospitals to optimize (transmural) communication and interaction. However, weaknesses and threats exist and must be tackled in order to promote large scale implementation. PMID:24007312

  4. The interRAI Acute Care instrument incorporated in an eHealth system for standardized and web-based geriatric assessment: strengths, weaknesses, opportunities and threats in the acute hospital setting.

    PubMed

    Devriendt, Els; Wellens, Nathalie I H; Flamaing, Johan; Declercq, Anja; Moons, Philip; Boonen, Steven; Milisen, Koen

    2013-09-05

    The interRAI Acute Care instrument is a multidimensional geriatric assessment system intended to determine a hospitalized older persons' medical, psychosocial and functional capacity and needs. Its objective is to develop an overall plan for treatment and long-term follow-up based on a common set of standardized items that can be used in various care settings. A Belgian web-based software system (BelRAI-software) was developed to enable clinicians to interpret the output and to communicate the patients' data across wards and care organizations. The purpose of the study is to evaluate the (dis)advantages of the implementation of the interRAI Acute Care instrument as a comprehensive geriatric assessment instrument in an acute hospital context. In a cross-sectional multicenter study on four geriatric wards in three acute hospitals, trained clinical staff (nurses, occupational therapists, social workers, and geriatricians) assessed 410 inpatients in routine clinical practice. The BelRAI-system was evaluated by focus groups, observations, and questionnaires. The Strengths, Weaknesses, Opportunities and Threats were mapped (SWOT-analysis) and validated by the participants. The primary strengths of the BelRAI-system were a structured overview of the patients' condition early after admission and the promotion of multidisciplinary assessment. Our study was a first attempt to transfer standardized data between home care organizations, nursing homes and hospitals and a way to centralize medical, allied health professionals and nursing data. With the BelRAI-software, privacy of data is guaranteed. Weaknesses are the time-consuming character of the process and the overlap with other assessment instruments or (electronic) registration forms. There is room for improving the user-friendliness and the efficiency of the software, which needs hospital-specific adaptations. Opportunities are a timely and systematic problem detection and continuity of care. An actual shortage of funding of personnel to coordinate the assessment process is the most important threat. The BelRAI-software allows standardized transmural information transfer and the centralization of medical, allied health professionals and nursing data. It is strictly secured and follows strict privacy regulations, allowing hospitals to optimize (transmural) communication and interaction. However, weaknesses and threats exist and must be tackled in order to promote large scale implementation.

  5. Investigation of optical current transformer signal processing method based on an improved Kalman algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan

    2018-01-01

    This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.

  6. Dynamic Eye Tracking Based Metrics for Infant Gaze Patterns in the Face-Distractor Competition Paradigm

    PubMed Central

    Ahtola, Eero; Stjerna, Susanna; Yrttiaho, Santeri; Nelson, Charles A.; Leppänen, Jukka M.; Vanhatalo, Sampsa

    2014-01-01

    Objective To develop new standardized eye tracking based measures and metrics for infants’ gaze dynamics in the face-distractor competition paradigm. Method Eye tracking data were collected from two samples of healthy 7-month-old (total n = 45), as well as one sample of 5-month-old infants (n = 22) in a paradigm with a picture of a face or a non-face pattern as a central stimulus, and a geometric shape as a lateral stimulus. The data were analyzed by using conventional measures of infants’ initial disengagement from the central to the lateral stimulus (i.e., saccadic reaction time and probability) and, additionally, novel measures reflecting infants gaze dynamics after the initial disengagement (i.e., cumulative allocation of attention to the central vs. peripheral stimulus). Results The results showed that the initial saccade away from the centrally presented stimulus is followed by a rapid re-engagement of attention with the central stimulus, leading to cumulative preference for the central stimulus over the lateral stimulus over time. This pattern tended to be stronger for salient facial expressions as compared to non-face patterns, was replicable across two independent samples of 7-month-old infants, and differentiated between 7 and 5 month-old infants. Conclusion The results suggest that eye tracking based assessments of infants’ cumulative preference for faces over time can be readily parameterized and standardized, and may provide valuable techniques for future studies examining normative developmental changes in preference for social signals. Significance Standardized measures of early developing face preferences may have potential to become surrogate biomarkers of neurocognitive and social development. PMID:24845102

  7. Experimentally induced central sensitization in the cervical spine evokes postural stiffening strategies in healthy young adults.

    PubMed

    Huntley, Andrew H; Srbely, John Z; Zettel, John L

    2015-02-01

    Dysequilibrium of cervicogenic origin can result from pain and injury to cervical paraspinal tissues post-whiplash; however, the specific physiological mechanisms still remain unclear. Central sensitization is a neuradaptive process which has been clinically associated with conditions of chronic pain and hypersensitivity. Strong links have been demonstrated between pain hypersensitivity and postural deficits post-whiplash; however, the precise mechanisms are still poorly understood. The purpose of this study was to explore the mechanisms of cervicogenic disequilibrium by investigating the effect of experimentally induced central sensitization in the cervical spine on postural stability in young healthy adults. Sixteen healthy young adults (7 males (22.6±1.13 years) and 9 females (22±2.69 years)) performed 30-s full-tandem stance trials on an AMTI force plate under normal and centrally sensitized conditions. The primary outcome variables included the standard deviation of the center of pressure (COP) position in medio-lateral (M-L) and antero-posterior (A-P) directions; sway range of the COP in M-L and A-P directions and the mean power frequency (MPF) of the COP and horizontal ground shear forces. Variability and sway range of the COP decreased with experimental induction of central sensitization, accompanied by an increase in MPF of COP displacement in both M-L and A-P directions, suggesting an increase in postural stiffening post-sensitization versus non-sensitized controls. Future studies need to further explore this relationship in clinical (whiplash, chronic pain) populations. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. 49 CFR 71.8 - Mountain zone.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.8 Mountain zone. The fourth zone, the mountain standard time zone, includes that part of the United States that is west of the boundary line between the central and mountain standard time zones described in § 71.7 and east of the...

  9. The Common Core State Standards: The Emperor Is Still Looking for His Clothes

    ERIC Educational Resources Information Center

    Tienken, Christopher H.

    2012-01-01

    As of September 2012, only Alaska, Minnesota, Nebraska, Texas, and Virginia had not adopted the Common Core State Standards (CCSS). Yet empirical evidence that demonstrates the efficacy of the initiative remains elusive. The Emperor has no clothes (Tienken 2011a). This latest installment of standardization and centralization of curriculum and…

  10. The Role of NCA Commission on Schools: Voluntary Cooperative Action.

    ERIC Educational Resources Information Center

    Gose, Kenneth F.

    1997-01-01

    Calls for a new standard of educational excellence following the assumption of the new North Central Association (NCA) Standard and Criteria for Accreditation. Argues that in addition to maintaining past standards of excellence, modern-age schools must also actively adapt and improve; they must examine their current status and plan improvements.…

  11. Central Auditory Processing of Temporal and Spectral-Variance Cues in Cochlear Implant Listeners

    PubMed Central

    Pham, Carol Q.; Bremen, Peter; Shen, Weidong; Yang, Shi-Ming; Middlebrooks, John C.; Zeng, Fan-Gang; Mc Laughlin, Myles

    2015-01-01

    Cochlear implant (CI) listeners have difficulty understanding speech in complex listening environments. This deficit is thought to be largely due to peripheral encoding problems arising from current spread, which results in wide peripheral filters. In normal hearing (NH) listeners, central processing contributes to segregation of speech from competing sounds. We tested the hypothesis that basic central processing abilities are retained in post-lingually deaf CI listeners, but processing is hampered by degraded input from the periphery. In eight CI listeners, we measured auditory nerve compound action potentials to characterize peripheral filters. Then, we measured psychophysical detection thresholds in the presence of multi-electrode maskers placed either inside (peripheral masking) or outside (central masking) the peripheral filter. This was intended to distinguish peripheral from central contributions to signal detection. Introduction of temporal asynchrony between the signal and masker improved signal detection in both peripheral and central masking conditions for all CI listeners. Randomly varying components of the masker created spectral-variance cues, which seemed to benefit only two out of eight CI listeners. Contrastingly, the spectral-variance cues improved signal detection in all five NH listeners who listened to our CI simulation. Together these results indicate that widened peripheral filters significantly hamper central processing of spectral-variance cues but not of temporal cues in post-lingually deaf CI listeners. As indicated by two CI listeners in our study, however, post-lingually deaf CI listeners may retain some central processing abilities similar to NH listeners. PMID:26176553

  12. Archaeology as a social science.

    PubMed

    Smith, Michael E; Feinman, Gary M; Drennan, Robert D; Earle, Timothy; Morris, Ian

    2012-05-15

    Because of advances in methods and theory, archaeology now addresses issues central to debates in the social sciences in a far more sophisticated manner than ever before. Coupled with methodological innovations, multiscalar archaeological studies around the world have produced a wealth of new data that provide a unique perspective on long-term changes in human societies, as they document variation in human behavior and institutions before the modern era. We illustrate these points with three examples: changes in human settlements, the roles of markets and states in deep history, and changes in standards of living. Alternative pathways toward complexity suggest how common processes may operate under contrasting ecologies, populations, and economic integration.

  13. Archaeology as a social science

    PubMed Central

    Smith, Michael E.; Feinman, Gary M.; Drennan, Robert D.; Earle, Timothy; Morris, Ian

    2012-01-01

    Because of advances in methods and theory, archaeology now addresses issues central to debates in the social sciences in a far more sophisticated manner than ever before. Coupled with methodological innovations, multiscalar archaeological studies around the world have produced a wealth of new data that provide a unique perspective on long-term changes in human societies, as they document variation in human behavior and institutions before the modern era. We illustrate these points with three examples: changes in human settlements, the roles of markets and states in deep history, and changes in standards of living. Alternative pathways toward complexity suggest how common processes may operate under contrasting ecologies, populations, and economic integration. PMID:22547811

  14. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  15. Higgs radiation off top quarks at the Tevatron and the LHC.

    PubMed

    Beenakker, W; Dittmaier, S; Krämer, M; Plümper, B; Spira, M; Zerwas, P M

    2001-11-12

    Higgs bosons can be searched for in the channels pp macro/pp-->tt macro H + X at the Fermilab Tevatron and the Cern Large Hadron Collider (LHC). We have calculated the QCD corrections to these processes in the standard model at next-to-leading order. The higher-order corrections reduce the renormalization and factorization scale dependence considerably and stabilize the theoretical predictions for the cross sections. At the central scale mu = (2m(t)+M(H))/2 the properly defined K factors are slightly below unity for the Tevatron (K approximately 0.8) and slightly above unity for the LHC (K approximately 1.2).

  16. The NAIMS cooperative pilot project: Design, implementation and future directions.

    PubMed

    Oh, Jiwon; Bakshi, Rohit; Calabresi, Peter A; Crainiceanu, Ciprian; Henry, Roland G; Nair, Govind; Papinutto, Nico; Constable, R Todd; Reich, Daniel S; Pelletier, Daniel; Rooney, William; Schwartz, Daniel; Tagge, Ian; Shinohara, Russell T; Simon, Jack H; Sicotte, Nancy L

    2017-10-01

    The North American Imaging in Multiple Sclerosis (NAIMS) Cooperative represents a network of 27 academic centers focused on accelerating the pace of magnetic resonance imaging (MRI) research in multiple sclerosis (MS) through idea exchange and collaboration. Recently, NAIMS completed its first project evaluating the feasibility of implementation and reproducibility of quantitative MRI measures derived from scanning a single MS patient using a high-resolution 3T protocol at seven sites. The results showed the feasibility of utilizing advanced quantitative MRI measures in multicenter studies and demonstrated the importance of careful standardization of scanning protocols, central image processing, and strategies to account for inter-site variability.

  17. 76 FR 37549 - Energy Conservation Program: Energy Conservation Standards for Residential Furnaces and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-27

    ...The Energy Policy and Conservation Act of 1975 (EPCA), as amended, prescribes energy conservation standards for various consumer products and certain commercial and industrial equipment, including residential furnaces and residential central air conditioners and heat pumps. EPCA also requires the U.S. Department of Energy (DOE) to determine whether more-stringent, amended standards for these products would be technologically feasible and economically justified, and would save a significant amount of energy. In this notice, DOE proposes energy conservation standards for residential furnaces and for residential central air conditioners and heat pumps identical to those set forth in a direct final rule published elsewhere in today's Federal Register. If DOE receives adverse comment and determines that such comment may provide a reasonable basis for withdrawing the direct final rule, DOE will publish a notice withdrawing the direct final rule and will proceed with this proposed rule.

  18. 78 FR 9311 - Hazard Communication; Corrections and Technical Amendment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-08

    ... Column for Standard No. 1910.1051. ``Cancer; eye and respiratory tract irritation; center nervous system... irritation; central nervous system effects; and flammability.'' The following table contains a summary of the... (l)(1)(ii) ``center nervous system effects'' is paragraph. corrected to ``central nervous system...

  19. Open up the Ceiling on the Common Core State Standards: Preparing Students for 21st-Century Literacy--Now

    ERIC Educational Resources Information Center

    Drew, Sally Valentino

    2013-01-01

    The Common Core State Standards Initiative is the latest effort to reform education through standards. This article examines how the Standards promise to prepare students for the changing world of the 21st century, yet do not consider the changing nature of literacy--especially the centrality of the Internet as a 21st century text, and online…

  20. Caregiver Education Reduces the Incidence of Community-Acquired CLABSIs in the Pediatric Patient With Intestinal Failure.

    PubMed

    Drews, Barbie; Macaluso, Michelle; Piper, Hannah; Channabasappa, Nandini

    Pediatric patients with intestinal failure often require central venous catheters for extended periods of time for parenteral nutrition, blood sampling, and medication administration, increasing morbidity, mortality, and costs. In 2007, we reported a central line-associated bloodstream infection rate of 7.0 per 1,000 catheter line-days in our pediatric patients with intestinal failure. On the basis of this high rate of catheter-associated infections, we developed and implemented a central line care curriculum for patients/family caregivers and home health nurses. We aim to show with the implementation of patient/family caregiver and home health nurse standardized education, the central line-associated bloodstream infection rate can be significantly reduced and that this is sustainable. A retrospective review of 80 pediatric outpatients with intestinal failure and long-term central venous access was performed between January 1, 2009, and December 31, 2014. During this time period, the nursing department at Children's Medical Center of Dallas implemented a systematic central line care education program for patients and/or caregivers. The number of community-acquired central line-associated bloodstream infections during this time period was collected and compared with our previously reported data from 2005 to 2007 prior to the implementation of education program. With the implementation of standardized care guidelines and a central venous catheter care curriculum, the community-acquired rate decreased from 4.8 to 2.9 per 1,000 catheter-days in 80 patients with intestinal failure between January 1, 2009, and December 31, 2014 (p < .001). This was also a significant decrease compared with the initial central line-associated bloodstream infection rate of 7.0 per 1,000 central line days in 2007 (p < .001) prior to the development of the central venous catheter care curriculum. We have shown that the incidence of community-acquired central line-associated bloodstream infections in children with intestinal failure can be reduced through formal education of central venous catheter care to family members.

  1. Transplanting a Western-Style Journalism Education to the Central Asian Republics of the Former Soviet Union: Experiences and Challenges at the American University of Central Asia in Kyrgyzstan

    ERIC Educational Resources Information Center

    Skochilo, Elena; Toralieva, Gulnura; Freedman, Eric; Shafer, Richard

    2013-01-01

    Western standards of journalism education, as well as western professional journalistic practices, have had difficulty taking root in the five independent countries of formerly Soviet Central Asia. This essay examines the experience of one university's Department of Journalism and Mass Communication since 1997 and the challenges it faces,…

  2. Centralization or decentralization of facial structures in Korean young adults.

    PubMed

    Yoo, Ja-Young; Kim, Jeong-Nam; Shin, Kang-Jae; Kim, Soon-Heum; Choi, Hyun-Gon; Jeon, Hyun-Soo; Koh, Ki-Seok; Song, Wu-Chul

    2013-05-01

    It is well known that facial beauty is dictated by facial type, and harmony between the eyes, nose, and mouth. Furthermore, facial impression is judged according to the overall facial contour and the relationship between the facial structures. The aims of the present study were to determine the optimal criteria for the assessment of gathering or separation of the facial structures and to define standardized ratios for centralization or decentralization of the facial structures.Four different lengths were measured, and 2 indexes were calculated from standardized photographs of 551 volunteers. Centralization and decentralization were assessed using the width index (interpupillary distance / facial width) and height index (eyes-mouth distance / facial height). The mean ranges of the width index and height index were 42.0 to 45.0 and 36.0 to 39.0, respectively. The width index did not differ with sex, but males had more decentralized faces, and females had more centralized faces, vertically. The incidence rate of decentralized faces among the men was 30.3%, and that of centralized faces among the women was 25.2%.The mean ranges in width and height indexes have been determined in a Korean population. Faces with width and height index scores under and over the median ranges are determined to be "centralized" and "decentralized," respectively.

  3. Optical Features of Efficient Europium(III) Complexes with β-Diketonato and Auxiliary Ligands and Mechanistic Investigation of Energy Transfer Process.

    PubMed

    Bala, Manju; Kumar, Satish; Taxak, V B; Boora, Priti; Khatkar, S P

    2016-09-01

    Two new europium (III) complexes have been synthesized with 1,3-[bis(4-methoxyphenyl)]propane-1,3-dionato (HBMPD) as main ligand and 2,2'-bipyridyl (bipy) or 1,10-phenanthroline (phen) as an auxiliary ligand. The main ligand HBMPD has been synthesized by ecofriendly microwave approach and complexes by solution precipitation method. The resulting materials are characterized by IR, (1)H-NMR, elemental analysis, X-ray diffraction, UV-visible and TG-DTG techniques. The photoluminescence (PL) spectroscopy depicts the detail analysis of photophysical properties of the complexes, their results show that the ligand interact with Eu (III) ion which act as antenna and transfers the absorbed energy to the central europium(III) ion via sensitization process efficiently. As a consequence of this interaction, these materials exhibit excellent luminescent intensity, long decay time (τ), high quantum efficiency (η) and Judd-Ofelt intensity parameter (Ω2). The CIE coordinates fall under the deep red region, matching well with the NTSC (National Television Standard Committee) standard. Hence, these highly efficient optical materials can be used as a red component in organic light emitting diodes (OLEDs) and full color flat panel displays.

  4. Clinical research in a hospital--from the lone rider to teamwork.

    PubMed

    Hannisdal, E

    1996-01-01

    Clinical research of high international standard is very demanding and requires clinical data of high quality, software, hardware and competence in research design and statistical treatment of data. Most busy clinicians have little time allocated for clinical research and this increases the need for a potent infrastructure. This paper describes how the Norwegian Radium Hospital, a specialized cancer hospital, has reorganized the clinical research process. This includes a new department, the Clinical Research Office, which serves the formal framework, a central Diagnosis Registry, clinical databases and multicentre studies. The department assists about 120 users, mainly clinicians. Installation of a network software package with over 10 programs has strongly provided an internal standardization, reduced the costs and saved clinicians a great deal of time. The hospital is building up about 40 diagnosis-specific clinical databases with up to 200 variables registered. These databases are shared by the treatment group and seem to be important tools for quality assurance. We conclude that the clinical research process benefits from a firm infrastructure facilitating teamwork through extensive use of modern information technology. We are now ready for the next phase, which is to work for a better external technical framework for cooperation with other institutions throughout the world.

  5. Development and evaluation of a web-based application for digital findings and documentation in physiotherapy education.

    PubMed

    Spieler, Bernadette; Burgsteiner, Harald; Messer-Misak, Karin; Gödl-Purrer, Barbara; Salchinger, Beate

    2015-01-01

    Findings in physiotherapy have standardized approaches in treatment, but there is also a significant margin of differences in how to implement these standards. Clinical decisions require experience and continuous learning processes to consolidate personal values and opinions and studies suggest that lecturers can influence students positively. Recently, the study course of Physiotherapy at the University of Applied Science in Graz has offered a paper based finding document. This document supported decisions through the adaption of the clinical reasoning process. The document was the starting point for our learning application called "EasyAssess", a Java based web-application for a digital findings documentation. A central point of our work was to ensure efficiency, effectiveness and usability of the web-application through usability tests utilized by both students and lecturers. Results show that our application fulfills the previously defined requirements and can be efficiently used in daily routine largely because of its simple user interface and its modest design. Due to the close cooperation with the study course Physiotherapy, the application has incorporated the various needs of the target audiences and confirmed the usefulness of our application.

  6. Ultrasound as a Screening Tool for Central Venous Catheter Positioning and Exclusion of Pneumothorax.

    PubMed

    Amir, Rabia; Knio, Ziyad O; Mahmood, Feroze; Oren-Grinberg, Achikam; Leibowitz, Akiva; Bose, Ruma; Shaefi, Shahzad; Mitchell, John D; Ahmed, Muneeb; Bardia, Amit; Talmor, Daniel; Matyal, Robina

    2017-07-01

    Although real-time ultrasound guidance during central venous catheter insertion has become a standard of care, postinsertion chest radiograph remains the gold standard to confirm central venous catheter tip position and rule out associated lung complications like pneumothorax. We hypothesize that a combination of transthoracic echocardiography and lung ultrasound is noninferior to chest radiograph when used to accurately assess central venous catheter positioning and screen for pneumothorax. All operating rooms and surgical and trauma ICUs at the institution. Single-center, prospective noninferiority study. Patients receiving ultrasound-guided subclavian or internal jugular central venous catheters. During ultrasound-guided central venous catheter placement, correct positioning of central venous catheter was accomplished by real-time visualization of the guide wire and positive right atrial swirl sign using the subcostal four-chamber view. After insertion, pneumothorax was ruled out by the presence of lung sliding and seashore sign on M-mode. Data analysis was done for 137 patients. Chest radiograph ruled out pneumothorax in 137 of 137 patients (100%). Lung ultrasound was performed in 123 of 137 patients and successfully screened for pneumothorax in 123 of 123 (100%). Chest radiograph approximated accurate catheter tip position in 136 of 137 patients (99.3%). Adequate subcostal four-chamber views could not be obtained in 13 patients. Accurate positioning of central venous catheter with ultrasound was then confirmed in 121 of 124 patients (97.6%) as described previously. Transthoracic echocardiography and lung ultrasound are noninferior to chest x-ray for screening of pneumothorax and accurate central venous catheter positioning. Thus, the point of care use of ultrasound can reduce central venous catheter insertion to use time, exposure to radiation, and improve patient safety.

  7. Amodal processing in human prefrontal cortex.

    PubMed

    Tamber-Rosenau, Benjamin J; Dux, Paul E; Tombu, Michael N; Asplund, Christopher L; Marois, René

    2013-07-10

    Information enters the cortex via modality-specific sensory regions, whereas actions are produced by modality-specific motor regions. Intervening central stages of information processing map sensation to behavior. Humans perform this central processing in a flexible, abstract manner such that sensory information in any modality can lead to response via any motor system. Cognitive theories account for such flexible behavior by positing amodal central information processing (e.g., "central executive," Baddeley and Hitch, 1974; "supervisory attentional system," Norman and Shallice, 1986; "response selection bottleneck," Pashler, 1994). However, the extent to which brain regions embodying central mechanisms of information processing are amodal remains unclear. Here we apply multivariate pattern analysis to functional magnetic resonance imaging (fMRI) data to compare response selection, a cognitive process widely believed to recruit an amodal central resource across sensory and motor modalities. We show that most frontal and parietal cortical areas known to activate across a wide variety of tasks code modality, casting doubt on the notion that these regions embody a central processor devoid of modality representation. Importantly, regions of anterior insula and dorsolateral prefrontal cortex consistently failed to code modality across four experiments. However, these areas code at least one other task dimension, process (instantiated as response selection vs response execution), ensuring that failure to find coding of modality is not driven by insensitivity of multivariate pattern analysis in these regions. We conclude that abstract encoding of information modality is primarily a property of subregions of the prefrontal cortex.

  8. 32 CFR 1904.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PROCEDURES GOVERNING ACCEPTANCE OF SERVICE OF PROCESS § 1904.2 Definitions. (a) Agency or CIA means the Central Intelligence Agency and include all staff elements of the Director of Central Intelligence. (b) Process means a...

  9. 32 CFR 1904.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PROCEDURES GOVERNING ACCEPTANCE OF SERVICE OF PROCESS § 1904.2 Definitions. (a) Agency or CIA means the Central Intelligence Agency and include all staff elements of the Director of Central Intelligence. (b) Process means a...

  10. 32 CFR 1904.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PROCEDURES GOVERNING ACCEPTANCE OF SERVICE OF PROCESS § 1904.2 Definitions. (a) Agency or CIA means the Central Intelligence Agency and include all staff elements of the Director of Central Intelligence. (b) Process means a...

  11. 32 CFR 1904.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PROCEDURES GOVERNING ACCEPTANCE OF SERVICE OF PROCESS § 1904.2 Definitions. (a) Agency or CIA means the Central Intelligence Agency and include all staff elements of the Director of Central Intelligence. (b) Process means a...

  12. 32 CFR 1904.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Defense Other Regulations Relating to National Defense CENTRAL INTELLIGENCE AGENCY PROCEDURES GOVERNING ACCEPTANCE OF SERVICE OF PROCESS § 1904.2 Definitions. (a) Agency or CIA means the Central Intelligence Agency and include all staff elements of the Director of Central Intelligence. (b) Process means a...

  13. Comparison of weighted and unweighted network analysis in the case of a pig trade network in Northern Germany.

    PubMed

    Büttner, Kathrin; Krieter, Joachim

    2018-08-01

    The analysis of trade networks as well as the spread of diseases within these systems focuses mainly on pure animal movements between farms. However, additional data included as edge weights can complement the informational content of the network analysis. However, the inclusion of edge weights can also alter the outcome of the network analysis. Thus, the aim of the study was to compare unweighted and weighted network analyses of a pork supply chain in Northern Germany and to evaluate the impact on the centrality parameters. Five different weighted network versions were constructed by adding the following edge weights: number of trade contacts, number of delivered livestock, average number of delivered livestock per trade contact, geographical distance and reciprocal geographical distance. Additionally, two different edge weight standardizations were used. The network observed from 2013 to 2014 contained 678 farms which were connected by 1,018 edges. General network characteristics including shortest path structure (e.g. identical shortest paths, shortest path lengths) as well as centrality parameters for each network version were calculated. Furthermore, the targeted and the random removal of farms were performed in order to evaluate the structural changes in the networks. All network versions and edge weight standardizations revealed the same number of shortest paths (1,935). Between 94.4 to 98.9% of the unweighted network and the weighted network versions were identical. Furthermore, depending on the calculated centrality parameters and the edge weight standardization used, it could be shown that the weighted network versions differed from the unweighted network (e.g. for the centrality parameters based on ingoing trade contacts) or did not differ (e.g. for the centrality parameters based on the outgoing trade contacts) with regard to the Spearman Rank Correlation and the targeted removal of farms. The choice of standardization method as well as the inclusion or exclusion of specific farm types (e.g. abattoirs) can alter the results significantly. These facts have to be considered when centrality parameters are to be used for the implementation of prevention and control strategies in the case of an epidemic. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. CLABSI Conversations: Lessons From Peer-to-Peer Assessments to Reduce Central Line-Associated Bloodstream Infections.

    PubMed

    Pham, Julius Cuong; Goeschel, Christine A; Berenholtz, Sean M; Demski, Renee; Lubomski, Lisa H; Rosen, Michael A; Sawyer, Melinda D; Thompson, David A; Trexler, Polly; Weaver, Sallie J; Weeks, Kristina R; Pronovost, Peter J

    2016-01-01

    A national collaborative helped many hospitals dramatically reduce central line-associated bloodstream infections (CLABSIs), but some hospitals struggled to reduce infection rates. This article describes the development of a peer-to-peer assessment process (CLABSI Conversations) and the practical, actionable practices we discovered that helped intensive care unit teams achieve a CLABSI rate of less than 1 infection per 1000 catheter-days for at least 1 year. CLABSI Conversations was designed as a learning-oriented process, in which a team of peers visited hospitals to surface barriers to infection prevention and to share best practices and insights from successful intensive care units. Common practices led to 10 recommendations: executive and board leaders communicate the goal of zero CLABSI throughout the hospital; senior and unit-level leaders hold themselves accountable for CLABSI rates; unit physicians and nurse leaders own the problem; clinical leaders and infection preventionists build infection prevention training and simulation programs; infection preventionists participate in unit-based CLABSI reduction efforts; hospital managers make compliance with best practices easy; clinical leaders standardize the hospital's catheter insertion and maintenance practices and empower nurses to stop any potentially harmful acts; unit leaders and infection preventionists investigate CLABSIs to identify root causes; and unit nurses and staff audit catheter maintenance policies and practices.

  15. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  16. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  17. [Blood cultures in the paediatric emergency department. Guidelines and recommendations on their indications, collection, processing and interpretation].

    PubMed

    Hernández-Bou, S; Álvarez Álvarez, C; Campo Fernández, M N; García Herrero, M A; Gené Giralt, A; Giménez Pérez, M; Piñeiro Pérez, R; Gómez Cortés, B; Velasco, R; Menasalvas Ruiz, A I; García García, J J; Rodrigo Gonzalo de Liria, C

    2016-05-01

    Blood culture (BC) is the gold standard when a bacteraemia is suspected, and is one of the most requested microbiological tests in paediatrics. Some changes have occurred in recent years: the introduction of new vaccines, the increasing number of patients with central vascular catheters, as well as the introduction of continuous monitoring BC systems. These changes have led to the review and update of different factors related to this technique in order to optimise its use. A practice guideline is presented with recommendations on BC, established by the Spanish Society of Paediatric Emergency Care and the Spanish Society for Paediatric Infectious Diseases. After reviewing the available scientific evidence, several recommendations for each of the following aspects are presented: BC indications in the Emergency Department, how to obtain, transport and process cultures, special situations (indications and interpretation of results in immunosuppressed patients and/or central vascular catheter carriers, indications for anaerobic BC), differentiation between bacteraemia and contamination when a BC shows bacterial growth and actions to take with a positive BC in patients with fever of unknown origin. Copyright © 2015 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  18. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  19. Processing efficiency theory in children: working memory as a mediator between trait anxiety and academic performance.

    PubMed

    Owens, Matthew; Stevenson, Jim; Norgate, Roger; Hadwin, Julie A

    2008-10-01

    Working memory skills are positively associated with academic performance. In contrast, high levels of trait anxiety are linked with educational underachievement. Based on Eysenck and Calvo's (1992) processing efficiency theory (PET), the present study investigated whether associations between anxiety and educational achievement were mediated via poor working memory performance. Fifty children aged 11-12 years completed verbal (backwards digit span; tapping the phonological store/central executive) and spatial (Corsi blocks; tapping the visuospatial sketchpad/central executive) working memory tasks. Trait anxiety was measured using the State-Trait Anxiety Inventory for Children. Academic performance was assessed using school administered tests of reasoning (Cognitive Abilities Test) and attainment (Standard Assessment Tests). The results showed that the association between trait anxiety and academic performance was significantly mediated by verbal working memory for three of the six academic performance measures (math, quantitative and non-verbal reasoning). Spatial working memory did not significantly mediate the relationship between trait anxiety and academic performance. On average verbal working memory accounted for 51% of the association between trait anxiety and academic performance, while spatial working memory only accounted for 9%. The findings indicate that PET is a useful framework to assess the impact of children's anxiety on educational achievement.

  20. The Latin-American region and the challenges to develop one homogeneous and harmonized hazard model: preliminary results for the Caribbean and Central America regions in the GEM context

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Arcila, M.; Benito, B.; Eraso, J.; García, R.; Gomez Capera, A.; Pagani, M.; Pinho, R.; Rendon, H.; Torres, Y.

    2013-05-01

    Latin America is a seismically active region with complex tectonic settings that make the creation of hazard models challenging. Over the past two decades PSHA studies have been completed for this region in the context of global (Shedlock, 1999), regional (Dimaté et al., 1999) and national initiatives. Currently different research groups are developing new models for various nations. The Global Earthquake Model (GEM), an initiative aiming at the creation of a large global community working collaboratively on building hazard and risk models using open standards and tools, is promoting the collaboration between different national projects and groups so as to facilitate the creation of harmonized regional models. The creation of a harmonized hazard model can follow different approaches, varying from a simple patching of available models to a complete homogenisation of basic information and the subsequent creation of a completely new PSHA model. In this contribution we describe the process and results of a first attempt aiming at the creation of a community based model covering the Caribbean and Central America regions. It consists of five main steps: 1- Identification and collection of available PSHA input models; 2- Analysis of the consistency, transparency and reproducibility of each model; 3- Selection (if more then a model exists for the same region); 4- Representation of the models in a standardized format and incorporation of new knowledge from recent studies; 5- Proposal(s) of harmonization We consider some PHSA studies completed over the latest twenty years in the region comprising the Caribbean (CAR), Central America (CAM) and northern South America (SA), we illustrate a tentative harmonization of the seismic source geometries models and we discuss the steps needed toward a complete harmonisation of the models. Our will is to have a model based on best practices and high standards created though a combination of knowledge and competences coming from the scientific community, incorporating national and regional Institutions. This is an ambitious goal that can be pursued only through an intense and open cooperation between all the interested subjects.

  1. Use of general purpose graphics processing units with MODFLOW

    USGS Publications Warehouse

    Hughes, Joseph D.; White, Jeremy T.

    2013-01-01

    To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.

  2. Behavioral Signs of (Central) Auditory Processing Disorder in Children With Nonsyndromic Cleft Lip and/or Palate: A Parental Questionnaire Approach.

    PubMed

    Ma, Xiaoran; McPherson, Bradley; Ma, Lian

    2016-03-01

    Objective Children with nonsyndromic cleft lip and/or palate often have a high prevalence of middle ear dysfunction. However, there are also indications that they may have a higher prevalence of (central) auditory processing disorder. This study used Fisher's Auditory Problems Checklist for caregivers to determine whether children with nonsyndromic cleft lip and/or palate have potentially more auditory processing difficulties compared with craniofacially normal children. Methods Caregivers of 147 school-aged children with nonsyndromic cleft lip and/or palate were recruited for the study. This group was divided into three subgroups: cleft lip, cleft palate, and cleft lip and palate. Caregivers of 60 craniofacially normal children were recruited as a control group. Hearing health tests were conducted to evaluate peripheral hearing. Caregivers of children who passed this assessment battery completed Fisher's Auditory Problems Checklist, which contains 25 questions related to behaviors linked to (central) auditory processing disorder. Results Children with cleft palate showed the lowest scores on the Fisher's Auditory Problems Checklist questionnaire, consistent with a higher index of suspicion for (central) auditory processing disorder. There was a significant difference in the manifestation of (central) auditory processing disorder-linked behaviors between the cleft palate and the control groups. The most common behaviors reported in the nonsyndromic cleft lip and/or palate group were short attention span and reduced learning motivation, along with hearing difficulties in noise. Conclusion A higher occurrence of (central) auditory processing disorder-linked behaviors were found in children with nonsyndromic cleft lip and/or palate, particularly cleft palate. Auditory processing abilities should not be ignored in children with nonsyndromic cleft lip and/or palate, and it is necessary to consider assessment tests for (central) auditory processing disorder when an auditory diagnosis is made for this population.

  3. 41 CFR Appendix C to Chapter 301 - Standard Data Elements for Federal Travel [Traveler Identification

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Method employee used to purchase transportation tickets Method Indicator GTR U.S. Government Transportation Request Central Billing Account A contractor centrally billed account Government Charge Card In.../Date Fields Claimant Signature Traveler's signature, or digital representation. The signature signifies...

  4. Water quality of the Garang River, Semarang, Central Java, Indonesia based on the government regulation standard

    NASA Astrophysics Data System (ADS)

    Ujianti, R. M. D.; Anggoro, S.; Bambang, A. N.; Purwanti, F.

    2018-05-01

    The Garang watershed composed by three main river streams has been managed by the Regional water company of the Semarang city, Central Java for drinking water supply. A river is often polluted by domestic waste and industrial effluents. Therefore water quality of the river should be keep to meet the Government regulation standard. The study aims to analyze water quality of the Garang’ river using pollution index based on the government regulation. Series data from 2010 to 2016 were derived from the Environmental and Forestry Office of the Central Java Province and sampling of water quality was taken in August 2017 from the middle of watershed area. Water quality parameters include temperature, pH, TDS, DO, COD, Phosphate, Nitrate, Chromium, Copper, Cadmium and H2S. The research indicates that concentration of Copper has exceeds the standard of the Government Regulation No. 82 Year 2001. The water pollution index is 1.23, its means that the river is lightly polluted. Therefore the river should be managed comprehensively for sustainable uses in order to create one river one management concept.

  5. Light neutral C P -even Higgs boson within the next-to-minimal supersymmetric standard model at the Large Hadron Electron Collider

    NASA Astrophysics Data System (ADS)

    Das, Siba Prasad; Nowakowski, Marek

    2017-09-01

    We analyze the prospects of observing the light charge parity (C P )-even neutral Higgs bosons (h1) in their decays into b b ¯ quarks, in the neutral and charged current production processes e h1q and ν h1q at the upcoming Large Hadron Electron Collider (LHeC), with √{s }≈1.296 TeV . Assuming that the intermediate Higgs boson (h2 ) is Standard Model (SM)-like, we study the Higgs production within the framework of next-to-minimal supersymmetric Standard Model (NMSSM). We consider the constraints from dark-matter, sparticle masses, and the Higgs boson data. The signal in our analysis can be classified as three jets, with electron (missing energy) coming from the neutral (charged) current interaction. We demand that the number of b -tagged jets in the central rapidity region be greater or equal to two. The remaining jet is tagged in the forward regions. With this forward jet and two b -tagged jets in the central region, we reconstructed three jets invariant masses. Applying some lower limits on these invariant masses turns out to be an essential criterion to enhance the signal-to-background rates, with slightly different sets of kinematical selections in the two different channels. We consider almost all reducible and irreducible SM background processes. We find that the non-SM like Higgs boson, h1, would be accessible in some of the NMSSM benchmark points, at approximately the 0.4 σ (2.5 σ ) level in the e +3 j channel up to Higgs boson masses of 75 GeV, and in the ET +3 j channel could be discovered with the 1.7 σ (2.4 σ ) level up to Higgs boson masses of 88 GeV with 100 fb-1 of data in a simple cut-based (with optimization) selection. With ten times more data accumulation at the end of the LHeC run, and using optimization, one can have 5 σ discovery in the electron (missing energy) channel up to 85 (more than 90) GeV.

  6. Free visual exploration of natural movies in schizophrenia.

    PubMed

    Silberg, Johanna Elisa; Agtzidis, Ioannis; Startsev, Mikhail; Fasshauer, Teresa; Silling, Karen; Sprenger, Andreas; Dorr, Michael; Lencer, Rebekka

    2018-01-05

    Eye tracking dysfunction (ETD) observed with standard pursuit stimuli represents a well-established biomarker for schizophrenia. How ETD may manifest during free visual exploration of real-life movies is unclear. Eye movements were recorded (EyeLink®1000) while 26 schizophrenia patients and 25 healthy age-matched controls freely explored nine uncut movies and nine pictures of real-life situations for 20 s each. Subsequently, participants were shown still shots of these scenes to decide whether they had explored them as movies or pictures. Participants were additionally assessed on standard eye-tracking tasks. Patients made smaller saccades (movies (p = 0.003), pictures (p = 0.002)) and had a stronger central bias (movies and pictures (p < 0.001)) than controls. In movies, patients' exploration behavior was less driven by image-defined, bottom-up stimulus saliency than controls (p < 0.05). Proportions of pursuit tracking on movies differed between groups depending on the individual movie (group*movie p = 0.011, movie p < 0.001). Eye velocity on standard pursuit stimuli was reduced in patients (p = 0.029) but did not correlate with pursuit behavior on movies. Additionally, patients obtained lower rates of correctly identified still shots as movies or pictures (p = 0.046). Our results suggest a restricted centrally focused visual exploration behavior in patients not only on pictures, but also on movies of real-life scenes. While ETD observed in the laboratory cannot be directly transferred to natural viewing conditions, these alterations support a model of impairments in motion information processing in patients resulting in a reduced ability to perceive moving objects and less saliency driven exploration behavior presumably contributing to alterations in the perception of the natural environment.

  7. The role of the amygdala and the basal ganglia in visual processing of central vs. peripheral emotional content.

    PubMed

    Almeida, Inês; van Asselen, Marieke; Castelo-Branco, Miguel

    2013-09-01

    In human cognition, most relevant stimuli, such as faces, are processed in central vision. However, it is widely believed that recognition of relevant stimuli (e.g. threatening animal faces) at peripheral locations is also important due to their survival value. Moreover, task instructions have been shown to modulate brain regions involved in threat recognition (e.g. the amygdala). In this respect it is also controversial whether tasks requiring explicit focus on stimulus threat content vs. implicit processing differently engage primitive subcortical structures involved in emotional appraisal. Here we have addressed the role of central vs. peripheral processing in the human amygdala using animal threatening vs. non-threatening face stimuli. First, a simple animal face recognition task with threatening and non-threatening animal faces, as well as non-face control stimuli, was employed in naïve subjects (implicit task). A subsequent task was then performed with the same stimulus categories (but different stimuli) in which subjects were told to explicitly detect threat signals. We found lateralized amygdala responses both to the spatial location of stimuli and to the threatening content of faces depending on the task performed: the right amygdala showed increased responses to central compared to left presented stimuli specifically during the threat detection task, while the left amygdala was better prone to discriminate threatening faces from non-facial displays during the animal face recognition task. Additionally, the right amygdala responded to faces during the threat detection task but only when centrally presented. Moreover, we have found no evidence for superior responses of the amygdala to peripheral stimuli. Importantly, we have found that striatal regions activate differentially depending on peripheral vs. central processing of threatening faces. Accordingly, peripheral processing of these stimuli activated more strongly the putaminal region, while central processing engaged mainly the caudate nucleus. We conclude that the human amygdala has a central bias for face stimuli, and that visual processing recruits different striatal regions, putaminal or caudate based, depending on the task and on whether peripheral or central visual processing is involved. © 2013 Elsevier Ltd. All rights reserved.

  8. Distributed trace using central performance counter memory

    DOEpatents

    Satterfield, David L; Sexton, James C

    2013-10-22

    A plurality of processing cores, are central storage unit having at least memory connected in a daisy chain manner, forming a daisy chain ring layout on an integrated chip. At least one of the plurality of processing cores places trace data on the daisy chain connection for transmitting the trace data to the central storage unit, and the central storage unit detects the trace data and stores the trace data in the memory co-located in with the central storage unit.

  9. Distributed trace using central performance counter memory

    DOEpatents

    Satterfield, David L.; Sexton, James C.

    2013-01-22

    A plurality of processing cores, are central storage unit having at least memory connected in a daisy chain manner, forming a daisy chain ring layout on an integrated chip. At least one of the plurality of processing cores places trace data on the daisy chain connection for transmitting the trace data to the central storage unit, and the central storage unit detects the trace data and stores the trace data in the memory co-located in with the central storage unit.

  10. Early goal-directed therapy (EGDT) for severe sepsis/septic shock: which components of treatment are more difficult to implement in a community-based emergency department?

    PubMed

    O'Neill, Rory; Morales, Javier; Jule, Michael

    2012-05-01

    Early goal-directed therapy (EGDT) has been shown to reduce mortality in patients with severe sepsis/septic shock, however, implementation of this protocol in the emergency department (ED) is sometimes difficult. We evaluated our sepsis protocol to determine which EGDT elements were more difficult to implement in our community-based ED. This was a non-concurrent cohort study of adult patients entered into a sepsis protocol at a single community hospital from July 2008 to March 2009. Charts were reviewed for the following process measures: a predefined crystalloid bolus, antibiotic administration, central venous catheter insertion, central venous pressure measurement, arterial line insertion, vasopressor utilization, central venous oxygen saturation measurement, and use of a standardized order set. We also compared the individual component adherence with survival to hospital discharge. A total of 98 patients presented over a 9-month period. Measures with the highest adherence were vasopressor administration (79%; 95% confidence interval [CI] 69-89%) and antibiotic use (78%; 95% CI 68-85%). Measures with the lowest adherence included arterial line placement (42%; 95% CI 32-52%), central venous pressure measurement (27%; 95% CI 18-36%), and central venous oxygen saturation measurement (15%; 95% CI 7-23%). Fifty-seven patients survived to hospital discharge (Mortality: 33%). The only element of EDGT to demonstrate a statistical significance in patients surviving to hospital discharge was the crystalloid bolus (79% vs. 46%) (respiratory rate [RR] = 1.76, 95% CI 1.11-2.58). In our community hospital, arterial line placement, central venous pressure measurement, and central venous oxygen saturation measurement were the most difficult elements of EGDT to implement. Patients who survived to hospital discharge were more likely to receive the crystalloid bolus. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Like Driving from "The Back Seat": Teaching English as a Second Language in Commodified Curricular Terrains

    ERIC Educational Resources Information Center

    Plaisance, Michelle; Salas, Spencer; D'Amico, Mark M.

    2018-01-01

    Contemporary K-12 standards-based educational reform has emerged as a central focus of scholarship in TESOL, with robust discussions (practical and theoretical) addressing the shift from ESL as a subject matter unto itself to teaching standards-based content in English (and the standardized assessment of students' achievement across those content…

  12. Changes in the Cognitive Complexity of English Instruction: The Moderating Effects of School and Classroom Characteristics

    ERIC Educational Resources Information Center

    Polikoff, Morgan S.; Struthers, Kathryn

    2013-01-01

    Background/Context: A central aim of standards-based reform is to close achievement gaps by raising academic standards for all students. Rigorous standards coupled with aligned assessments will purportedly improve student opportunity to learn through high-quality, aligned instruction. After 10 years of No Child Left Behind (NCLB), the impact of…

  13. A Creative Approach to the Common Core Standards: The Da Vinci Curriculum

    ERIC Educational Resources Information Center

    Chaucer, Harry

    2012-01-01

    "A Creative Approach to the Common Core Standards: The Da Vinci Curriculum" challenges educators to design programs that boldly embrace the Common Core State Standards by imaginatively drawing from the genius of great men and women such as Leonardo da Vinci. A central figure in the High Renaissance, Leonardo made extraordinary contributions as a…

  14. Centralized light-source optical access network based on polarization multiplexing.

    PubMed

    Grassi, Fulvio; Mora, José; Ortega, Beatriz; Capmany, José

    2010-03-01

    This paper presents and demonstrates a centralized light source optical access network based on optical polarization multiplexing technique. By using two optical sources emitting light orthogonally polarized in the Central Node for downstream and upstream operations, the Remote Node is kept source-free. EVM values below telecommunication standard requirements have been measured experimentally when bidirectional digital signals have been transmitted over 10 km of SMF employing subcarrier multiplexing technique in the electrical domain.

  15. 16 CFR 305.12 - Labeling for central air conditioners, heat pumps, and furnaces.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... pumps, and furnaces. (a) Layout. All energy labels for central air conditioners, heat pumps, and... the end of this part illustrating the basic layout. All positioning, spacing, type sizes, and line... calculated for heating Region IV for the standardized design heating requirement nearest the capacity...

  16. 16 CFR 305.12 - Labeling for central air conditioners, heat pumps, and furnaces.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... pumps, and furnaces. (a) Layout. All energy labels for central air conditioners, heat pumps, and... the end of this part illustrating the basic layout. All positioning, spacing, type sizes, and line... calculated for heating Region IV for the standardized design heating requirement nearest the capacity...

  17. 16 CFR 305.12 - Labeling for central air conditioners, heat pumps, and furnaces.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... pumps, and furnaces. (a) Layout. All energy labels for central air conditioners, heat pumps, and... end of this part illustrating the basic layout. All positioning, spacing, type sizes, and line widths... calculated for heating Region IV for the standardized design heating requirement nearest the capacity...

  18. 16 CFR 305.12 - Labeling for central air conditioners, heat pumps, and furnaces.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... pumps, and furnaces. (a) Layout. All energy labels for central air conditioners, heat pumps, and... the end of this part illustrating the basic layout. All positioning, spacing, type sizes, and line... calculated for heating Region IV for the standardized design heating requirement nearest the capacity...

  19. 75 FR 81440 - Establishment of Class E Airspace; Central City, NE

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... this action to enhance the safety and management of Instrument Flight Rule (IFR) operations at the... airspace at Central City, NE, to accommodate new Area Navigation (RNAV) Standard Instrument Approach... this incorporation by reference action under 1 CFR part 51, subject to the annual revision of FAA Order...

  20. CORRELATIONS OF PERSONAL EXPOSURE TO PARTICLES WITH OUTDOOR AIR MEASUREMENT: A REVIEW OF RECENT STUDIES

    EPA Science Inventory

    Epidemiological studies have found a correlation between daily mortality and particle concentrations in outdoor air as measured at a central monitoring station. These studies have been the central reason for the U.S. EPA to propose new tighter particle standards. However, perso...

  1. 40 CFR 437.14 - New source performance standards (NSPS).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... GUIDELINES AND STANDARDS THE CENTRALIZED WASTE TREATMENT POINT SOURCE CATEGORY Metals Treatment and Recovery....0522 Cobalt 0.182 0.0703 Copper 0.659 0.216 Lead 1.32 0.283 Mercury 0.000641 0.000246 Nickel 0.794 0...

  2. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  3. Central localization of plasticity involved in appetitive conditioning in Lymnaea

    PubMed Central

    Straub, Volko A.; Styles, Benjamin J.; Ireland, Julie S.; O'Shea, Michael; Benjamin, Paul R.

    2004-01-01

    Learning to associate a conditioned (CS) and unconditioned stimulus (US) results in changes in the processing of CS information. Here, we address directly the question whether chemical appetitive conditioning of Lymnaea feeding behavior involves changes in the peripheral and/or central processing of the CS by using extracellular recording techniques to monitor neuronal activity at two stages of the sensory processing pathway. Our data show that appetitive conditioning does not affect significantly the overall CS response of afferent nerves connecting chemosensory structures in the lips and tentacles to the central nervous system (CNS). In contrast, neuronal output from the cerebral ganglia, which represent the first central processing stage for chemosensory information, is enhanced significantly in response to the CS after appetitive conditioning. This demonstrates that chemical appetitive conditioning in Lymnaea affects the central, but not the peripheral processing of chemosensory information. It also identifies the cerebral ganglia of Lymnaea as an important site for neuronal plasticity and forms the basis for detailed cellular studies of neuronal plasticity. PMID:15537733

  4. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    PubMed

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-06-27

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  5. Central precocious puberty: from physiopathological mechanisms to treatment.

    PubMed

    Chirico, V; Lacquaniti, A; Salpietro, V; Buemi, M; Salpietro, C; Arrigo, T

    2014-01-01

    Puberty is a complex, coordinated biological process with multiple levels of regulations. The timing of puberty varies greatly in children and it is influenced by environmental, endocrine and genetic factors. Precocious puberty (PP) is an important issue, affecting between 1 in 5.000-10.000 children. The physiopathological mechanism is still unknown. From an etiological point of view, PP may be subdivided into gonadotropin-releasing hormone (GnRH) -dependent and independent causes. GnRH-dependent PP, often called central precocious puberty (CPP), is based on hypothalamic-pituitary-gonadal axis activation associated with progressive pubertal development, accelerated growth rate and advancement of skeletal age. Conversely, peripheral precocious puberty (PPP) is related to sex steroid exposure, independently of hypothalamic-–pituitary-–gonadal (HPG) axis activation. Kisspeptins play a central role in the modulation of GnRH secretion with peripheral factors that influence the timing of puberty, such as adipokines and endocrine disrupting chemicals. Moreover, PP could be related to genetic disorders, involving pivotal genes of the HPG axis. The standard test used to verify HPG activity is the gonadotropin response to administered GnRH analogs. We describe the physiopathological mechanisms of PP and its clinical implications, analysing diagnostic flow-chart and new potential biomarkers that could reveal PP. An update of the current literature was also carried out regarding the recent novelty for treatment.

  6. Variables associated with peripherally inserted central catheter related infection in high risk newborn infants 1

    PubMed Central

    Rangel, Uesliz Vianna; Gomes, Saint Clair dos Santos; Costa, Ana Maria Aranha Magalhães; Moreira, Maria Elisabeth Lopes

    2014-01-01

    OBJECTIVE: to relate the variables from a surveillance form for intravenous devices in high risk newborn infants with peripherally inserted central catheter related infection. METHODOLOGY: approximately 15 variables were studied, being associated with peripherally inserted central catheter related infection, this being defined by blood culture results. The variables analyzed were obtained from the surveillance forms used with intravenous devices, attached to the medical records of newborn infants weighing between 500 and 1,499 g. The statistical association was defined using the Chi-squared and Student t tests. The study was approved by the Research Ethics Committee of the Instituto Fernandes Figueira under process N. 140.703/12. RESULTS: 63 medical records were analyzed. The infection rate observed was 25.4%. Of the variables analyzed, only three had a statistically-significant relationship with the blood culture - the use of drugs capable of inhibiting acid secretion, post-natal steroid use, and undertaking more than one invasive procedure (p-value of 0.0141, 0.0472 and 0.0277, respectively). CONCLUSION: the absence of significance of the variables of the form may be related to the quality of the records and to the absence of standardization. It is recommended that the teams be encouraged to adhere to the protocol and fill out the form. PMID:25493681

  7. D-MSR: A Distributed Network Management Scheme for Real-Time Monitoring and Process Control Applications in Wireless Industrial Automation

    PubMed Central

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead. PMID:23807687

  8. Finger tapping and pre-attentive sensorimotor timing in adults with ADHD.

    PubMed

    Hove, Michael J; Gravel, Nickolas; Spencer, Rebecca M C; Valera, Eve M

    2017-12-01

    Sensorimotor timing deficits are considered central to attention-deficit/hyperactivity disorder (ADHD). However, the tasks establishing timing impairments often involve interconnected processes, including low-level sensorimotor timing and higher level executive processes such as attention. Thus, the source of timing deficits in ADHD remains unclear. Low-level sensorimotor timing can be isolated from higher level processes in a finger-tapping task that examines the motor response to unexpected shifts of metronome onsets. In this study, adults with ADHD and ADHD-like symptoms (n = 25) and controls (n = 26) performed two finger-tapping tasks. The first assessed tapping variability in a standard tapping task (metronome-paced and unpaced). In the other task, participants tapped along with a metronome that contained unexpected shifts (±15, 50 ms); the timing adjustment on the tap following the shift captures pre-attentive sensorimotor timing (i.e., phase correction) and thus should be free of potential higher order confounds (e.g., attention). In the standard tapping task, as expected, the ADHD group had higher timing variability in both paced and unpaced tappings. However, in the pre-attentive task, performance did not differ between the ADHD and control groups. Together, results suggest that low-level sensorimotor timing and phase correction are largely preserved in ADHD and that some timing impairments observed in ADHD may stem from higher level factors (such as sustained attention).

  9. Pulse-rate discrimination by cochlear-implant and normal-hearing listeners with and without binaural cues

    PubMed Central

    Carlyon, Robert P.; Long, Christopher J.; Deeks, John M.

    2008-01-01

    Experiment 1 measured rate discrimination of electric pulse trains by bilateral cochlear implant (CI) users, for standard rates of 100, 200, and 300 pps. In the diotic condition the pulses were presented simultaneously to the two ears. Consistent with previous results with unilateral stimulation, performance deteriorated at higher standard rates. In the signal interval of each trial in the dichotic condition, the standard rate was presented to the left ear and the (higher) signal rate was presented to the right ear; the non-signal intervals were the same as in the diotic condition. Performance in the dichotic condition was better for some listeners than in the diotic condition for standard rates of 100 and 200 pps, but not at 300 pps. It is concluded that the deterioration in rate discrimination observed for CI users at high rates cannot be alleviated by the introduction of a binaural cue, and is unlikely to be limited solely by central pitch processes. Experiment 2 performed an analogous experiment in which 300-pps acoustic pulse trains were bandpass filtered (3900-5400 Hz) and presented in a noise background to normal-hearing listeners. Unlike the results of experiment 1, performance was superior in the dichotic than in the diotic condition. PMID:18397032

  10. Coherent Frequency Reference System for the NASA Deep Space Network

    NASA Technical Reports Server (NTRS)

    Tucker, Blake C.; Lauf, John E.; Hamell, Robert L.; Gonzaler, Jorge, Jr.; Diener, William A.; Tjoelker, Robert L.

    2010-01-01

    The NASA Deep Space Network (DSN) requires state-of-the-art frequency references that are derived and distributed from very stable atomic frequency standards. A new Frequency Reference System (FRS) and Frequency Reference Distribution System (FRD) have been developed, which together replace the previous Coherent Reference Generator System (CRG). The FRS and FRD each provide new capabilities that significantly improve operability and reliability. The FRS allows for selection and switching between frequency standards, a flywheel capability (to avoid interruptions when switching frequency standards), and a frequency synthesis system (to generate standardized 5-, 10-, and 100-MHz reference signals). The FRS is powered by redundant, specially filtered, and sustainable power systems and includes a monitor and control capability for station operations to interact and control the frequency-standard selection process. The FRD receives the standardized 5-, 10-, and 100-MHz reference signals and distributes signals to distribution amplifiers in a fan out fashion to dozens of DSN users that require the highly stable reference signals. The FRD is also powered by redundant, specially filtered, and sustainable power systems. The new DSN Frequency Distribution System, which consists of the FRS and FRD systems described here, is central to all operational activities of the NASA DSN. The frequency generation and distribution system provides ultra-stable, coherent, and very low phase-noise references at 5, l0, and 100 MHz to between 60 and 100 separate users at each Deep Space Communications Complex.

  11. Arms Transfers to Venezuela: A Comparative and Critical Analysis of the Acquisition Process (1980-1996).

    DTIC Science & Technology

    1999-03-01

    Budget (Oficina Central de Presupuesto [OCEPRE]), which is the presidential agency with overall responsibility to formulate the national budget...Budget (Oficina Central de Presupuesto OCEPRE), and they receive a special treatment in the Venezuelan Budgetary process. The OCEPRE is the...the Central Office of Budget (Oficina Central de Presupuesto , OCEPRE). This occurs when funds for weapons acquisitions come from the ordinary budget

  12. An Analog Macroscopic Technique for Studying Molecular Hydrodynamic Processes in Dense Gases and Liquids.

    PubMed

    Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G

    2017-12-04

    An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.

  13. Brain responses to sound intensity changes dissociate depressed participants and healthy controls.

    PubMed

    Ruohonen, Elisa M; Astikainen, Piia

    2017-07-01

    Depression is associated with bias in emotional information processing, but less is known about the processing of neutral sensory stimuli. Of particular interest is processing of sound intensity which is suggested to indicate central serotonergic function. We tested weather event-related brain potentials (ERPs) to occasional changes in sound intensity can dissociate first-episode depressed, recurrent depressed and healthy control participants. The first-episode depressed showed larger N1 amplitude to deviant sounds compared to recurrent depression group and control participants. In addition, both depression groups, but not the control group, showed larger N1 amplitude to deviant than standard sounds. Whether these manifestations of sensory over-excitability in depression are directly related to the serotonergic neurotransmission requires further research. The method based on ERPs to sound intensity change is fast and low-cost way to objectively measure brain activation and holds promise as a future diagnostic tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Autophagy proteins are not universally required for phagosome maturation.

    PubMed

    Cemma, Marija; Grinstein, Sergio; Brumell, John H

    2016-09-01

    Phagocytosis plays a central role in immunity and tissue homeostasis. After internalization of cargo into single-membrane phagosomes, these compartments undergo a maturation sequences that terminates in lysosome fusion and cargo degradation. Components of the autophagy pathway have recently been linked to phagosome maturation in a process called LC3-associated phagocytosis (LAP). In this process, autophagy machinery is thought to conjugate LC3 directly onto the phagosomal membrane to promote lysosome fusion. However, a recent study has suggested that ATG proteins may in fact impair phagosome maturation to promote antigen presentation. Here, we examined the impact of ATG proteins on phagosome maturation in murine cells using FCGR2A/FcγR-dependent phagocytosis as a model. We show that phagosome maturation is not affected in Atg5-deficient mouse embryonic fibroblasts, or in Atg5- or Atg7-deficient bone marrow-derived macrophages using standard assays of phagosome maturation. We propose that ATG proteins may be required for phagosome maturation under some conditions, but are not universally required for this process.

  15. Improving the Procurement of Information Technology Commodities

    DTIC Science & Technology

    1994-04-12

    and 0 the federal government. Apply Electronic Data Interchange (EDI) to order processing at AVC. Implementing EDI at AVC and throughout the Air Force...ordering process by approximately 2 weeks. 4. Centralize the Central Order Processing Offices. By consolidating AVC and SSCs order processing responsibilities...Im plem entation ..................................................................................... 21 5. ORDER PROCESSING WITH EDI

  16. Central Auditory Processing Disorders: Is It a Meaningful Construct or a Twentieth Century Unicorn?

    ERIC Educational Resources Information Center

    Kamhi, Alan G.; Beasley, Daniel S.

    1985-01-01

    The article demonstrates how professional and theoretical perspectives (including psycholinguistics, behaviorist, and information processing perspectives) significantly influence the manner in which central auditory processing is viewed, assessed, and remediated. (Author/CL)

  17. Expect the unexpected: a paradoxical effect of cue validity on the orienting of attention.

    PubMed

    Jollie, Ashley; Ivanoff, Jason; Webb, Nicole E; Jamieson, Andrew S

    2016-10-01

    Predictive central cues generate location-based expectancies, voluntary shifts of attention, and facilitate target processing. Often, location-based expectancies and voluntary attention are confounded in cueing tasks. Here we vary the predictability of central cues to determine whether they can evoke the inhibition of target processing in three go/no-go experiments. In the first experiment, the central cue was uninformative and did not predict the target's location. Importantly, these cues did not seem to affect target processing. In the second experiment, the central cue indicated the most or the least likely location of the target. Surprisingly, both types of cues facilitated target processing at the cued location. In the third experiment, the central cue predicted the most likely location of a no-go target, but it did not provide relevant information pertaining to the location of the go target. Again, the central cue facilitated processing of the go target. These results suggest that efforts to strategically allocate inhibition may be thwarted by the paradoxical monitoring of the cued location. The current findings highlight the need to further explore the relationship between location-based expectancies and spatial attention in cueing tasks.

  18. Data Acquisition System for Multi-Frequency Radar Flight Operations Preparation

    NASA Technical Reports Server (NTRS)

    Leachman, Jonathan

    2010-01-01

    A three-channel data acquisition system was developed for the NASA Multi-Frequency Radar (MFR) system. The system is based on a commercial-off-the-shelf (COTS) industrial PC (personal computer) and two dual-channel 14-bit digital receiver cards. The decimated complex envelope representations of the three radar signals are passed to the host PC via the PCI bus, and then processed in parallel by multiple cores of the PC CPU (central processing unit). The innovation is this parallelization of the radar data processing using multiple cores of a standard COTS multi-core CPU. The data processing portion of the data acquisition software was built using autonomous program modules or threads, which can run simultaneously on different cores. A master program module calculates the optimal number of processing threads, launches them, and continually supplies each with data. The benefit of this new parallel software architecture is that COTS PCs can be used to implement increasingly complex processing algorithms on an increasing number of radar range gates and data rates. As new PCs become available with higher numbers of CPU cores, the software will automatically utilize the additional computational capacity.

  19. Central load reduces peripheral processing: Evidence from incidental memory of background speech.

    PubMed

    Halin, Niklas; Marsh, John E; Sörqvist, Patrik

    2015-12-01

    Is there a trade-off between central (working memory) load and peripheral (perceptual) processing? To address this question, participants were requested to undertake an n-back task in one of two levels of central/cognitive load (i.e., 1-back or 2-back) in the presence of a to-be-ignored story presented via headphones. Participants were told to ignore the background story, but they were given a surprise memory test of what had been said in the background story, immediately after the n-back task was completed. Memory was poorer in the high central load (2-back) condition in comparison with the low central load (1-back) condition. Hence, when people compensate for higher central load, by increasing attentional engagement, peripheral processing is constrained. Moreover, participants with high working memory capacity (WMC) - with a superior ability for attentional engagement - remembered less of the background story, but only in the low central load condition. Taken together, peripheral processing - as indexed by incidental memory of background speech - is constrained when task engagement is high. © 2015 The Authors. Scandinavian Journal of Psychology published by Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  20. 7. 'Tunnel No 14, Concrete Lining,' Southern Pacific Standard SingleTrack ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. 'Tunnel No 14, Concrete Lining,' Southern Pacific Standard Single-Track Tunnel, ca. 1909. Under current numbering, this is now Tunnel 29 (HAER No. CA-205). - Central Pacific Transcontinental Railroad, Sacramento to Nevada state line, Sacramento, Sacramento County, CA

  1. 7 CFR 29.6037 - Stem.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Stem. 29.6037 Section 29.6037 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Definitions § 29.6037 Stem. The midrib or large central vein of a tobacco leaf. ...

  2. 7 CFR 29.6037 - Stem.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Stem. 29.6037 Section 29.6037 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Definitions § 29.6037 Stem. The midrib or large central vein of a tobacco leaf. ...

  3. 7 CFR 29.6037 - Stem.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Stem. 29.6037 Section 29.6037 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Definitions § 29.6037 Stem. The midrib or large central vein of a tobacco leaf. ...

  4. 7 CFR 29.6037 - Stem.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Stem. 29.6037 Section 29.6037 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Definitions § 29.6037 Stem. The midrib or large central vein of a tobacco leaf. ...

  5. 7 CFR 29.6037 - Stem.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Stem. 29.6037 Section 29.6037 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Definitions § 29.6037 Stem. The midrib or large central vein of a tobacco leaf. ...

  6. The Educationally Challenged American School District.

    ERIC Educational Resources Information Center

    Clinchy, Evans

    1998-01-01

    Two national reform movements--one focused on creating small, autonomous schools, the other fixated on a standardization agenda--are basically in conflict. The standards movement is touting the traditional, top-down, centralized, bureaucratic system modeled after Frederick Taylor and his efficiency experts. Progressive, decentralized initiatives…

  7. Chemistry in Past and New Science Frameworks and Standards: Gains, Losses, and Missed Opportunities

    ERIC Educational Resources Information Center

    Talanquer, Vicente; Sevian, Hannah

    2014-01-01

    Science education frameworks and standards play a central role in the development of curricula and assessments, as well as in guiding teaching practices in grades K-12. Recently, the National Research Council published a new Framework for K-12 Science Education that has guided the development of the Next Generation Science Standards. In this…

  8. Strategies for success in creating an effective multihospital health-system pharmacy and therapeutics committee.

    PubMed

    Leonard, Mandy C; Thyagarajan, Rema; Wilson, Amy J; Sekeres, Mikkael A

    2018-04-01

    Lessons learned from the creation of a multihospital health-system formulary management and pharmacy and therapeutics (P&T) committee are described. A health system can create and implement a multihospital system formulary and P&T committee to provide evidence-based medications for ideal healthcare. The formulary and P&T process should be multidisciplinary and include adequate representation from system hospitals. The aim of a system formulary and P&T committee is standardization; however, the system should allow flexibility for differences. Key points for a successful multihospital system formulary and P&T committee are patience, collaboration, resilience, and communication. When establishing a multihospital health-system formulary and P&T committee, the needs of individual hospitals are crucial. A designated member of the pharmacy department needs to centrally coordinate and manage formulary requests, medication reviews and monographs, meeting agendas and minutes, and a summary of decisions for implementation. It is imperative to create a timeline for formulary reviews to set expectations, as well as a process for formulary appeals. Collaboration across the various hospitals is critical for successful formulary standardization. When implementing a health-system P&T committee or standardizing a formulary system, it is important to be patient and give local sites time to make practice changes. Evidence-based data and rationale must be provided to all sites to support formulary changes. Finally, there must be multidisciplinary collaboration. There are several options for formulary structures and P&T committees in a health system. Potential strengths and barriers should be evaluated before selecting a formulary management process. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. [Assessment of the air quality improment of cleaning and disinfection on central air-conditioning ventilation system].

    PubMed

    Liu, Hongliang; Zhang, Lei; Feng, Lihong; Wang, Fei; Xue, Zhiming

    2009-09-01

    To assess the effect of air quality of cleaning and disinfection on central air-conditioning ventilation systems. 102 air-conditioning ventilation systems in 46 public facilities were sampled and investigated based on Hygienic assessment criterion of cleaning and disinfection of public central air-conditioning systems. Median dust volume decreased from 41.8 g/m2 to 0.4 g/m2, and the percentage of pipes meeting the national standard for dust decreased from 17.3% (13/60) to 100% (62/62). In the dust, median aerobic bacterial count decreased from 14 cfu/cm2 to 1 cfu/cm2. Median aerobic fungus count decreased from 10 cfu/cm2 to 0 cfu/cm2. The percentage of pipes with bacterial and fungus counts meeting the national standard increased from 92.4% (171/185) and 82.2% (152/185) to 99.4% (165/166) and 100% (166/166), respectively. In the ventilation air, median aerobic bacterial count decreased from 756 cfu/m3 to 229 cfu/m3. Median aerobic fungus count decreased from 382 cfu/m3 to 120 cfu/m3. The percentage of pipes meeting the national standard for ventilation air increased from 33.3% (81/243) and 62.1% (151/243) to 79.8% (292/366) and 87.7% (242/276), respectively. But PM10 rose from 0.060 mg/m3 to 0.068 mg/m3, and the percentage of pipes meeting the national standard for PM10 increased from 74.2% (13/60) to 90.2% (46/51). The cleaning and disinfection of central air-conditioning ventilation systems could have a beneficial effect of air quality.

  10. Jedburgh Operations: Support to the French Resistance in Central France from June through September 1944

    DTIC Science & Technology

    1991-06-07

    SUBTITLE 5. FUNDING NUMBERS Jedbr’rgh Operations: Support to the French Resistance in Central France From June through September 1944 F.. AUTHOR(S...Europe, beginning with operation OVERLORD in France . This study explains the origins, purpose, and missions/tasks of the Jedburgh project . The focus of...Best Available Standard Form 298 Back (Rev 2-89) JEDBURGH OPERATI ONS: SUPPORT TO THE FRENCH RESISTANCE IN CENTRAL FRANCE FROM JUNE THROUGH SEPTEMBER

  11. Polarization of gamma-ray burst afterglows in the synchrotron self-Compton process from a highly relativistic jet

    NASA Astrophysics Data System (ADS)

    Lin, Hai-Nan; Li, Xin; Chang, Zhe

    2017-04-01

    Linear polarization has been observed in both the prompt phase and afterglow of some bright gamma-ray bursts (GRBs). Polarization in the prompt phase spans a wide range, and may be as high as ≳ 50%. In the afterglow phase, however, it is usually below 10%. According to the standard fireball model, GRBs are produced by synchrotron radiation and Compton scattering process in a highly relativistic jet ejected from the central engine. It is widely accepted that prompt emissions occur in the internal shock when shells with different velocities collide with each other, and the magnetic field advected by the jet from the central engine can be ordered on a large scale. On the other hand, afterglows are often assumed to occur in the external shock when the jet collides with interstellar medium, and the magnetic field produced by the shock through, for example, Weibel instability, is possibly random. In this paper, we calculate the polarization properties of the synchrotron self-Compton process from a highly relativistic jet, in which the magnetic field is randomly distributed in the shock plane. We also consider the generalized situation where a uniform magnetic component perpendicular to the shock plane is superposed on the random magnetic component. We show that it is difficult for the polarization to be larger than 10% if the seed electrons are isotropic in the jet frame. This may account for the observed upper limit of polarization in the afterglow phase of GRBs. In addition, if the random and uniform magnetic components decay with time at different speeds, then the polarization angle may change 90° during the temporal evolution. Supported by Fundamental Research Funds for the Central Universities (106112016CDJCR301206), National Natural Science Fund of China (11375203, 11603005), and Open Project Program of State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of Sciences, China (Y5KF181CJ1)

  12. A clinical study to evaluate the correlation between maxillary central incisor tooth form and face form in an Indian population.

    PubMed

    Koralakunte, Pavankumar R; Budihal, Dhanyakumar H

    2012-09-01

    A study was performed to examine the correlation between maxillary central incisor tooth form and face form in males and females in an Indian population. The selection of prosthetic teeth for edentulous patients is a primary issue in denture esthetics, especially in the case of maxillary central incisors, which are the most prominent teeth in the arch. Two hundred dental students of Indian origin comprising 79 males and 121 females aged 18-28 years studying at Bapuji Dental College and Hospital were randomly selected as the study subjects. A standardized photographic procedure was used to obtain images of the face and the maxillary central incisors. The outline forms of the face and the maxillary right central incisor tooth were determined using a standardized method. The outline forms obtained were used to classify both face form and tooth form on the basis of visual and William's methods. The means were considered after evaluation by five prosthodontists, and the results were tabulated. Statistical analysis was performed using the chi-squared test for association and Z-test for equality of proportions. A correlation greater than 50% was observed between tooth form and face form by the visual method, compared with one of 31.5% by William's method. There was no highly defined correlation between maxillary central incisor tooth form and face form among the male and female Indian subjects studied.

  13. The neural correlates of beauty comparison

    PubMed Central

    Mussweiler, Thomas; Mullins, Paul; Linden, David E. J.

    2014-01-01

    Beauty is in the eye of the beholder. How attractive someone is perceived to be depends on the individual or cultural standards to which this person is compared. But although comparisons play a central role in the way people judge the appearance of others, the brain processes underlying attractiveness comparisons remain unknown. In the present experiment, we tested the hypothesis that attractiveness comparisons rely on the same cognitive and neural mechanisms as comparisons of simple nonsocial magnitudes such as size. We recorded brain activity with functional magnetic resonance imaging (fMRI) while participants compared the beauty or height of two women or two dogs. Our data support the hypothesis of a common process underlying these different types of comparisons. First, we demonstrate that the distance effect characteristic of nonsocial comparisons also holds for attractiveness comparisons. Behavioral results indicated, for all our comparisons, longer response times for near than far distances. Second, the neural correlates of these distance effects overlapped in a frontoparietal network known for its involvement in processing simple nonsocial quantities. These results provide evidence for overlapping processes in the comparison of physical attractiveness and nonsocial magnitudes. PMID:23508477

  14. The neural correlates of beauty comparison.

    PubMed

    Kedia, Gayannée; Mussweiler, Thomas; Mullins, Paul; Linden, David E J

    2014-05-01

    Beauty is in the eye of the beholder. How attractive someone is perceived to be depends on the individual or cultural standards to which this person is compared. But although comparisons play a central role in the way people judge the appearance of others, the brain processes underlying attractiveness comparisons remain unknown. In the present experiment, we tested the hypothesis that attractiveness comparisons rely on the same cognitive and neural mechanisms as comparisons of simple nonsocial magnitudes such as size. We recorded brain activity with functional magnetic resonance imaging (fMRI) while participants compared the beauty or height of two women or two dogs. Our data support the hypothesis of a common process underlying these different types of comparisons. First, we demonstrate that the distance effect characteristic of nonsocial comparisons also holds for attractiveness comparisons. Behavioral results indicated, for all our comparisons, longer response times for near than far distances. Second, the neural correlates of these distance effects overlapped in a frontoparietal network known for its involvement in processing simple nonsocial quantities. These results provide evidence for overlapping processes in the comparison of physical attractiveness and nonsocial magnitudes.

  15. Comparison of DP3 Signals Evoked by Comfortable 3D Images and 2D Images — an Event-Related Potential Study using an Oddball Task

    NASA Astrophysics Data System (ADS)

    Ye, Peng; Wu, Xiang; Gao, Dingguo; Liang, Haowen; Wang, Jiahui; Deng, Shaozhi; Xu, Ningsheng; She, Juncong; Chen, Jun

    2017-02-01

    The horizontal binocular disparity is a critical factor for the visual fatigue induced by watching stereoscopic TVs. Stereoscopic images that possess the disparity within the ‘comfort zones’ and remain still in the depth direction are considered comfortable to the viewers as 2D images. However, the difference in brain activities between processing such comfortable stereoscopic images and 2D images is still less studied. The DP3 (differential P3) signal refers to an event-related potential (ERP) component indicating attentional processes, which is typically evoked by odd target stimuli among standard stimuli in an oddball task. The present study found that the DP3 signal elicited by the comfortable 3D images exhibits the delayed peak latency and enhanced peak amplitude over the anterior and central scalp regions compared to the 2D images. The finding suggests that compared to the processing of the 2D images, more attentional resources are involved in the processing of the stereoscopic images even though they are subjectively comfortable.

  16. Development of a practice-based research program.

    PubMed

    Hawk, C; Long, C R; Boulanger, K

    1998-01-01

    To establish an infrastructure to collect accurate data from ambulatory settings. The program was developed through an iterative model governed by a process of formative evaluation. The three iterations were a needs assessment, feasibility study and pilot project. Necessary program components were identified as infrastructure, practitioner-researcher partnership, centralized data management and standardized quality assurance measures. Volunteer chiropractors and their staff collected data on patients in their practices in ambulatory settings in the U.S. and Canada. Evaluative measures were counts of participants, patients and completed forms. Standardized, validated and reliable measures collected by patient self-report were used to assess treatment outcomes. These included the SF-36 or SF-12 Health Survey, the Pain Disability Index, and the Global Well-Being Scale. For characteristics for which appropriate standardized instruments were not available, questionnaires were designed and and pilot-tested before use. Information was gathered on practice and patient characteristics and treatment outcomes, but for this report, only those data concerning process evaluation are reported. Through the three program iterations, 65 DCs collected data on 1360 patients, 663 of whom were new patients. Follow-up data recorded by doctors were obtained for more than 70% of patients; a maximum of 50% of patient-completed follow-up forms were collected in the three iterations. This program is capable of providing data for descriptive epidemiology of ambulatory patients, and, with continued effort to maximize follow-up, may have utility in providing insight into utilization patterns and patient outcomes.

  17. Modeling and characterization of the CEJ for optimization of esthetic implant design.

    PubMed

    Gallucci, German O; Belser, Urs C; Bernard, Jean-Pierre; Magne, Pascal

    2004-02-01

    This study evaluated the dimensions and characteristics of the cementoenamel junction (CEJ) of maxillary anterior teeth; the natural CEJ was compared to current implant design and used for design optimization. Standardized digital images of 137 extracted human teeth (45 central incisors, 46 lateral incisors, and 46 canines) were used to measure cervical dimensions, CEJ curvature, and distance from zenith of CEJ to interdental contact on proximal views. The x- and y-coordinates of the CEJ contour were digitized before mathematic processing to allow the representation of a single average curve for buccal, palatal, mesial, and distal surfaces for each tooth type. These measurements were combined to existing data related to dentogingival and "implantomucosal" junction to extrapolate specific biologic landmarks around teeth and implants. Mean cervical dimensions, distance from zenith of CEJ to interdental contact, and CEJ curvature were compared. Cervical dimensions significantly differed, with a more symmetric cervical cross-section for central incisors, slightly more rectangular shape for lateral incisors, and distinctly rectangular shape for canines. CEJ curvature was statistically different between all tooth groups (centrals > laterals > canines); within groups, curvature value was always superior at the mesial aspect compared to distally (3.46 mm vs 3.13 mm for centrals, 2.97 mm vs 2.38 mm for laterals, and 2.55 mm vs 1.60 mm for canines). Tooth-implant biologic width discrepancies ranged from 4.10 to 5.96 mm and were different between all groups of teeth (centrals > laterals > canines); within groups, the discrepancy was always superior at the mesial aspect compared to distally. Current implant design featuring a flat, rotation-symmetric shoulder should be reconsidered in view of natural CEJ contour to improve biologic considerations and related esthetics.

  18. Compositional data supports decentralized model of production and circulation of artifacts in the pre-Columbian south-central Andes.

    PubMed

    Lazzari, Marisa; Pereyra Domingorena, Lucas; Stoner, Wesley D; Scattolin, María Cristina; Korstanje, María Alejandra; Glascock, Michael D

    2017-05-16

    The circulation and exchange of goods and resources at various scales have long been considered central to the understanding of complex societies, and the Andes have provided a fertile ground for investigating this process. However, long-standing archaeological emphasis on typological analysis, although helpful to hypothesize the direction of contacts, has left important aspects of ancient exchange open to speculation. To improve understanding of ancient exchange practices and their potential role in structuring alliances, we examine material exchanges in northwest Argentina (part of the south-central Andes) during 400 BC to AD 1000 (part of the regional Formative Period), with a multianalytical approach (petrography, instrumental neutron activation analysis, laser ablation inductively coupled plasma mass spectrometry) to artifacts previously studied separately. We assess the standard centralized model of interaction vs. a decentralized model through the largest provenance database available to date in the region. The results show: ( i ) intervalley heterogeneity of clays and fabrics for ordinary wares; ( ii ) intervalley homogeneity of clays and fabrics for a wide range of decorated wares (e.g., painted Ciénaga); ( iii ) selective circulation of two distinct polychrome wares (Vaquerías and Condorhuasi); ( iv ) generalized access to obsidian from one major source and various minor sources; and ( v ) selective circulation of volcanic rock tools from a single source. These trends reflect the multiple and conflicting demands experienced by people in small-scale societies, which may be difficult to capitalize by aspiring elites. The study undermines centralized narratives of exchange for this period, offering a new platform for understanding ancient exchange based on actual material transfers, both in the Andes and beyond.

  19. Compositional data supports decentralized model of production and circulation of artifacts in the pre-Columbian south-central Andes

    PubMed Central

    Pereyra Domingorena, Lucas; Stoner, Wesley D.; Scattolin, María Cristina; Korstanje, María Alejandra; Glascock, Michael D.

    2017-01-01

    The circulation and exchange of goods and resources at various scales have long been considered central to the understanding of complex societies, and the Andes have provided a fertile ground for investigating this process. However, long-standing archaeological emphasis on typological analysis, although helpful to hypothesize the direction of contacts, has left important aspects of ancient exchange open to speculation. To improve understanding of ancient exchange practices and their potential role in structuring alliances, we examine material exchanges in northwest Argentina (part of the south-central Andes) during 400 BC to AD 1000 (part of the regional Formative Period), with a multianalytical approach (petrography, instrumental neutron activation analysis, laser ablation inductively coupled plasma mass spectrometry) to artifacts previously studied separately. We assess the standard centralized model of interaction vs. a decentralized model through the largest provenance database available to date in the region. The results show: (i) intervalley heterogeneity of clays and fabrics for ordinary wares; (ii) intervalley homogeneity of clays and fabrics for a wide range of decorated wares (e.g., painted Ciénaga); (iii) selective circulation of two distinct polychrome wares (Vaquerías and Condorhuasi); (iv) generalized access to obsidian from one major source and various minor sources; and (v) selective circulation of volcanic rock tools from a single source. These trends reflect the multiple and conflicting demands experienced by people in small-scale societies, which may be difficult to capitalize by aspiring elites. The study undermines centralized narratives of exchange for this period, offering a new platform for understanding ancient exchange based on actual material transfers, both in the Andes and beyond. PMID:28461485

  20. ISO 19115 Experiences in NASA's Earth Observing System (EOS) ClearingHOuse (ECHO)

    NASA Astrophysics Data System (ADS)

    Cechini, M. F.; Mitchell, A.

    2011-12-01

    Metadata is an important entity in the process of cataloging, discovering, and describing earth science data. As science research and the gathered data increases in complexity, so does the complexity and importance of descriptive metadata. To meet these growing needs, the metadata models required utilize richer and more mature metadata attributes. Categorizing, standardizing, and promulgating these metadata models to a politically, geographically, and scientifically diverse community is a difficult process. An integral component of metadata management within NASA's Earth Observing System Data and Information System (EOSDIS) is the Earth Observing System (EOS) ClearingHOuse (ECHO). ECHO is the core metadata repository for the EOSDIS data centers providing a centralized mechanism for metadata and data discovery and retrieval. ECHO has undertaken an internal restructuring to meet the changing needs of scientists, the consistent advancement in technology, and the advent of new standards such as ISO 19115. These improvements were based on the following tenets for data discovery and retrieval: + There exists a set of 'core' metadata fields recommended for data discovery. + There exists a set of users who will require the entire metadata record for advanced analysis. + There exists a set of users who will require a 'core' set metadata fields for discovery only. + There will never be a cessation of new formats or a total retirement of all old formats. + Users should be presented metadata in a consistent format of their choosing. In order to address the previously listed items, ECHO's new metadata processing paradigm utilizes the following approach: + Identify a cross-format set of 'core' metadata fields necessary for discovery. + Implement format-specific indexers to extract the 'core' metadata fields into an optimized query capability. + Archive the original metadata in its entirety for presentation to users requiring the full record. + Provide on-demand translation of 'core' metadata to any supported result format. Lessons learned by the ECHO team while implementing its new metadata approach to support usage of the ISO 19115 standard will be presented. These lessons learned highlight some discovered strengths and weaknesses in the ISO 19115 standard as it is introduced to an existing metadata processing system.

  1. POM.gpu-v1.0: a GPU-based Princeton Ocean Model

    NASA Astrophysics Data System (ADS)

    Xu, S.; Huang, X.; Oey, L.-Y.; Xu, F.; Fu, H.; Zhang, Y.; Yang, G.

    2015-09-01

    Graphics processing units (GPUs) are an attractive solution in many scientific applications due to their high performance. However, most existing GPU conversions of climate models use GPUs for only a few computationally intensive regions. In the present study, we redesign the mpiPOM (a parallel version of the Princeton Ocean Model) with GPUs. Specifically, we first convert the model from its original Fortran form to a new Compute Unified Device Architecture C (CUDA-C) code, then we optimize the code on each of the GPUs, the communications between the GPUs, and the I / O between the GPUs and the central processing units (CPUs). We show that the performance of the new model on a workstation containing four GPUs is comparable to that on a powerful cluster with 408 standard CPU cores, and it reduces the energy consumption by a factor of 6.8.

  2. Process optimization for sensory characteristics of seriales (Flacourtia jangomas) ready-to-drink (RTD) beverage

    NASA Astrophysics Data System (ADS)

    Cimafranca, L.; Dizon, E.

    2018-01-01

    Seriales (Flacourtia jangomas) is an underutilized fruit in the Philippines. The processing of the fruit into a RTD beverage was standardized by statistical methods. Plackett-Burman Design (PB) was used to determine the most significant factors that affect the sensory characteristics of the product. Response surface methodology (RSM) was applied based on the factorial Central Composite Design (CCD) to determine the optimum conditions for the maximum sensory acceptability of the seriales RTD beverage. Results of the PB revealed that the most significant factors were blanching time, level of seriales and TSS level. With different levels of blanching time (0.5, 1.0, and 1.5 min.), seriales level (10, 20, 30 %) and TSS value (12, 15, 18ºBrix), the optimum region for sensory acceptability was perceived at 0.7 to 1.4 minutes blanching time, seriales level of not beyond 27 %, and TSS at any level.

  3. Software Architecture Evaluation in Global Software Development Projects

    NASA Astrophysics Data System (ADS)

    Salger, Frank

    Due to ever increasing system complexity, comprehensive methods for software architecture evaluation become more and more important. This is further stressed in global software development (GSD), where the software architecture acts as a central knowledge and coordination mechanism. However, existing methods for architecture evaluation do not take characteristics of GSD into account. In this paper we discuss what aspects are specific for architecture evaluations in GSD. Our experiences from GSD projects at Capgemini sd&m indicate, that architecture evaluations differ in how rigorously one has to assess modularization, architecturally relevant processes, knowledge transfer and process alignment. From our project experiences, we derive nine good practices, the compliance to which should be checked in architecture evaluations in GSD. As an example, we discuss how far the standard architecture evaluation method used at Capgemini sd&m already considers the GSD-specific good practices, and outline what extensions are necessary to achieve a comprehensive architecture evaluation framework for GSD.

  4. [Robot--a member of (re)habilitation team].

    PubMed

    Krasnik, Rastislava; Mikov, Aleksandra; Golubović, Spela; Komazec, Zoran; Komazec, Slobodanka Lemajić

    2012-01-01

    The rehabilitation process involves a whole team of experts who participate in it over a long period oftime. The Intensive development of science and technology has made it possible to design a number of robots which are used for therapeutic purposes and participate in the rehabilitation process. During the long history of technological development of mankind, a number of conceptual and technological solutions for the construction of robots have been known. By using robots in medical rehabilitation it is possible to implement the rehabilitation of peripheral and central motor neurons by increasing the motivation of patients for further recovery and effectiveness of therapy. The paper presents some technological solutions for robot-assisted rehabilitation of patients of different age groups and some possibilities of its use in the treatment. Using robots in standard physiotherapy protocols that involve a number of repetitions, exact dosage, quality design and adaptability to each individual patient leads to the significant progress in the rehabilitation of patients.

  5. Serious Play with Dynamic Plane Transformations

    ERIC Educational Resources Information Center

    King, James

    2011-01-01

    Transformations are a central organizing idea in geometry. They are included in most geometry curricula and are likely to appear with even greater emphasis in the future, given the central role they play in the "Common Core State Standards" for K-12 mathematics. One of the attractions of geometry is the ability to draw and construct the…

  6. A modular docking mechanism for in-orbit assembly and spacecraft servicing

    NASA Technical Reports Server (NTRS)

    Gampe, F.; Priesett, K.; Bentall, R. H.

    1985-01-01

    A Docking Mechanism concept is described which is suitable for use with autonomous docking systems. The central feature of using simple cylindrical handles on one side and a type of prism seating on the other is offered as a practical method of achieving a standardized structural interface without freezing continued development of the latches, either technically or commercially. The main emphasis in docking mechanism concepts is in two directions: (1) a very simple docking mechanism, involving mainly the latch mechanism to achieve a structural link; and (2) a sophisticated Docking Mechanism, where the latch mechanism is designed for nonrigid spacecraft and the achievement of very low dynamic interactions between spacecraft during the docking process.

  7. Optimization of locations of diffusion spots in indoor optical wireless local area networks

    NASA Astrophysics Data System (ADS)

    Eltokhey, Mahmoud W.; Mahmoud, K. R.; Ghassemlooy, Zabih; Obayya, Salah S. A.

    2018-03-01

    In this paper, we present a novel optimization of the locations of the diffusion spots in indoor optical wireless local area networks, based on the central force optimization (CFO) scheme. The users' performance uniformity is addressed by using the CFO algorithm, and adopting different objective function's configurations, while considering maximization and minimization of the signal to noise ratio and the delay spread, respectively. We also investigate the effect of varying the objective function's weights on the system and the users' performance as part of the adaptation process. The results show that the proposed objective function configuration-based optimization procedure offers an improvement of 65% in the standard deviation of individual receivers' performance.

  8. Low-Level Space Optimization of an AES Implementation for a Bit-Serial Fully Pipelined Architecture

    NASA Astrophysics Data System (ADS)

    Weber, Raphael; Rettberg, Achim

    A previously developed AES (Advanced Encryption Standard) implementation is optimized and described in this paper. The special architecture for which this implementation is targeted comprises synchronous and systematic bit-serial processing without a central controlling instance. In order to shrink the design in terms of logic utilization we deeply analyzed the architecture and the AES implementation to identify the most costly logic elements. We propose to merge certain parts of the logic to achieve better area efficiency. The approach was integrated into an existing synthesis tool which we used to produce synthesizable VHDL code. For testing purposes, we simulated the generated VHDL code and ran tests on an FPGA board.

  9. Entropy Inequalities for Stable Densities and Strengthened Central Limit Theorems

    NASA Astrophysics Data System (ADS)

    Toscani, Giuseppe

    2016-10-01

    We consider the central limit theorem for stable laws in the case of the standardized sum of independent and identically distributed random variables with regular probability density function. By showing decay of different entropy functionals along the sequence we prove convergence with explicit rate in various norms to a Lévy centered density of parameter λ >1 . This introduces a new information-theoretic approach to the central limit theorem for stable laws, in which the main argument is shown to be the relative fractional Fisher information, recently introduced in Toscani (Ricerche Mat 65(1):71-91, 2016). In particular, it is proven that, with respect to the relative fractional Fisher information, the Lévy density satisfies an analogous of the logarithmic Sobolev inequality, which allows to pass from the monotonicity and decay to zero of the relative fractional Fisher information in the standardized sum to the decay to zero in relative entropy with an explicit decay rate.

  10. A tale of three Brownfields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweet, F.R.; Worthington, M.A.; Belli, E.

    Contaminated site remediation and reuse, or Brownfield redevelopment, has become an increasingly important approach to site development in the northeast corridor, yet the scale of this activity is but a fraction of its full potential. The problem lies in the multi-jurisdictional quagmire that confronts a Brownfield project. Permitting such projects is an overly taxing dynamic process that has become a staple diet for adept multidisciplinary consulting firms. Limited government sanctions such as clean sites initiatives and economic revitalization zones are at best, when successful, interesting bench studies. The central hypothesis that, if regulations are streamlined then site reuse will occur,more » is sound. Yet streamlining brings concerns that the protection of public health and the environment will be compromised and that the result will be a lower standard of public protection for urban populations. The authors postulate that the permitting of Brownfield projects can be streamlined without creating a double standard of risk tolerance. The authors present evidence of this by comparing publicly and privately funded projects.« less

  11. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  12. Phasor based single-molecule localization microscopy in 3D (pSMLM-3D): An algorithm for MHz localization rates using standard CPUs

    NASA Astrophysics Data System (ADS)

    Martens, Koen J. A.; Bader, Arjen N.; Baas, Sander; Rieger, Bernd; Hohlbein, Johannes

    2018-03-01

    We present a fast and model-free 2D and 3D single-molecule localization algorithm that allows more than 3 × 106 localizations per second to be calculated on a standard multi-core central processing unit with localization accuracies in line with the most accurate algorithms currently available. Our algorithm converts the region of interest around a point spread function to two phase vectors (phasors) by calculating the first Fourier coefficients in both the x- and y-direction. The angles of these phasors are used to localize the center of the single fluorescent emitter, and the ratio of the magnitudes of the two phasors is a measure for astigmatism, which can be used to obtain depth information (z-direction). Our approach can be used both as a stand-alone algorithm for maximizing localization speed and as a first estimator for more time consuming iterative algorithms.

  13. Does previous healthcare experience increase success in physician assistant training?

    PubMed

    Hegmann, Theresa; Iverson, Katie

    2016-06-01

    Healthcare experience is used by many physician assistant (PA) programs to rank applicants. Despite a large healthcare literature base evaluating admissions factors, little information is available on the relationship of healthcare experience and educational outcomes. We aimed to test whether previous healthcare experience is associated with increased success during the clinical portion of the PA educational process. Hours of direct healthcare experience reported on Central Application Service for Physician Assistants applications for 124 students in the classes of 2009 through 2013 were associated with a calculated average preceptor evaluation score for each student and with average standardized-patient examination scores for a subset of students. Average student age was 28.7 years and median healthcare experience was 2,257 hours (range 390-16,400). Previous healthcare experience was not significantly correlated with preceptor evaluations or standardized-patient examination scores. This 5-year single institution pilot study did not support the hypothesis that healthcare experience is associated with improved clinical year outcomes.

  14. Central auditory processing disorder (CAPD) tests in a school-age hearing screening programme - analysis of 76,429 children.

    PubMed

    Skarzynski, Piotr H; Wlodarczyk, Andrzej W; Kochanek, Krzysztof; Pilka, Adam; Jedrzejczak, Wiktor W; Olszewski, Lukasz; Bruski, Lukasz; Niedzielski, Artur; Skarzynski, Henryk

    2015-01-01

    Hearing disorders among school-age children are a current concern. Continuing studies have been performed in Poland since 2008, and on 2 December 2011 the EU Council adopted Conclusions on the Early Detection and Treatment of Communication Disorders in Children, Including the Use of e-Health Tools and innovative Solutions. The discussion now focuses not only on the efficacy of hearing screening programmes in schoolchildren, but what should be its general aim and what tests it should include? This paper makes the case that it is important to include central auditory processing disorder (CAPD) tests. One such test is the dichotic digits test (DDT). The aim of the presented study was to evaluate the usefulness of the DDT in detecting central hearing disorders in school-age children. During hearing screening programmes conducted in Poland in 2008-2010, exactly 235,664 children (7-12-years-old) were screened in 9,325 schools. Of this number, 7,642 were examined using the DDT test for CAPD. Screening programmes were conducted using the Sense Examination Platform. With the cut-off criterion set at the 5th percentile, results for the DDT applied in a divided attention mode were 11.4% positive for 7-year-olds and 11.3% for 12-year-olds. In the focused attention mode, the comparable result for 12-year-olds was 9.7%. There was a clear right ear advantage. In children with positive DDT results, a higher incidence of other disorders, such as dyslexia, was observed. A test for CAPD should be included in the hearing screening of school-age children. The results of this study form the basis for developing Polish standards in this area.

  15. Circuit-Detour Design and Implementation - Enhancing the Southern California's Seismic Network Reliability through Redundant Network Paths

    NASA Astrophysics Data System (ADS)

    Watkins, M.; Busby, R.; Rico, H.; Johnson, M.; Hauksson, E.

    2003-12-01

    We provide enhanced network robustness by apportioning redundant data communications paths for seismic stations in the field. By providing for more than one telemetry route, either physical or logical, network operators can improve availability of seismic data while experiencing occasional network outages, and also during the loss of key gateway interfaces such as a router or central processor. This is especially important for seismic stations in sparsely populated regions where a loss of a single site may result in a significant gap in the network's monitoring capability. A number of challenges arise in the application of a circuit-detour mechanism. One requirement is that it fits well within the existing framework of our real-time system processing. It is also necessary to craft a system that is not needlessly complex to maintain or implement, particularly during a crisis. The method that we use for circuit-detours does not require the reconfiguration of dataloggers or communications equipment in the field. Remote network configurations remain static, changes are only required at the central site. We have implemented standardized procedures to detour circuits on similar transport mediums, such as virtual circuits on the same leased line; as well as physically different communications pathways, such as a microwave link backed up by a leased line. The lessons learned from these improvements in reliability, and optimization efforts could be applied to other real-time seismic networks. A fundamental tenant of most seismic networks is that they are reliable and have a high percentage of real-time data availability. A reasonable way to achieve these expectations is to provide alternate means of delivering data to the central processing sites, with a simple method for utilizing these alternate paths.

  16. Brain region-specific enhancement of remyelination and prevention of demyelination by the CSF1R kinase inhibitor BLZ945.

    PubMed

    Beckmann, Nicolau; Giorgetti, Elisa; Neuhaus, Anna; Zurbruegg, Stefan; Accart, Nathalie; Smith, Paul; Perdoux, Julien; Perrot, Ludovic; Nash, Mark; Desrayaud, Sandrine; Wipfli, Peter; Frieauff, Wilfried; Shimshek, Derya R

    2018-02-15

    Multiple sclerosis (MS) is a chronic inflammatory disease affecting the central nervous system (CNS). While multiple effective immunomodulatory therapies for MS exist today, they lack the scope of promoting CNS repair, in particular remyelination. Microglia play a pivotal role in regulating myelination processes, and the colony-stimulating factor 1 (CSF-1) pathway is a key regulator for microglia differentiation and survival. Here, we investigated the effects of the CSF-1 receptor kinase inhibitor, BLZ945, on central myelination processes in the 5-week murine cuprizone model by non-invasive and longitudinal magnetic resonance imaging (MRI) and histology. Therapeutic 2-week BLZ945 treatment caused a brain region-specific enhancement of remyelination in the striatum/cortex, which was absent in the corpus callosum/external capsule. This beneficial effect correlated positively with microglia reduction, increased oligodendrocytes and astrogliosis. Prophylactic BLZ945 treatment prevented excessive demyelination in the corpus callosum by reducing microglia and increasing oligondendrocytes. In the external capsule oligodendrocytes were depleted but not microglia and a buildup of myelin debris and axonal damage was observed. A similar microglial dysfunction in the external capsule with an increase of myelin debris was obvious in triggering receptor expressed on myeloid cells 2 (TREM2) knock-out mice treated with cuprizone. Finally, therapeutic BLZ945 treatment did not change the disease course in experimental autoimmune encephalomyelitis mice, a peripherally driven neuroinflammation model. Taken together, our data suggest that a short-term therapeutic inhibition of the CSF-1 receptor pathway by BLZ945 in the murine cuprizone model enhances central remyelination by modulating neuroinflammation. Thus, microglia-modulating therapies could be considered clinically for promoting myelination in combination with standard-of-care treatments in MS patients.

  17. [Optimization of registry of deaths from chronic kidney disease in agricultural communities in Central America].

    PubMed

    Escamilla-Cejudo, José Antonio; Báez, Jorge Lara; Peña, Rodolfo; Luna, Patricia Lorena Ruiz; Ordunez, Pedro

    2016-11-01

    Several Central American countries are seeing continued growth in the number of deaths from chronic kidney disease of nontraditional causes (CKDnT) among farm workers and there is underreporting. This report presents the results of a consensus process coordinated by the Pan American Health Organization/World Health Organization (PAHO/WHO), the United States Centers for Disease Control and Prevention (CDC), and the Latin American Society of Nephrology and Hypertension (SLANH). This consensus seeks to increase the probability of detecting and recording deaths from these causes. There has been recognition of the negative impact of the lack of a standardized instrument and the lack of training in the medical profession for adequate registration of the cause or causes of death. As a result of the consensus, the following has been proposed: temporarily use a code from the Codes for Special Purposes in the International Classification of Diseases (ICD-10); continue to promote use of the WHO international standardized instrument for recording causes and preceding events related to death; increase training of physicians responsible for filling out death certificates; take action to increase the coverage and quality of information on mortality; and create a decision tree to facilitate selection of CKDnT as a specific cause of death, while presenting the role that different regional and subregional mechanisms in the Region of the Americas should play in order to improve CKD and CKDnT mortality records.

  18. Micro X-ray Fluorescence Study of Late Pre-Hispanic Ceramics from the Western Slopes of the South Central Andes Region in the Arica y Parinacota Region, Chile: A New Methodological Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flewett, S.; Saintenoy, T.; Sepulveda, M.

    Archeological ceramic paste material typically consists of a mix of a clay matrix and various millimeter and sub-millimeter sized mineral inclusions. Micro X-ray Fluorescence (μXRF) is a standard compositional classification tool, and in this work we propose and demonstrate an improved fluorescence map processing protocol where the mineral inclusions are automatically separated from the clay matrix to allow independent statistical analysis of the two parts. Application of this protocol allowed us to improve enhance the differentiation discrimination between different ceramic shards compared with the standard procedure of comparing working with only the spatially averaged elemental concentrations. Using the new protocol,more » we performed an initial compositional classification of a set of 83 ceramic shards from the western slopes of the south central Andean region in the Arica y Parinacota region of present-day far northern Chile. Comparing the classifications obtained using the new versus the old (average concentrations only) protocols, we found that some samples were erroneously classified with the old protocol. From an archaeological perspective, a very broad and heterogeneous sample set was used in this study due to the fact that this was the first such study to be performed on ceramics from this region. This allowed a general overview to be obtained, however further work on more specific sample sets will be necessary to extract concrete archaeological conclusions.« less

  19. Transgenic mice as an alternative to monkeys for neurovirulence testing of live oral poliovirus vaccine: validation by a WHO collaborative study.

    PubMed Central

    Dragunsky, Eugenia; Nomura, Tatsuji; Karpinski, Kazimir; Furesz, John; Wood, David J.; Pervikov, Yuri; Abe, Shinobu; Kurata, Takeshi; Vanloocke, Olivier; Karganova, Galina; Taffs, Rolf; Heath, Alan; Ivshina, Anna; Levenbook, Inessa

    2003-01-01

    OBJECTIVE: Extensive WHO collaborative studies were performed to evaluate the suitability of transgenic mice susceptible to poliovirus (TgPVR mice, strain 21, bred and provided by the Central Institute for Experimental Animals, Japan) as an alternative to monkeys in the neurovirulence test (NVT) of oral poliovirus vaccine (OPV). METHODS: Nine laboratories participated in the collaborative study on testing neurovirulence of 94 preparations of OPV and vaccine derivatives of all three serotypes in TgPVR21 mice. FINDINGS: Statistical analysis of the data demonstrated that the TgPVR21 mouse NVT was of comparable sensitivity and reproducibility to the conventional WHO NVT in simians. A statistical model for acceptance/rejection of OPV lots in the mouse test was developed, validated, and shown to be suitable for all three vaccine types. The assessment of the transgenic mouse NVT is based on clinical evaluation of paralysed mice. Unlike the monkey NVT, histological examination of central nervous system tissue of each mouse offered no advantage over careful and detailed clinical observation. CONCLUSIONS: Based on data from the collaborative studies the WHO Expert Committee for Biological Standardization approved the mouse NVT as an alternative to the monkey test for all three OPV types and defined a standard implementation process for laboratories that wish to use the test. This represents the first successful introduction of transgenic animals into control of biologicals. PMID:12764491

  20. Quality management and accreditation of research tissue banks: experience of the National Center for Tumor Diseases (NCT) Heidelberg.

    PubMed

    Herpel, Esther; Röcken, Christoph; Manke, Heike; Schirmacher, Peter; Flechtenmacher, Christa

    2010-12-01

    Tissue banks are key resource and technology platforms in biomedical research that address the molecular pathogenesis of diseases as well as disease prevention, diagnosis, and treatment. Due to the central role of tissue banks in standardized collection, storage, and distribution of human tissues and their derivatives, quality management and its external assessment is becoming increasingly relevant for the maintenance, acceptance, and funding of tissue banks. Little experience exists regarding formalized external evaluation of tissue banks, especially regarding certification and accreditation. Based on the accreditation of the National Center of Tumor Diseases (NCT) tissue bank in Heidelberg (Germany), criteria, requirements, processes, and implications were compiled and evaluated. Accreditation formally approved professional competence and performance of the tissue bank in all steps involved in tissue collection, storage, handling as well as macroscopic and histologic examination and final (exit) examination of the tissue and transfer supervised by board-certified competent histopathologists. Thereby, accreditation provides a comprehensive measure to evaluate and document the quality standard of tissue research banks and may play a significant role in the future assessment of tissue banks. Furthermore, accreditation may support harmonization and standardization of tissue banking for biomedical research purposes.

  1. Effects of Methylphenidate (Ritalin) on Auditory Performance in Children with Attention and Auditory Processing Disorders.

    ERIC Educational Resources Information Center

    Tillery, Kim L.; Katz, Jack; Keller, Warren D.

    2000-01-01

    A double-blind, placebo-controlled study examined effects of methylphenidate (Ritalin) on auditory processing in 32 children with both attention deficit hyperactivity disorder and central auditory processing (CAP) disorder. Analyses revealed that Ritalin did not have a significant effect on any of the central auditory processing measures, although…

  2. Inadequate ventilation for nosocomial tuberculosis prevention in public hospitals in Central Thailand.

    PubMed

    Jiamjarasrangsi, W; Bualert, S; Chongthaleong, A; Chaindamporn, A; Udomsantisuk, N; Euasamarnjit, W

    2009-04-01

    Forty-two community and general hospitals in central Thailand. To examine the adequacy of indoor ventilation for nosocomial tuberculosis (TB) prevention in public hospitals in central Thailand. A cross-sectional survey was conducted among 323 patient care and ancillary areas in the target hospitals. Data on indoor ventilation rate were collected by the tracer gas method and reported as air changes per hour (ACH). The adequacy of the measured ventilation rates were then determined by comparison with the international recommended standard values. Indoor ventilation rates were inadequate in almost half of the studied areas (144/323, 44.6%). The inadequacy was particularly serious in the emergency rooms (ERs) and radiological areas, where 73.8% (31/42 each) of the rooms had ACH below the recommended standards. Detailed analysis showed that most of the rooms with natural ventilation had air exchange rates that exceeded the recommended standards, while the opposite was the case for rooms with air-conditioning, particularly the window or wall-mount type. Indoor ventilation in high-risk nosocomial TB areas in public hospitals in Thailand was inadequate due to the installation of air-conditioning systems in modern buildings.

  3. Social value and individual choice: The value of a choice-based decision-making process in a collectively funded health system.

    PubMed

    Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark

    2018-02-01

    Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Investigation of Central Pain Processing in Post-Operative Shoulder Pain and Disability

    PubMed Central

    Valencia, Carolina; Fillingim, Roger B.; Bishop, Mark; Wu, Samuel S.; Wright, Thomas W.; Moser, Michael; Farmer, Kevin; George, Steven Z.

    2014-01-01

    Measures of central pain processing like conditioned pain modulation (CPM), and suprathreshold heat pain response (SHPR) have been described to assess different components of central pain modulatory mechanisms. Central pain processing potentially play a role in the development of postsurgical pain, however, the role of CPM and SHPR in explaining postoperative clinical pain and disability is still unclear. Seventy eight patients with clinical shoulder pain were included in this study. Patients were examined before shoulder surgery, at 3 months, and 6 months after surgery. The primary outcome measures were pain intensity and upper extremity disability. Analyses revealed that the change score (baseline – 3 months) of 5th pain rating of SHPR accounted for a significant amount of variance in 6 month postsurgical clinical pain intensity and disability after age, sex, preoperative pain intensity, and relevant psychological factors were considered. The present study suggests that baseline measures of central pain processing were not predictive of 6 month postoperative pain outcome. Instead, the 3 month change in SHPR might be a relevant factor in the transition to elevated 6-month postoperative pain and disability outcomes. In patients with shoulder pain, the 3 month change in a measure of central pain processing might be a relevant factor in the transition to elevated 6-month postoperative pain and disability scores. PMID:24042347

  5. Recommendations for Implementing the New Illinois Early Learning and Development Standards to Affect Classroom Practices for Social and Emotional Learning

    ERIC Educational Resources Information Center

    Zinsser, Katherine M.; Dusenbury, Linda

    2015-01-01

    The state of Illinois in the central United States has long been a trendsetter both in the development of learning standards and in addressing social and emotional learning in education settings. With a recent revision to the state's early learning standards, published in 2013, the Illinois State Board of Education (ISBE) fully aligned its…

  6. Collaborative Radiological Response Planning

    DTIC Science & Technology

    2013-12-01

    280-5500 Standard Form 298 (Rev. 2–89) Prescribed by ANSI Std. 239–18 i THIS PAGE INTENTIONALLY LEFT BLANK ii Approved for public release...critical tasks, under specified conditions and performance standards .”33 Aligned with the central objective of Capabilities-Based Planning, the TCL...affect task performance.36 More specifically, measures and performance criteria describe a standard for how well a task must be performed and on

  7. The Central Coherence Account of Autism Revisited: Evidence from the ComFor Study

    ERIC Educational Resources Information Center

    Noens, Ilse L. J.; van Berckelaer-Onnes, Ina A.

    2008-01-01

    According to the central coherence account, people with autism have a tendency to focus on local rather than global processing. However, there is considerable controversy about the locus of the weak drive for central coherence. Some studies support enhanced bottom-up processing, whereas others claim reduced top-down feedback. The results of the…

  8. A Glossary for Pre-Calculus

    ERIC Educational Resources Information Center

    Arnold, Bruce; Kracht, Brenda; Ross, Judy; Teegarden, Terrie; Tompkins, Maurice

    2012-01-01

    In the deconstruction of the California state standards for trigonometry, linear algebra and mathematical analysis for the Cal-PASS (California Partnership for Achieving Student Success) Content Standards Deconstruction projects, it became apparent that terms were used for which no definition was given. The San Diego Central Cal-PASS Math…

  9. 75 FR 27227 - Energy Conservation Program: Energy Conservation Standards for Residential Central Air...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... that DOE could consider for these products. DOE also encouraged written comments on these subjects... conservation standards notice of public meeting (NOPM) and availability of the preliminary technical support... Federal Register notice announcing the availability of its preliminary technical support document for...

  10. Journeying "Down the Rabbit Hole"

    ERIC Educational Resources Information Center

    Rossman, Alan; Dummer, John

    2004-01-01

    In describing the professional development journey of science teachers, the National Science Standards (NRC 1996) provides a useful cartography. Inquiry, those standards suggest, is the central strategy for the teaching of science. By illustrating the parallels between inquiry as a form of scientific investigation and inquiry as a classroom…

  11. Fish: A New Computer Program for Friendly Introductory Statistics Help

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  12. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  13. The Association of State Legal Mandates for Data Submission of Central Line-associated Blood Stream Infections in Neonatal Intensive Care Units with Process and Outcome Measures

    PubMed Central

    Zachariah, Philip; Reagan, Julie; Furuya, E. Yoko; Dick, Andrew; Liu, Hangsheng; Herzig, Carolyn T.A; Pogorzelska-Maziarz, Monika; Stone, Patricia W.; Saiman, Lisa

    2014-01-01

    Objective To determine the association between state legal mandates for data submission of central line-associated blood stream infections (CLABSIs) in neonatal intensive care units (NICUs) with process/outcome measures. Design Cross-sectional study. Participants National sample of level II/III and III NICUs participating in National Healthcare Safety Network (NHSN) surveillance. Methods State mandates for data submission of CLABSIs in NICUs in place by 2011 were compiled and verified with state healthcare-associated infection coordinators. A web-based survey of infection control departments in October 2011 assessed CLABSI prevention practices i.e. compliance with checklist and bundle components (process measures) in ICUs including NICUs. Corresponding 2011 NHSN NICU CLABSI rates (outcome measures) were used to calculate Standardized Infection Ratios (SIR). The association between mandates and process/outcome measures was assessed by multivariable logistic regression. Results Among 190 study NICUs, 107 (56.3%) NICUs were located in states with mandates, with mandates in place for 3 or more years for half. More NICUs in states with mandates reported ≥95% compliance to at least one CLABSI prevention practice (52.3% – 66.4%) than NICUs in states without mandates (28.9% – 48.2%). Mandates were predictors of ≥95% compliance with all practices (OR 2.8; 95% CI 1.4–6.1). NICUs in states with mandates reported lower mean CLABSI rates in the <750gm birth-weight group (2.4 vs. 5.7 CLABSIs/1000 CL-days) but not in others. Mandates were not associated with SIR <1. Conclusions State mandates for NICU CLABSI data submission were significantly associated with ≥95% compliance with CLABSI prevention practices but not with lower CLABSI rates. PMID:25111921

  14. Practice parameters for the use of autotitrating continuous positive airway pressure devices for titrating pressures and treating adult patients with obstructive sleep apnea syndrome: an update for 2007. An American Academy of Sleep Medicine report.

    PubMed

    Morgenthaler, Timothy I; Aurora, R Nisha; Brown, Terry; Zak, Rochelle; Alessi, Cathy; Boehlecke, Brian; Chesson, Andrew L; Friedman, Leah; Kapur, Vishesh; Maganti, Rama; Owens, Judith; Pancer, Jeffrey; Swick, Todd J

    2008-01-01

    These practice parameters are an update of the previously published recommendations regarding the use of autotitrating positive airway pressure (APAP) devices for titrating pressures and treating adult patients with obstructive sleep apnea syndrome. Continuous positive airway pressure (CPAP) at an effective setting verified by attended polysomnography is a standard treatment for obstructive sleep apnea (OSA). APAP devices change the treatment pressure based on feedback from various patient measures such as airflow, pressure fluctuations, or measures of airway resistance. These devices may aid in the pressure titration process, address possible changes in pressure requirements throughout a given night and from night to night, aid in treatment of OSA when attended CPAP titration has not or cannot be accomplished, or improve patient comfort. A task force of the Standards of Practice Committee of the American Academy of Sleep Medicine has reviewed the literature published since the 2002 practice parameter on the use of APAP. Current recommendations follow: (1) APAP devices are not recommended to diagnose OSA; (2) patients with congestive heart failure, patients with significant lung disease such as chronic obstructive pulmonary disease; patients expected to have nocturnal arterial oxyhemoglobin desaturation due to conditions other than OSA (e.g., obesity hypoventilation syndrome); patients who do not snore (either naturally or as a result of palate surgery); and patients who have central sleep apnea syndromes are not currently candidates for APAP titration or treatment; (3) APAP devices are not currently recommended for split-night titration; (4) certain APAP devices may be used during attended titration with polysomnography to identify a single pressure for use with standard CPAP for treatment of moderate to severe OSA; (5) certain APAP devices may be initiated and used in the self-adjusting mode for unattended treatment of patients with moderate to severe OSA without significant comorbidities (CHF, COPD, central sleep apnea syndromes, or hypoventilation syndromes); (6) certain APAP devices may be used in an unattended way to determine a fixed CPAP treatment pressure for patients with moderate to severe OSA without significant comorbidities (CHF, COPD, central sleep apnea syndromes, or hypoventilation syndromes); (7) patients being treated with fixed CPAP on the basis of APAP titration or being treated with APAP must have close clinical follow-up to determine treatment effectiveness and safety; and (8) a reevaluation and, if necessary, a standard attended CPAP titration should be performed if symptoms do not resolve or the APAP treatment otherwise appears to lack efficacy.

  15. Effect of delayed auditory feedback on stuttering with and without central auditory processing disorders.

    PubMed

    Picoloto, Luana Altran; Cardoso, Ana Cláudia Vieira; Cerqueira, Amanda Venuti; Oliveira, Cristiane Moço Canhetti de

    2017-12-07

    To verify the effect of delayed auditory feedback on speech fluency of individuals who stutter with and without central auditory processing disorders. The participants were twenty individuals with stuttering from 7 to 17 years old and were divided into two groups: Stuttering Group with Auditory Processing Disorders (SGAPD): 10 individuals with central auditory processing disorders, and Stuttering Group (SG): 10 individuals without central auditory processing disorders. Procedures were: fluency assessment with non-altered auditory feedback (NAF) and delayed auditory feedback (DAF), assessment of the stuttering severity and central auditory processing (CAP). Phono Tools software was used to cause a delay of 100 milliseconds in the auditory feedback. The "Wilcoxon Signal Post" test was used in the intragroup analysis and "Mann-Whitney" test in the intergroup analysis. The DAF caused a statistically significant reduction in SG: in the frequency score of stuttering-like disfluencies in the analysis of the Stuttering Severity Instrument, in the amount of blocks and repetitions of monosyllabic words, and in the frequency of stuttering-like disfluencies of duration. Delayed auditory feedback did not cause statistically significant effects on SGAPD fluency, individuals with stuttering with auditory processing disorders. The effect of delayed auditory feedback in speech fluency of individuals who stutter was different in individuals of both groups, because there was an improvement in fluency only in individuals without auditory processing disorder.

  16. Assessment of pharmacy information system performance in selected hospitals in isfahan city during 2011.

    PubMed

    Saqaeian Nejad Isfahani, Sakineh; Mirzaeian, Razieh; Habibi, Mahbobe

    2013-01-01

    In supporting a therapeutic approach and medication therapy management, pharmacy information system acts as one of the central pillars of information system. This ensures that medication therapy is being supported and evaluated with an optimal level of safety and quality similar to other treatments and services. This research aims to evaluate the performance of pharmacy information system in three types of teaching, private and social affiliated hospitals. The present study is an applied, descriptive and analytical study which was conducted on the pharmacy information system in use in the selected hospitals. The research population included all the users of pharmacy information systems in the selected hospitals. The research sample is the same as the research population. Researchers collected data using a self-designed checklist developed following the guidelines of the American Society of Health-System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and pharmacy information system pharmacists and users. To collect data besides observation, the questionnaires were distributed among pharmacy information system pharmacists and users. Finally, the analysis of the data was performed using the SPSS software. Pharmacy information system was found to be semi-automated in 16 hospitals and automated in 3 ones. Regarding the standards in the guidelines issued by the Society of Pharmacists, the highest rank in observing the input standards belonged to the Social Services associated hospitals with a mean score of 32.75. While teaching hospitals gained the highest score both in processing standards with a mean score of 29.15 and output standards with a mean score of 43.95, and the private hospitals had the lowest mean scores of 23.32, 17.78, 24.25 in input, process and output standards respectively. Based on the findings, the studied hospitals had minimal compliance with the input, output and processing standards related to the pharmacy information system. It is suggested that the establishment of a team composed of operational managers, computer fields experts, health information managers, pharmacists as well as physicians may contribute to the promotion of the capabilities of pharmacy information system to be able to focus on health care practitioners' and users' requirements.

  17. Blood donation in Chile: Replacement and volunteer donors.

    PubMed

    Herrera, Claudia; Martínez, Cristina; Armanet, Leonor; Cárcamo, Amalia; Boye, Patricia; Lyng, Cecilia

    2010-01-01

    In recent years, the Chilean Health Ministry has developed a strategy in order to improve the safety and opportunity of the blood supply through the creation of a nationally co-ordinated blood transfusion service, centralizing collection management, production and testing in three Blood Centers along the country and promoting voluntary, regular, blood donation. In 2007, a comprehensive study of the situation of Blood Transfusion Services in Chile concluded that several critical factors make it difficult to achieve a safe and adequate access to blood and blood components in the country. For example there is a low donation rate (14.3/1000 inhabitants), very low percentage of voluntary donors (10%), excessive amount of blood banks collecting, processing and testing blood revealing an atomized non-centralized system, lack of a national IT system and insufficient national standards. There are two regions in the country, Bio Bio and Valparaíso, where Regional Blood Centers are located, that have put in place several strategies in order to obtain better results. Copyright 2009 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  18. An Infrastructure to Enable Lightweight Context-Awareness for Mobile Users

    PubMed Central

    Curiel, Pablo; Lago, Ana B.

    2013-01-01

    Mobile phones enable us to carry out a wider range of tasks every day, and as a result they have become more ubiquitous than ever. However, they are still more limited in terms of processing power and interaction capabilities than traditional computers, and the often distracting and time-constricted scenarios in which we use them do not help in alleviating these limitations. Context-awareness is a valuable technique to address these issues, as it enables to adapt application behaviour to each situation. In this paper we present a context management infrastructure for mobile environments, aimed at controlling context information life-cycle in this kind of scenarios, with the main goal of enabling application and services to adapt their behaviour to better meet end-user needs. This infrastructure relies on semantic technologies and open standards to improve interoperability, and is based on a central element, the context manager. This element acts as a central context repository and takes most of the computational burden derived from dealing with this kind of information, thus relieving from these tasks to more resource-scarce devices in the system. PMID:23899932

  19. Widowhood in old age: Viewed in a family context☆

    PubMed Central

    Moss, Miriam S.; Moss, Sidney Z.

    2014-01-01

    Researchers and clinicians have traditionally explored widowhood as an intrapersonal process. We expand the paradigm of bereavement research to explore the widow's perceptions of her experience within a family context. In a study of family bereavement, 24 widows each participated in 2 separate qualitative interviews, followed by standard qualitative analyses of the transcribed narratives. Three inter-related central topics emerged. (1) Widows stress the importance of their independence vis a vis their family as central to their sense of identity. (2) Widows perceive that they and their adult children avoid expressing their feelings of sadness and loss with each other. (3) Widows believe that their children are unable to understand the meaning of the widows' loss because of differences in generations and life situations. Two inter-woven underlying themes emerged: protection of self and of other, and boundaries between widow and children. Just as protection is rooted in a dynamic of separation between widow and child, boundaries are rooted in their deep bond. When researchers and clinicians recognize the dynamics of these two themes they can potentially increase understanding of widowhood within the context of the family. PMID:24655677

  20. Does gaze cueing produce automatic response activation: a lateralized readiness potential (LRP) study.

    PubMed

    Vainio, L; Heimola, M; Heino, H; Iljin, I; Laamanen, P; Seesjärvi, E; Paavilainen, P

    2014-05-01

    Previous research has shown that gaze cues facilitate responses to an upcoming target if the target location is compatible with the direction of the cue. Similar cueing effects have also been observed with central arrow cues. Both of these cueing effects have been attributed to a reflexive orienting of attention triggered by the cue. In addition, orienting of attention has been proposed to result in a partial response activation of the corresponding hand that, in turn, can be observed in the lateralized readiness potential (LRP), an electrophysiological indicator of automatic hand-motor response preparation. For instance, a central arrow cue has been observed to produce automatic hand-motor activation as indicated by the LRPs. The present study investigated whether gaze cues could also produce similar activation patterns in LRP. Although the standard gaze cueing effect was observed in the behavioural data, the LRP data did not reveal any consistent automatic hand-motor activation. The study suggests that motor processes associated with gaze cueing effect may operate exclusively at the level of oculomotor programming. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Psychomotor and cognitive effects of piribedil, a dopamine agonist, in young healthy volunteers.

    PubMed

    Schück, Stéphane; Bentué-Ferrer, Danièle; Kleinermans, Diane; Reymann, Jean-Michel; Polard, Elisabeth; Gandon, Jean-Marc; Allain, Hervé

    2002-02-01

    Piribedil is a dopamine agonist acting on D2 and D3 central nervous system dopamine receptors. This drug has been administered to 12 young healthy male volunteers (age 22 +/- 2 years) according to a single center randomized, double-blind, two ways cross-over, placebo controlled trial, including a washout period of one week. Placebo and piribedil were administered by a single intravenous infusion over 2 h (3 mg). Psychomotor performance and cognitive functions were assessed through a standardized and computerized psychometric tests battery and a continuous electroencephalogram (EEG) mapping. Piribedil improved simple reaction time (P=0.02), immediate (P=0.045 and 0.004), and delayed free recall (P=0.05), dual coding test (P=0.02) and increased theta and fast beta waves on the EEG (P < 0.05 and 0.001, respectively). No deleterious effect was observed on the tests exploring attention and concentration via the other procedures. It is concluded that a single intravenous perfusion of piribedil 3 mg improves alertness and the information processing speed within the central nervous system, in healthy volunteers.

  2. Complex pitch perception mechanisms are shared by humans and a New World monkey.

    PubMed

    Song, Xindong; Osmanski, Michael S; Guo, Yueqi; Wang, Xiaoqin

    2016-01-19

    The perception of the pitch of harmonic complex sounds is a crucial function of human audition, especially in music and speech processing. Whether the underlying mechanisms of pitch perception are unique to humans, however, is unknown. Based on estimates of frequency resolution at the level of the auditory periphery, psychoacoustic studies in humans have revealed several primary features of central pitch mechanisms. It has been shown that (i) pitch strength of a harmonic tone is dominated by resolved harmonics; (ii) pitch of resolved harmonics is sensitive to the quality of spectral harmonicity; and (iii) pitch of unresolved harmonics is sensitive to the salience of temporal envelope cues. Here we show, for a standard musical tuning fundamental frequency of 440 Hz, that the common marmoset (Callithrix jacchus), a New World monkey with a hearing range similar to that of humans, exhibits all of the primary features of central pitch mechanisms demonstrated in humans. Thus, marmosets and humans may share similar pitch perception mechanisms, suggesting that these mechanisms may have emerged early in primate evolution.

  3. Prediction of the Thrust Performance and the Flowfield of Liquid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Wang, T.-S.

    1990-01-01

    In an effort to improve the current solutions in the design and analysis of liquid propulsive engines, a computational fluid dynamics (CFD) model capable of calculating the reacting flows from the combustion chamber, through the nozzle to the external plume, was developed. The Space Shuttle Main Engine (SSME) fired at sea level, was investigated as a sample case. The CFD model, FDNS, is a pressure based, non-staggered grid, viscous/inviscid, ideal gas/real gas, reactive code. An adaptive upwinding differencing scheme is employed for the spatial discretization. The upwind scheme is based on fourth order central differencing with fourth order damping for smooth regions, and second order central differencing with second order damping for shock capturing. It is equipped with a CHMQGM equilibrium chemistry algorithm and a PARASOL finite rate chemistry algorithm using the point implicit method. The computed flow results and performance compared well with those of other standard codes and engine hot fire test data. In addition, the transient nozzle flowfield calculation was also performed to demonstrate the ability of FDNS in capturing the flow separation during the startup process.

  4. Creating and evaluating a data-driven curriculum for central venous catheter placement.

    PubMed

    Duncan, James R; Henderson, Katherine; Street, Mandie; Richmond, Amy; Klingensmith, Mary; Beta, Elio; Vannucci, Andrea; Murray, David

    2010-09-01

    Central venous catheter placement is a common procedure with a high incidence of error. Other fields requiring high reliability have used Failure Mode and Effects Analysis (FMEA) to prioritize quality and safety improvement efforts. To use FMEA in the development of a formal, standardized curriculum for central venous catheter training. We surveyed interns regarding their prior experience with central venous catheter placement. A multidisciplinary team used FMEA to identify high-priority failure modes and to develop online and hands-on training modules to decrease the frequency, diminish the severity, and improve the early detection of these failure modes. We required new interns to complete the modules and tracked their progress using multiple assessments. Survey results showed new interns had little prior experience with central venous catheter placement. Using FMEA, we created a curriculum that focused on planning and execution skills and identified 3 priority topics: (1) retained guidewires, which led to training on handling catheters and guidewires; (2) improved needle access, which prompted the development of an ultrasound training module; and (3) catheter-associated bloodstream infections, which were addressed through training on maximum sterile barriers. Each module included assessments that measured progress toward recognition and avoidance of common failure modes. Since introducing this curriculum, the number of retained guidewires has fallen more than 4-fold. Rates of catheter-associated infections have not yet declined, and it will take time before ultrasound training will have a measurable effect. The FMEA provided a process for curriculum development. Precise definitions of failure modes for retained guidewires facilitated development of a curriculum that contributed to a dramatic decrease in the frequency of this complication. Although infections and access complications have not yet declined, failure mode identification, curriculum development, and monitored implementation show substantial promise for improving patient safety during placement of central venous catheters.

  5. Annual Forest Inventories for the North Central Region of the United States

    Treesearch

    Ronald E. McRoberts; Mark H. Hansen

    1999-01-01

    The primary objective in developing procedures for annual forest inventories for the north central region of the United States is to establish the capability of producing standard forest inventory and analysis estimates on an annual basis. The inventory system developed to accomplish this objective features several primary functions, including (1) an annual sample of...

  6. Assessment of Loblolly Pine Decline in Central Alabama

    Treesearch

    Nolan J. Hess; William J. Otrosina; Emily A. Carter; Jim R. Steinman; John P. Jones; Lori G. Eckhardt; Ann M. Weber; Charles H. Walkinshaw

    2002-01-01

    Loblolly pine (Pinus taeda L.) decline has been prevalent on upland sites of central Alabama since the 1960's. The purpose of this study was to compare Forest Health Monitoring (FHM) standards and protocols with root health evaluations relative to crown, stem, and site measurements. Thirty-nine 1/6 acre plots were established on loblolly decline...

  7. Child Rights and Quality Education: Child-Friendly Schools in Central and Eastern Europe (CEE)

    ERIC Educational Resources Information Center

    Clair, Nancy; Miske, Shirley; Patel, Deepa

    2012-01-01

    Since the breakup of the Soviet Union and former Yugoslavia, Central and Eastern European (CEE) countries have engaged in education reforms based on international frameworks. One of these, the Child-Friendly Schools (CFS) approach, is distinctively grounded in the Convention on the Rights of the Child (CRC). CFS standards are comprehensive,…

  8. Lessons from the Field: Developing and Implementing the Qatar Student Assessment System, 2002-2006. Technical Report

    ERIC Educational Resources Information Center

    Gonzalez, Gabriella; Le, Vi-Nhuan; Broer, Markus; Mariano, Louis T.; Froemel, J. Enrique; Goldman, Charles A.; DaVanzo, Julie

    2009-01-01

    Qatar has recently positioned itself to be a leader in education. Central to the country's efforts is the implementation of reforms to its K-12 education system. Central to the reform initiatives was the development of internationally benchmarked curriculum standards in four subjects: Arabic, English as a foreign language, mathematics, and…

  9. OUTCROP-BASED HIGH RESOLUTION GAMMA-RAY CHARACTERIZATION OF ARSENIC-BEARING LITHOFACIES IN THE PERMIAN GARBER SANDSTONE AND WELLINGTON FORMATION, CENTRAL OKLAHOMA AQUIFER (COA). CLEVELAND COUNTY, OKLAHOMA

    EPA Science Inventory

    The COA supplies drinking water to a number of municipalities in central Oklahoma. Two major stratigraphic units in the COA, the Garber Sandstone and Wellington Formation, contain naturally occurring arsenic that exceeds government mandated drinking-water standards (EPA, 2001). ...

  10. 78 FR 14076 - Fisheries of the Exclusive Economic Zone Off Alaska; Groundfish of the Gulf of Alaska; Central...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-04

    ... related to management, enforcement and data collection (management costs). Section 304(d)(2) of the MSA... publishes the standard ex-vessel prices and fee percentage for cost recovery under the Central Gulf of... provisions of section 303A of the Magnuson- Stevens Fishery Conservation and Management Act (MSA). Section...

  11. Leading for Instructional Improvement in the Context of Accountability: Central Office Leadership

    ERIC Educational Resources Information Center

    Rigby, Jessica Goodman; Corriell, Rebecca; Kuhl, Katie J.

    2018-01-01

    This case was written to help prepare central office leaders who are expected to design systems and lead toward instructional improvement in the context of both educational accountability and implementation of standards with increased rigor. The intent of this case study is to encourage educators to examine the complex and multiple challenges of…

  12. Capture of Fluorescence Decay Times by Flow Cytometry

    PubMed Central

    Naivar, Mark A.; Jenkins, Patrick; Freyer, James P.

    2012-01-01

    In flow cytometry, the fluorescence decay time of an excitable species has been largely underutilized and is not likely found as a standard parameter on any imaging cytometer, sorting, or analyzing system. Most cytometers lack fluorescence lifetime hardware mainly owing to two central issues. Foremost, research and development with lifetime techniques has lacked proper exploitation of modern laser systems, data acquisition boards, and signal processing techniques. Secondly, a lack of enthusiasm for fluorescence lifetime applications in cells and with bead-based assays has persisted among the greater cytometry community. In this unit, we describe new approaches that address these issues and demonstrate the simplicity of digitally acquiring fluorescence relaxation rates in flow. The unit is divided into protocol and commentary sections in order to provide a most comprehensive discourse on acquiring the fluorescence lifetime with frequency-domain methods. The unit covers (i) standard fluorescence lifetime acquisition (protocol-based) with frequency-modulated laser excitation, (ii) digital frequency-domain cytometry analyses, and (iii) interfacing fluorescence lifetime measurements onto sorting systems. Within the unit is also a discussion on how digital methods are used for aliasing in order to harness higher frequency ranges. Also, a final discussion is provided on heterodyning and processing of waveforms for multi-exponential decay extraction. PMID:25419263

  13. Optimizing the post-graduate institutional program evaluation process.

    PubMed

    Lypson, Monica L; Prince, Mark E P; Kasten, Steven J; Osborne, Nicholas H; Cohan, Richard H; Kowalenko, Terry; Dougherty, Paul J; Reynolds, R Kevin; Spires, M Catherine; Kozlow, Jeffrey H; Gitlin, Scott D

    2016-02-17

    Reviewing program educational efforts is an important component of postgraduate medical education program accreditation. The post-graduate review process has evolved over time to include centralized oversight based on accreditation standards. The institutional review process and the impact on participating faculty are topics not well described in the literature. We conducted multiple Plan-Do-Study-Act (PDSA) cycles to identify and implement areas for change to improve productivity in our institutional program review committee. We also conducted one focus group and six in-person interviews with 18 committee members to explore their perspectives on the committee's evolution. One author (MLL) reviewed the transcripts and performed the initial thematic coding with a PhD level research associate and identified and categorized themes. These themes were confirmed by all participating committee members upon review of a detailed summary. Emergent themes were triangulated with the University of Michigan Medical School's Admissions Executive Committee (AEC). We present an overview of adopted new practices to the educational program evaluation process at the University of Michigan Health System that includes standardization of meetings, inclusion of resident members, development of area content experts, solicitation of committed committee members, transition from paper to electronic committee materials, and focus on continuous improvement. Faculty and resident committee members identified multiple improvement areas including the ability to provide high quality reviews of training programs, personal and professional development, and improved feedback from program trainees. A standing committee that utilizes the expertise of a group of committed faculty members and which includes formal resident membership has significant advantages over ad hoc or other organizational structures for program evaluation committees.

  14. [How does central cornea thickness influence intraocular pressure during applanation and contour tonometry?].

    PubMed

    Schwenteck, T; Knappe, M; Moros, I

    2012-09-01

    Golmann applanation tonometry represents a well-established procedure for measuring intraocular pressure (IOP). This implies the necessity of an accurate measurement of IOP with the reference tonometer. One example is the contour tonometer Pascal with a measuring probe, adapted to the cornea geometry, for measuring the IOP and the ocular pulse amplitude. There is controversy of how strongly corneal thickness affects the measurement of IOP. We thus analysed, for a number of eyes, the correlation of IOP, as measured by two types of applanation tonometers and one contour tonometer and the central corneal thickness. In all 158 patient eyes were investigated in a clinical comparison of applanation tonometers AT 870 and Ocuton-A. The study was performed by a trained ophthalmologist and the comparison was in accordance with international standard ISO 8612. In addition, the corneal thickness in the vertex was repeatedly determined using an Oculus Pentacam. The potential effect of central corneal thickness on the IOP as measured by the mentioned tonometers was statistically evaluated by rank correlation analysis. We found that the measured IOP values for the three investigated tonometers were not normally distributed. The central corneal thickness values, in contrast, measured on 158 eyes by means of an ultrasound pachymeter and additionally on 235 eyes by the Pentacam, obeyed a Gaussian distribution. For the correlation analysis of both parameters the Spearman linear rank correlation coefficient (r) was considered. We found a very weak (|r| < 0.2) correlation between central corneal thickness and IOP for all 3 tonometers. The softness of the correlation is also illustrated by a large standard deviation of the regression line. A comparison of the different devices for corneal-thickness measurements shows less variance and a smaller variation coefficient when the ultrasoundpachymeter AL-1000 is used. The measured values for IOP are only very weakly correlated to the central corneal thickness. For the 3 tonometer types studied there is no need to correct the indicated pressure values according to the central corneal thickness of the investigated eye. Clinical comparisons according to the ISO 8612 standard between a tonometer under test and a reference Goldmann applanation tonometer are always a time-consuming procedure. Additional measures to determine the central corneal thickness of every investigated eyes are dispensable. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Comparison of the design criteria of 141 onsite wastewater treatment systems available on the French market.

    PubMed

    Dubois, V; Boutin, C

    2018-06-15

    New EC standards published in 2009 led to a surge in onsite wastewater treatment systems reaching the European market. Here we summarize their technical aspects and compare them to known values used in centralized wastewater treatment. The paper deals with two types of processes: attached-growth systems (AGS) on fine media and suspended-growth systems (SGS). Covering 141 technical approvals and 36 manufacturers, we compare onsite design criteria against the centralized wastewater design criteria for each process. The systems use a wide range of materials for bacterial growth, from soil, sand or gravel to zeolite, coconut shavings or rockwool cubes, with a huge range of variation in useful surface, from 0.26 m 2 /PE for one rockwool cube filter to 5 m 2 /PE for a (traditional system) vertical sand filter. Some rockwool can handle applied daily surface load of 160 g BOD 5 /m 2 . SGS design parameters range from 0.025 to 0.34 kg BOD 5 per kg MLVSS/d with hydraulic retention times of 0.28-3.7 d. For clarifier design, water velocity ranges from 0.15 to 1.47 m/h. In the sludge line, sludge storage volume ranges from 0.125 down to just 0.56 m 3 /PE. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Distributed data collection and supervision based on web sensor

    NASA Astrophysics Data System (ADS)

    He, Pengju; Dai, Guanzhong; Fu, Lei; Li, Xiangjun

    2006-11-01

    As a node in Internet/Intranet, web sensor has been promoted in recent years and wildly applied in remote manufactory, workshop measurement and control field. However, the conventional scheme can only support HTTP protocol, and the remote users supervise and control the collected data published by web in the standard browser because of the limited resource of the microprocessor in the sensor; moreover, only one node of data acquirement can be supervised and controlled in one instant therefore the requirement of centralized remote supervision, control and data process can not be satisfied in some fields. In this paper, the centralized remote supervision, control and data process by the web sensor are proposed and implemented by the principle of device driver program. The useless information of the every collected web page embedded in the sensor is filtered and the useful data is transmitted to the real-time database in the workstation, and different filter algorithms are designed for different sensors possessing independent web pages. Every sensor node has its own filter program of web, called "web data collection driver program", the collecting details are shielded, and the supervision, control and configuration software can be implemented by the call of web data collection driver program just like the use of the I/O driver program. The proposed technology can be applied in the data acquirement where relative low real-time is required.

  17. Procedures for central auditory processing screening in schoolchildren.

    PubMed

    Carvalho, Nádia Giulian de; Ubiali, Thalita; Amaral, Maria Isabel Ramos do; Santos, Maria Francisca Colella

    2018-03-22

    Central auditory processing screening in schoolchildren has led to debates in literature, both regarding the protocol to be used and the importance of actions aimed at prevention and promotion of auditory health. Defining effective screening procedures for central auditory processing is a challenge in Audiology. This study aimed to analyze the scientific research on central auditory processing screening and discuss the effectiveness of the procedures utilized. A search was performed in the SciELO and PUBMed databases by two researchers. The descriptors used in Portuguese and English were: auditory processing, screening, hearing, auditory perception, children, auditory tests and their respective terms in Portuguese. original articles involving schoolchildren, auditory screening of central auditory skills and articles in Portuguese or English. studies with adult and/or neonatal populations, peripheral auditory screening only, and duplicate articles. After applying the described criteria, 11 articles were included. At the international level, central auditory processing screening methods used were: screening test for auditory processing disorder and its revised version, screening test for auditory processing, scale of auditory behaviors, children's auditory performance scale and Feather Squadron. In the Brazilian scenario, the procedures used were the simplified auditory processing assessment and Zaidan's battery of tests. At the international level, the screening test for auditory processing and Feather Squadron batteries stand out as the most comprehensive evaluation of hearing skills. At the national level, there is a paucity of studies that use methods evaluating more than four skills, and are normalized by age group. The use of simplified auditory processing assessment and questionnaires can be complementary in the search for an easy access and low-cost alternative in the auditory screening of Brazilian schoolchildren. Interactive tools should be proposed, that allow the selection of as many hearing skills as possible, validated by comparison with the battery of tests used in the diagnosis. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  18. Lean principles to optimize instrument utilization for spine surgery in an academic medical center: an opportunity to standardize, cut costs, and build a culture of improvement.

    PubMed

    Lunardini, David; Arington, Richard; Canacari, Elena G; Gamboa, Kelly; Wagner, Katiri; McGuire, Kevin J

    2014-09-15

    Case study OBJECTIVE.: To optimize the utilization of operating room instruments for orthopedic and neurosurgical spine cases in an urban level 1 academic medical center through application of Lean principles. Process improvement systems such as Lean have been adapted to health care and offer an opportunity for frank assessment of surgical routines to increase efficiency and enhance value. The goal has been to safely reduce the financial burden to the health care system without compromising care and if possible reallocate these resources or gains in efficiency to further improve the value to the patient. The investigators identified instruments as a source of waste in the operating room and proposed a Lean process assessment. The instruments and the instrument processing workflow were described. An audit documented the utilization of each instrument by orthopedic surgeons and neurosurgeons through observation of spine cases. The data were then presented to the stakeholders, including surgeons, the perioperative director, and representatives from nursing, central processing, and the surgical technicians. Of the 38 cases audited, only 89 (58%) of the instruments were used at least once. On the basis of the data and stakeholder consensus, 63 (41%) of the instruments were removed, resulting in a weight reduction of 17.5 lb and consolidation of 2 instrument sets into 1. Projected cost savings were approximately $41,000 annually. Although new instruments were purchased to standardize sets, the return on investment was estimated to be 2 years. Inefficient surgical routines may comprise significant resource waste in an institution. Process assessment is an important tool in decreasing health care costs, with objectivity provided by Lean or similar principles, and essential impetus to change provided by stakeholders. 4.

  19. Modified ground-truthing: an accurate and cost-effective food environment validation method for town and rural areas.

    PubMed

    Caspi, Caitlin Eicher; Friebur, Robin

    2016-03-17

    A major concern in food environment research is the lack of accuracy in commercial business listings of food stores, which are convenient and commonly used. Accuracy concerns may be particularly pronounced in rural areas. Ground-truthing or on-site verification has been deemed the necessary standard to validate business listings, but researchers perceive this process to be costly and time-consuming. This study calculated the accuracy and cost of ground-truthing three town/rural areas in Minnesota, USA (an area of 564 miles, or 908 km), and simulated a modified validation process to increase efficiency without comprising accuracy. For traditional ground-truthing, all streets in the study area were driven, while the route and geographic coordinates of food stores were recorded. The process required 1510 miles (2430 km) of driving and 114 staff hours. The ground-truthed list of stores was compared with commercial business listings, which had an average positive predictive value (PPV) of 0.57 and sensitivity of 0.62 across the three sites. Using observations from the field, a modified process was proposed in which only the streets located within central commercial clusters (the 1/8 mile or 200 m buffer around any cluster of 2 stores) would be validated. Modified ground-truthing would have yielded an estimated PPV of 1.00 and sensitivity of 0.95, and would have resulted in a reduction in approximately 88 % of the mileage costs. We conclude that ground-truthing is necessary in town/rural settings. The modified ground-truthing process, with excellent accuracy at a fraction of the costs, suggests a new standard and warrants further evaluation.

  20. Standards for Libraries Within Regional Library Systems in Saskatchewan.

    ERIC Educational Resources Information Center

    Saskatchewan Library Association, Regina.

    These quantitative standards for the delivery of library services to a dispersed population, which were developed by the Saskatchewan Library Association, are based on the decentralized delivery of library services backed up by the centralized provision of technical services, resource people, and special collections in Saskatchewan. The roles of…

  1. Role of Clinical Practice in Teacher Preparation: Perceptions of Elementary Teacher Candidates

    ERIC Educational Resources Information Center

    Singh, Delar K.

    2017-01-01

    The Council for Accreditation of Teacher Education Programs (CAEP) has established five standards to measure the effectiveness of teacher preparation programs. Clinical partnerships and practice represent "Standard 2." The CAEP requires that teacher education programs design high quality clinical practice that is central to preparation…

  2. Standards for School Library/Media Programs, 1972-75.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison. Div. of Library Services.

    To aid elementary, middle, junior high, and high schools in planning an Instructional Materials Center, this handbook presents standards for this modern concept of a school library. The term Instructional Materials Center (IMC) is used throughout to designate a centralized collection of materials, with a staff of professional and clerical…

  3. Assessing the Complexity of Students' Knowledge in Chemistry

    ERIC Educational Resources Information Center

    Bernholt, Sascha; Parchmann, Ilka

    2011-01-01

    Current reforms in the education policy of various countries are intended to produce a paradigm shift in the educational system towards an outcome orientation. After implementing educational standards as normative objectives, the development of test procedures that adequately reflect these targets and standards is a central problem. This paper…

  4. Will Standards Save Public Education? New Democracy Forum Series.

    ERIC Educational Resources Information Center

    Meier, Deborah

    The lead essay in this collection, "Educating a Democracy" by Deborah Meir, rejects the idea of a centralized authority that dictates how and what teachers teach. Standardization prevents citizens from shaping their own schools, classrooms, and communities. Schools teach democratic virtues and provide much of this teaching by example.…

  5. 75 FR 71033 - Air Quality Designations for the 2008 Lead (Pb) National Ambient Air Quality Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    .... These include damage to the central nervous system, cardiovascular function, kidneys, immune system, and... growth); (5) Meteorology (weather/transport patterns); (6) Geography/topography (mountain ranges or other... Air Quality Designations for the 2008 Lead (Pb) National Ambient Air Quality Standards AGENCY...

  6. Rushing Roulette

    PubMed Central

    Scanlan, Larry

    1976-01-01

    In a Canada-wide survey, CANADIAN FAMILY PHYSICIAN found a startling divergence in provincial standards for ambulance crews and vehicles. While some provinces had developed a well-integrated ambulance system with central dispatching, rigorous standards for attendants and advanced paramedical training programs, in some the ambulances are run almost entirely by local undertakers. PMID:21308032

  7. The International Learning Object Metadata Survey

    ERIC Educational Resources Information Center

    Friesen, Norm

    2004-01-01

    A wide range of projects and organizations is currently making digital learning resources (learning objects) available to instructors, students, and designers via systematic, standards-based infrastructures. One standard that is central to many of these efforts and infrastructures is known as Learning Object Metadata (IEEE 1484.12.1-2002, or LOM).…

  8. 76 FR 76328 - Energy Conservation Program: Enforcement of Regional Standards for Residential Furnaces and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-07

    ... contractors in the product supply chain. The Department is considering these approaches or some combination of... Conditioners and Heat Pumps AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy... efficiency standards for residential furnaces and residential central air conditioners and heat pumps. DOE...

  9. Central auditory processing disorder (CAPD) in children with specific language impairment (SLI). Central auditory tests.

    PubMed

    Dlouha, Olga; Novak, Alexej; Vokral, Jan

    2007-06-01

    The aim of this project is to use central auditory tests for diagnosis of central auditory processing disorder (CAPD) in children with specific language impairment (SLI), in order to confirm relationship between speech-language impairment and central auditory processing. We attempted to establish special dichotic binaural tests in Czech language modified for younger children. Tests are based on behavioral audiometry using dichotic listening (different auditory stimuli that presented to each ear simultaneously). The experimental tasks consisted of three auditory measures (test 1-3)-dichotic listening of two-syllable words presented like binaural interaction tests. Children with SLI are unable to create simple sentences from two words that are heard separately but simultaneously. Results in our group of 90 pre-school children (6-7 years old) confirmed integration deficit and problems with quality of short-term memory. Average rate of success of children with specific language impairment was 56% in test 1, 64% in test 2 and 63% in test 3. Results of control group: 92% in test 1, 93% in test 2 and 92% in test 3 (p<0.001). Our results indicate the relationship between disorders of speech-language perception and central auditory processing disorders.

  10. Review of the general geology and solid-phase geochemical studies in the vicinity of the Central Oklahoma aquifer

    USGS Publications Warehouse

    Mosier, Elwin L.; Bullock, John H.

    1988-01-01

    The Central Oklahoma aquifer is the principal source of ground water for municipal, industrial, and rural use in central Oklahoma. Ground water in the aquifer is contained in consolidated sedimentary rocks consisting of the Admire, Council Grove, and Chase Groups, Wellington Formation, and Garber Sandstone and in the unconsolidated Quaternary alluvium and terrace deposits that occur along the major stream systems in the study area. The Garber Sandstone and the Wellington Formation comprise the main flow system and, as such, the aquifer is often referred to as the 'Garber-Wellington aquifer.' The consolidated sedimentary rocks consist of interbedded lenticular sandstone, shale, and siltstone beds deposited in similar deltaic environments in early Permian time. Arsenic, chromium, and selenium are found in the ground water of the Central Oklahoma aquifer in concentrations that, in places, exceed the primary drinking-water standards of the Environmental Protection Agency. Gross-alpha concentrations also exceed the primary standards in some wells, and uranium concentrations are uncommonly high in places. As a prerequisite to a surface and subsurface solid-phase geochemical study, this report summarizes the general geology of the Central Oklahoma study area. Summaries of results from certain previously reported solid-phase geochemical studies that relate to the vicinity of the Central Oklahoma aquifer are also given; including a summary of the analytical results and distribution plots for arsenic, selenium, chromium, thorium, uranium, copper, and barium from the U.S. Department of Energy's National Uranium Resource Evaluation (NURE) Program.

  11. Change in area of geographic atrophy in the Age-Related Eye Disease Study: AREDS report number 26.

    PubMed

    Lindblad, Anne S; Lloyd, Patricia C; Clemons, Traci E; Gensler, Gary R; Ferris, Frederick L; Klein, Michael L; Armstrong, Jane R

    2009-09-01

    To characterize progression of geographic atrophy (GA) associated with age-related macular degeneration in AREDS as measured by digitized fundus photographs. Fundus photographs from 181 of 4757 AREDS participants with a GA area of at least 0.5 disc areas at baseline or from participants who developed bilateral GA during follow-up were scanned, digitized, and evaluated longitudinally. Geographic atrophy area was determined using planimetry. Rates of progression from noncentral to central GA and of vision loss following development of central GA included the entire AREDS cohort. Median initial lesion size was 4.3 mm(2). Average change in digital area of GA from baseline was 2.03 mm(2) (standard error of the mean, 0.24 mm(2)) at 1 year, 3.78 mm(2) (0.24 mm(2)) at 2 years, 5.93 mm(2) (0.34 mm(2)) at 3 years, and 1.78 mm(2) (0.086 mm(2)) per year overall. Median time to developing central GA after any GA diagnosis was 2.5 years (95% confidence interval, 2.0-3.0). Average visual acuity decreased by 3.7 letters at first documentation of central GA, and by 22 letters at year 5. Growth of GA area can be reliably measured using standard fundus photographs that are digitized and subsequently graded at a reading center. Development of GA is associated with subsequent further growth of GA, development of central GA, and loss in central vision.

  12. Motivation enhances visual working memory capacity through the modulation of central cognitive processes.

    PubMed

    Sanada, Motoyuki; Ikeda, Koki; Kimura, Kenta; Hasegawa, Toshikazu

    2013-09-01

    Motivation is well known to enhance working memory (WM) capacity, but the mechanism underlying this effect remains unclear. The WM process can be divided into encoding, maintenance, and retrieval, and in a change detection visual WM paradigm, the encoding and retrieval processes can be subdivided into perceptual and central processing. To clarify which of these segments are most influenced by motivation, we measured ERPs in a change detection task with differential monetary rewards. The results showed that the enhancement of WM capacity under high motivation was accompanied by modulations of late central components but not those reflecting attentional control on perceptual inputs across all stages of WM. We conclude that the "state-dependent" shift of motivation impacted the central, rather than the perceptual functions in order to achieve better behavioral performances. Copyright © 2013 Society for Psychophysiological Research.

  13. Modern-Day Demographic Processes in Central Europe and Their Potential Interactions with Climate Change

    NASA Astrophysics Data System (ADS)

    Bański, Jerzy

    2013-01-01

    The aim of this article is to evaluate the effect of contemporary transformations in the population of Central European countries on climate change, in addition to singling out the primary points of interaction between demographic processes and the climate. In analyzing the interactions between climate and demographics, we can formulate three basic hypotheses regarding the region in question: 1) as a result of current demographic trends in Central Europe, the influence of the region on its climate will probably diminish, 2) the importance of the "climatically displaced" in global migratory movements will increase, and some of those concerned will move to Central Europe, 3) the contribution of the region to global food security will increase. In the last decade most of what comprises the region of Central Europe has reported a decline in population growth and a negative migration balance. As a process, this loss of population may have a positive effect on the environment and the climate. We can expect ongoing climate change to intensify migration processes, particularly from countries outside Europe. Interactions between climate and demographic processes can also be viewed in the context of food security. The global warming most sources foresee for the coming decades is the process most likely to result in spatial polarization of food production in agriculture. Central Europe will then face the challenge of assuring and improving food security, albeit this time on a global scale.

  14. Feasibility of a Centralized Clinical Trials Coverage Analysis: A Joint Initiative of the American Society of Clinical Oncology and the National Cancer Institute.

    PubMed

    Szczepanek, Connie M; Hurley, Patricia; Good, Marjorie J; Denicoff, Andrea; Willenberg, Kelly; Dawson, Casey; Kurbegov, Dax

    2017-06-01

    Clinical trial billing compliance is a challenge that is faced by overburdened clinical trials sites. The requirements place institutions and research sites at increased potential for financial risk. To reduce their risk, sites develop a coverage analysis (CA) before opening each trial. For multisite trials, this translates into system-wide redundancies, inconsistencies, trial delays, and potential costs to sites and patients. These factors exacerbate low accrual rates to cancer clinical trials. ASCO and the National Cancer Institute (NCI) collaborated to address this problem. An ASCO Research Community Forum working group proposed the concept of providing centrally developed CAs to research sites at protocol startup. The group collaborated with NCI and billing compliance experts to hold a symposium for key stakeholders to share knowledge, build skills, provide tools to conduct centralized CAs, and strategize about the next steps. Forty-eight attendees, who represented a range of stakeholders, participated in the symposium. As a result of this initiative, NCI directed the Cancer Trials Support Unit to convene a working group with NCI's National Clinical Trials Network (NCTN) and Community Oncology Research Program (NCORP) to develop tools and processes for generating CAs for their trials. A CA template with core elements was developed and is being adapted in a pilot project across NCTN Group and NCORP Research Bases. Centralized CAs for multisite trials-using standardized tools and templates-are feasible. They have the potential to reduce risk for patients and sites, forecast budget needs, and help decrease trial startup times that impede patient access and accrual to clinical trials.

  15. [Neurological complications of chronic alcoholism: study of 42 observations in Guinea].

    PubMed

    Cisse, F A; Keita, M M; Diallo, I M; Camara, M I; Konate, M M; Konate, F; Conde, K; Diallo, A N; Nyassinde, J; Djigue, B S; Camara, M; Koumbassa, M L; Diakhate, I; Cisse, A

    2014-01-01

    Neurologic disorders related to chronic alcoholism in traditional areas of Guinea are frequent, but reports about them are rare. We conducted the first study in Guinea on this subject and retrospectively collected 42 cases of neurologic manifestations related to alcoholism over a 7-year period. The standard findings of the literature were confirmed in our population: peak frequency after the age of 40 years (82.8%) and clear male overrepresentation (M/F sex ratio: 13/1). All the standard signs and symptoms are reported, with a clear predominance of alcoholic polyneuropathy and hepatic encephalopathy. The study of nutritional status by both body mass index (BMI) and the Detsky criteria showed that these patients were severely malnourished. The brain MRI was a crucial contribution for diagnosing the standard central nervous system complications of alcoholism: Gayet Wernicke encephalopathy, Marchiafava-Bignami disease, Korsakoff syndrome, central pontine myelinolysis, and cerebellar degeneration.

  16. DOD can save millions by using energy efficient centralized aircraft support systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-05-07

    The ways the Department of Defense can save millions of dollars annually by using new energy efficient centralized aircraft support systems at certain Air Force and Navy bases are discussed. The Air Force and Navy have developed and installed several different systems and have realized some degree of success. However, each service has developed its systems independently. Consequently, there is no commonality between the services' systems which could permit economical procurements for standard servicewide systems. Standardization would also prevent duplication of design efforts by the services and minimize proliferation of aircraft support equipment. It also would allow the services tomore » further reduce costs by combining requirements to assure the most economical quantities for buying system components. GAO makes specific recommendations to the Secretaries of Defense and the Air Force to develop standard systems and to install them at all bases where feasible and practical.« less

  17. The biological standard of living and mortality in Central Italy at the beginning of the 19th century.

    PubMed

    Coppola, Michela

    2013-12-01

    The biological standard of living in Central Italy at the beginning of the 19th century is analyzed using newly collected data on the height of recruits in the army of the Papal States. The results reveal a decline in height for the cohorts born under French rule (1796-1815). Although this trend was common to many parts of Europe, the estimated magnitude of the decline suggests a worsening of the biological standard of living of the working classes in the Papal States even relative to that of other countries. Despite the differences in the economic systems within the Papal States, no significant geographical variation in height has been found: even the most dynamic and advanced regions experienced a dramatic height decline. Mortality also increased during the period under consideration. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Economic analysis of effluent limitation guidelines and standards for the centralized waste treatment industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, W.

    1998-12-01

    This report estimates the economic and financial effects and the benefits of compliance with the proposed effluent limitations guidelines and standards for the Centralized Waste Treatment (CWT) industry. The Environmental Protection Agency (EPA) has measured these impacts in terms of changes in the profitability of waste treatment operations at CWT facilities, changes in market prices to CWT services, and changes in the quantities of waste management at CWT facilities in six geographic regions. EPA has also examined the impacts on companies owning CWT facilities (including impacts on small entities), on communities in which CWT facilities are located, and on environmentalmore » justice. EPA examined the benefits to society of the CWT effluent limitations guidelines and standards by examining cancer and non-cancer health effects of the regulation, recreational benefits, and cost savings to publicly owned treatment works (POTWs) to which indirect-discharging CWT facilities send their wastewater.« less

  19. Exploring Pre-K Age 4 Learning Standards and Their Role in Early Childhood Education: Research and Policy Implications. Research Report. ETS RR-16-14

    ERIC Educational Resources Information Center

    De Bruin-Parecki, Andrea; Slutzky, Carly

    2016-01-01

    Currently in the United States, 50 states, 5 territories, and the District of Columbia have established prekindergarten (pre-K) age 4 learning standards that are intended to outline skills and knowledge that set children on a path to success in kindergarten and upcoming grades. These standards are emphasized as a centralizing force in early…

  20. Product and Process, Literacy and Orality: An Essay on Composition and Culture.

    ERIC Educational Resources Information Center

    Killingsworth, M. Jimmie

    1993-01-01

    Argues that two oppositions (product versus process, literacy versus orality) bear a special relationship to one another resembling a ratio. Relates product and literacy to centralized authority, and relates process and orality to open-minded exchange, thus evoking the central dilemma of modern culture. (HB)

  1. Central Processing Dysfunctions in Children: A Review of Research.

    ERIC Educational Resources Information Center

    Chalfant, James C.; Scheffelin, Margaret A.

    Research on central processing dysfunctions in children is reviewed in three major areas. The first, dysfunctions in the analysis of sensory information, includes auditory, visual, and haptic processing. The second, dysfunction in the synthesis of sensory information, covers multiple stimulus integration and short-term memory. The third area of…

  2. The role of the central nervous system in the generation and maintenance of chronic pain in rheumatoid arthritis, osteoarthritis and fibromyalgia

    PubMed Central

    2011-01-01

    Pain is a key component of most rheumatologic diseases. In fibromyalgia, the importance of central nervous system pain mechanisms (for example, loss of descending analgesic activity and central sensitization) is well documented. A few studies have also noted alterations in central pain processing in osteoarthritis, and some data, including the observation of widespread pain sensitivity, suggest that central pain-processing defects may alter the pain response in rheumatoid arthritis patients. When central pain is identified, different classes of analgesics (for example, serotonin-norepinephrine reuptake inhibitors, α2δ ligands) may be more effective than drugs that treat peripheral or nociceptive pain (for example, nonsteroidal anti-inflammatory drugs and opioids). PMID:21542893

  3. Extending the amygdala in theories of threat processing

    PubMed Central

    Fox, Andrew S.; Oler, Jonathan A.; Tromp, Do P.M.; Fudge, Julie L.; Kalin, Ned H.

    2015-01-01

    The central extended amygdala is an evolutionarily conserved set of interconnected brain regions that play an important role in threat processing to promote survival. Two core components of the central extended amygdala, the central nucleus of the amygdala (Ce) and the lateral bed nucleus of the stria terminalis (BST) are highly similar regions that serve complimentary roles by integrating fear- and anxiety-relevant information. Survival depends on the central extended amygdala's ability to rapidly integrate and respond to threats that vary in their immediacy, proximity, and characteristics. Future studies will benefit from understanding alterations in central extended amygdala function in relation to stress-related psychopathology. PMID:25851307

  4. Perceptual load-dependent neural correlates of distractor interference inhibition.

    PubMed

    Xu, Jiansong; Monterosso, John; Kober, Hedy; Balodis, Iris M; Potenza, Marc N

    2011-01-18

    The load theory of selective attention hypothesizes that distractor interference is suppressed after perceptual processing (i.e., in the later stage of central processing) at low perceptual load of the central task, but in the early stage of perceptual processing at high perceptual load. Consistently, studies on the neural correlates of attention have found a smaller distractor-related activation in the sensory cortex at high relative to low perceptual load. However, it is not clear whether the distractor-related activation in brain regions linked to later stages of central processing (e.g., in the frontostriatal circuits) is also smaller at high rather than low perceptual load, as might be predicted based on the load theory. We studied 24 healthy participants using functional magnetic resonance imaging (fMRI) during a visual target identification task with two perceptual loads (low vs. high). Participants showed distractor-related increases in activation in the midbrain, striatum, occipital and medial and lateral prefrontal cortices at low load, but distractor-related decreases in activation in the midbrain ventral tegmental area and substantia nigra (VTA/SN), striatum, thalamus, and extensive sensory cortices at high load. Multiple levels of central processing involving midbrain and frontostriatal circuits participate in suppressing distractor interference at either low or high perceptual load. For suppressing distractor interference, the processing of sensory inputs in both early and late stages of central processing are enhanced at low load but inhibited at high load.

  5. Time-saving design of experiment protocol for optimization of LC-MS data processing in metabolomic approaches.

    PubMed

    Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine

    2013-08-06

    We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.

  6. Productivity Measures Associated With a Patient Access Initiative

    PubMed Central

    Gable, William H.; Pappas, Theodore N.; Jacobs, Danny O.; Cutler, Desmond A.; Kuo, Paul C.

    2006-01-01

    Objective: To assess financial performance associated with a patient 7-day access initiative. Background Data: Patient access to clinical services is frequently an obstacle at academic medical centers. Conflicting surgeon priorities among academic, clinical, educational, and leadership duties often create difficulties for patient entry into the “system.” Methods: The scope and objectives were identified to be: design of a standard, simple new patient appointment process, design of a standard process in cases where an appointment is not available in 7 days, use subspecialty team search capabilities, minimize/eliminate prescheduling requirements, centralize appointment scheduling, and creation and reporting of 7-day access metrics. Following maturation of the process, the 7-day access metrics from the period July 2004 to December 2004 and January 2005 to June 2005 were compared with corresponding time periods from calendar years 2001, 2002, and 2003. Results: Payor mix was unaltered. The median waiting time for a new patient appointment decreased from 21 days to 10 days. When compared with calendar years 2001, 2002, and 2003, respectively, the 2 periods of the 7-day access initiative in calendar years 2004 and 2005 were associated with significantly increased visits, new patients, operative procedures, hospital charges, and physician charges. Conclusions: Implementation of a 7-day access initiative can significantly increase financial productivity of general surgery groups in academic medical centers. We conclude that simplifying access to services can benefit academic surgical practices. Sustaining this level of productivity will continue to prove challenging. PMID:16632994

  7. The Interaction between Central and Peripheral Processing in Chinese Handwritten Production: Evidence from the Effect of Lexicality and Radical Complexity

    PubMed Central

    Zhang, Qingfang; Feng, Chen

    2017-01-01

    The interaction between central and peripheral processing in written word production remains controversial. This study aims to investigate whether the effects of radical complexity and lexicality in central processing cascade into peripheral processing in Chinese written word production. The participants were asked to write characters and non-characters (lexicality) with different radical complexity (few- and many-strokes). The findings indicated that regardless of the lexicality, the writing latencies were longer for characters with higher complexity (the many-strokes condition) than for characters with lower complexity (the few-strokes condition). The participants slowed down their writing execution at the radicals' boundary strokes, which indicated a radical boundary effect in peripheral processing. Interestingly, the lexicality and the radical complexity affected the pattern of shift velocity and writing velocity during the execution of writing. Lexical processing cascades into peripheral processing but only at the beginning of Chinese characters. In contrast, the radical complexity influenced the execution of handwriting movement throughout the entire character, and the pattern of the effect interacted with the character frequency. These results suggest that the processes of the lexicality and the radical complexity function during the execution of handwritten word production, which suggests that central processing cascades over peripheral processing during Chinese characters handwriting. PMID:28348536

  8. The CUAHSI Water Data Center: Enabling Data Publication, Discovery and Re-use

    NASA Astrophysics Data System (ADS)

    Seul, M.; Pollak, J.

    2014-12-01

    The CUAHSI Water Data Center (WDC) supports a standards-based, services-oriented architecture for time-series data and provides a separate service to publish spatial data layers as shape files. Two new services that the WDC offers are a cloud-based server (Cloud HydroServer) for publishing data and a web-based client for data discovery. The Cloud HydroServer greatly simplifies data publication by eliminating the need for scientists to set up an SQL-server data base, a requirement that has proven to be a significant barrier, and ensures greater reliability and continuity of service. Uploaders have been developed to simplify the metadata documentation process. The web-based data client eliminates the need for installing a program to be used as a client and works across all computer operating systems. The services provided by the WDC is a foundation for big data use, re-use, and meta-analyses. Using data transmission standards enables far more effective data sharing and discovery; standards used by the WDC are part of a global set of standards that should enable scientists to access unprecedented amount of data to address larger-scale research questions than was previously possible. A central mission of the WDC is to ensure these services meet the needs of the water science community and are effective at advancing water science.

  9. Reliability of anthropometric measurements in European preschool children: the ToyBox-study.

    PubMed

    De Miguel-Etayo, P; Mesana, M I; Cardon, G; De Bourdeaudhuij, I; Góźdź, M; Socha, P; Lateva, M; Iotova, V; Koletzko, B V; Duvinage, K; Androutsos, O; Manios, Y; Moreno, L A

    2014-08-01

    The ToyBox-study aims to develop and test an innovative and evidence-based obesity prevention programme for preschoolers in six European countries: Belgium, Bulgaria, Germany, Greece, Poland and Spain. In multicentre studies, anthropometric measurements using standardized procedures that minimize errors in the data collection are essential to maximize reliability of measurements. The aim of this paper is to describe the standardization process and reliability (intra- and inter-observer) of height, weight and waist circumference (WC) measurements in preschoolers. All technical procedures and devices were standardized and centralized training was given to the fieldworkers. At least seven children per country participated in the intra- and inter-observer reliability testing. Intra-observer technical error ranged from 0.00 to 0.03 kg for weight and from 0.07 to 0.20 cm for height, with the overall reliability being above 99%. A second training was organized for WC due to low reliability observed in the first training. Intra-observer technical error for WC ranged from 0.12 to 0.71 cm during the first training and from 0.05 to 1.11 cm during the second training, and reliability above 92% was achieved. Epidemiological surveys need standardized procedures and training of researchers to reduce measurement error. In the ToyBox-study, very good intra- and-inter-observer agreement was achieved for all anthropometric measurements performed. © 2014 World Obesity.

  10. Central Auditory Processing through the Looking Glass: A Critical Look at Diagnosis and Management.

    ERIC Educational Resources Information Center

    Young, Maxine L.

    1985-01-01

    The article examines the contributions of both audiologists and speech-language pathologists to the diagnosis and management of students with central auditory processing disorders and language impairments. (CL)

  11. Lessons learned from the integration of local stakeholders in water management approaches in central-northern Namibia

    NASA Astrophysics Data System (ADS)

    Jokisch, A.; Urban, W.

    2012-04-01

    Water is the main limiting factor for economic and agricultural development in central-northern Namibia, where approximately 50% of the Namibian population lives on less than 10% of the country's surface area. The climate in the region can be characterized as semi-arid, with distinctive rainy and dry seasons and an average precipitation of 470 mm/a. Central-northern Namibia can furthermore be characterized by a system of so-called Oshanas, very shallow ephemeral river streams which drain the whole region from north to south towards the Etosha-Saltpan. Water quality within these ephemeral river streams rapidly decreases towards the end of the dry season due to high rates of evaporation (2,700 mm/a) which makes the water unsuitable for human consumption and in certain times of the year also for irrigation purposes. Other local water resources are scarce or of low quality. Therefore, the local water supply is mainly secured via a pipeline scheme which is fed by the Namibian-Angolan border river Kunene. Within the research project CuveWaters - Integrated Water Resources Management in central-northern Namibia different small scale water supply and sanitation technologies are implemented and tested as part of the projects multi-resource mix. The aim is to decentralize the regional water supply and make it more sustainable especially in the face of climate change. To gain understanding and to create ownership within the local population for the technologies implemented, stakeholder participation and capacity development are integral parts of the project. As part of the implementation process of rainwater harvesting and water harvesting from ephemeral river streams, pilot plants for the storage of water were constructed with the help of local stakeholders who will also be the beneficiaries of the pilot plants. The pilot plants consist of covered storage tanks and infrastructure for small scale horticultural use of the water stored. These small scale horticultural activities enable the users of the pilot plants to improve their standard of living by producing vegetables for self-consumption or for selling them on local markets. Irrigation for small-scale horticulture was virtually unknown in the region prior to the project which makes intense training for the local users necessary. This paper summarizes the participative process of finding a pilot village and a suitable location along the ephemeral river stream as well as the process of selecting people from the local community for construction and for the operation of the pilot plant. According to the demand-responsive approach of the CuveWaters project, local stakeholders were involved in all these processes. Tools for participation used are workshops and interviews with local stakeholders and the integration of the users in all decision-making processes as well as in construction, maintenance, operation and monitoring.

  12. Poverty and Ethnicity: A Cross-Country Study of Roma Poverty in Central Europe. World Bank Technical Paper.

    ERIC Educational Resources Information Center

    Revenga, Ana; Ringold, Dena; Tracy, William Martin

    Roma, or "gypsies," are the main poverty risk group in many countries of central and eastern Europe. Living standards for the Roma have deteriorated more severely during the region's transition to a market economy than they have for other population groups, and Roma have been poorly positioned to take advantage of emerging economic and…

  13. Supine Length, Weight and Head Circumference at Birth in Central Iran

    ERIC Educational Resources Information Center

    Ayatollahi, S. M. T.; Rafiei, Mohammad

    2007-01-01

    Supine length, weight and head circumferences of 10,241 neonates (5241 boys, 5000 girls, sex ratio 105) born in Arak (central Iran) in 2004 are reported. The mean plus or minus standard deviation of boys' and girls' (p value for sex difference) supine length (mm), weight (g) and head circumference (mm) were estimated as 501 plus or minus 30 and…

  14. Chemical release of pole-sized trees in a central hardwood clearcut

    Treesearch

    J. W. Van Sambeek; D. Abugarshall Kai; David B. Shenaut

    1995-01-01

    Our study evaluated the effectiveness of tree injection and full basal bark treatments using three herbicide formulations at reduced or standard practice rates to release crop trees in an overstocked pole-sized Central Hardwood stand. Herbicides tested included glyphosate (Accord), dicamba only (Banvel CST), and dicamba+2,4-D (Banvel 520). The study was conducted in a...

  15. Essays on remote monitoring as an emerging tool for centralized management of decentralized wastewater systems

    NASA Astrophysics Data System (ADS)

    Solomon, Clement

    According to the United States Environmental Protections Agency (USEPA), nearly one in four households in the United States depends on an individual septic system (commonly referred as an onsite system or a decentralized wastewater system) to treat and disperse wastewater. More than half of these systems are over 30 years old, and surveys indicate at least 10 to 20% might not be functioning properly. The USEPA concluded in its 1997 report to Congress that adequately managed decentralized wastewater systems (DWS) are a cost-effective and long-term option for meeting public health and water quality goals, particularly in less densely populated areas. The major challenge however is the absence of a guiding national regulatory framework based on consistent performance-based standards and lack of proper management of DWS. These inconsistencies pose a significant threat to our water resources, local economies, and public health. This dissertation addresses key policy and regulatory strategies needed in response to the new realities confronting decentralized wastewater management. The two core objectives of this research are to demonstrate the centralized management of DWS paradigm and to present a scientific methodology to develop performance-based standards (a regulatory shift from prescriptive methods) using remote monitoring. The underlying remote monitoring architecture for centralized DWS management and the value of science-based policy making are presented. Traditionally, prescriptive standards using conventional grab sampling data are the norm by which most standards are set. Three case studies that support the potential of remote monitoring as a tool for standards development and system management are presented. The results revealed a vital role for remote monitoring in the development of standardized protocols, policies and procedures that are greatly lacking in this field. This centralized management and remote monitoring paradigm fits well and complements current USEPA policy (13 elements of management); meets the growing need for qualitative data (objective and numerical); has better time efficiencies as real-time events are sampled and translated into machine-readable signals in a short period of time; allows cost saving rapid response to system recovery and operation; produces labor and economic efficiencies through targeted responses; and, improves the quality and operational costs of any management program. This project was funded by the USEPA grant # C-82878001 as part of the National Onsite Demonstration Project (NODP), West Virginia University.

  16. The effect of low versus high approach-motivated positive affect on memory for peripherally versus centrally presented information.

    PubMed

    Gable, Philip A; Harmon-Jones, Eddie

    2010-08-01

    Emotions influence attention and processes involved in memory. Although some research has suggested that positive affect categorically influences these processes differently than neutral affect, recent research suggests that motivational intensity of positive affective states influences these processes. The present experiments examined memory for centrally or peripherally presented information after the evocation of approach-motivated positive affect. Experiment 1 found that, relative to neutral conditions, pregoal, approach-motivated positive affect (caused by a monetary incentives task) enhanced memory for centrally presented information, whereas postgoal, low approach-motivated positive affect enhanced memory for peripherally presented information. Experiment 2 found that, relative to a neutral condition, high approach-motivated positive affect (caused by appetitive pictures) enhanced memory for centrally presented information but hindered memory for peripheral information. These results suggest a more complex relationship between positive affect and memory processes and highlight the importance of considering the motivational intensity of positive affects in cognitive processes. Copyright 2010 APA

  17. Stress and tension-type headache mechanisms.

    PubMed

    Cathcart, Stuart; Winefield, Anthony H; Lushington, Kurt; Rolan, Paul

    2010-10-01

    Stress is widely demonstrated as a contributing factor in tension-type headache (TTH). The mechanisms underlying this remain unclear at present. Recent research indicates the importance of central pain processes in tension-type headache (TTH) pathophysiology. Concurrently, research with animals and healthy humans has begun to elucidate the relationship between stress and pain processing in the central nervous system, including central pain processes putatively dysfunctional in TTH. Combined, these two fields of research present new insights and hypotheses into possible mechanisms by which stress may contribute to TTH. To date, however, there has been no comprehensive review of this literature. The present paper provides such a review, which may be valuable in facilitating a broader understanding of the central mechanisms by which stress may contribute to TTH.

  18. Testing the Predictions of the Central Capacity Sharing Model

    ERIC Educational Resources Information Center

    Tombu, Michael; Jolicoeur, Pierre

    2005-01-01

    The divergent predictions of 2 models of dual-task performance are investigated. The central bottleneck and central capacity sharing models argue that a central stage of information processing is capacity limited, whereas stages before and after are capacity free. The models disagree about the nature of this central capacity limitation. The…

  19. PCR-DGGE analysis of lactic acid bacteria and yeast dynamics during the production processes of three varieties of Panettone.

    PubMed

    Garofalo, C; Silvestri, G; Aquilanti, L; Clementi, F

    2008-07-01

    To study lactic acid bacteria (LAB) and yeast dynamics during the production processes of sweet-leavened goods manufactured with type I sourdoughs. Fourteen sourdough and dough samples were taken from a baking company in central Italy during the production lines of three varieties of Panettone. The samples underwent pH measurements and plating analysis on three solid media. The microbial DNA was extracted from both the (sour)doughs and the viable LAB and yeast cells collected in bulk, and subjected to PCR-denaturing gradient gel electrophoresis (DGGE) analysis. The molecular fingerprinting of the cultivable plus noncultivable microbial populations provide evidence of the dominance of Lactobacillus sanfranciscensis, Lactobacillus brevis and Candida humilis in the three fermentation processes. The DGGE profiles of the cultivable communities reveal a bacterial shift in the final stages of two of the production processes, suggesting an effect of technological parameters on the selection of the dough microflora. Our findings confirm the importance of using a combined analytical approach to explore microbial communities that develop during the leavening process of sweet-leavened goods. In-depth studies of sourdough biodiversity and population dynamics occurring during sourdough fermentation are fundamental for the control of the leavening process and the manufacture of standardized, high-quality products.

  20. Cloud-based Web Services for Near-Real-Time Web access to NPP Satellite Imagery and other Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.

    2010-12-01

    We are building a scalable, cloud computing-based infrastructure for Web access to near-real-time data products synthesized from the U.S. National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP) and other geospatial and meteorological data. Given recent and ongoing changes in the the NPP and NPOESS programs (now Joint Polar Satellite System), the need for timely delivery of NPP data is urgent. We propose an alternative to a traditional, centralized ground segment, using distributed Direct Broadcast facilities linked to industry-standard Web services by a streamlined processing chain running in a scalable cloud computing environment. Our processing chain, currently implemented on Amazon.com's Elastic Compute Cloud (EC2), retrieves raw data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and synthesizes data products such as Sea-Surface Temperature, Vegetation Indices, etc. The cloud computing approach lets us grow and shrink computing resources to meet large and rapid fluctuations (twice daily) in both end-user demand and data availability from polar-orbiting sensors. Early prototypes have delivered various data products to end-users with latencies between 6 and 32 minutes. We have begun to replicate machine instances in the cloud, so as to reduce latency and maintain near-real time data access regardless of increased data input rates or user demand -- all at quite moderate monthly costs. Our service-based approach (in which users invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored and composite (e.g., false-color multiband) products on demand. To facilitate broad impact and adoption of our technology, we have emphasized open, industry-standard software interfaces and open source software. Through our work, we envision the widespread establishment of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sensors. A scalable architecture based on cloud computing ensures cost-effective, real-time processing and delivery of NPP and other data. Access via standard Web services maximizes its interoperability and usefulness.

Top