Standardized UXO Technology Demonstration Site, Scoring Record No. 943
2014-08-01
COLLERAN ROAD ABERDEEN PROVING GROUND, MARYLAND 21005-5059 Printed on Recycled Paper TEDT-AT-SL-M MEMORANDUM FOR Program Manager – SERDP...equipment. Small munitions grid Contains 300 grid cells . The center of each grid cell contains either munitions, clutter, or nothing with a portion...weather was warm and the field dry throughout the survey period for Battelle. 12 3.3.3 Soil Moisture Three soil probes were placed at various
A grid for a precise analysis of daily activities.
Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E
2010-01-01
Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.
2008-08-01
DEMONSTRATOR’S FIELD PERSONNEL Geophysicist: Craig Hyslop Geophysicist: John Jacobsen Geophysicist: Rob Mehl 3.7 DEMONSTRATOR’S FIELD...Practical Nonparametric Statistics, W.J. Conover, John Wiley & Sons, 1980 , pages 144 through 151. APPENDIX F. ABBREVIATIONS F-1 (Page F-2
Gupta, Kishan; Beer, Nathan J.; Keller, Lauren A.; Hasselmo, Michael E.
2014-01-01
Prior studies of head direction (HD) cells indicate strong landmark control over the preferred firing direction of these cells, with few studies exhibiting shifts away from local reference frames over time. We recorded spiking activity of grid and HD cells in the medial entorhinal cortex of rats, testing correlations of local environmental cues with the spatial tuning curves of these cells' firing fields as animals performed continuous spatial alternation on a T-maze that shared the boundaries of an open-field arena. The environment was rotated into configurations the animal had either seen or not seen in the past recording week. Tuning curves of both cell types demonstrated commensurate shifts of tuning with T-maze rotations during less recent rotations, more so than recent rotations. This strongly suggests that animals are shifting their reference frame away from the local environmental cues over time, learning to use a different reference frame more likely reliant on distal or idiothetic cues. In addition, grid fields demonstrated varying levels of “fragmentation” on the T-maze. The propensity for fragmentation does not depend on grid spacing and grid score, nor animal trajectory, indicating the cognitive treatment of environmental subcompartments is likely driven by task demands. PMID:23382518
2007-05-01
BOX 25046, FEDERAL CENTER, M.S. 964 DENVER, CO 80225-0046 TECHNOLOGY TYPE/PLATFORM: TMGS MAGNETOMETER/TOWED ARRAY PREPARED BY: U.S. ARMY...GEOLOGICAL SURVEY, TMGS MAGNETOMETER/TOWED ARRAY) 8-CO-160-UXO-021 Karwatka, Michael... TMGS Magnetometer/Towed Array, MEC Unclassified Unclassified Unclassified SAR (Page ii Blank) i ACKNOWLEDGMENTS
Racadio, John M.; Abruzzo, Todd A.; Johnson, Neil D.; Patel, Manish N.; Kukreja, Kamlesh U.; den Hartog, Mark. J. H.; Hoornaert, Bart P.A.; Nachabe, Rami A.
2015-01-01
The purpose of this study was to reduce pediatric doses while maintaining or improving image quality scores without removing the grid from X‐ray beam. This study was approved by the Institutional Animal Care and Use Committee. Three piglets (5, 14, and 20 kg) were imaged using six different selectable detector air kerma (Kair) per frame values (100%, 70%, 50%, 35%, 25%, 17.5%) with and without the grid. Number of distal branches visualized with diagnostic confidence relative to the injected vessel defined image quality score. Five pediatric interventional radiologists evaluated all images. Image quality score and piglet Kair were statistically compared using analysis of variance and receiver operating curve analysis to define the preferred dose setting and use of grid for a visibility of 2nd and 3rd order vessel branches. Grid removal reduced both dose to subject and imaging quality by 26%. Third order branches could only be visualized with the grid present; 100% detector Kair was required for smallest pig, while 70% detector Kair was adequate for the two larger pigs. Second order branches could be visualized with grid at 17.5% detector Kair for all three pig sizes. Without the grid, 50%, 35%, and 35% detector Kair were required for smallest to largest pig, respectively. Grid removal reduces both dose and image quality score. Image quality scores can be maintained with less dose to subject with the grid in the beam as opposed to removed. Smaller anatomy requires more dose to the detector to achieve the same image quality score. PACS numbers: 87.53.Bn, 87.57.N‐, 87.57.cj, 87.59.cf, 87.59.Dj PMID:26699297
AVQS: attack route-based vulnerability quantification scheme for smart grid.
Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik
2014-01-01
A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.
[Mobile emergency care medical records audit: the need for Tunisian guidelines].
Mallouli, Manel; Hchaichi, Imen; Ammar, Asma; Sehli, Jihène; Zedini, Chekib; Mtiraoui, Ali; Ajmi, Thouraya
2017-03-06
Objective: This study was designed to assess the quality of the Gabès (Tunisia) mobile emergency care medical records and propose corrective actions.Materials and methods: A clinical audit was performed at the Gabès mobile emergency care unit (SMUR). Records of day, night and weekend primary and secondary interventions during the first half of 2014 were analysed according to a data collection grid comprising 56 criteria based on the SMUR guidelines and the 2013 French Society of Emergency Medicine evaluation guide. A non-conformance score was calculated for each section.Results: 415 medical records were analysed. The highest non-conformance rates (48.5%) concerned the “specificities of the emergency medical record” section. The lowest non-conformance rates concerned the surveillance data section (23.4%). The non-conformance score for the medical data audit was 24%.Conclusion: This audit identified minor dysfunctions that could be due to the absence of local guidelines concerning medical records in general and more specifically SMUR. Corrective measures were set up in the context of a short-term and intermediate-term action plan.
AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid
Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik
2014-01-01
A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923
Standardized UXO Technology Demonstration Site Blind Grid Scoring Record Number 891
2008-08-01
magnetometers (Foerster CON650 gradiometers) and RTK-DGPS georeferencing will be used. The spacing between the individual fluxgate sensors will be 25 cm...used for data acquisition usually ranges from 8 to 32. b. For the demonstration at Aberdeen Proving Ground, a system with eight fluxgate ...up to 32 fluxgate gradiometers (for the APG demonstration: eight fluxgate gradiometers), a robust, all-terrain trailer, the MonMX data acquisition
Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 764
2006-04-01
Attainable accuracy of depth (z) ± 0.3 meter Detection performance for ferrous and nonferrous metals : will detect ammunition components 20-mm...ASSOCIATES, INC. 6832 OLD DOMINION DRIVE MCLEAN, VA 22101 TECHNOLOGY TYPE/PLATFORM: MULTI CHANNEL DETECTOR SYSTEM (AMOS)/TOWED PREPARED BY: U.S...Multi Channel Detector System (AMOS)/Towed, MEC 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON a. REPORT Unclassified b. ABSTRACT
Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 919
2008-07-01
provided by demonstrator) a. The core component of the electromagnetic (EM) AMOS metal detector is a linear multichannel sensor array consisting of a...Attainable accuracy of depth (z) +0.3 m h. Detection performance for ferrous and nonferrous metals : Will detect ammunition components 20-mm caliber...2-meter-wide transmitter coil and 16 receiver coils, mounted on a robust, all-terrain trailer (fig. 1). b. The AMOS detector unit consists of the
Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record Number 842
2007-06-01
collection sessions. Daily: A location identified as having no subsurface metal will be designated as a calibration point. Readings will be... metallic item will be placed below the center of the sensors, and the instrument’s response will be observed. The item will then be removed, and static... nonferrous anomalies. Due to limitations of the magnetometer, the nonferrous items cannot be detected. Therefore, the ROC curves presented in Figures
NASA Astrophysics Data System (ADS)
Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun
2018-01-01
With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.
Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 805
2007-03-01
and receiver (RX) coils. b. The Tensor Magnetic Gradiometer System ( TMGS ) has been reconfigured to improve its performance compared with the...ALL TEM. The TMGS raw data files consist of an ASCII header with system settings followed by the data in binary format. The GPS positions, EDA...exported in ASCII format. A new data acquisition system for the TMGS will be supplied by the demonstrator. It is controlled by LabVIEW, as is the ALL
2008-09-01
heading north from the southern end point, and then returning south from the northern end point. 2) A metallic pin-flag is placed over the midpoint...test involves traverses across a known point located away from buried UXO or other metallic debris. A 5-meter- length of line is walked in eight...ferrous and nonferrous anomalies. Due to limitations of the magnetometer, the nonferrous items cannot be detected. Therefore, the ROC curves
Tabuse, Hideaki; Kalali, Amir; Azuma, Hideki; Ozaki, Norio; Iwata, Nakao; Naitoh, Hiroshi; Higuchi, Teruhiko; Kanba, Shigenobu; Shioe, Kunihiko; Akechi, Tatsuo; Furukawa, Toshi A
2007-09-30
The Hamilton Rating Scale for Depression (HAMD) is the de facto international gold standard for the assessment of depression. There are some criticisms, however, especially with regard to its inter-rater reliability, due to the lack of standardized questions or explicit scoring procedures. The GRID-HAMD was developed to provide standardized explicit scoring conventions and a structured interview guide for administration and scoring of the HAMD. We developed the Japanese version of the GRID-HAMD and examined its inter-rater reliability among experienced and inexperienced clinicians (n=70), how rater characteristics may affect it, and how training can improve it in the course of a model training program using videotaped interviews. The results showed that the inter-rater reliability of the GRID-HAMD total score was excellent to almost perfect and those of most individual items were also satisfactory to excellent, both with experienced and inexperienced raters, and both before and after the training. With its standardized definitions, questions and detailed scoring conventions, the GRID-HAMD appears to be the best achievable set of interview guides for the HAMD and can provide a solid tool for highly reliable assessment of depression severity.
Using Option Grids: steps toward shared decision-making for neonatal circumcision.
Fay, Mary; Grande, Stuart W; Donnelly, Kyla; Elwyn, Glyn
2016-02-01
To assess the impact, acceptability and feasibility of a short encounter tool designed to enhance the process of shared decision-making and parental engagement. We analyzed video-recordings of clinical encounters, half undertaken before and half after a brief intervention that trained four clinicians how to use Option Grids, using an observer-based measure of shared decision-making. We also analyzed semi-structured interviews conducted with the clinicians four weeks after their exposure to the intervention. Observer OPTION(5) scores were higher at post-intervention, with a mean of 33.9 (SD=23.5) compared to a mean of 16.1 (SD=7.1) for pre-intervention, a significant difference of 17.8 (95% CI: 2.4, 33.2). Prior to using the intervention, clinicians used a consent document to frame circumcision as a default practice. Encounters with the Option Grid conferred agency to both parents and clinicians, and facilitated shared decision-making. Clinician reported recognizing the tool's positive effect on their communication process. Tools such as Option Grids have the potential to make it easier for clinicians to achieve shared decision-making. Encounter tools have the potential to change practice. More research is needed to test their feasibility in routine practice. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A linkable identity privacy algorithm for HealthGrid.
Zhang, Ning; Rector, Alan; Buchan, Iain; Shi, Qi; Kalra, Dipak; Rogers, Jeremy; Goble, Carole; Walker, Steve; Ingram, David; Singleton, Peter
2005-01-01
The issues of confidentiality and privacy have become increasingly important as Grid technology is being adopted in public sectors such as healthcare. This paper discusses the importance of protecting the confidentiality and privacy of patient health/medical records, and the challenges exhibited in enforcing this protection in a Grid environment. It proposes a novel algorithm to allow traceable/linkable identity privacy in dealing with de-identified medical records. Using the algorithm, de-identified health records associated to the same patient but generated by different healthcare providers are given different pseudonyms. However, these pseudonymised records of the same patient can still be linked by a trusted entity such as the NHS trust or HealthGrid manager. The paper has also recommended a security architecture that integrates the proposed algorithm with other data security measures needed to achieve the desired security and privacy in the HealthGrid context.
An Algorithm for Converting Contours to Elevation Grids.
ERIC Educational Resources Information Center
Reid-Green, Keith S.
Some of the test questions for the National Council of Architectural Registration Boards deal with the site, including drainage, regrading, and the like. Some questions are most easily scored by examining contours, but others, such as water flow questions, are best scored from a grid in which each element is assigned its average elevation. This…
Soblosky, J S; Colgin, L L; Chorney-Lane, D; Davidson, J F; Carey, M E
1997-12-30
Hindlimb and forelimb deficits in rats caused by sensorimotor cortex lesions are frequently tested by using the narrow flat beam (hindlimb), the narrow pegged beam (hindlimb and forelimb) or the grid-walking (forelimb) tests. Although these are excellent tests, the narrow flat beam generates non-parametric data so that using more powerful parametric statistical analyses are prohibited. All these tests can be difficult to score if the rat is moving rapidly. Foot misplacements, especially on the grid-walking test, are indicative of an ongoing deficit, but have not been reliably and accurately described and quantified previously. In this paper we present an easy to construct and use horizontal ladder-beam with a camera system on rails which can be used to evaluate both hindlimb and forelimb deficits in a single test. By slow motion videotape playback we were able to quantify and demonstrate foot misplacements which go beyond the recovery period usually seen using more conventional measures (i.e. footslips and footfaults). This convenient system provides a rapid and reliable method for recording and evaluating rat performance on any type of beam and may be useful for measuring sensorimotor recovery following brain injury.
Bazelet, Corinna S; Thompson, Aileen C; Naskrecki, Piotr
2016-01-01
The use of endemism and vascular plants only for biodiversity hotspot delineation has long been contested. Few studies have focused on the efficacy of global biodiversity hotspots for the conservation of insects, an important, abundant, and often ignored component of biodiversity. We aimed to test five alternative diversity measures for hotspot delineation and examine the efficacy of biodiversity hotspots for conserving a non-typical target organism, South African katydids. Using a 1° fishnet grid, we delineated katydid hotspots in two ways: (1) count-based: grid cells in the top 10% of total, endemic, threatened and/or sensitive species richness; vs. (2) score-based: grid cells with a mean value in the top 10% on a scoring system which scored each species on the basis of its IUCN Red List threat status, distribution, mobility and trophic level. We then compared katydid hotspots with each other and with recognized biodiversity hotspots. Grid cells within biodiversity hotspots had significantly higher count-based and score-based diversity than non-hotspot grid cells. There was a significant association between the three types of hotspots. Of the count-based measures, endemic species richness was the best surrogate for the others. However, the score-based measure out-performed all count-based diversity measures. Species richness was the least successful surrogate of all. The strong performance of the score-based method for hotspot prediction emphasizes the importance of including species' natural history information for conservation decision-making, and is easily adaptable to other organisms. Furthermore, these results add empirical support for the efficacy of biodiversity hotspots in conserving non-target organisms.
Bazelet, Corinna S.; Thompson, Aileen C.; Naskrecki, Piotr
2016-01-01
The use of endemism and vascular plants only for biodiversity hotspot delineation has long been contested. Few studies have focused on the efficacy of global biodiversity hotspots for the conservation of insects, an important, abundant, and often ignored component of biodiversity. We aimed to test five alternative diversity measures for hotspot delineation and examine the efficacy of biodiversity hotspots for conserving a non-typical target organism, South African katydids. Using a 1° fishnet grid, we delineated katydid hotspots in two ways: (1) count-based: grid cells in the top 10% of total, endemic, threatened and/or sensitive species richness; vs. (2) score-based: grid cells with a mean value in the top 10% on a scoring system which scored each species on the basis of its IUCN Red List threat status, distribution, mobility and trophic level. We then compared katydid hotspots with each other and with recognized biodiversity hotspots. Grid cells within biodiversity hotspots had significantly higher count-based and score-based diversity than non-hotspot grid cells. There was a significant association between the three types of hotspots. Of the count-based measures, endemic species richness was the best surrogate for the others. However, the score-based measure out-performed all count-based diversity measures. Species richness was the least successful surrogate of all. The strong performance of the score-based method for hotspot prediction emphasizes the importance of including species’ natural history information for conservation decision-making, and is easily adaptable to other organisms. Furthermore, these results add empirical support for the efficacy of biodiversity hotspots in conserving non-target organisms. PMID:27631131
NeuroGrid: recording action potentials from the surface of the brain.
Khodagholy, Dion; Gelinas, Jennifer N; Thesen, Thomas; Doyle, Werner; Devinsky, Orrin; Malliaras, George G; Buzsáki, György
2015-02-01
Recording from neural networks at the resolution of action potentials is critical for understanding how information is processed in the brain. Here, we address this challenge by developing an organic material-based, ultraconformable, biocompatible and scalable neural interface array (the 'NeuroGrid') that can record both local field potentials(LFPs) and action potentials from superficial cortical neurons without penetrating the brain surface. Spikes with features of interneurons and pyramidal cells were simultaneously acquired by multiple neighboring electrodes of the NeuroGrid, allowing for the isolation of putative single neurons in rats. Spiking activity demonstrated consistent phase modulation by ongoing brain oscillations and was stable in recordings exceeding 1 week's duration. We also recorded LFP-modulated spiking activity intraoperatively in patients undergoing epilepsy surgery. The NeuroGrid constitutes an effective method for large-scale, stable recording of neuronal spikes in concert with local population synaptic activity, enhancing comprehension of neural processes across spatiotemporal scales and potentially facilitating diagnosis and therapy for brain disorders.
Marrin, Katy; Wood, Fiona; Firth, Jill; Kinsey, Katharine; Edwards, Adrian; Brain, Kate E; Newcombe, Robert G; Nye, Alan; Pickles, Timothy; Hawthorne, Kamila; Elwyn, Glyn
2014-04-07
Despite policy interest, an ethical imperative, and evidence of the benefits of patient decision support tools, the adoption of shared decision making (SDM) in day-to-day clinical practice remains slow and is inhibited by barriers that include culture and attitudes; resources and time pressures. Patient decision support tools often require high levels of health and computer literacy. Option Grids are one-page evidence-based summaries of the available condition-specific treatment options, listing patients' frequently asked questions. They are designed to be sufficiently brief and accessible enough to support a better dialogue between patients and clinicians during routine consultations. This paper describes a study to assess whether an Option Grid for osteoarthritis of the knee (OA of the knee) facilitates SDM, and explores the use of Option Grids by patients disadvantaged by language or poor health literacy. This will be a stepped wedge exploratory trial involving 72 patients with OA of the knee referred from primary medical care to a specialist musculoskeletal service in Oldham. Six physiotherapists will sequentially join the trial and consult with six patients using usual care procedures. After a period of brief training in using the Option Grid, the same six physiotherapists will consult with six further patients using an Option Grid in the consultation. The primary outcome will be efficacy of the Option Grid in facilitating SDM as measured by observational scores using the OPTION scale. Comparisons will be made between patients who have received the Option Grid and those who received usual care. A Decision Quality Measure (DQM) will assess quality of decision making. The health literacy of patients will be measured using the REALM-R instrument. Consultations will be observed and audio-recorded. Interviews will be conducted with the physiotherapists, patients and any interpreters present to explore their views of using the Option Grid. Option Grids offer a potential solution to the barriers to implementing traditional decision aids into routine clinical practice. The study will assess whether Option Grids can facilitate SDM in day-to-day clinical practice and explore their use with patients disadvantaged by language or poor health literacy. Current Controlled Trials ISRCTN94871417.
Mukherjee, Sudipto; Rizzo, Robert C.
2014-01-01
Scoring functions are a critically important component of computer-aided screening methods for the identification of lead compounds during early stages of drug discovery. Here, we present a new multi-grid implementation of the footprint similarity (FPS) scoring function that was recently developed in our laboratory which has proven useful for identification of compounds which bind to a protein on a per-residue basis in a way that resembles a known reference. The grid-based FPS method is much faster than its Cartesian-space counterpart which makes it computationally tractable for on-the-fly docking, virtual screening, or de novo design. In this work, we establish that: (i) relatively few grids can be used to accurately approximate Cartesian space footprint similarity, (ii) the method yields improved success over the standard DOCK energy function for pose identification across a large test set of experimental co-crystal structures, for crossdocking, and for database enrichment, and (iii) grid-based FPS scoring can be used to tailor construction of new molecules to have specific properties, as demonstrated in a series of test cases targeting the viral protein HIVgp41. The method will be made available in the program DOCK6. PMID:23436713
During running in place, grid cells integrate elapsed time and distance run
Kraus, Benjamin J.; Brandon, Mark P.; Robinson, Robert J.; Connerney, Michael A.; Hasselmo, Michael E.; Eichenbaum, Howard
2015-01-01
Summary The spatial scale of grid cells may be provided by self-generated motion information or by external sensory information from environmental cues. To determine whether grid cell activity reflects distance traveled or elapsed time independent of external information, we recorded grid cells as animals ran in place on a treadmill. Grid cell activity was only weakly influenced by location but most grid cells and other neurons recorded from the same electrodes strongly signaled a combination of distance and time, with some signaling only distance or time. Grid cells were more sharply tuned to time and distance than non-grid cells. Many grid cells exhibited multiple firing fields during treadmill running, parallel to the periodic firing fields observed in open fields, suggesting a common mode of information processing. These observations indicate that, in the absence of external dynamic cues, grid cells integrate self-generated distance and time information to encode a representation of experience. PMID:26539893
A secure and efficiently searchable health information architecture.
Yasnoff, William A
2016-06-01
Patient-centric repositories of health records are an important component of health information infrastructure. However, patient information in a single repository is potentially vulnerable to loss of the entire dataset from a single unauthorized intrusion. A new health record storage architecture, the personal grid, eliminates this risk by separately storing and encrypting each person's record. The tradeoff for this improved security is that a personal grid repository must be sequentially searched since each record must be individually accessed and decrypted. To allow reasonable search times for large numbers of records, parallel processing with hundreds (or even thousands) of on-demand virtual servers (now available in cloud computing environments) is used. Estimated search times for a 10 million record personal grid using 500 servers vary from 7 to 33min depending on the complexity of the query. Since extremely rapid searching is not a critical requirement of health information infrastructure, the personal grid may provide a practical and useful alternative architecture that eliminates the large-scale security vulnerabilities of traditional databases by sacrificing unnecessary searching speed. Copyright © 2016 Elsevier Inc. All rights reserved.
Ahn, Su Yeon; Chae, Kum Ju; Goo, Jin Mo
2018-01-01
To compare the observer preference of image quality and radiation dose between non-grid, grid-like, and grid images. Each of the 38 patients underwent bedside chest radiography with and without a grid. A grid-like image was generated from a non-grid image using SimGrid software (Samsung Electronics Co. Ltd.) employing deep-learning-based scatter correction technology. Two readers recorded the preference for 10 anatomic landmarks and the overall appearance on a five-point scale for a pair of non-grid and grid-like images, and a pair of grid-like and grid images, respectively, which were randomly presented. The dose area product (DAP) was also recorded. Wilcoxon's rank sum test was used to assess the significance of preference. Both readers preferred grid-like images to non-grid images significantly ( p < 0.001); with a significant difference in terms of the preference for grid images to grid-like images ( p = 0.317, 0.034, respectively). In terms of anatomic landmarks, both readers preferred grid-like images to non-grid images ( p < 0.05). No significant differences existed between grid-like and grid images except for the preference for grid images in proximal airways by two readers, and in retrocardiac lung and thoracic spine by one reader. The median DAP were 1.48 (range, 1.37-2.17) dGy * cm 2 in grid images and 1.22 (range, 1.11-1.78) dGy * cm 2 in grid-like images with a significant difference ( p < 0.001). The SimGrid software significantly improved the image quality of non-grid images to a level comparable to that of grid images with a relatively lower level of radiation exposure.
Stability assessment of structures under earthquake hazard through GRID technology
NASA Astrophysics Data System (ADS)
Prieto Castrillo, F.; Boton Fernandez, M.
2009-04-01
This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding Metadata containing the response LFN, earthquake magnitude and maximum structure displacement is also stored. Finally, the displacements are post-processed through a statistically-based algorithm from the available Metadata to obtain the probability of collapse of the structure for different earthquake magnitudes. From this study, it is possible to build a vulnerability report for the structure type and seismic data. The proposed methodology can be combined with the on-going initiatives to build a European earthquake record database. In this context, Grid enables collaboration analysis over shared seismic data and results among different institutions.
Using the Bem and Klein Grid Scores to Predict Health Services Usage by Men
Reynolds, Grace L.; Fisher, Dennis G.; Dyo, Melissa; Huckabay, Loucine M.
2016-01-01
We examined the association between scores on the Bem Sex Roles Inventory (BSRI), Klein Sexual Orientation Grid (KSOG) and utilization of hospital inpatient services, emergency departments, and outpatient clinic visits in the past 12 months among 53 men (mean age 39 years). The femininity subscale score on the BSRI, ever having had gonorrhea and age were the three variables identified in a multivariate linear regression significantly predicting use of total health services. This supports the hypothesis that sex roles can assist our understanding of men’s use of health services. PMID:27337618
NASA Astrophysics Data System (ADS)
Hardman, M.; Brodzik, M. J.; Long, D. G.; Paget, A. C.; Armstrong, R. L.
2015-12-01
Beginning in 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Currently available global gridded passive microwave data sets serve a diverse community of hundreds of data users, but do not meet many requirements of modern Earth System Data Records (ESDRs) or Climate Data Records (CDRs), most notably in the areas of intersensor calibration, quality-control, provenance and consistent processing methods. The original gridding techniques were relatively primitive and were produced on 25 km grids using the original EASE-Grid definition that is not easily accommodated in modern software packages. Further, since the first Level 3 data sets were produced, the Level 2 passive microwave data on which they were based have been reprocessed as Fundamental CDRs (FCDRs) with improved calibration and documentation. We are funded by NASA MEaSUREs to reprocess the historical gridded data sets as EASE-Grid 2.0 ESDRs, using the most mature available Level 2 satellite passive microwave (SMMR, SSM/I-SSMIS, AMSR-E) records from 1978 to the present. We have produced prototype data from SSM/I and AMSR-E for the year 2003, for review and feedback from our Early Adopter user community. The prototype data set includes conventional, low-resolution ("drop-in-the-bucket" 25 km) grids and enhanced-resolution grids derived from the two candidate image reconstruction techniques we are evaluating: 1) Backus-Gilbert (BG) interpolation and 2) a radiometer version of Scatterometer Image Reconstruction (SIR). We summarize our temporal subsetting technique, algorithm tuning parameters and computational costs, and include sample SSM/I images at enhanced resolutions of up to 3 km. We are actively working with our Early Adopters to finalize content and format of this new, consistently-processed high-quality satellite passive microwave ESDR.
Jones, P. D. [University of East Anglia, Norwich, United Kingdom; Raper, S. C.B. [University of East Anglia, Norwich, United Kingdom; Cherry, B. S.G. [University of East Anglia, Norwich, United Kingdom; Goodess, C. M. [University of East Anglia, Norwich, United Kingdom; Wigley, T. M. L. [University of East Anglia, Norwich, United Kingdom; Santer, B. [University of East Anglia, Norwich, United Kingdom; Kelly, P. M. [University of East Anglia, Norwich, United Kingdom; Bradley, R. S. [University of Massachusetts, Amherst, Massachusetts (USA); Diaz, H. F. [National Oceanic and Atmospheric Administration (NOAA), Environmental Research Laboratories, Boulder, CO (United States).
1991-01-01
This NDP presents land-based monthly surface-air-temperature anomalies (departures from a 1951-1970 reference period mean) on a 5° latitude by 10° longitude global grid. Monthly surface-air-temperature anomalies (departures from a 1957-1975 reference period mean) for the Antarctic (grid points from 65°S to 85°S) are presented in a similar way as a separate data set. The data were derived primarily from the World Weather Records and from the archives of the United Kingdom Meteorological Office. This long-term record of temperature anomalies may be used in studies addressing possible greenhouse-gas-induced climate changes. To date, the data have been employed in producing regional, hemispheric, and global time series for determining whether recent (i.e., post-1900) warming trends have taken place. The present updated version of this data set is identical to the earlier version for all records from 1851-1978 except for the addition of the Antarctic surface-air-temperature anomalies beginning in 1957. Beginning with the 1979 data, this package differs from the earlier version in several ways. Erroneous data for some sites have been corrected after a review of the actual station temperature data, and inconsistencies in the representation of missing values have been removed. For some grid locations, data have been added from stations that had not contributed to the original set. Data from satellites have also been used to correct station records where large discrepancies were evident. The present package also extends the record by adding monthly surface-air-temperature anomalies for the Northern (grid points from 85°N to 0°) and Southern (grid points from 5°S to 60°S) Hemispheres for 1985-1990. In addition, this updated package presents the monthly-mean-temperature records for the individual stations that were used to produce the set of gridded anomalies. The periods of record vary by station. Northern Hemisphere data have been corrected for inhomogeneities, while Southern Hemisphere data are presented in uncorrected form.
Wu, Xiangxiang; Zeng, Huahui; Zhu, Xin; Ma, Qiujuan; Hou, Yimin; Wu, Xuefen
2013-11-20
A series of pyrrolopyridinone derivatives as specific inhibitors towards the cell division cycle 7 (Cdc7) was taken into account, and the efficacy of these compounds was analyzed by QSAR and docking approaches to gain deeper insights into the interaction mechanism and ligands selectivity for Cdc7. By regression analysis the prediction models based on Grid score and Zou-GB/SA score were found, respectively with good quality of fits (r(2)=0.748, 0.951; r(cv)(2)=0.712, 0.839). The accuracy of the models was validated by test set and the deviation of the predicted values in validation set using Zou-GB/SA score was smaller than that using Grid score, suggesting that the model based on Zou-GB/SA score provides a more effective method for predicting potencies of Cdc7 inhibitors. Copyright © 2013 Elsevier B.V. All rights reserved.
Tsulukidze, Maka; Grande, Stuart W; Gionfriddo, Michael R
2015-07-01
To assess the feasibility of Option Grids(®)for facilitating shared decision making (SDM) in simulated clinical consultations and explore clinicians' views on their practicability. We used mixed methods approach to analyze clinical consultations using the Observer OPTION instrument and thematic analysis for follow-up interviews with clinicians. Clinicians achieved high scores on information sharing and low scores on preference elicitation and integration. Four themes were identified: (1) Barriers affect practicability of Option Grids(®); (2) Option Grids(®) facilitate the SDM process; (3) Clinicians are aware of the gaps in their practice of SDM; (4) Training and ongoing feedback on the optimal use of Option Grids(®) are necessary. Use of Option Grids(®) by clinicians with background knowledge in SDM did not facilitate optimal levels of competency on the SDM core concepts of preference elicitation and integration. Future research must evaluate the impact of training on the use of Option Grids(®), and explore how best to help clinicians bridge the gap between knowledge and action. Clinicians proficiently imparting information in simulations struggled to elicit and integrate patient preferences - understanding this gap and developing strategies to close it are the next steps for implementing SDM into clinical practice. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Dickinson, Paul A; Kesisoglou, Filippos; Flanagan, Talia; Martinez, Marilyn N; Mistry, Hitesh B; Crison, John R; Polli, James E; Cruañes, Maria T; Serajuddin, Abu T M; Müllertz, Anette; Cook, Jack A; Selen, Arzu
2016-11-01
The aim of Biopharmaceutics Risk Assessment Roadmap (BioRAM) and the BioRAM Scoring Grid is to facilitate optimization of clinical performance of drug products. BioRAM strategy relies on therapy-driven drug delivery and follows an integrated systems approach for formulating and addressing critical questions and decision-making (J Pharm Sci. 2014,103(11): 3777-97). In BioRAM, risk is defined as not achieving the intended in vivo drug product performance, and success is assessed by time to decision-making and action. Emphasis on time to decision-making and time to action highlights the value of well-formulated critical questions and well-designed and conducted integrated studies. This commentary describes and illustrates application of the BioRAM Scoring Grid, a companion to the BioRAM strategy, which guides implementation of such an integrated strategy encompassing 12 critical areas and 6 assessment stages. Application of the BioRAM Scoring Grid is illustrated using published literature. Organizational considerations for implementing BioRAM strategy, including the interactions, function, and skillsets of the BioRAM group members, are also reviewed. As a creative and innovative systems approach, we believe that BioRAM is going to have a broad-reaching impact, influencing drug development and leading to unique collaborations influencing how we learn, and leverage and share knowledge. Published by Elsevier Inc.
Wang, W; Degenhart, A D; Collinger, J L; Vinjamuri, R; Sudre, G P; Adelson, P D; Holder, D L; Leuthardt, E C; Moran, D W; Boninger, M L; Schwartz, A B; Crammond, D J; Tyler-Kabara, E C; Weber, D J
2009-01-01
In this study human motor cortical activity was recorded with a customized micro-ECoG grid during individual finger movements. The quality of the recorded neural signals was characterized in the frequency domain from three different perspectives: (1) coherence between neural signals recorded from different electrodes, (2) modulation of neural signals by finger movement, and (3) accuracy of finger movement decoding. It was found that, for the high frequency band (60-120 Hz), coherence between neighboring micro-ECoG electrodes was 0.3. In addition, the high frequency band showed significant modulation by finger movement both temporally and spatially, and a classification accuracy of 73% (chance level: 20%) was achieved for individual finger movement using neural signals recorded from the micro-ECoG grid. These results suggest that the micro-ECoG grid presented here offers sufficient spatial and temporal resolution for the development of minimally-invasive brain-computer interface applications.
Organic electronics for high-resolution electrocorticography of the human brain.
Khodagholy, Dion; Gelinas, Jennifer N; Zhao, Zifang; Yeh, Malcolm; Long, Michael; Greenlee, Jeremy D; Doyle, Werner; Devinsky, Orrin; Buzsáki, György
2016-11-01
Localizing neuronal patterns that generate pathological brain signals may assist with tissue resection and intervention strategies in patients with neurological diseases. Precise localization requires high spatiotemporal recording from populations of neurons while minimizing invasiveness and adverse events. We describe a large-scale, high-density, organic material-based, conformable neural interface device ("NeuroGrid") capable of simultaneously recording local field potentials (LFPs) and action potentials from the cortical surface. We demonstrate the feasibility and safety of intraoperative recording with NeuroGrids in anesthetized and awake subjects. Highly localized and propagating physiological and pathological LFP patterns were recorded, and correlated neural firing provided evidence about their local generation. Application of NeuroGrids to brain disorders, such as epilepsy, may improve diagnostic precision and therapeutic outcomes while reducing complications associated with invasive electrodes conventionally used to acquire high-resolution and spiking data.
NASA Astrophysics Data System (ADS)
Walawender, Jakub; Kothe, Steffen; Trentmann, Jörg; Pfeifroth, Uwe; Cremer, Roswitha
2017-04-01
The purpose of this study is to create a 1 km2 gridded daily sunshine duration data record for Germany covering the period from 1983 to 2015 (33 years) based on satellite estimates of direct normalised surface solar radiation and in situ sunshine duration observations using a geostatistical approach. The CM SAF SARAH direct normalized irradiance (DNI) satellite climate data record and in situ observations of sunshine duration from 121 weather stations operated by DWD are used as input datasets. The selected period of 33 years is associated with the availability of satellite data. The number of ground stations is limited to 121 as there are only time series with less than 10% of missing observations over the selected period included to keep the long-term consistency of the output sunshine duration data record. In the first step, DNI data record is used to derive sunshine hours by applying WMO threshold of 120 W/m2 (SDU = DNI ≥ 120 W/m2) and weighting of sunny slots to correct the sunshine length between two instantaneous image data due to cloud movement. In the second step, linear regression between SDU and in situ sunshine duration is calculated to adjust the satellite product to the ground observations and the output regression coefficients are applied to create a regression grid. In the last step regression residuals are interpolated with ordinary kriging and added to the regression grid. A comprehensive accuracy assessment of the gridded sunshine duration data record is performed by calculating prediction errors (cross-validation routine). "R" is used for data processing. A short analysis of the spatial distribution and temporal variability of sunshine duration over Germany based on the created dataset will be presented. The gridded sunshine duration data are useful for applications in various climate-related studies, agriculture and solar energy potential calculations.
NASA Astrophysics Data System (ADS)
Chen, X.
2016-12-01
This study present a multi-scale approach combining Mode Decomposition and Variance Matching (MDVM) method and basic process of Point-by-Point Regression (PPR) method. Different from the widely applied PPR method, the scanning radius for each grid box, were re-calculated considering the impact from topography (i.e. mean altitudes and fluctuations). Thus, appropriate proxy records were selected to be candidates for reconstruction. The results of this multi-scale methodology could not only provide the reconstructed gridded temperature, but also the corresponding uncertainties of the four typical timescales. In addition, this method can bring in another advantage that spatial distribution of the uncertainty for different scales could be quantified. To interpreting the necessity of scale separation in calibration, with proxy records location over Eastern Asia, we perform two sets of pseudo proxy experiments (PPEs) based on different ensembles of climate model simulation. One consist of 7 simulated results by 5 models (BCC-CSM1-1, CSIRO-MK3L-1-2, HadCM3, MPI-ESM-P, and Giss-E2-R) of the "past1000" simulation from Coupled Model Intercomparison Project Phase 5. The other is based on the simulations of Community Earth System Model Last Millennium Ensemble (CESM-LME). The pseudo-records network were obtained by adding the white noise with signal-to-noise ratio (SNR) increasing from 0.1 to 1.0 to the simulated true state and the locations mainly followed the PAGES-2k network in Asia. Totally, 400 years (1601-2000) simulation was used for calibration and 600 years (1001-1600) for verification. The reconstructed results were evaluated by three metrics 1) root mean squared error (RMSE), 2) correlation and 3) reduction of error (RE) score. The PPE verification results have shown that, in comparison with ordinary linear calibration method (variance matching), the RMSE and RE score of PPR-MDVM are improved, especially for the area with sparse proxy records. To be noted, in some periods with large volcanic activities, the RMSE of MDVM get larger than VM for higher SNR cases. It should be inferred that the volcanic eruptions might blur the intrinsic characteristics of multi-scales variabilities of the climate system and the MDVM method would show less advantage in that case.
NASA Astrophysics Data System (ADS)
Ouellette, G., Jr.; DeLong, K. L.
2016-02-01
High-resolution proxy records of sea surface temperature (SST) are increasingly being produced using trace element and isotope variability within the skeletal materials of marine organisms such as corals, mollusks, sclerosponges, and coralline algae. Translating the geochemical variations within these organisms into records of SST requires calibration with SST observations using linear regression methods, preferably with in situ SST records that span several years. However, locations with such records are sparse; therefore, calibration is often accomplished using gridded SST data products such as the Hadley Center's HADSST (5º) and interpolated HADISST (1º) data sets, NOAA's extended reconstructed SST data set (ERSST; 2º), optimum interpolation SST (OISST; 1º), and Kaplan SST data sets (5º). From these data products, the SST used for proxy calibration is obtained for a single grid cell that includes the proxy's study site. The gridded data sets are based on the International Comprehensive Ocean-Atmosphere Data Set (ICOADS) and each uses different methods of interpolation to produce the globally and temporally complete data products except for HadSST, which is not interpolated but quality controlled. This study compares SST for a single site from these gridded data products with a high-resolution satellite-based SST data set from NOAA (Pathfinder; 4 km) with in situ SST data and coral Sr/Ca variability for our study site in Haiti to assess differences between these SST records with a focus on seasonal variability. Our results indicate substantial differences in the seasonal variability captured for the same site among these data sets on the order of 1-3°C. This analysis suggests that of the data products, high-resolution satellite SST best captured seasonal variability at the study site. Unfortunately, satellite SST records are limited to the past few decades. If satellite SST are to be used to calibrate proxy records, collecting modern, living samples is desirable.
Report of the IAU/IAG Working Group on Cartographic Coordinates and Rotational Elements: 2006
2007-01-01
of Mars is that specified in the final MOLA Mission Experiment Gridded Data Record (MEGDR) Products (Smith et al. 2003). In particular, the 128...Altimeter Mission Experiment Gridded Data Record. NASA Planetary Data System, MGS-M- MOLA -5-MEGDR-L3-V1.0, 2003. Available on-line from http://pds
NASA Technical Reports Server (NTRS)
Obrien, S. O. (Principal Investigator)
1980-01-01
The program, LACREG, extracted all pixels that are contained in a specific IJ grid section. The pixels, along with a header record are stored in a disk file defined by the user. The program will extract up to 99 IJ grid sections.
De-identification of clinical notes via recurrent neural network and conditional random field.
Liu, Zengjian; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai
2017-11-01
De-identification, identifying information from data, such as protected health information (PHI) present in clinical data, is a critical step to enable data to be shared or published. The 2016 Centers of Excellence in Genomic Science (CEGS) Neuropsychiatric Genome-scale and RDOC Individualized Domains (N-GRID) clinical natural language processing (NLP) challenge contains a de-identification track in de-identifying electronic medical records (EMRs) (i.e., track 1). The challenge organizers provide 1000 annotated mental health records for this track, 600 out of which are used as a training set and 400 as a test set. We develop a hybrid system for the de-identification task on the training set. Firstly, four individual subsystems, that is, a subsystem based on bidirectional LSTM (long-short term memory, a variant of recurrent neural network), a subsystem-based on bidirectional LSTM with features, a subsystem based on conditional random field (CRF) and a rule-based subsystem, are used to identify PHI instances. Then, an ensemble learning-based classifiers is deployed to combine all PHI instances predicted by above three machine learning-based subsystems. Finally, the results of the ensemble learning-based classifier and the rule-based subsystem are merged together. Experiments conducted on the official test set show that our system achieves the highest micro F1-scores of 93.07%, 91.43% and 95.23% under the "token", "strict" and "binary token" criteria respectively, ranking first in the 2016 CEGS N-GRID NLP challenge. In addition, on the dataset of 2014 i2b2 NLP challenge, our system achieves the highest micro F1-scores of 96.98%, 95.11% and 98.28% under the "token", "strict" and "binary token" criteria respectively, outperforming other state-of-the-art systems. All these experiments prove the effectiveness of our proposed method. Copyright © 2017. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Maaß, Heiko; Cakmak, Hüseyin Kemal; Bach, Felix; Mikut, Ralf; Harrabi, Aymen; Süß, Wolfgang; Jakob, Wilfried; Stucky, Karl-Uwe; Kühnapfel, Uwe G.; Hagenmeyer, Veit
2015-12-01
Power networks will change from a rigid hierarchic architecture to dynamic interconnected smart grids. In traditional power grids, the frequency is the controlled quantity to maintain supply and load power balance. Thereby, high rotating mass inertia ensures for stability. In the future, system stability will have to rely more on real-time measurements and sophisticated control, especially when integrating fluctuating renewable power sources or high-load consumers like electrical vehicles to the low-voltage distribution grid.
Enabling Efficient Intelligence Analysis in Degraded Environments
2013-06-01
Magnets Grid widget for multidimensional information exploration ; and a record browser of Visual Summary Cards widget for fast visual identification of...evolution analysis; a Magnets Grid widget for multi- dimensional information exploration ; and a record browser of Visual Summary Cards widget for fast...attention and inattentional blindness. It also explores and develops various techniques to represent information in a salient way and provide efficient
An updated global grid point surface air temperature anomaly data set: 1851--1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sepanski, R.J.; Boden, T.A.; Daniels, R.C.
1991-10-01
This document presents land-based monthly surface air temperature anomalies (departures from a 1951--1970 reference period mean) on a 5{degree} latitude by 10{degree} longitude global grid. Monthly surface air temperature anomalies (departures from a 1957--1975 reference period mean) for the Antarctic (grid points from 65{degree}S to 85{degree}S) are presented in a similar way as a separate data set. The data were derived primarily from the World Weather Records and the archives of the United Kingdom Meteorological Office. This long-term record of temperature anomalies may be used in studies addressing possible greenhouse-gas-induced climate changes. To date, the data have been employed inmore » generating regional, hemispheric, and global time series for determining whether recent (i.e., post-1900) warming trends have taken place. This document also presents the monthly mean temperature records for the individual stations that were used to generate the set of gridded anomalies. The periods of record vary by station. Northern Hemisphere station data have been corrected for inhomogeneities, while Southern Hemisphere data are presented in uncorrected form. 14 refs., 11 figs., 10 tabs.« less
Cai, Li
2015-06-01
Lord and Wingersky's (Appl Psychol Meas 8:453-461, 1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined on a grid formed by direct products of quadrature points. However, the increase in computational burden remains exponential in the number of dimensions, making the implementation of the recursive algorithm cumbersome for truly high-dimensional models. In this paper, a dimension reduction method that is specific to the Lord-Wingersky recursions is developed. This method can take advantage of the restrictions implied by hierarchical item factor models, e.g., the bifactor model, the testlet model, or the two-tier model, such that a version of the Lord-Wingersky recursive algorithm can operate on a dramatically reduced set of quadrature points. For instance, in a bifactor model, the dimension of integration is always equal to 2, regardless of the number of factors. The new algorithm not only provides an effective mechanism to produce summed score to IRT scaled score translation tables properly adjusted for residual dependence, but leads to new applications in test scoring, linking, and model fit checking as well. Simulated and empirical examples are used to illustrate the new applications.
The BioGRID interaction database: 2013 update.
Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Heinicke, Sven; Boucher, Lorrie; Winter, Andrew; Stark, Chris; Nixon, Julie; Ramage, Lindsay; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Breitkreutz, Ashton; Sellam, Adnane; Chen, Daici; Chang, Christie; Rust, Jennifer; Livstone, Michael; Oughtred, Rose; Dolinski, Kara; Tyers, Mike
2013-01-01
The Biological General Repository for Interaction Datasets (BioGRID: http//thebiogrid.org) is an open access archive of genetic and protein interactions that are curated from the primary biomedical literature for all major model organism species. As of September 2012, BioGRID houses more than 500 000 manually annotated interactions from more than 30 model organisms. BioGRID maintains complete curation coverage of the literature for the budding yeast Saccharomyces cerevisiae, the fission yeast Schizosaccharomyces pombe and the model plant Arabidopsis thaliana. A number of themed curation projects in areas of biomedical importance are also supported. BioGRID has established collaborations and/or shares data records for the annotation of interactions and phenotypes with most major model organism databases, including Saccharomyces Genome Database, PomBase, WormBase, FlyBase and The Arabidopsis Information Resource. BioGRID also actively engages with the text-mining community to benchmark and deploy automated tools to expedite curation workflows. BioGRID data are freely accessible through both a user-defined interactive interface and in batch downloads in a wide variety of formats, including PSI-MI2.5 and tab-delimited files. BioGRID records can also be interrogated and analyzed with a series of new bioinformatics tools, which include a post-translational modification viewer, a graphical viewer, a REST service and a Cytoscape plugin.
Qualitative Life-Grids: A Proposed Method for Comparative European Educational Research
ERIC Educational Resources Information Center
Abbas, Andrea; Ashwin, Paul; McLean, Monica
2013-01-01
Drawing upon their large three-year mixed-method study comparing four English university sociology departments, the authors demonstrate the benefits to be gained from concisely recording biographical stories on life-grids. They argue that life-grids have key benefits which are important for comparative European educational research. Some of these…
2008-07-01
dropout rate amongst Grid participants suggests participants found the Grid more frustrating to use, and subjective satisfaction scores show... learned more than N years of graduate school could ever teach me, and my sister, who was always there for me when my Black Friday letters came. Abstract...greatly affect whether policies match their authors’ intentions ; a bad user interface can lead to policies with many errors, while a good user interface
Basagni, Benedetta; Luzzatti, Claudio; Navarrete, Eduardo; Caputo, Marina; Scrocco, Gessica; Damora, Alessio; Giunchi, Laura; Gemignani, Paola; Caiazzo, Annarita; Gambini, Maria Grazia; Avesani, Renato; Mancuso, Mauro; Trojano, Luigi; De Tanti, Antonio
2017-04-01
Verbal reasoning is a complex, multicomponent function, which involves activation of functional processes and neural circuits distributed in both brain hemispheres. Thus, this ability is often impaired after brain injury. The aim of the present study is to describe the construction of a new verbal reasoning test (VRT) for patients with brain injury and to provide normative values in a sample of healthy Italian participants. Three hundred and eighty healthy Italian subjects (193 women and 187 men) of different ages (range 16-75 years) and educational level (primary school to postgraduate degree) underwent the VRT. VRT is composed of seven subtests, investigating seven different domains. Multiple linear regression analysis revealed a significant effect of age and education on the participants' performance in terms of both VRT total score and all seven subtest scores. No gender effect was found. A correction grid for raw scores was built from the linear equation derived from the scores. Inferential cut-off scores were estimated using a non-parametric technique, and equivalent scores were computed. We also provided a grid for the correction of results by z scores.
Sasse, Alexander; de Vries, Sjoerd J; Schindler, Christina E M; de Beauchêne, Isaure Chauvot; Zacharias, Martin
2017-01-01
Protein-protein docking protocols aim to predict the structures of protein-protein complexes based on the structure of individual partners. Docking protocols usually include several steps of sampling, clustering, refinement and re-scoring. The scoring step is one of the bottlenecks in the performance of many state-of-the-art protocols. The performance of scoring functions depends on the quality of the generated structures and its coupling to the sampling algorithm. A tool kit, GRADSCOPT (GRid Accelerated Directly SCoring OPTimizing), was designed to allow rapid development and optimization of different knowledge-based scoring potentials for specific objectives in protein-protein docking. Different atomistic and coarse-grained potentials can be created by a grid-accelerated directly scoring dependent Monte-Carlo annealing or by a linear regression optimization. We demonstrate that the scoring functions generated by our approach are similar to or even outperform state-of-the-art scoring functions for predicting near-native solutions. Of additional importance, we find that potentials specifically trained to identify the native bound complex perform rather poorly on identifying acceptable or medium quality (near-native) solutions. In contrast, atomistic long-range contact potentials can increase the average fraction of near-native poses by up to a factor 2.5 in the best scored 1% decoys (compared to existing scoring), emphasizing the need of specific docking potentials for different steps in the docking protocol.
Thurber, Steven; Wilson, Ann; Realmuto, George; Specker, Sheila
2018-03-01
To investigate the concurrent and criterion validity of two independently developed measurement instruments, INTERMED and LOCUS, designed to improve the treatment and clinical management of patients with complex symptom manifestations. Participants (N = 66) were selected from hospital records based on the complexity of presenting symptoms, with tripartite diagnoses across biological, psychiatric and addiction domains. Biopsychosocial information from hospital records were submitted to INTERMED and LOCUS grids. In addition, Global Assessment of Functioning (GAF) ratings were gathered for statistical analyses. The product moment correlation between INTERMED and LOCUS was 0.609 (p = .01). Inverse zero-order correlations for INTERMED and LOCUS total score and GAF were obtained. However, only the beta weight for LOCUS and GAF was significant. An exploratory principal components analysis further illuminated areas of convergence between the instruments. INTERMED and LOCUS demonstrated shared variance. INTERMED appeared more sensitive to complex medical conditions and severe physiological reactions, whereas LOCUS findings are more strongly related to psychiatric symptoms. Implications are discussed.
Ahmadi, Emad; Katnani, Husam A.; Daftari Besheli, Laleh; Gu, Qiang; Atefi, Reza; Villeneuve, Martin Y.; Eskandar, Emad; Lev, Michael H.; Golby, Alexandra J.; Gupta, Rajiv
2016-01-01
Purpose To develop an electrocorticography (ECoG) grid by using deposition of conductive nanoparticles in a polymer thick film on an organic substrate (PTFOS) that induces minimal, if any, artifacts on computed tomographic (CT) and magnetic resonance (MR) images and is safe in terms of tissue reactivity and MR heating. Materials and Methods All procedures were approved by the Animal Care and Use Committee and complied with the Public Health Services Guide for the Care and Use of Animals. Electrical functioning of PTFOS for cortical recording and stimulation was tested in two mice. PTFOS disks were implanted in two mice; after 30 days, the tissues surrounding the implants were harvested, and tissue injury was studied by using immunostaining. Five neurosurgeons rated mechanical properties of PTFOS compared with conventional grids by using a three-level Likert scale. Temperature increases during 30 minutes of 3-T MR imaging were measured in a head phantom with no grid, a conventional grid, and a PTFOS grid. Two neuroradiologists rated artifacts on CT and MR images of a cadaveric head specimen with no grid, a conventional grid, and a PTFOS grid by using a four-level Likert scale, and the mean ratings were compared between grids. Results Oscillatory local field potentials were captured with cortical recordings. Cortical stimulations in motor cortex elicited muscle contractions. PTFOS implants caused no adverse tissue reaction. Mechanical properties were rated superior to conventional grids (χ2 test, P < .05). The temperature increase during MR imaging for the three cases of no grid, PTFOS grid, and conventional grid was 3.84°C, 4.05°C, and 10.13°C, respectively. PTFOS induced no appreciable artifacts on CT and MR images, and PTFOS image quality was rated significantly higher than that with conventional grids (two-tailed t test, P < .05). Conclusion PTFOS grids may be an attractive alternative to conventional ECoG grids with regard to mechanical properties, 3-T MR heating profile, and CT and MR imaging artifacts. © RSNA, 2016 Online supplemental material is available for this article. PMID:26844363
Mansourian, Arash; Momen-Heravi, Fatemeh; Saheb-Jamee, Mahnaz; Esfehani, Mahsa; Khalilzadeh, Omid; Momen-Beitollahi, Jalil
2011-12-01
Corticosteroids are the mainstay for treatment of oral lichen planus (OLP) and have their own side effects. The aim of this study was to compare the therapeutic effects of aloe vera (AV) mouthwash with triamcinolone acetonide 0.1% (TA) on OLP. A total of 46 patients with OLP were enrolled in this study. The patients were randomly divided into 2 groups. Each group was treated with received AV mouthwash or TA. The treatment period for both groups was 4 weeks. The basement data were recorded for each patient. Patients were evaluated on days 8, 16 and after completing the course of treatment (visit 1-3). The last follow-up was 2 months after the start of treatment (visit 4). Visual analogue scale was used for evaluating pain and burning sensation and Thongprasom index for clinical improvement and healing. In addition, lesion sizes were measured and recorded at each visit using a grid. Baseline characteristics, including pain and burning sensation score, size and clinical characteristics of the lesions according to Thongprasom index, were not different between the 2 treatment groups. Both AV and TA significantly reduced visual analogue scale score, Thongprasom score and size of the lesions after treatment (P < 0.001) and after 2 months of discontinuation of the treatment (P < 0.001). In the AV group, 74% of patients and in the TA group 78% of patients showed some degrees of healing in the last follow-up. AV mouthwash is an effective substitute for TA in the treatment of OLP.
Designing for Wide-Area Situation Awareness in Future Power Grid Operations
NASA Astrophysics Data System (ADS)
Tran, Fiona F.
Power grid operation uncertainty and complexity continue to increase with the rise of electricity market deregulation, renewable generation, and interconnectedness between multiple jurisdictions. Human operators need appropriate wide-area visualizations to help them monitor system status to ensure reliable operation of the interconnected power grid. We observed transmission operations at a control centre, conducted critical incident interviews, and led focus group sessions with operators. The results informed a Work Domain Analysis of power grid operations, which in turn informed an Ecological Interface Design concept for wide-area monitoring. I validated design concepts through tabletop discussions and a usability evaluation with operators, earning a mean System Usability Scale score of 77 out of 90. The design concepts aim to support an operator's complete and accurate understanding of the power grid state, which operators increasingly require due to the critical nature of power grid infrastructure and growing sources of system uncertainty.
NASA Astrophysics Data System (ADS)
Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis
2014-05-01
The 2007 Flood Directive promotes the integration and valorization of historical and significant floods in flood risk management (Flood Directive Text, chapter II, and article 4). Taking into account extreme past floods analysis seems necessary in the mitigation process of vulnerability face to flooding risk. In France, this aspect of the Directive was carried out through the elaboration of Preliminary Flood Risk Assessment (PFRA) and the establishment of a 2000 floods list. From this first list, a sample of 176 floods, considered as remarkable has been selected. These floods were compiled in discussion with local authorities in charge of flood management (Lang et al., 2012) and have to be integrated in priority in local risk management policies. However, a consideration emerges about this classification: how a remarkable flood can be defined? According which criteria can it be considered as remarkable? To answer these questions, a methodology has been established by building an evaluation grid of remarkable floods in France. The primary objective of this grid is to analyze the remarkable flood's characteristics (hydrological and meteorological characteristics, sociological- political and economic impacts), and secondly to propose a classification of significant floods selected in the 2011 PFRA. To elaborate this evaluation grid, several issues had to be taken into account. First, the objective is to allow the comparison of events from various periods. These temporal disparities include the integration of various kinds of data and point out the importance of historical hydrology. It is possible to evaluate accurately the characteristics of recent floods by interpreting quantitative data (for example hydrological records. However, for floods that occurred before the 1960's it is necessary resorting to qualitative information such as written sources is necessary (Coeur, Lang, 2008). In a second part the evaluation grid requires equitable criteria in order not to emphasize one flood typology or one flood dynamic (for example flash floods are often over-represented than slow dynamic floods in existing databases). Thus, the selected criteria have to introduce a general overview of flooding risk in France by integrating all typologies: storm surges, torrential floods, rising groundwater level and resulting to flood, etc. The methodology developed for the evaluation grid is inspired by several scientific works related to historical hydrology (Bradzil, 2006; Benito et al., 2004) or extreme floods classification (Kundzewics et al. 2013; Garnier E., 2005). The referenced information are mainly issued from investigations realized for the PFRA (archives, local data),from internet databases on flooding disasters, and from a complementary bibliography (some scientists such as Maurice Pardé a geographer who largely documented French floods during the 20th century). The proposed classification relies on three main axes. Each axis is associated to a set of criteria, each one related to a score (from 0.5 to 4 points), and pointing out a final remarkability score. • The flood intensity characterizing the flood's hazard level. It is composed of the submersion duration, important to valorize floods with slow dynamics as flooding from groundwater, the event peak discharge's return period, and the presence of factors increasing significantly the hazard level (dykes breaks, log jam, sediment transport…) • The flood severity focuses on economic damages, social and political repercussions, media coverage of the event, fatalities number or eventual flood warning failures. Analyzing the flood consequences is essential in order to evaluate the vulnerability of society at disaster date. • The spatial extension of the flood, which contributes complementary information to the two first axes. The evaluation grid was tested and applied on the sample of 176 remarkable events. Around twenty events (from 1856 to 2010) come out with a high remarkability rate. The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Troia, Matthew J.; McManamay, Ryan A.
Primary biodiversity data constitute observations of particular species at given points in time and space. Open-access electronic databases provide unprecedented access to these data, but their usefulness in characterizing species distributions and patterns in biodiversity depend on how complete species inventories are at a given survey location and how uniformly distributed survey locations are along dimensions of time, space, and environment. Our aim was to compare completeness and coverage among three open-access databases representing ten taxonomic groups (amphibians, birds, freshwater bivalves, crayfish, freshwater fish, fungi, insects, mammals, plants, and reptiles) in the contiguous United States. We compiled occurrence records frommore » the Global Biodiversity Information Facility (GBIF), the North American Breeding Bird Survey (BBS), and federally administered fish surveys (FFS). In this study, we aggregated occurrence records by 0.1° × 0.1° grid cells and computed three completeness metrics to classify each grid cell as well-surveyed or not. Next, we compared frequency distributions of surveyed grid cells to background environmental conditions in a GIS and performed Kolmogorov–Smirnov tests to quantify coverage through time, along two spatial gradients, and along eight environmental gradients. The three databases contributed >13.6 million reliable occurrence records distributed among >190,000 grid cells. The percent of well-surveyed grid cells was substantially lower for GBIF (5.2%) than for systematic surveys (BBS and FFS; 82.5%). Still, the large number of GBIF occurrence records produced at least 250 well-surveyed grid cells for six of nine taxonomic groups. Coverages of systematic surveys were less biased across spatial and environmental dimensions but were more biased in temporal coverage compared to GBIF data. GBIF coverages also varied among taxonomic groups, consistent with commonly recognized geographic, environmental, and institutional sampling biases. Lastly, this comprehensive assessment of biodiversity data across the contiguous United States provides a prioritization scheme to fill in the gaps by contributing existing occurrence records to the public domain and planning future surveys.« less
Troia, Matthew J.; McManamay, Ryan A.
2016-06-12
Primary biodiversity data constitute observations of particular species at given points in time and space. Open-access electronic databases provide unprecedented access to these data, but their usefulness in characterizing species distributions and patterns in biodiversity depend on how complete species inventories are at a given survey location and how uniformly distributed survey locations are along dimensions of time, space, and environment. Our aim was to compare completeness and coverage among three open-access databases representing ten taxonomic groups (amphibians, birds, freshwater bivalves, crayfish, freshwater fish, fungi, insects, mammals, plants, and reptiles) in the contiguous United States. We compiled occurrence records frommore » the Global Biodiversity Information Facility (GBIF), the North American Breeding Bird Survey (BBS), and federally administered fish surveys (FFS). In this study, we aggregated occurrence records by 0.1° × 0.1° grid cells and computed three completeness metrics to classify each grid cell as well-surveyed or not. Next, we compared frequency distributions of surveyed grid cells to background environmental conditions in a GIS and performed Kolmogorov–Smirnov tests to quantify coverage through time, along two spatial gradients, and along eight environmental gradients. The three databases contributed >13.6 million reliable occurrence records distributed among >190,000 grid cells. The percent of well-surveyed grid cells was substantially lower for GBIF (5.2%) than for systematic surveys (BBS and FFS; 82.5%). Still, the large number of GBIF occurrence records produced at least 250 well-surveyed grid cells for six of nine taxonomic groups. Coverages of systematic surveys were less biased across spatial and environmental dimensions but were more biased in temporal coverage compared to GBIF data. GBIF coverages also varied among taxonomic groups, consistent with commonly recognized geographic, environmental, and institutional sampling biases. Lastly, this comprehensive assessment of biodiversity data across the contiguous United States provides a prioritization scheme to fill in the gaps by contributing existing occurrence records to the public domain and planning future surveys.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sepanski, R.J.; Boden, T.A.; Daniels, R.C.
This document presents land-based monthly surface air temperature anomalies (departures from a 1951--1970 reference period mean) on a 5{degree} latitude by 10{degree} longitude global grid. Monthly surface air temperature anomalies (departures from a 1957--1975 reference period mean) for the Antarctic (grid points from 65{degree}S to 85{degree}S) are presented in a similar way as a separate data set. The data were derived primarily from the World Weather Records and the archives of the United Kingdom Meteorological Office. This long-term record of temperature anomalies may be used in studies addressing possible greenhouse-gas-induced climate changes. To date, the data have been employed inmore » generating regional, hemispheric, and global time series for determining whether recent (i.e., post-1900) warming trends have taken place. This document also presents the monthly mean temperature records for the individual stations that were used to generate the set of gridded anomalies. The periods of record vary by station. Northern Hemisphere station data have been corrected for inhomogeneities, while Southern Hemisphere data are presented in uncorrected form. 14 refs., 11 figs., 10 tabs.« less
NASA Astrophysics Data System (ADS)
Paget, A. C.; Brodzik, M. J.; Long, D. G.; Hardman, M.
2016-02-01
The historical record of satellite-derived passive microwave brightness temperatures comprises data from multiple imaging radiometers (SMMR, SSM/I-SSMIS, AMSR-E), spanning nearly 40 years of Earth observations from 1978 to the present. Passive microwave data are used to monitor time series of many climatological variables, including ocean wind speeds, cloud liquid water and sea ice concentrations and ice velocity. Gridded versions of passive microwave data have been produced using various map projections (polar stereographic, Lambert azimuthal equal-area, cylindrical equal-area, quarter-degree Platte-Carree) and data formats (flat binary, HDF). However, none of the currently available versions can be rendered in the common visualization standard, geoTIFF, without requiring cartographic reprojection. Furthermore, the reprojection details are complicated and often require expert knowledge of obscure software package options. We are producing a consistently calibrated, completely reprocessed data set of this valuable multi-sensor satellite record, using EASE-Grid 2.0, an improved equal-area projection definition that will require no reprojection for translation into geoTIFF. Our approach has been twofold: 1) define the projection ellipsoid to match the reference datum of the satellite data, and 2) include required file-level metadata for standard projection software to correctly render the data in the geoTIFF standard. The Calibrated, Enhanced Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR), leverages image reconstruction techniques to enhance gridded spatial resolution to 3 km and uses newly available intersensor calibrations to improve the quality of derived geophysical products. We expect that our attention to easy geoTIFF compatibility will foster higher-quality analysis with the CETB product by enabling easy and correct intercomparison with other gridded and in situ data.
Bayless, E. Randall; Arihood, Leslie D.; Reeves, Howard W.; Sperl, Benjamin J.S.; Qi, Sharon L.; Stipe, Valerie E.; Bunch, Aubrey R.
2017-01-18
As part of the National Water Availability and Use Program established by the U.S. Geological Survey (USGS) in 2005, this study took advantage of about 14 million records from State-managed collections of water-well drillers’ records and created a database of hydrogeologic properties for the glaciated United States. The water-well drillers’ records were standardized to be relatively complete and error-free and to provide consistent variables and naming conventions that span all State boundaries.Maps and geospatial grids were developed for (1) total thickness of glacial deposits, (2) total thickness of coarse-grained deposits, (3) specific-capacity based transmissivity and hydraulic conductivity, and (4) texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity. The information included in these maps and grids is required for most assessments of groundwater availability, in addition to having applications to studies of groundwater flow and transport. The texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity were based on an assumed range of hydraulic conductivity values for coarse- and fine-grained deposits and should only be used with complete awareness of the methods used to create them. However, the maps and grids of texture-based estimated equivalent hydraulic conductivity and transmissivity may be useful for application to areas where a range of measured values is available for re-scaling.Maps of hydrogeologic information for some States are presented as examples in this report but maps and grids for all States are available electronically at the project Web site (USGS Glacial Aquifer System Groundwater Availability Study, http://mi.water.usgs.gov/projects/WaterSmart/Map-SIR2015-5105.html) and the Science Base Web site, https://www.sciencebase.gov/catalog/item/58756c7ee4b0a829a3276352.
NASA Astrophysics Data System (ADS)
Kim, Y.; Du, J.; Kimball, J. S.
2017-12-01
The landscape freeze-thaw (FT) status derived from satellite microwave remote sensing is closely linked to vegetation phenology and productivity, surface energy exchange, evapotranspiration, snow/ice melt dynamics, and trace gas fluxes over land areas affected by seasonally frozen temperatures. A long-term global satellite microwave Earth System Data Record of daily landscape freeze-thaw status (FT-ESDR) was developed using similar calibrated 37GHz, vertically-polarized (V-pol) brightness temperatures (Tb) from SMMR, SSM/I, and SSMIS sensors. The FT-ESDR shows mean annual spatial classification accuracies of 90.3 and 84.3 % for PM and AM overpass retrievals relative surface air temperature (SAT) measurement based FT estimates from global weather stations. However, the coarse FT-ESDR gridding (25-km) is insufficient to distinguish finer scale FT heterogeneity. In this study, we tested alternative finer scale FT estimates derived from two enhanced polar-grid (3.125-km and 6-km resolution), 36.5 GHz V-pol Tb records derived from calibrated AMSR-E and AMSR2 sensor observations. The daily FT estimates are derived using a modified seasonal threshold algorithm that classifies daily Tb variations in relation to grid cell-wise FT thresholds calibrated using ERA-Interim reanalysis based SAT, downscaled using a digital terrain map and estimated temperature lapse rates. The resulting polar-grid FT records for a selected study year (2004) show mean annual spatial classification accuracies of 90.1% (84.2%) and 93.1% (85.8%) for respective PM (AM) 3.125km and 6-km Tb retrievals relative to in situ SAT measurement based FT estimates from regional weather stations. Areas with enhanced FT accuracy include water-land boundaries and mountainous terrain. Differences in FT patterns and relative accuracy obtained from the enhanced grid Tb records were attributed to several factors, including different noise contributions from underlying Tb processing and spatial mismatches between Tb retrievals and SAT calibrated FT thresholds.
NASA Astrophysics Data System (ADS)
Hardman, M.; Brodzik, M. J.; Long, D. G.
2017-12-01
Since 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Up until recently, the available global gridded passive microwave data sets have not been produced consistently. Various projections (equal-area, polar stereographic), a number of different gridding techniques were used, along with various temporal sampling as well as a mix of Level 2 source data versions. In addition, not all data from all sensors have been processed completely and they have not been processed in any one consistent way. Furthermore, the original gridding techniques were relatively primitive and were produced on 25 km grids using the original EASE-Grid definition that is not easily accommodated in modern software packages. As part of NASA MEaSUREs, we have re-processed all data from SMMR, all SSM/I-SSMIS and AMSR-E instruments, using the most mature Level 2 data. The Calibrated, Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) gridded data are now available from the NSIDC DAAC. The data are distributed as netCDF files that comply with CF-1.6 and ACDD-1.3 conventions. The data have been produced on EASE 2.0 projections at smoothed, 25 kilometer resolution and spatially-enhanced resolutions, up to 3.125 km depending on channel frequency, using the radiometer version of the Scatterometer Image Reconstruction (rSIR) method. We expect this newly produced data set to enable scientists to better analyze trends in coastal regions, marginal ice zones and in mountainous terrain that were not possible with the previous gridded passive microwave data. The use of the EASE-Grid 2.0 definition and netCDF-CF formatting allows users to extract compliant geotiff images and provides for easy importing and correct reprojection interoperability in many standard packages. As a consistently-processed, high-quality satellite passive microwave ESDR, we expect this data set to replace earlier gridded passive microwave data sets, and to pave the way for new insights from higher-resolution derived geophysical products.
Romanelli, Pantaleo; Piangerelli, Marco; Ratel, David; Gaude, Christophe; Costecalde, Thomas; Puttilli, Cosimo; Picciafuoco, Mauro; Benabid, Alim; Torres, Napoleon
2018-05-11
OBJECTIVE Wireless technology is a novel tool for the transmission of cortical signals. Wireless electrocorticography (ECoG) aims to improve the safety and diagnostic gain of procedures requiring invasive localization of seizure foci and also to provide long-term recording of brain activity for brain-computer interfaces (BCIs). However, no wireless devices aimed at these clinical applications are currently available. The authors present the application of a fully implantable and externally rechargeable neural prosthesis providing wireless ECoG recording and direct cortical stimulation (DCS). Prolonged wireless ECoG monitoring was tested in nonhuman primates by using a custom-made device (the ECoG implantable wireless 16-electrode [ECOGIW-16E] device) containing a 16-contact subdural grid. This is a preliminary step toward large-scale, long-term wireless ECoG recording in humans. METHODS The authors implanted the ECOGIW-16E device over the left sensorimotor cortex of a nonhuman primate ( Macaca fascicularis), recording ECoG signals over a time span of 6 months. Daily electrode impedances were measured, aiming to maintain the impedance values below a threshold of 100 KΩ. Brain mapping was obtained through wireless cortical stimulation at fixed intervals (1, 3, and 6 months). After 6 months, the device was removed. The authors analyzed cortical tissues by using conventional histological and immunohistological investigation to assess whether there was evidence of damage after the long-term implantation of the grid. RESULTS The implant was well tolerated; no neurological or behavioral consequences were reported in the monkey, which resumed his normal activities within a few hours of the procedure. The signal quality of wireless ECoG remained excellent over the 6-month observation period. Impedance values remained well below the threshold value; the average impedance per contact remains approximately 40 KΩ. Wireless cortical stimulation induced movements of the upper and lower limbs, and elicited fine movements of the digits as well. After the monkey was euthanized, the grid was found to be encapsulated by a newly formed dural sheet. The grid removal was performed easily, and no direct adhesions of the grid to the cortex were found. Conventional histological studies showed no cortical damage in the brain region covered by the grid, except for a single microscopic spot of cortical necrosis (not visible to the naked eye) in a region that had undergone repeated procedures of electrical stimulation. Immunohistological studies of the cortex underlying the grid showed a mild inflammatory process. CONCLUSIONS This preliminary experience in a nonhuman primate shows that a wireless neuroprosthesis, with related long-term ECoG recording (up to 6 months) and multiple DCSs, was tolerated without sequelae. The authors predict that epilepsy surgery could realize great benefit from this novel prosthesis, providing an extended time span for ECoG recording.
A 149 min periodicity underlies the X-ray flaring of Sgr A*
NASA Astrophysics Data System (ADS)
Leibowitz, Elia
2018-03-01
In a paper in 2017, I have shown that 39 large X-ray flares of Sgr A* that were recorded by Chandra observatory in the year 2012 are concentrated preferably around tick marks of an equi-distance grid on the time axis. The period of this grid as found in that paper is 0.1033 d. In this work I show that the effect can be found among all the large X-ray flares recorded by Chandra and XMM - Newton along 15 yr. The mid-points of all the 71 large flares recorded between years 2000 and 2014 are also tightly grouped around tick marks of a grid with this period, or more likely, 0.1032 d. This result is obtained with a confidence level of at least 3.27σ and very likely of 4.62σ. I find also a possible hint that a similar grid is underlying IR flares of the object. I suggest that the pacemaker in the occurrences of the large X-ray flares of Sgr A* is a mass of the order of a low-mass star or a small planet, in a slightly eccentric Keplerian orbit around the SMBH at the centre of the Galaxy. The radius of this orbit is about 6.6 Schwarzschild radii of the BH.
Membrane potential dynamics of grid cells
Domnisoru, Cristina; Kinkhabwala, Amina A.; Tank, David W.
2014-01-01
During navigation, grid cells increase their spike rates in firing fields arranged on a strikingly regular triangular lattice, while their spike timing is often modulated by theta oscillations. Oscillatory interference models of grid cells predict theta amplitude modulations of membrane potential during firing field traversals, while competing attractor network models predict slow depolarizing ramps. Here, using in-vivo whole-cell recordings, we tested these models by directly measuring grid cell intracellular potentials in mice running along linear tracks in virtual reality. Grid cells had large and reproducible ramps of membrane potential depolarization that were the characteristic signature tightly correlated with firing fields. Grid cells also exhibited intracellular theta oscillations that influenced their spike timing. However, the properties of theta amplitude modulations were not consistent with the view that they determine firing field locations. Our results support cellular and network mechanisms in which grid fields are produced by slow ramps, as in attractor models, while theta oscillations control spike timing. PMID:23395984
Elwyn, Glyn; Pickles, Tim; Edwards, Adrian; Kinsey, Katharine; Brain, Kate; Newcombe, Robert G; Firth, Jill; Marrin, Katy; Nye, Alan; Wood, Fiona
2016-04-01
To evaluate whether introducing tools, specifically designed for use in clinical encounters, namely Option Grids, into a clinical practice setting leads to higher levels of shared decision making. A stepped wedge trial design where 6 physiotherapists at an interface clinic in Oldham, UK, were sequentially instructed in how to use an Option Grid for osteoarthritis of the knee. Patients with suspected or confirmed osteoarthritis of the knee were recruited, six per clinician prior to instruction, and six per clinician afterwards. We measured shared decision making, patient knowledge, and readiness to decide. A total of 72 patients were recruited; 36 were allocated to the intervention group. There was an 8.4 point (95% CI 4.4 to 12.2) increase in the Observer OPTION score (range 0-100) in the intervention group. The mean gain in knowledge was 0.9 points (score range 0-5, 95% CI, 0.3 to 1.5). There was no increase in encounter duration. Shared decision making increased when clinicians used the knee osteoarthritis Option Grid. Tools designed to support collaboration and deliberation about treatment options lead to increased levels of shared decision making. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Ban, Tomohiro; Ohue, Masahito; Akiyama, Yutaka
2018-04-01
The identification of comprehensive drug-target interactions is important in drug discovery. Although numerous computational methods have been developed over the years, a gold standard technique has not been established. Computational ligand docking and structure-based drug design allow researchers to predict the binding affinity between a compound and a target protein, and thus, they are often used to virtually screen compound libraries. In addition, docking techniques have also been applied to the virtual screening of target proteins (inverse docking) to predict target proteins of a drug candidate. Nevertheless, a more accurate docking method is currently required. In this study, we proposed a method in which a predicted ligand-binding site is covered by multiple grids, termed multiple grid arrangement. Notably, multiple grid arrangement facilitates the conformational search for a grid-based ligand docking software and can be applied to the state-of-the-art commercial docking software Glide (Schrödinger, LLC). We validated the proposed method by re-docking with the Astex diverse benchmark dataset and blind binding site situations, which improved the correct prediction rate of the top scoring docking pose from 27.1% to 34.1%; however, only a slight improvement in target prediction accuracy was observed with inverse docking scenarios. These findings highlight the limitations and challenges of current scoring functions and the need for more accurate docking methods. The proposed multiple grid arrangement method was implemented in Glide by modifying a cross-docking script for Glide, xglide.py. The script of our method is freely available online at http://www.bi.cs.titech.ac.jp/mga_glide/. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Dinehart, R.L.; Burau, J.R.
2005-01-01
A strategy of repeated surveys by acoustic Doppler current profiler (ADCP) was applied in a tidal river to map velocity vectors and suspended-sediment indicators. The Sacramento River at the junction with the Delta Cross Channel at Walnut Grove, California, was surveyed over several tidal cycles in the Fall of 2000 and 2001 with a vessel-mounted ADCP. Velocity profiles were recorded along flow-defining survey paths, with surveys repeated every 27 min through a diurnal tidal cycle. Velocity vectors along each survey path were interpolated to a three-dimensional Cartesian grid that conformed to local bathymetry. A separate array of vectors was interpolated onto a grid from each survey. By displaying interpolated vector grids sequentially with computer animation, flow dynamics of the reach could be studied in three-dimensions as flow responded to the tidal cycle. Velocity streamtraces in the grid showed the upwelling of flow from the bottom of the Sacramento River channel into the Delta Cross Channel. The sequential display of vector grids showed that water in the canal briefly returned into the Sacramento River after peak flood tides, which had not been known previously. In addition to velocity vectors, ADCP data were processed to derive channel bathymetry and a spatial indicator for suspended-sediment concentration. Individual beam distances to bed, recorded by the ADCP, were transformed to yield bathymetry accurate enough to resolve small bedforms within the study reach. While recording velocity, ADCPs also record the intensity of acoustic backscatter from particles suspended in the flow. Sequential surveys of backscatter intensity were interpolated to grids and animated to indicate the spatial movement of suspended sediment through the study reach. Calculation of backscatter flux through cross-sectional grids provided a first step for computation of suspended-sediment discharge, the second step being a calibrated relation between backscatter intensity and sediment concentration. Spatial analyses of ADCP data showed that a strategy of repeated surveys and flow-field interpolation has the potential to simplify computation of flow and sediment discharge through complex waterways. The use of trade, product, industry, or firm names in this report is for descriptive purposes only and does not constitute endorsement of products by the US Government. ?? 2005 Elsevier B.V. All rights reserved.
Chromosomal aberrations in Sigmodon hispidus from a Superfund site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowers, B.; McBee, K.; Lochmiller, R.
1995-12-31
Cotton rats (Sigmodon hispidus) were collected from an EPA Superfund site located on an abandoned oil refinery. Three trapping grids were located on the refinery and three similar grids were located at uncontaminated localities which served as reference sites. Bone marrow metaphase chromosome preparations were examined for chromosomal damage. For each individual, 50 cells were scored for six classes of chromosomal lesions. For the fall 1991 trapping period, mean number of aberrant cells per individual was 2.33, 0.85, and 1.50 for the three Superfund grids., Mean number of aberrant cells per individual was 2.55, 2.55, and 2.12 from the referencemore » grids. Mean number of lesions per cell was 2.77, 0.86, and 1.9 from the Superfund grids, and 3.55, 2.77, and 2.50 from the reference grids. For the spring 1992 trapping period, more damage was observed in animals from both Superfund and reference sites; however, animals from Superfund grids had more damage than animals from reference grids. Mean number of aberrant cells per individual was 3.50, 3.25, and 3.70 from the Superfund grids, and 2.40, 2.11, and 1.40 from the reference grids. Mean number of lesions per cell was 4.80, 4.25, and 5.50 from the Superfund grids, and 2.60, 2.33, and 1.50 from the reference grids. These data suggest animals may be more susceptible to chromosomal damage during winter months, and animals from the Superfund grids appear to be more severely affected than animals from reference grids.« less
NASA Astrophysics Data System (ADS)
Degenhart, Alan D.; Eles, James; Dum, Richard; Mischel, Jessica L.; Smalianchuk, Ivan; Endler, Bridget; Ashmore, Robin C.; Tyler-Kabara, Elizabeth C.; Hatsopoulos, Nicholas G.; Wang, Wei; Batista, Aaron P.; Cui, X. Tracy
2016-08-01
Objective. Electrocorticography (ECoG), used as a neural recording modality for brain-machine interfaces (BMIs), potentially allows for field potentials to be recorded from the surface of the cerebral cortex for long durations without suffering the host-tissue reaction to the extent that it is common with intracortical microelectrodes. Though the stability of signals obtained from chronically implanted ECoG electrodes has begun receiving attention, to date little work has characterized the effects of long-term implantation of ECoG electrodes on underlying cortical tissue. Approach. We implanted and recorded from a high-density ECoG electrode grid subdurally over cortical motor areas of a Rhesus macaque for 666 d. Main results. Histological analysis revealed minimal damage to the cortex underneath the implant, though the grid itself was encapsulated in collagenous tissue. We observed macrophages and foreign body giant cells at the tissue-array interface, indicative of a stereotypical foreign body response. Despite this encapsulation, cortical modulation during reaching movements was observed more than 18 months post-implantation. Significance. These results suggest that ECoG may provide a means by which stable chronic cortical recordings can be obtained with comparatively little tissue damage, facilitating the development of clinically viable BMI systems.
Wuchty, Stefan
2006-05-23
While the analysis of unweighted biological webs as diverse as genetic, protein and metabolic networks allowed spectacular insights in the inner workings of a cell, biological networks are not only determined by their static grid of links. In fact, we expect that the heterogeneity in the utilization of connections has a major impact on the organization of cellular activities as well. We consider a web of interactions between protein domains of the Protein Family database (PFAM), which are weighted by a probability score. We apply metrics that combine the static layout and the weights of the underlying interactions. We observe that unweighted measures as well as their weighted counterparts largely share the same trends in the underlying domain interaction network. However, we only find weak signals that weights and the static grid of interactions are connected entities. Therefore assuming that a protein interaction is governed by a single domain interaction, we observe strong and significant correlations of the highest scoring domain interaction and the confidence of protein interactions in the underlying interactions of yeast and fly. Modeling an interaction between proteins if we find a high scoring protein domain interaction we obtain 1, 428 protein interactions among 361 proteins in the human malaria parasite Plasmodium falciparum. Assessing their quality by a logistic regression method we observe that increasing confidence of predicted interactions is accompanied by high scoring domain interactions and elevated levels of functional similarity and evolutionary conservation. Our results indicate that probability scores are randomly distributed, allowing to treat static grid and weights of domain interactions as separate entities. In particular, these finding confirms earlier observations that a protein interaction is a matter of a single interaction event on domain level. As an immediate application, we show a simple way to predict potential protein interactions by utilizing expectation scores of single domain interactions.
Bruel and Kjaer 4944 Microphone Grid Frequency Response Function System Identification
NASA Technical Reports Server (NTRS)
Bennett, Reginald; Lee, Erik
2010-01-01
Br el & Kjaer (B&K) 4944B pressure field microphone was judiciously selected to measure acoustic environments, 400Hz 50kHz, in close proximity of the nozzle during multiple firings of solid propellant rocket motors. It is well known that protective grids can affect the frequency response of microphones. B&K recommends operation of the B&K 4944B without a protective grid when recording measurements above 10 to 15 kHz.
NASA Astrophysics Data System (ADS)
Liguori, Sara; O'Loughlin, Fiachra; Souvignet, Maxime; Coxon, Gemma; Freer, Jim; Woods, Ross
2014-05-01
This research presents a newly developed observed sub-daily gridded precipitation product for England and Wales. Importantly our analysis specifically allows a quantification of rainfall errors from grid to the catchment scale, useful for hydrological model simulation and the evaluation of prediction uncertainties. Our methodology involves the disaggregation of the current one kilometre daily gridded precipitation records available for the United Kingdom[1]. The hourly product is created using information from: 1) 2000 tipping-bucket rain gauges; and 2) the United Kingdom Met-Office weather radar network. These two independent datasets provide rainfall estimates at temporal resolutions much smaller than the current daily gridded rainfall product; thus allowing the disaggregation of the daily rainfall records to an hourly timestep. Our analysis is conducted for the period 2004 to 2008, limited by the current availability of the datasets. We analyse the uncertainty components affecting the accuracy of this product. Specifically we explore how these uncertainties vary spatially, temporally and with climatic regimes. Preliminary results indicate scope for improvement of hydrological model performance by the utilisation of this new hourly gridded rainfall product. Such product will improve our ability to diagnose and identify structural errors in hydrological modelling by including the quantification of input errors. References [1] Keller V, Young AR, Morris D, Davies H (2006) Continuous Estimation of River Flows. Technical Report: Estimation of Precipitation Inputs. in Agency E (ed.). Environmental Agency.
The functional micro-organization of grid cells revealed by cellular-resolution imaging
Heys, James G.; Rangarajan, Krsna V.; Dombeck, Daniel A.
2015-01-01
Summary Establishing how grid cells are anatomically arranged, on a microscopic scale, in relation to their firing patterns in the environment would facilitate a greater micro-circuit level understanding of the brain’s representation of space. However, all previous grid cell recordings used electrode techniques that provide limited descriptions of fine-scale organization. We therefore developed a technique for cellular-resolution functional imaging of medial entorhinal cortex (MEC) neurons in mice navigating a virtual linear track, enabling a new experimental approach to study MEC. Using these methods, we show that grid cells are physically clustered in MEC compared to non-grid cells. Additionally, we demonstrate that grid cells are functionally micro-organized: The similarity between the environment firing locations of grid cell pairs varies as a function of the distance between them according to a “Mexican Hat” shaped profile. This suggests that, on average, nearby grid cells have more similar spatial firing phases than those further apart. PMID:25467986
mantisGRID: a grid platform for DICOM medical images management in Colombia and Latin America.
Garcia Ruiz, Manuel; Garcia Chaves, Alvin; Ruiz Ibañez, Carlos; Gutierrez Mazo, Jorge Mario; Ramirez Giraldo, Juan Carlos; Pelaez Echavarria, Alejandro; Valencia Diaz, Edison; Pelaez Restrepo, Gustavo; Montoya Munera, Edwin Nelson; Garcia Loaiza, Bernardo; Gomez Gonzalez, Sebastian
2011-04-01
This paper presents the mantisGRID project, an interinstitutional initiative from Colombian medical and academic centers aiming to provide medical grid services for Colombia and Latin America. The mantisGRID is a GRID platform, based on open source grid infrastructure that provides the necessary services to access and exchange medical images and associated information following digital imaging and communications in medicine (DICOM) and health level 7 standards. The paper focuses first on the data abstraction architecture, which is achieved via Open Grid Services Architecture Data Access and Integration (OGSA-DAI) services and supported by the Globus Toolkit. The grid currently uses a 30-Mb bandwidth of the Colombian High Technology Academic Network, RENATA, connected to Internet 2. It also includes a discussion on the relational database created to handle the DICOM objects that were represented using Extensible Markup Language Schema documents, as well as other features implemented such as data security, user authentication, and patient confidentiality. Grid performance was tested using the three current operative nodes and the results demonstrated comparable query times between the mantisGRID (OGSA-DAI) and Distributed mySQL databases, especially for a large number of records.
NASA Astrophysics Data System (ADS)
Ramsdale, Jason D.; Balme, Matthew R.; Conway, Susan J.; Gallagher, Colman; van Gasselt, Stephan A.; Hauber, Ernst; Orgel, Csilla; Séjourné, Antoine; Skinner, James A.; Costard, Francois; Johnsson, Andreas; Losiak, Anna; Reiss, Dennis; Swirad, Zuzanna M.; Kereszturi, Akos; Smith, Isaac B.; Platz, Thomas
2017-06-01
The increased volume, spatial resolution, and areal coverage of high-resolution images of Mars over the past 15 years have led to an increased quantity and variety of small-scale landform identifications. Though many such landforms are too small to represent individually on regional-scale maps, determining their presence or absence across large areas helps form the observational basis for developing hypotheses on the geological nature and environmental history of a study area. The combination of improved spatial resolution and near-continuous coverage significantly increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre and decametre-scale landforms. Here, we describe an approach for mapping small features (from decimetre to kilometre scale) across large areas, formulated for a project to study the northern plains of Mars, and provide context on how this method was developed and how it can be implemented. Rather than ;mapping; with points and polygons, grid-based mapping uses a ;tick box; approach to efficiently record the locations of specific landforms (we use an example suite of glacial landforms; including viscous flow features, the latitude dependant mantle and polygonised ground). A grid of squares (e.g. 20 km by 20 km) is created over the mapping area. Then the basemap data are systematically examined, grid-square by grid-square at full resolution, in order to identify the landforms while recording the presence or absence of selected landforms in each grid-square to determine spatial distributions. The result is a series of grids recording the distribution of all the mapped landforms across the study area. In some ways, these are equivalent to raster images, as they show a continuous distribution-field of the various landforms across a defined (rectangular, in most cases) area. When overlain on context maps, these form a coarse, digital landform map. We find that grid-based mapping provides an efficient solution to the problems of mapping small landforms over large areas, by providing a consistent and standardised approach to spatial data collection. The simplicity of the grid-based mapping approach makes it extremely scalable and workable for group efforts, requiring minimal user experience and producing consistent and repeatable results. The discrete nature of the datasets, simplicity of approach, and divisibility of tasks, open up the possibility for citizen science in which crowdsourcing large grid-based mapping areas could be applied.
Property Grids for the Kansas High Plains Aquifer from Water Well Drillers' Logs
NASA Astrophysics Data System (ADS)
Bohling, G.; Adkins-Heljeson, D.; Wilson, B. B.
2017-12-01
Like a number of state and provincial geological agencies, the Kansas Geological Survey hosts a database of water well drillers' logs, containing the records of sediments and lithologies characterized during drilling. At the moment, the KGS database contains records associated with over 90,000 wells statewide. Over 60,000 of these wells are within the High Plains aquifer (HPA) in Kansas, with the corresponding logs containing descriptions of over 500,000 individual depth intervals. We will present grids of hydrogeological properties for the Kansas HPA developed from this extensive, but highly qualitative, data resource. The process of converting the logs into quantitative form consists of first translating the vast number of unique (and often idiosyncratic) sediment descriptions into a fairly comprehensive set of standardized lithology codes and then mapping the standardized lithologies into a smaller number of property categories. A grid is superimposed on the region and the proportion of each property category is computed within each grid cell, with category proportions in empty grid cells computed by interpolation. Grids of properties such as hydraulic conductivity and specific yield are then computed based on the category proportion grids and category-specific property values. A two-dimensional grid is employed for this large-scale, regional application, with category proportions averaged between two surfaces, such as bedrock and the water table at a particular time (to estimate transmissivity at that time) or water tables at two different times (to estimate specific yield over the intervening time period). We have employed a sequence of water tables for different years, based on annual measurements from an extensive network of wells, providing an assessment of temporal variations in the vertically averaged aquifer properties resulting from water level variations (primarily declines) over time.
Skin texture parameters of the dorsal hand in evaluating skin aging in China.
Gao, Qian; Hu, Li-Wen; Wang, Yang; Xu, Wen-Ying; Ouyang, Nan-Ning; Dong, Guo-Qing; Shi, Song-Tian; Liu, Yang
2011-11-01
There are various non-invasive methods in skin morphology for assessing skin aging. The use of digital photography will make it easier and more convenient. In this study, we explored some skin texture parameters for evaluating skin aging using digital image processing. Two hundred and twenty-eight subjects who lived in Sanya, China, were involved. Individual sun exposure history and other factors influencing skin aging were collected by a questionnaire. Meanwhile, we took photos of their dorsal hands. Skin images were graded according to the Beagley-Gibson system. These skin images were also processed using image analysis software. Five skin texture parameters, Angle Num., Angle Max., Angle Diff., Distance and Grids, were produced in reference to the Beagley-Gibson system. All texture parameters were significantly associated with the Beagley-Gibson score. Among the parameters, the distance between primary lines (Distance) and the value of angle formed by intersection textures (Angle Max., Angle Diff.) were positively associated with the Beagley-Gibson score. However, there was a negative correlation between the number of grids (Grids), the number of angle (Angle Num.) and the Beagley-Gibson score. These texture parameters were also correlated with factors influencing skin aging such as sun exposure, age, smoking, drinking and body mass index. In multivariate analysis, Grids and Distance were mainly affected by age. But Angle Max. and Angle Diff. were mainly affected by sun exposure. It seemed that the skin surface morphologic parameters presented in our study reflect skin aging changes to some extent and could be used to describe skin aging using digital image processing. © 2011 John Wiley & Sons A/S.
Globally Gridded Satellite (GridSat) Observations for Climate Studies
NASA Technical Reports Server (NTRS)
Knapp, Kenneth R.; Ansari, Steve; Bain, Caroline L.; Bourassa, Mark A.; Dickinson, Michael J.; Funk, Chris; Helms, Chip N.; Hennon, Christopher C.; Holmes, Christopher D.; Huffman, George J.;
2012-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them: there is no central archive of geostationary data for all international satellites, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multi-satellite climate studies. The International Satellite Cloud Climatology Project set the stage for overcoming these issues by archiving a subset of the full resolution geostationary data at approx.10 km resolution at 3 hourly intervals since 1983. Recent efforts at NOAA s National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in the netCDF format using standards that permit a wide variety of tools and libraries to quickly and easily process the data. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
Pastoll, Hugh; Ramsden, Helen L.; Nolan, Matthew F.
2012-01-01
The medial entorhinal cortex (MEC) is an increasingly important focus for investigation of mechanisms for spatial representation. Grid cells found in layer II of the MEC are likely to be stellate cells, which form a major projection to the dentate gyrus. Entorhinal stellate cells are distinguished by distinct intrinsic electrophysiological properties, but how these properties contribute to representation of space is not yet clear. Here, we review the ionic conductances, synaptic, and excitable properties of stellate cells, and examine their implications for models of grid firing fields. We discuss why existing data are inconsistent with models of grid fields that require stellate cells to generate periodic oscillations. An alternative possibility is that the intrinsic electrophysiological properties of stellate cells are tuned specifically to control integration of synaptic input. We highlight recent evidence that the dorsal-ventral organization of synaptic integration by stellate cells, through differences in currents mediated by HCN and leak potassium channels, influences the corresponding organization of grid fields. Because accurate cellular data will be important for distinguishing mechanisms for generation of grid fields, we introduce new data comparing properties measured with whole-cell and perforated patch-clamp recordings. We find that clustered patterns of action potential firing and the action potential after-hyperpolarization (AHP) are particularly sensitive to recording condition. Nevertheless, with both methods, these properties, resting membrane properties and resonance follow a dorsal-ventral organization. Further investigation of the molecular basis for synaptic integration by stellate cells will be important for understanding mechanisms for generation of grid fields. PMID:22536175
Rios, Anthony; Kavuluru, Ramakanth
2017-11-01
The CEGS N-GRID 2016 Shared Task in Clinical Natural Language Processing (NLP) provided a set of 1000 neuropsychiatric notes to participants as part of a competition to predict psychiatric symptom severity scores. This paper summarizes our methods, results, and experiences based on our participation in the second track of the shared task. Classical methods of text classification usually fall into one of three problem types: binary, multi-class, and multi-label classification. In this effort, we study ordinal regression problems with text data where misclassifications are penalized differently based on how far apart the ground truth and model predictions are on the ordinal scale. Specifically, we present our entries (methods and results) in the N-GRID shared task in predicting research domain criteria (RDoC) positive valence ordinal symptom severity scores (absent, mild, moderate, and severe) from psychiatric notes. We propose a novel convolutional neural network (CNN) model designed to handle ordinal regression tasks on psychiatric notes. Broadly speaking, our model combines an ordinal loss function, a CNN, and conventional feature engineering (wide features) into a single model which is learned end-to-end. Given interpretability is an important concern with nonlinear models, we apply a recent approach called locally interpretable model-agnostic explanation (LIME) to identify important words that lead to instance specific predictions. Our best model entered into the shared task placed third among 24 teams and scored a macro mean absolute error (MMAE) based normalized score (100·(1-MMAE)) of 83.86. Since the competition, we improved our score (using basic ensembling) to 85.55, comparable with the winning shared task entry. Applying LIME to model predictions, we demonstrate the feasibility of instance specific prediction interpretation by identifying words that led to a particular decision. In this paper, we present a method that successfully uses wide features and an ordinal loss function applied to convolutional neural networks for ordinal text classification specifically in predicting psychiatric symptom severity scores. Our approach leads to excellent performance on the N-GRID shared task and is also amenable to interpretability using existing model-agnostic approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Recording wildlife locations with the Universal Transverse Mercator (UTM) grid system
T. G. Grubb; W. L. Eakle
1988-01-01
The Universal Transverse Mercator (UTM) international, planar, grid system is described and shown to offer greater simplicity, efficiency and accuracy for plotting wildlife locations than the more familiar Latitude-Longitude (Latilong) and Section-Township-Range (Cadastral) systems, and the State planar system. Use of the UTM system is explained with examples.
High-Density Stretchable Electrode Grids for Chronic Neural Recording
Tybrandt, Klas; Khodagholy, Dion; Dielacher, Bernd; Stauffer, Flurin; Renz, Aline F.; Buzsáki, György; Vörös, János
2018-01-01
Electrical interfacing with neural tissue is key to advancing diagnosis and therapies for neurological disorders, as well as providing detailed information about neural signals. A challenge for creating long-term stable interfaces between electronics and neural tissue is the huge mechanical mismatch between the systems. So far, materials and fabrication processes have restricted the development of soft electrode grids able to combine high performance, long-term stability, and high electrode density, aspects all essential for neural interfacing. Here, this challenge is addressed by developing a soft, high-density, stretchable electrode grid based on an inert, high-performance composite material comprising gold-coated titanium dioxide nanowires embedded in a silicone matrix. The developed grid can resolve high spatiotemporal neural signals from the surface of the cortex in freely moving rats with stable neural recording quality and preserved electrode signal coherence during 3 months of implantation. Due to its flexible and stretchable nature, it is possible to minimize the size of the craniotomy required for placement, further reducing the level of invasiveness. The material and device technology presented herein have potential for a wide range of emerging biomedical applications. PMID:29488263
Observations of solar-cell metallization corrosion
NASA Technical Reports Server (NTRS)
Mon, G. R.
1983-01-01
The Engineering Sciences Area of the Jet Propulsion Laboratory (JPL) Flat-Plate Solar Array Project is performing long term environmental tests on photovoltaic modules at Wyle Laboratories in Huntsville, Alabama. Some modules have been exposed to 85 C/85% RH and 40 C/93% RH for up to 280 days. Other modules undergoing temperature-only exposures ( 3% RH) at 85 C and 100 C have been tested for more than 180 days. At least two modules of each design type are exposed to each environment - one with, and the other without a 100-mA forward bias. Degradation is both visually observed and electrically monitored. Visual observations of changes in appearance are recorded at each inspection time. Significant visual observations relating to metallization corrosion (and/or metallization-induced corrosion) include discoloration (yellowing and browning) of grid lines, migration of grid line material into the encapsulation (blossoming), the appearance of rainbow-like diffraction patterns on the grid lines, and brown spots on collectors and grid lines. All of these observations were recorded for electrically biased modules in the 280-day tests with humidity.
Exploration of exposure conditions with a novel wireless detector for bedside digital radiography
NASA Astrophysics Data System (ADS)
Bosmans, Hilde; Nens, Joris; Delzenne, Louis; Marshall, Nicholas; Pauwels, Herman; De Wever, Walter; Oyen, Raymond
2012-03-01
We propose, apply and validate an optimization scheme for a new wireless CsI based DR detector in combination with a regular mobile X-ray system for bedside imaging applications. Three different grids were tested in this combination. Signal-difference-to-noise was investigated in two ways, using a 1mm Cu piece in combination with different thicknesses of PMMA and by means of the CDRAD phantom using 10 images per condition and an automated evaluation method. A Figure of Merit (FOM), namely SDNR2/Imparted Energy, was calculated for a large range of exposure conditions, without and with grid in place. Misalignment of the grids was evaluated via the same FOMs. This optimization study was validated with comparative X-ray acquisitions performed on dead bodies. An experienced radiologist scored the quality of several specific aspects for all these exposures. Signal difference to noise ratios measured with the Cu method correlated well with the threshold contrasts from the CDRAD analysis (R2 > 0.9). The analysis showed optimal FOM with detector air kerma rates as typically used in clinical practice. Lower tube voltages provide higher FOM than the higher values but their practical use depends on the limitations of X-ray tubes, linked to patient motion artefacts. The use of high resolution grids should be encouraged, as the FOM increases with 47% at 75kV. These scores from the Visual grading study confirmed the results obtained with the FOM. The switch to (wireless) DR technology for bedside imaging could benefit from devices to improve grid positioning or any scatter reduction technique.
Antiscatter grid use in pediatric digital tomosynthesis imaging†
King, Jenna M.; Reed, Martin
2011-01-01
The objective of this study was to assess the effect of antiscatter grid use on tomosynthesis image quality. We performed an observer study that rated the image quality of digital tomosynthesis scout radiographs and slice images of a Leeds TO.20 contrast‐detail test object embedded in acrylic with and without a grid. We considered 10, 15, 20 and 25 cm of acrylic to represent the wide range of patient thicknesses encountered in pediatric imaging. We also acquired and rated images without a grid at an increased patient dose. The readers counted the total number of visible details in each image as a measure of relative image quality. We observed that the antiscatter grid improves tomosynthesis image quality compared to the grid‐out case, which received image quality scores similar to grid‐in radiography. Our results suggest that, in order to achieve the best image quality in exchange for the increase in patient dose, it may often be appropriate to include an antiscatter grid for pediatric tomosynthesis imaging, particularly if the patient thickness is greater than 10 cm. PACS number: 87.57.‐s PMID:22089021
34 CFR 668.24 - Record retention and examinations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... administration of the Federal Perkins Loan, FWS, FSEOG, Federal Pell Grant, ACG, National SMART Grant, or TEACH... records necessary to support the data contained in the FISAP, including “income grid information,” for...
34 CFR 668.24 - Record retention and examinations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... administration of the Federal Perkins Loan, FWS, FSEOG, Federal Pell Grant, ACG, National SMART Grant, or TEACH... records necessary to support the data contained in the FISAP, including “income grid information,” for...
34 CFR 668.24 - Record retention and examinations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... administration of the Federal Perkins Loan, FWS, FSEOG, Federal Pell Grant, ACG, National SMART Grant, or TEACH... records necessary to support the data contained in the FISAP, including “income grid information,” for...
The functional micro-organization of grid cells revealed by cellular-resolution imaging.
Heys, James G; Rangarajan, Krsna V; Dombeck, Daniel A
2014-12-03
Establishing how grid cells are anatomically arranged, on a microscopic scale, in relation to their firing patterns in the environment would facilitate a greater microcircuit-level understanding of the brain's representation of space. However, all previous grid cell recordings used electrode techniques that provide limited descriptions of fine-scale organization. We therefore developed a technique for cellular-resolution functional imaging of medial entorhinal cortex (MEC) neurons in mice navigating a virtual linear track, enabling a new experimental approach to study MEC. Using these methods, we show that grid cells are physically clustered in MEC compared to nongrid cells. Additionally, we demonstrate that grid cells are functionally micro-organized: the similarity between the environment firing locations of grid cell pairs varies as a function of the distance between them according to a "Mexican hat"-shaped profile. This suggests that, on average, nearby grid cells have more similar spatial firing phases than those further apart. Copyright © 2014 Elsevier Inc. All rights reserved.
Complications and results of subdural grid electrode implantation in epilepsy surgery.
Lee, W S; Lee, J K; Lee, S A; Kang, J K; Ko, T S
2000-11-01
We assessed the risk of delayed subdural hematoma and other complications associated with subdural grid implantation. Forty-nine patients underwent subdural grid implantation with/without subdural strips or depth electrodes from January 1994 to August 1998. To identify the risk associated with subdural grid implantation, a retrospective review of all patients' medical records and radiological studies was performed. The major complications of 50 subdural grid electrode implantations were as follows: four cases (7.8%) of delayed subdural hematoma at the site of the subdural grid, requiring emergency operation; two cases (3.9%) of infection; one case (2.0%) of epidural hematoma; and one case (2.0%) of brain swelling. After subdural hematoma removal, the electrodes were left in place. CCTV monitoring and cortical stimulation studies were continued thereafter. No delayed subdural hematoma has occurred since routine placement of subdural drains was begun. In our experience the worst complication of subdural grid implantation has been delayed subdural hematoma. Placement of subdural drains and close observation may be helpful to prevent this serious complication.
Rousselet, Jérôme; Imbert, Charles-Edouard; Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre
2013-01-01
Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google Street View could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google Street View. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google Street View were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google Street View network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant.
Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre
2013-01-01
Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google street view could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google street view. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google street view were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google street view network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant. PMID:24130675
Multiscale image processing and antiscatter grids in digital radiography.
Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D
2009-01-01
Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.
Home and Building Energy Management Systems | Grid Modernization | NREL
Home and Building Energy Management Systems Home and Building Energy Management Systems NREL building assets and energy management systems can provide value to the grid. Photo of a pair of NREL researchers who received a record of invention for a home energy management system in a smart home laboratory
Scott, Ingrid U; Ip, Michael S; VanVeldhuisen, Paul C; Oden, Neal L; Blodi, Barbara A; Fisher, Marian; Chan, Clement K; Gonzalez, Victor H; Singerman, Lawrence J; Tolentino, Michael
2009-09-01
To compare the efficacy and safety of 1-mg and 4-mg doses of preservative-free intravitreal triamcinolone with standard care (grid photocoagulation in eyes without dense macular hemorrhage and deferral of photocoagulation until hemorrhage clears in eyes with dense macular hemorrhage) for eyes with vision loss associated with macular edema secondary to branch retinal vein occlusion (BRVO). Multicenter, randomized clinical trial of 411 participants. Main Outcome Measure Gain in visual acuity letter score of 15 or more from baseline to month 12. Twenty-nine percent, 26%, and 27% of participants achieved the primary outcome in the standard care, 1-mg, and 4-mg groups, respectively. None of the pairwise comparisons between the 3 groups was statistically significant at month 12. The rates of elevated intraocular pressure and cataract were similar for the standard care and 1-mg groups, but higher in the 4-mg group. There was no difference identified in visual acuity at 12 months for the standard care group compared with the triamcinolone groups; however, rates of adverse events (particularly elevated intraocular pressure and cataract) were highest in the 4-mg group. Application to Clinical Practice Grid photocoagulation as applied in the SCORE Study remains the standard care for patients with vision loss associated with macular edema secondary to BRVO who have characteristics similar to participants in the SCORE-BRVO trial. Grid photocoagulation should remain the benchmark against which other treatments are compared in clinical trials for eyes with vision loss associated with macular edema secondary to BRVO. Trial Registration clinicaltrials.gov Identifier: NCT00105027.
Large Area Coverage of a TPC Endcap with GridPix Detectors
NASA Astrophysics Data System (ADS)
Kaminski, Jochen
2018-02-01
The Large Prototype TPC at DESY, Hamburg, was built by the LCTPC collaboration as a testbed for new readout technologies of Time Projection Chambers. Up to seven modules of about 400 cm2 each can be placed in the endcap. Three of these modules were equipped with a total of 160 GridPix detectors. This is a combination of a highly pixelated readout ASIC and a Micromegas built on top. GridPix detectors have a very high efficiency of detecting primary electrons, which leads to excellent spatial and energy resolutions. For the first time a large number of GridPix detectors has been operated and long segments of tracks have been recorded with excellent precision.
Bai, Qifeng; Shao, Yonghua; Pan, Dabo; Zhang, Yang; Liu, Huanxiang; Yao, Xiaojun
2014-01-01
We designed a program called MolGridCal that can be used to screen small molecule database in grid computing on basis of JPPF grid environment. Based on MolGridCal program, we proposed an integrated strategy for virtual screening and binding mode investigation by combining molecular docking, molecular dynamics (MD) simulations and free energy calculations. To test the effectiveness of MolGridCal, we screened potential ligands for β2 adrenergic receptor (β2AR) from a database containing 50,000 small molecules. MolGridCal can not only send tasks to the grid server automatically, but also can distribute tasks using the screensaver function. As for the results of virtual screening, the known agonist BI-167107 of β2AR is ranked among the top 2% of the screened candidates, indicating MolGridCal program can give reasonable results. To further study the binding mode and refine the results of MolGridCal, more accurate docking and scoring methods are used to estimate the binding affinity for the top three molecules (agonist BI-167107, neutral antagonist alprenolol and inverse agonist ICI 118,551). The results indicate agonist BI-167107 has the best binding affinity. MD simulation and free energy calculation are employed to investigate the dynamic interaction mechanism between the ligands and β2AR. The results show that the agonist BI-167107 also has the lowest binding free energy. This study can provide a new way to perform virtual screening effectively through integrating molecular docking based on grid computing, MD simulations and free energy calculations. The source codes of MolGridCal are freely available at http://molgridcal.codeplex.com. PMID:25229694
Continuous Attractor Network Model for Conjunctive Position-by-Velocity Tuning of Grid Cells
Si, Bailu; Romani, Sandro; Tsodyks, Misha
2014-01-01
The spatial responses of many of the cells recorded in layer II of rodent medial entorhinal cortex (MEC) show a triangular grid pattern, which appears to provide an accurate population code for animal spatial position. In layer III, V and VI of the rat MEC, grid cells are also selective to head-direction and are modulated by the speed of the animal. Several putative mechanisms of grid-like maps were proposed, including attractor network dynamics, interactions with theta oscillations or single-unit mechanisms such as firing rate adaptation. In this paper, we present a new attractor network model that accounts for the conjunctive position-by-velocity selectivity of grid cells. Our network model is able to perform robust path integration even when the recurrent connections are subject to random perturbations. PMID:24743341
Degenhart, Alan D.; Eles, James; Dum, Richard; Mischel, Jessica L.; Smalianchuk, Ivan; Endler, Bridget; Ashmore, Robin C.; Tyler-Kabara, Elizabeth C.; Hatsopoulos, Nicholas G.; Wang, Wei; Batista, Aaron P.; Cui, X. Tracy
2016-01-01
Electrocorticography (ECoG), used as a neural recording modality for brain-machine interfaces (BMIs), potentially allows for field potentials to be recorded from the surface of the cerebral cortex for long durations without suffering the host-tissue reaction to the extent that it is common with intracortical microelectrodes. Though the stability of signals obtained from chronically-implanted ECoG electrodes has begun receiving attention, to date little work has characterized the effects of long-term implantation of ECoG electrodes on underlying cortical tissue. We implanted a high-density ECoG electrode grid subdurally over cortical motor areas of a Rhesus macaque for 666 days. Histological analysis revealed minimal damage to the cortex underneath the implant, though the grid itself was encapsulated in collagenous tissue. We observed macrophages and foreign body giant cells at the tissue-array interface, indicative of a stereotypical foreign body response. Despite this encapsulation, cortical modulation during reaching movements was observed more than 18 months post-implantation. These results suggest that ECoG may provide a means by which stable chronic cortical recordings can be obtained with comparatively little tissue damage, facilitating the development of clinically-viable brain-machine interface systems. PMID:27351722
Multi-scale recordings for neuroprosthetic control of finger movements.
Baker, Justin; Bishop, William; Kellis, Spencer; Levy, Todd; House, Paul; Greger, Bradley
2009-01-01
We trained a rhesus monkey to perform individuated and combined finger flexions and extensions of the thumb, index, and middle finger. A Utah Electrode Array (UEA) was implanted into the hand region of the motor cortex contralateral to the monkey's trained hand. We also implanted a microwire electrocorticography grid (microECoG) epidurally so that it covered the UEA. The microECoG grid spanned the arm and hand regions of both the primary motor and somatosensory cortices. Previously this monkey had Implantable MyoElectric Sensors (IMES) surgically implanted into the finger muscles of the monkey's forearm. Action potentials (APs), local field potentials (LFPs), and microECoG signals were recorded from wired head-stage connectors for the UEA and microECoG grids, while EMG was recorded wirelessly. The monkey performed a finger flexion/extension task while neural and EMG data were acquired. We wrote an algorithm that uses the spike data from the UEA to perform a real-time decode of the monkey's finger movements. Also, analyses of the LFP and microECoG data indicate that these data show trial-averaged differences between different finger movements, indicating the data are potentially decodeable.
High-Density Stretchable Electrode Grids for Chronic Neural Recording.
Tybrandt, Klas; Khodagholy, Dion; Dielacher, Bernd; Stauffer, Flurin; Renz, Aline F; Buzsáki, György; Vörös, János
2018-04-01
Electrical interfacing with neural tissue is key to advancing diagnosis and therapies for neurological disorders, as well as providing detailed information about neural signals. A challenge for creating long-term stable interfaces between electronics and neural tissue is the huge mechanical mismatch between the systems. So far, materials and fabrication processes have restricted the development of soft electrode grids able to combine high performance, long-term stability, and high electrode density, aspects all essential for neural interfacing. Here, this challenge is addressed by developing a soft, high-density, stretchable electrode grid based on an inert, high-performance composite material comprising gold-coated titanium dioxide nanowires embedded in a silicone matrix. The developed grid can resolve high spatiotemporal neural signals from the surface of the cortex in freely moving rats with stable neural recording quality and preserved electrode signal coherence during 3 months of implantation. Due to its flexible and stretchable nature, it is possible to minimize the size of the craniotomy required for placement, further reducing the level of invasiveness. The material and device technology presented herein have potential for a wide range of emerging biomedical applications. © 2018 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mapping Error in Southern Ocean Transport Computed from Satellite Altimetry and Argo
NASA Astrophysics Data System (ADS)
Kosempa, M.; Chambers, D. P.
2016-02-01
Argo profiling floats afford basin-scale coverage of the Southern Ocean since 2005. When density estimates from Argo are combined with surface geostrophic currents derived from satellite altimetry, one can estimate integrated geostrophic transport above 2000 dbar [e.g., Kosempa and Chambers, JGR, 2014]. However, the interpolation techniques relied upon to generate mapped data from Argo and altimetry will impart a mapping error. We quantify this mapping error by sampling the high-resolution Southern Ocean State Estimate (SOSE) at the locations of Argo floats and Jason-1, and -2 altimeter ground tracks, then create gridded products using the same optimal interpolation algorithms used for the Argo/altimetry gridded products. We combine these surface and subsurface grids to compare the sampled-then-interpolated transport grids to those from the original SOSE data in an effort to quantify the uncertainty in volume transport integrated across the Antarctic Circumpolar Current (ACC). This uncertainty is then used to answer two fundamental questions: 1) What is the minimum linear trend that can be observed in ACC transport given the present length of the instrument record? 2) How long must the instrument record be to observe a trend with an accuracy of 0.1 Sv/year?
GridPix detectors: Production and beam test results
NASA Astrophysics Data System (ADS)
Koppert, W. J. C.; van Bakel, N.; Bilevych, Y.; Colas, P.; Desch, K.; Fransen, M.; van der Graaf, H.; Hartjes, F.; Hessey, N. P.; Kaminski, J.; Schmitz, J.; Schön, R.; Zappon, F.
2013-12-01
The innovative GridPix detector is a Time Projection Chamber (TPC) that is read out with a Timepix-1 pixel chip. By using wafer post-processing techniques an aluminium grid is placed on top of the chip. When operated, the electric field between the grid and the chip is sufficient to create electron induced avalanches which are detected by the pixels. The time-to-digital converter (TDC) records the drift time enabling the reconstruction of high precision 3D track segments. Recently GridPixes were produced on full wafer scale, to meet the demand for more reliable and cheaper devices in large quantities. In a recent beam test the contribution of both diffusion and time walk to the spatial and angular resolutions of a GridPix detector with a 1.2 mm drift gap are studied in detail. In addition long term tests show that in a significant fraction of the chips the protection layer successfully quenches discharges, preventing harm to the chip.
Framing of grid cells within and beyond navigation boundaries
Savelli, Francesco; Luck, JD; Knierim, James J
2017-01-01
Grid cells represent an ideal candidate to investigate the allocentric determinants of the brain’s cognitive map. Most studies of grid cells emphasized the roles of geometric boundaries within the navigational range of the animal. Behaviors such as novel route-taking between local environments indicate the presence of additional inputs from remote cues beyond the navigational borders. To investigate these influences, we recorded grid cells as rats explored an open-field platform in a room with salient, remote cues. The platform was rotated or translated relative to the room frame of reference. Although the local, geometric frame of reference often exerted the strongest control over the grids, the remote cues demonstrated a consistent, sometimes dominant, countervailing influence. Thus, grid cells are controlled by both local geometric boundaries and remote spatial cues, consistent with prior studies of hippocampal place cells and providing a rich representational repertoire to support complex navigational (and perhaps mnemonic) processes. DOI: http://dx.doi.org/10.7554/eLife.21354.001 PMID:28084992
NASA Astrophysics Data System (ADS)
van Tuyet, Dao; Tuan, Ngo Anh; van Lang, Tran
Grid computing has been an increasing topic in recent years. It attracts the attention of many scientists from many fields. As a result, many Grid systems have been built for serving people's demands. At present, many tools for developing the Grid systems such as Globus, gLite, Unicore still developed incessantly. Especially, gLite - the Grid Middleware - was developed by the Europe Community scientific in recent years. Constant growth of Grid technology opened the way for new opportunities in term of information and data exchange in a secure and collaborative context. These new opportunities can be exploited to offer physicians new telemedicine services in order to improve their collaborative capacities. Our platform gives physicians an easy method to use telemedicine environment to manage and share patient's information (such as electronic medical record, images formatted DICOM) between remote locations. This paper presents the Grid Infrastructure based on gLite; some main components of gLite; the challenge scenario in which new applications can be developed to improve collaborative work between scientists; the process of deploying Hospital Open software Platform for E-health (HOPE) on the Grid.
Grid cells form a global representation of connected environments.
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-05-04
The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5-8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9-11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Grid Cells Form a Global Representation of Connected Environments
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-01-01
Summary The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5–8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9–11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. PMID:25913404
Design and Implementation of Real-Time Off-Grid Detection Tool Based on FNET/GridEye
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Jiahui; Zhang, Ye; Liu, Yilu
2014-01-01
Real-time situational awareness tools are of critical importance to power system operators, especially during emergencies. The availability of electric power has become a linchpin of most post disaster response efforts as it is the primary dependency for public and private sector services, as well as individuals. Knowledge of the scope and extent of facilities impacted, as well as the duration of their dependence on backup power, enables emergency response officials to plan for contingencies and provide better overall response. Based on real-time data acquired by Frequency Disturbance Recorders (FDRs) deployed in the North American power grid, a real-time detection methodmore » is proposed. This method monitors critical electrical loads and detects the transition of these loads from an on-grid state, where the loads are fed by the power grid to an off-grid state, where the loads are fed by an Uninterrupted Power Supply (UPS) or a backup generation system. The details of the proposed detection algorithm are presented, and some case studies and off-grid detection scenarios are also provided to verify the effectiveness and robustness. Meanwhile, the algorithm has already been implemented based on the Grid Solutions Framework (GSF) and has effectively detected several off-grid situations.« less
Monitoring and Modeling Performance of Communications in Computational Grids
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Le, Thuy T.
2003-01-01
Computational grids may include many machines located in a number of sites. For efficient use of the grid we need to have an ability to estimate the time it takes to communicate data between the machines. For dynamic distributed grids it is unrealistic to know exact parameters of the communication hardware and the current communication traffic and we should rely on a model of the network performance to estimate the message delivery time. Our approach to a construction of such a model is based on observation of the messages delivery time with various message sizes and time scales. We record these observations in a database and use them to build a model of the message delivery time. Our experiments show presence of multiple bands in the logarithm of the message delivery times. These multiple bands represent multiple paths messages travel between the grid machines and are incorporated in our multiband model.
Western Wind Data Set | Grid Modernization | NREL
replicates the stochastic nature of wind power plant output. NREL modeled hysteresis around wind turbine cut where wind speeds are often near wind turbine cut-out (~25 m/s), SCORE output does not replicate the Vestas V90). The hysteresis-corrected SCORE is an attempt to put the wind turbine hysteresis at cut-out
The abrupt development of adult-like grid cell firing in the medial entorhinal cortex
Wills, Thomas J.; Barry, Caswell; Cacucci, Francesca
2012-01-01
Understanding the development of the neural circuits subserving specific cognitive functions such as navigation remains a central problem in neuroscience. Here, we characterize the development of grid cells in the medial entorhinal cortex, which, by nature of their regularly spaced firing fields, are thought to provide a distance metric to the hippocampal neural representation of space. Grid cells emerge at the time of weaning in the rat, at around 3 weeks of age. We investigated whether grid cells in young rats are functionally equivalent to those observed in the adult as soon as they appear, or if instead they follow a gradual developmental trajectory. We find that, from the very youngest ages at which reproducible grid firing is observed (postnatal day 19): grid cells display adult-like firing fields that tessellate to form a coherent map of the local environment; that this map is universal, maintaining its internal structure across different environments; and that grid cells in young rats, as in adults, also encode a representation of direction and speed. To further investigate the developmental processes leading up to the appearance of grid cells, we present data from individual medial entorhinal cortex cells recorded across more than 1 day, spanning the period before and after the grid firing pattern emerged. We find that increasing spatial stability of firing was correlated with increasing gridness. PMID:22557949
Interpolation of unevenly spaced data using a parabolic leapfrog correction method and cubic splines
Julio L. Guardado; William T. Sommers
1977-01-01
The technique proposed allows interpolation of data recorded at unevenly spaced sites to a regular grid or to other sites. Known data are interpolated to an initial guess field grid of unevenly spaced rows and columns by a simple distance weighting procedure. The initial guess field is then adjusted by using a parabolic leapfrog correction and the known data. The final...
Globally Gridded Satellite observations for climate studies
Knapp, K.R.; Ansari, S.; Bain, C.L.; Bourassa, M.A.; Dickinson, M.J.; Funk, Chris; Helms, C.N.; Hennon, C.C.; Holmes, C.D.; Huffman, G.J.; Kossin, J.P.; Lee, H.-T.; Loew, A.; Magnusdottir, G.
2011-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them that no central archive of geostationary data for all international satellites exists, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multisatellite climate studies. The International Satellite Cloud Climatology Project (ISCCP) set the stage for overcoming these issues by archiving a subset of the full-resolution geostationary data at ~10-km resolution at 3-hourly intervals since 1983. Recent efforts at NOAA's National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in Network Common Data Format (netCDF) using standards that permit a wide variety of tools and libraries to process the data quickly and easily. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
A modified S-DIMM+: applying additional height grids for characterizing daytime seeing profiles
NASA Astrophysics Data System (ADS)
Wang, Zhiyong; Zhang, Lanqiang; Kong, Lin; Bao, Hua; Guo, Youming; Rao, Xuejun; Zhong, Libo; Zhu, Lei; Rao, Changhui
2018-07-01
Characterization of daytime atmospheric turbulence profiles is needed for the design of a multi-conjugate adaptive optical system. S-DIMM+ (solar differential image motion monitor+) is a technique to measure vertical seeing profiles. However, the number of height grids will be limited by the lenslet array of the wide-field Shack-Hartmann wavefront sensor (SHWFS). A small number of subaperture lenslet arrays will lead to a coarse height grid over the atmosphere, which can result in difficulty in finding the location of strong-turbulence layers and overestimates of the turbulence strength for the measured layers. To address this problem, we propose a modified S-DIMM+ method to measure seeing profiles iteratively with decreasing altitude range for a given number of height grids; finally they will be combined as a new seeing profile, with a denser and more uniform distribution of height grids. This method is tested with simulations and recovers the input height and contribution perfectly. Furthermore, this method is applied to the 102 data-sequences recorded from the 1-m New Vacuum Solar Telescope at Fuxian Solar Observatory, 55 of which were recorded at local time between 13:40 and 14:35 on 2016 October 6, and the other 47 between 12:50 and 13:40 on 2017 October 5. A 7x7 lenslet array of SHWFS is used to generate a 16-layer height grid to 15 km, each with 1 km height separation. The experimental results show that the turbulence has three origins in the lower (0-2 km) layers, the higher (3-6 km) layers and the uppermost (≥7 km) layers.
Improved Fast, Deep Record Length, Time-Resolved Visible Spectroscopy of Plasmas Using Fiber Grids
NASA Astrophysics Data System (ADS)
Brockington, S.; Case, A.; Cruz, E.; Williams, A.; Witherspoon, F. D.; Horton, R.; Klauser, R.; Hwang, D.
2017-10-01
HyperV Technologies is developing a fiber-coupled, deep record-length, low-light camera head for performing high time resolution spectroscopy on visible emission from plasma events. By coupling the output of a spectrometer to an imaging fiber bundle connected to a bank of amplified silicon photomultipliers, time-resolved spectroscopic imagers of 100 to 1,000 pixels can be constructed. A second generation prototype 32-pixel spectroscopic imager employing this technique was constructed and successfully tested at the University of California at Davis Compact Toroid Injection Experiment (CTIX). Pixel performance of 10 Megaframes/sec with record lengths of up to 256,000 frames ( 25.6 milliseconds) were achieved. Pixel resolution was 12 bits. Pixel pitch can be refined by using grids of 100 μm to 1000 μm diameter fibers. Experimental results will be discussed, along with future plans for this diagnostic. Work supported by USDOE SBIR Grant DE-SC0013801.
The BioGRID interaction database: 2017 update
Chatr-aryamontri, Andrew; Oughtred, Rose; Boucher, Lorrie; Rust, Jennifer; Chang, Christie; Kolas, Nadine K.; O'Donnell, Lara; Oster, Sara; Theesfeld, Chandra; Sellam, Adnane; Stark, Chris; Breitkreutz, Bobby-Joe; Dolinski, Kara; Tyers, Mike
2017-01-01
The Biological General Repository for Interaction Datasets (BioGRID: https://thebiogrid.org) is an open access database dedicated to the annotation and archival of protein, genetic and chemical interactions for all major model organism species and humans. As of September 2016 (build 3.4.140), the BioGRID contains 1 072 173 genetic and protein interactions, and 38 559 post-translational modifications, as manually annotated from 48 114 publications. This dataset represents interaction records for 66 model organisms and represents a 30% increase compared to the previous 2015 BioGRID update. BioGRID curates the biomedical literature for major model organism species, including humans, with a recent emphasis on central biological processes and specific human diseases. To facilitate network-based approaches to drug discovery, BioGRID now incorporates 27 501 chemical–protein interactions for human drug targets, as drawn from the DrugBank database. A new dynamic interaction network viewer allows the easy navigation and filtering of all genetic and protein interaction data, as well as for bioactive compounds and their established targets. BioGRID data are directly downloadable without restriction in a variety of standardized formats and are freely distributed through partner model organism databases and meta-databases. PMID:27980099
Study on key techniques for camera-based hydrological record image digitization
NASA Astrophysics Data System (ADS)
Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping
2015-10-01
With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.
49 CFR 238.305 - Interior calendar day mechanical inspection of passenger cars.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., circuit breakers, contactors, relays, grid resistors, and fuses are installed in non-hazardous locations... has access to the record upon request. (2) The written or electronic record must contain the following...) Any non-complying conditions found; and (iv) The signature or electronic identification of the...
49 CFR 238.305 - Interior calendar day mechanical inspection of passenger cars.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., circuit breakers, contactors, relays, grid resistors, and fuses are installed in non-hazardous locations... has access to the record upon request. (2) The written or electronic record must contain the following...) Any non-complying conditions found; and (iv) The signature or electronic identification of the...
49 CFR 238.305 - Interior calendar day mechanical inspection of passenger cars.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., circuit breakers, contactors, relays, grid resistors, and fuses are installed in non-hazardous locations... has access to the record upon request. (2) The written or electronic record must contain the following...) Any non-complying conditions found; and (iv) The signature or electronic identification of the...
Modeling and Grid Generation of Iced Airfoils
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Hackenberg, Anthony W.; Pennline, James A.; Schilling, Herbert W.
2007-01-01
SmaggIce Version 2.0 is a software toolkit for geometric modeling and grid generation for two-dimensional, singleand multi-element, clean and iced airfoils. A previous version of SmaggIce was described in Preparing and Analyzing Iced Airfoils, NASA Tech Briefs, Vol. 28, No. 8 (August 2004), page 32. To recapitulate: Ice shapes make it difficult to generate quality grids around airfoils, yet these grids are essential for predicting ice-induced complex flow. This software efficiently creates high-quality structured grids with tools that are uniquely tailored for various ice shapes. SmaggIce Version 2.0 significantly enhances the previous version primarily by adding the capability to generate grids for multi-element airfoils. This version of the software is an important step in streamlining the aeronautical analysis of ice airfoils using computational fluid dynamics (CFD) tools. The user may prepare the ice shape, define the flow domain, decompose it into blocks, generate grids, modify/divide/merge blocks, and control grid density and smoothness. All these steps may be performed efficiently even for the difficult glaze and rime ice shapes. Providing the means to generate highly controlled grids near rough ice, the software includes the creation of a wrap-around block (called the "viscous sublayer block"), which is a thin, C-type block around the wake line and iced airfoil. For multi-element airfoils, the software makes use of grids that wrap around and fill in the areas between the viscous sub-layer blocks for all elements that make up the airfoil. A scripting feature records the history of interactive steps, which can be edited and replayed later to produce other grids. Using this version of SmaggIce, ice shape handling and grid generation can become a practical engineering process, rather than a laborious research effort.
NASA Astrophysics Data System (ADS)
Clarke, Robin T.; Bulhoes Mendes, Carlos Andre; Costa Buarque, Diogo
2010-07-01
Two issues of particular importance for the Amazon watershed are: whether annual maxima obtained from reanalysis and raingauge records agree well enough for the former to be useful in extending records of the latter; and whether reported trends in Amazon annual rainfall are reflected in the behavior of annual extremes in precipitation estimated from reanalyses and raingauge records. To explore these issues, three sets of daily precipitation data (1979-2001) from the Brazilian Amazon were analyzed (NCEP/NCAR and ERA-40 reanalyses, and records from the raingauge network of the Brazilian water resources agency - ANA), using the following variables: (1) mean annual maximum precipitation totals, accumulated over one, two, three and five days; (2) linear trends in these variables; (3) mean length of longest within-year "dry" spell; (4) linear trends in these variables. Comparisons between variables obtained from all three data sources showed that reanalyses underestimated time-trends and mean annual maximum precipitation (over durations of one to five days), and the correlations between reanalysis and spatially-interpolated raingauge estimates were small for these two variables. Both reanalyses over-estimated mean lengths of dry period relative to the mean length recorded by the raingauge network. Correlations between the trends calculated from all three data sources were small. Time-trends averaged over the reanalysis grid-squares, and spatially-interpolated time trends from raingauge data, were all clustered around zero. In conclusion, although the NCEP/NCAR and ERA-40 gridded data-sets may be valuable for studies of inter-annual variability in precipitation totals, they were found to be inappropriate for analysis of precipitation extremes.
Fiberglass Grids as Sustainable Reinforcement of Historic Masonry
Righetti, Luca; Edmondson, Vikki; Corradi, Marco; Borri, Antonio
2016-01-01
Fiber-reinforced composite (FRP) materials have gained an increasing success, mostly for strengthening, retrofitting and repair of existing historic masonry structures and may cause a significant enhancement of the mechanical properties of the reinforced members. This article summarizes the results of previous experimental activities aimed at investigating the effectiveness of GFRP (Glass Fiber Reinforced Polymers) grids embedded into an inorganic mortar to reinforce historic masonry. The paper also presents innovative results on the relationship between the durability and the governing material properties of GFRP grids. Measurements of the tensile strength were made using specimens cut off from GFRP grids before and after ageing in aqueous solution. The tensile strength of a commercially available GFRP grid has been tested after up 450 days of storage in deionized water and NaCl solution. A degradation in tensile strength and Young’s modulus up to 30.2% and 13.2% was recorded, respectively. This degradation indicated that extended storage in a wet environment may cause a decrease in the mechanical properties. PMID:28773725
Fiberglass Grids as Sustainable Reinforcement of Historic Masonry.
Righetti, Luca; Edmondson, Vikki; Corradi, Marco; Borri, Antonio
2016-07-21
Fiber-reinforced composite (FRP) materials have gained an increasing success, mostly for strengthening, retrofitting and repair of existing historic masonry structures and may cause a significant enhancement of the mechanical properties of the reinforced members. This article summarizes the results of previous experimental activities aimed at investigating the effectiveness of GFRP (Glass Fiber Reinforced Polymers) grids embedded into an inorganic mortar to reinforce historic masonry. The paper also presents innovative results on the relationship between the durability and the governing material properties of GFRP grids. Measurements of the tensile strength were made using specimens cut off from GFRP grids before and after ageing in aqueous solution. The tensile strength of a commercially available GFRP grid has been tested after up 450 days of storage in deionized water and NaCl solution. A degradation in tensile strength and Young's modulus up to 30.2% and 13.2% was recorded, respectively. This degradation indicated that extended storage in a wet environment may cause a decrease in the mechanical properties.
NASA Astrophysics Data System (ADS)
Guo, Lijuan; Yan, Haijun; Gao, Wensheng; Chen, Yun; Hao, Yongqi
2018-01-01
With the development of power big data, considering the wider power system data, the appropriate large data analysis method can be used to mine the potential law and value of power big data. On the basis of considering all kinds of monitoring data and defects and fault records of main transformer, the paper integrates the power grid, equipment as well as environment data and uses SVM as the main algorithm to evaluate the risk of the main transformer. It gets and compares the evaluation results under different modes, and proves that the risk assessment algorithms and schemes have certain effectiveness. This paper provides a new idea for data fusion of smart grid, and provides a reference for further big data evaluation of power grid equipment.
Alpha1 LASSO data bundles Lamont, OK
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Krishna, Bhargavi (ORCID:000000018828528X)
2016-08-03
A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input includes model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
Xu, Yule; Rong, Ao; Bi, Yanlong; Xu, Wei
2016-01-01
Purpose . To evaluate the efficacy of intravitreal conbercept (IVC) plus modified grid laser photocoagulation (MGP) versus IVC alone for treatment of diffuse diabetic macular edema (DDME). Methods. In this retrospective study, 51 DDME patients were treated with either IVC alone (IVC group) or IVC plus MGP (combined group) with 12 months of follow-up. The clinical records of those patients were reviewed. Results. 26 patients (31 eyes) received IVC alone and 25 patients (30 eyes) received combined therapy. At month 12, the mean best-corrected visual acuity (BCVA) letter score improvement was 9.1 ± 4.5 and 7.5 ± 4.2 in the IVC group and the combined group and the mean central retinal thickness (CRT) reduction was 145.1 ± 69.9 μ m and 168.5 ± 53.6 μ m, respectively. There was no statistically significant difference of improvement in BCVA ( P = 0.164) and decrease in CRT ( P = 0.149) between the two groups. The mean number of injections delivered was significantly higher ( P < 0.001) in the IVC group (5.6 ± 0.8 per eye) than in the combined group (3.3 ± 1.2 per eye). Conclusions . IVC alone or combined with MGP appeared to be effective for treatment of DDME, achieving the similar clinical efficacy. Moreover, MGP helps to reduce the number of injections.
NASA Astrophysics Data System (ADS)
Suharsono; Nurdian, S. W.; Palupi, I. R.
2016-11-01
Relocating hypocenter is a way to improve the velocity model of the subsurface. One of the method is Grid Search. To perform the distribution of the velocity in subsurface by tomography method, it is used the result of relocating hypocenter to be a reference for subsurface analysis in volcanic and major structural patterns, such as in Central Java. The main data of this study is the earthquake data recorded from 1952 to 2012 with the P wave number is 9162, the number of events is 2426 were recorded by 30 stations located in the vicinity of Central Java. Grid search method has some advantages they are: it can relocate the hypocenter more accurate because this method is dividing space lattice model into blocks, and each grid block can only be occupied by one point hypocenter. Tomography technique is done by travel time data that has had relocated with inversion pseudo bending method. Grid search relocated method show that the hypocenter's depth is shallower than before and the direction is to the south, the hypocenter distribution is modeled into the subduction zone between the continent of Eurasia with the Indo-Australian with an average angle of 14 °. The tomography results show the low velocity value is contained under volcanoes with value of -8% to -10%, then the pattern of the main fault structure in Central Java can be description by the results of tomography at high velocity that is from 8% to 10% with the direction is northwest and northeast-southwest.
Multi-Grid detector for neutron spectroscopy: results obtained on time-of-flight spectrometer CNCS
NASA Astrophysics Data System (ADS)
Anastasopoulos, M.; Bebb, R.; Berry, K.; Birch, J.; Bryś, T.; Buffet, J.-C.; Clergeau, J.-F.; Deen, P. P.; Ehlers, G.; van Esch, P.; Everett, S. M.; Guerard, B.; Hall-Wilton, R.; Herwig, K.; Hultman, L.; Höglund, C.; Iruretagoiena, I.; Issa, F.; Jensen, J.; Khaplanov, A.; Kirstein, O.; Lopez Higuera, I.; Piscitelli, F.; Robinson, L.; Schmidt, S.; Stefanescu, I.
2017-04-01
The Multi-Grid detector technology has evolved from the proof-of-principle and characterisation stages. Here we report on the performance of the Multi-Grid detector, the MG.CNCS prototype, which has been installed and tested at the Cold Neutron Chopper Spectrometer, CNCS at SNS. This has allowed a side-by-side comparison to the performance of 3He detectors on an operational instrument. The demonstrator has an active area of 0.2 m2. It is specifically tailored to the specifications of CNCS. The detector was installed in June 2016 and has operated since then, collecting neutron scattering data in parallel to the He-3 detectors of CNCS. In this paper, we present a comprehensive analysis of this data, in particular on instrument energy resolution, rate capability, background and relative efficiency. Stability, gamma-ray and fast neutron sensitivity have also been investigated. The effect of scattering in the detector components has been measured and provides input to comparison for Monte Carlo simulations. All data is presented in comparison to that measured by the 3He detectors simultaneously, showing that all features recorded by one detector are also recorded by the other. The energy resolution matches closely. We find that the Multi-Grid is able to match the data collected by 3He, and see an indication of a considerable advantage in the count rate capability. Based on these results, we are confident that the Multi-Grid detector will be capable of producing high quality scientific data on chopper spectrometers utilising the unprecedented neutron flux of the ESS.
The effect of high-frequency electrical pulses on organic tissue in root canals.
Lendini, M; Alemanno, E; Migliaretti, G; Berutti, E
2005-08-01
To evaluate debris and smear layer scores after application of high-frequency electrical pulses produced by the Endox Endodontic System (Lysis Srl, Nova Milanese, Italy) on intact pulp tissue and organic and inorganic residues after endodontic instrumentation. The study comprised 75 teeth planned for extraction. The teeth were randomly divided into two groups (60 teeth) and a control group (15 teeth): group 1 (30 teeth) was not subjected to instrumentation; group 2 (30 teeth) was instrumented by Hero Shaper instruments and apical stops were prepared to size 40. Each group was subdivided into subgroups A and B (15 teeth); two electrical pulses were applied to subgroups 1A and 2A (one in the apical third and one in the middle third, respectively, at 3 and 6 mm from the root apices); four electrical pulses were applied to subgroups 1B and 2B (two in the apical third, two in the middle third). The control group (15 teeth) was prepared with Hero Shapers and irrigated with 5 mL of EDTA (10%) and 5 mL of 5% NaOCl at 50 degrees C but not subjected to the electrical pulse treatment. Roots were split longitudinally and canal walls were examined at 80x, 200x, 750x, 1500x and 15,000x magnifications, using a scanning electron microscope. Smear layer and debris scores were recorded at the 3 and 6 mm levels using a five-step scoring scale and a 200-microm grid. Means were tested for significance using the one-way anova model and the Bonferroni post-hoc test. The differences between groups were considered to be statistically significant when P < 0.05. The mean value for debris scores for the three groups varied from 1.80 (+/-0.77) to 4.50 (+/-0.68). The smear layer scores for group 2 and the control specimens varied from 2.00 (+/-0.91) to 2.33 (+/-0.99). A significant difference was found in mean debris scores at the 3 and 6 mm levels between the three groups (P < 0.001). The Bonferroni post-hoc test confirmed that the difference was due to group 1. In the two subgroups treated with four high-frequency pulses (1B and 2B) a substantial reduction in mean debris scores was found at the 3 and 6 mm level; subgroup 2B was practically free of organic residue. No significant differences for mean smear layer and debris scores were recorded between group 2 and the control group at the two levels; a significant difference was found only for mean smear layer scores at the 3 mm level between subgroup 2B and the control group (P < 0.05). The Endox device used with four electrical pulses had optimal efficacy when used after mechanical instrumentation. Traditional canal shaping and cleaning was essential to ensure an effective use of high-frequency electrical pulses in eliminating residues of pulp tissue and inorganic debris.
ERIC Educational Resources Information Center
Rapp, John T.; Carroll, Regina A.; Stangeland, Lindsay; Swanson, Greg; Higgins, William J.
2011-01-01
The authors evaluated the extent to which interobserver agreement (IOA) scores, using the block-by-block method for events scored with continuous duration recording (CDR), were higher when the data from the same sessions were converted to discontinuous methods. Sessions with IOA scores of 89% or less with CDR were rescored using 10-s partial…
Absence of Visual Input Results in the Disruption of Grid Cell Firing in the Mouse.
Chen, Guifen; Manson, Daniel; Cacucci, Francesca; Wills, Thomas Joseph
2016-09-12
Grid cells are spatially modulated neurons within the medial entorhinal cortex whose firing fields are arranged at the vertices of tessellating equilateral triangles [1]. The exquisite periodicity of their firing has led to the suggestion that they represent a path integration signal, tracking the organism's position by integrating speed and direction of movement [2-10]. External sensory inputs are required to reset any errors that the path integrator would inevitably accumulate. Here we probe the nature of the external sensory inputs required to sustain grid firing, by recording grid cells as mice explore familiar environments in complete darkness. The absence of visual cues results in a significant disruption of grid cell firing patterns, even when the quality of the directional information provided by head direction cells is largely preserved. Darkness alters the expression of velocity signaling within the entorhinal cortex, with changes evident in grid cell firing rate and the local field potential theta frequency. Short-term (<1.5 s) spike timing relationships between grid cell pairs are preserved in the dark, indicating that network patterns of excitatory and inhibitory coupling between grid cells exist independently of visual input and of spatially periodic firing. However, we find no evidence of preserved hexagonal symmetry in the spatial firing of single grid cells at comparable short timescales. Taken together, these results demonstrate that visual input is required to sustain grid cell periodicity and stability in mice and suggest that grid cells in mice cannot perform accurate path integration in the absence of reliable visual cues. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
A bilateral integrative health-care knowledge service mechanism based on 'MedGrid'.
Liu, Chao; Jiang, Zuhua; Zhen, Lu; Su, Hai
2008-04-01
Current health-care organizations are encountering impression of paucity of medical knowledge. This paper classifies medical knowledge with new scopes. The discovery of health-care 'knowledge flow' initiates a bilateral integrative health-care knowledge service, and we make medical knowledge 'flow' around and gain comprehensive effectiveness through six operations (such as knowledge refreshing...). Seizing the active demand of Chinese health-care revolution, this paper presents 'MedGrid', which is a platform with medical ontology and knowledge contents service. Each level and detailed contents are described on MedGrid info-structure. Moreover, a new diagnosis and treatment mechanism are formed by technically connecting with electronic health-care records (EHRs).
2008-09-01
explosions (UNEs) at the Semipalatinsk Test Site and regional earthquakes recorded by station WMQ (Urumchi, China). Measurements from the grids are... Semipalatinsk , Lop Nor, Novaya Zemlya, and Nevada Test Sites (STS, LNTS, NZTS, NTS, respectively) and regional earthquakes. We used phase-specific window...stations (triangles) within 2000 km of STS and LNTS. Semipalatinsk Test Site Figure 2 shows Pn/Lg spectral ratios, corrected for site and distance
Globally-Gridded Interpolated Night-Time Marine Air Temperatures 1900-2014
NASA Astrophysics Data System (ADS)
Junod, R.; Christy, J. R.
2016-12-01
Over the past century, climate records have pointed to an increase in global near-surface average temperature. Near-surface air temperature over the oceans is a relatively unused parameter in understanding the current state of climate, but is useful as an independent temperature metric over the oceans and serves as a geographical and physical complement to near-surface air temperature over land. Though versions of this dataset exist (i.e. HadMAT1 and HadNMAT2), it has been strongly recommended that various groups generate climate records independently. This University of Alabama in Huntsville (UAH) study began with the construction of monthly night-time marine air temperature (UAHNMAT) values from the early-twentieth century through to the present era. Data from the International Comprehensive Ocean and Atmosphere Data Set (ICOADS) were used to compile a time series of gridded UAHNMAT, (20S-70N). This time series was homogenized to correct for the many biases such as increasing ship height, solar deck heating, etc. The time series of UAHNMAT, once adjusted to a standard reference height, is gridded to 1.25° pentad grid boxes and interpolated using the kriging interpolation technique. This study will present results which quantify the variability and trends and compare to current trends of other related datasets that include HadNMAT2 and sea-surface temperatures (HadISST & ERSSTv4).
PLOT3D- DRAWING THREE DIMENSIONAL SURFACES
NASA Technical Reports Server (NTRS)
Canright, R. B.
1994-01-01
PLOT3D is a package of programs to draw three-dimensional surfaces of the form z = f(x,y). The function f and the boundary values for x and y are the input to PLOT3D. The surface thus defined may be drawn after arbitrary rotations. However, it is designed to draw only functions in rectangular coordinates expressed explicitly in the above form. It cannot, for example, draw a sphere. Output is by off-line incremental plotter or online microfilm recorder. This package, unlike other packages, will plot any function of the form z = f(x,y) and portrays continuous and bounded functions of two independent variables. With curve fitting; however, it can draw experimental data and pictures which cannot be expressed in the above form. The method used is division into a uniform rectangular grid of the given x and y ranges. The values of the supplied function at the grid points (x, y) are calculated and stored; this defines the surface. The surface is portrayed by connecting successive (y,z) points with straight-line segments for each x value on the grid and, in turn, connecting successive (x,z) points for each fixed y value on the grid. These lines are then projected by parallel projection onto the fixed yz-plane for plotting. This program has been implemented on the IBM 360/67 with on-line CDC microfilm recorder.
Digital terrain tapes: user guide
,
1980-01-01
DMATC's digital terrain tapes are a by-product of the agency's efforts to streamline the production of raised-relief maps. In the early 1960's DMATC developed the Digital Graphics Recorder (DGR) system that introduced new digitizing techniques and processing methods into the field of three-dimensional mapping. The DGR system consisted of an automatic digitizing table and a computer system that recorded a grid of terrain elevations from traces of the contour lines on standard topographic maps. A sequence of computer accuracy checks was performed and then the elevations of grid points not intersected by contour lines were interpolated. The DGR system produced computer magnetic tapes which controlled the carving of plaster forms used to mold raised-relief maps. It was realized almost immediately that this relatively simple tool for carving plaster molds had enormous potential for storing, manipulating, and selectively displaying (either graphically or numerically) a vast number of terrain elevations. As the demand for the digital terrain tapes increased, DMATC began developing increasingly advanced digitizing systems and now operates the Digital Topographic Data Collection System (DTDCS). With DTDCS, two types of data elevations as contour lines and points, and stream and ridge lines are sorted, matched, and resorted to obtain a grid of elevation values for every 0.01 inch on each map (approximately 200 feet on the ground). Undefined points on the grid are found by either linear or or planar interpolation.
Evaluation of the Spies™ modalities image quality
Emiliani, Esteban; Talso, Michele; Baghdadi, Mohammed; Barreiro, Aarón; Orosa, Andrea; Serviàn, Pol; Gavrilov, Pavel; Proietti, Silvia; Traxer, Olivier
2017-01-01
Introduction The Spies™ system (Karl-Storz®) was introduced into digital ureteroscopy to improve endoscopic vision. To date, there is no data to either indicate which of the Spies modalities is better for improving diagnosis and treatment procedures, nor to compare the modalities in terms of image quality. The aim of this study was to evaluate and compare the image quality of five Spies™ modalities (SM) to the standard white light in an in-vitro model. Materials and Methods Two standardized grids and 3 stones of different composition were recorded in white light and the 5SM (Clara, Chroma, Clara+Chroma), Spectra A and B) using 4 standardized aqueous scenarios. Twelve templates were done in order to simultaneously compare the same objective in the different modalities. Six urologists, five medical students, five urology residents, and five persons not involved with urology evaluated each video on a scale of 1 (very bad) to 5 (very good). Results Comparing white light to SM, subjects scored better the quality of Clara and Clara+Chroma than white light (p=0.0139 and p<0.05) and scored worse Spectra A and B (p=0.0005 and p=0.0023)). When comparing Clara to the other SM, it was ranked equivalent to Clara+Chroma (p=0.67) and obtained a higher rank than Chroma, Spectra A and B (p<0.05, p=0.0001 and p=0.0001). In the multivariate analysis mean scores were higher among urologists. Conclusion In all analyzed scenarios, the subjects ranked Clara and Clara+Chroma as the modalities with better image quality compared to white light. PMID:28338307
Evaluation of the Spies TM modalities image quality.
Emiliani, Esteban; Talso, Michele; Baghdadi, Mohammed; Barreiro, Aaron; Orosa, Andrea; Serviàn, Pol; Gavrilov, Pavel; Proietti, Silvia; Traxer, Olivier
2017-01-01
The Spies™ system (Karl-Storz®) was introduced into digital ureteroscopy to improve endoscopic vision. To date, there is no data to either indicate which of the Spies modalities is better for improving diagnosis and treatment procedures, nor to compare the modalities in terms of image quality. The aim of this study was to evaluate and compare the image quality of five Spies™ modalities (SM) to the standard white light in an in-vitro model. Two standardized grids and 3 stones of different composition were recorded in white light and the 5SM (Clara, Chroma, Clara+Chroma), Spectra A and B) using 4 standardized aqueous scenarios. Twelve templates were done in order to simultaneously compare the same objective in the different modalities. Six urologists, five medical students, five urology residents, and five persons not involved with urology evaluated each video on a scale of 1 (very bad) to 5 (very good). Comparing white light to SM, subjects scored better the quality of Clara and Clara+Chroma than white light (p=0.0139 and p<0.05) and scored worse Spectra A and B (p=0.0005 and p=0.0023). When comparing Clara to the other SM, it was ranked equivalent to Clara+Chroma (p=0.67) and obtained a higher rank than Chroma, Spectra A and B (p<0.05, p=0.0001 and p=0.0001). In the multivariate analysis mean scores were higher among urologists. In all analyzed scenarios, the subjects ranked Clara and Clara+Chroma as the modalities with better image quality compared to white light. Copyright® by the International Brazilian Journal of Urology.
Potential for unreliable interpretation of EEG recorded with microelectrodes.
Stacey, William C; Kellis, Spencer; Greger, Bradley; Butson, Christopher R; Patel, Paras R; Assaf, Trevor; Mihaylova, Temenuzhka; Glynn, Simon
2013-08-01
Recent studies in epilepsy, cognition, and brain machine interfaces have shown the utility of recording intracranial electroencephalography (iEEG) with greater spatial resolution. Many of these studies utilize microelectrodes connected to specialized amplifiers that are optimized for such recordings. We recently measured the impedances of several commercial microelectrodes and demonstrated that they will distort iEEG signals if connected to clinical EEG amplifiers commonly used in most centers. In this study we demonstrate the clinical implications of this effect and identify some of the potential difficulties in using microelectrodes. Human iEEG data were digitally filtered to simulate the signal recorded by a hybrid grid (two macroelectrodes and eight microelectrodes) connected to a standard EEG amplifier. The filtered iEEG data were read by three trained epileptologists, and high frequency oscillations (HFOs) were detected with a well-known algorithm. The filtering method was verified experimentally by recording an injected EEG signal in a saline bath with the same physical acquisition system used to generate the model. Several electrodes underwent scanning electron microscopy (SEM). Macroelectrode recordings were unaltered compared to the source iEEG signal, but microelectrodes attenuated low frequencies. The attenuated signals were difficult to interpret: all three clinicians changed their clinical scoring of slowing and seizures when presented with the same data recorded on different sized electrodes. The HFO detection algorithm was oversensitive with microelectrodes, classifying many more HFOs than when the same data were recorded with macroelectrodes. In addition, during experimental recordings the microelectrodes produced much greater noise as well as large baseline fluctuations, creating sharply contoured transients, and superimposed "false" HFOs. SEM of these microelectrodes demonstrated marked variability in exposed electrode surface area, lead fractures, and sharp edges. Microelectrodes should not be used with low impedance (<1 GΩ) amplifiers due to severe signal attenuation and variability that changes clinical interpretations. The current method of preparing microelectrodes can leave sharp edges and nonuniform amounts of exposed wire. Even when recorded with higher impedance amplifiers, microelectrode data are highly prone to artifacts that are difficult to interpret. Great care must be taken when analyzing iEEG from high impedance microelectrodes. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.
Entorhinal cortex receptive fields are modulated by spatial attention, even without movement
König, Peter; König, Seth; Buffalo, Elizabeth A
2018-01-01
Grid cells in the entorhinal cortex allow for the precise decoding of position in space. Along with potentially playing an important role in navigation, grid cells have recently been hypothesized to make a general contribution to mental operations. A prerequisite for this hypothesis is that grid cell activity does not critically depend on physical movement. Here, we show that movement of covert attention, without any physical movement, also elicits spatial receptive fields with a triangular tiling of space. In monkeys trained to maintain central fixation while covertly attending to a stimulus moving in the periphery we identified a significant population (20/141, 14% neurons at a FDR <5%) of entorhinal cells with spatially structured receptive fields. This contrasts with recordings obtained in the hippocampus, where grid-like representations were not observed. Our results provide evidence that neurons in macaque entorhinal cortex do not rely on physical movement. PMID:29537964
The FORBIO Climate data set for climate analyses
NASA Astrophysics Data System (ADS)
Delvaux, C.; Journée, M.; Bertrand, C.
2015-06-01
In the framework of the interdisciplinary FORBIO Climate research project, the Royal Meteorological Institute of Belgium is in charge of providing high resolution gridded past climate data (i.e. temperature and precipitation). This climate data set will be linked to the measurements on seedlings, saplings and mature trees to assess the effects of climate variation on tree performance. This paper explains how the gridded daily temperature (minimum and maximum) data set was generated from a consistent station network between 1980 and 2013. After station selection, data quality control procedures were developed and applied to the station records to ensure that only valid measurements will be involved in the gridding process. Thereafter, the set of unevenly distributed validated temperature data was interpolated on a 4 km × 4 km regular grid over Belgium. The performance of different interpolation methods has been assessed. The method of kriging with external drift using correlation between temperature and altitude gave the most relevant results.
Evaluation model of distribution network development based on ANP and grey correlation analysis
NASA Astrophysics Data System (ADS)
Ma, Kaiqiang; Zhan, Zhihong; Zhou, Ming; Wu, Qiang; Yan, Jun; Chen, Genyong
2018-06-01
The existing distribution network evaluation system cannot scientifically and comprehensively reflect the distribution network development status. Furthermore, the evaluation model is monotonous and it is not suitable for horizontal analysis of many regional power grids. For these reason, this paper constructs a set of universal adaptability evaluation index system and model of distribution network development. Firstly, distribution network evaluation system is set up by power supply capability, power grid structure, technical equipment, intelligent level, efficiency of the power grid and development benefit of power grid. Then the comprehensive weight of indices is calculated by combining the AHP with the grey correlation analysis. Finally, the index scoring function can be obtained by fitting the index evaluation criterion to the curve, and then using the multiply plus operator to get the result of sample evaluation. The example analysis shows that the model can reflect the development of distribution network and find out the advantages and disadvantages of distribution network development. Besides, the model provides suggestions for the development and construction of distribution network.
Baguena, N; Thomas-Antérion, C; Sciessere, K; Truche, A; Extier, C; Guyot, E; Paris, N
2006-06-01
Assessment of executive functions in an everyday life activity, evaluating brain injury subjects with script generation and execution tasks. We compared a script generation task to a script execution task, whereby subjects had to make a cooked dish. Two grids were used for the quotation, qualitative and quantitative, as well as the calculation of an anosognosis score. We checked whether the execution task was more sensitive to a dysexecutive disorder than the script generation task and compared the scores obtained in this evaluation with those from classical frontal tests. Twelve subjects with brain injury 6 years+/-4.79 ago and 12 healthy control subjects were tested. The subjects carried out a script generation task whereby they had to explain the necessary stages to make a chocolate cake. They also had to do a script execution task corresponding to the cake making. The 2 quotation grids were operational and complementary. The quantitative grid is more sensitive to a dysexecutive disorder. The brain injury subjects made more errors in the execution task. It is important to evaluate the executive functions of subjects with brain injury in everyday life tasks, not just in psychometric or script-generation tests. Indeed the ecological realization of a very simple task can reveal executive function difficulties such as the planning or the sequencing of actions, which are under-evaluated in laboratory tests.
The Moss Flora of Akdağ Mountain (Amasya, Turkey)
Canli, Kerem; Çetin, Barbaros
2014-01-01
The moss flora of Akdağ Mountain (Amasya, Turkey) was investigated. At the result of identifications of 1500 moss specimens, collected from the research area, 178 taxa belonging to 69 genera and 26 families were determined. Among them, 94 taxa are new for A3 grid square according to the Turkey grid system which was adopted by Henderson. The location data of Grimmia crinitoleucophaea Cardot and Barbula enderesii Garov. are the first records for Turkey, and Encalypta spathulata Müll. Hal., Schistidium dupretii (Thér.) W. A. Weber, Weissia condensa var. armata (Thér. & Trab.) M. J. Cano, Ros & J. Guerra, Tortella bambergeri (Schimp.), Barbula enderesii Garov., Hedwigia ciliata var. leucophaea Bruch & Schimp., and Campyliadelphus elodes (Lindb.) Kanda are recorded for the second time to the byroflora of Turkey. PMID:25587573
NASA Astrophysics Data System (ADS)
Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.
2017-12-01
An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the spatial resolution of the grid, the size of the FOV and the on-target spacing of observations. Our approach may be applicable and beneficial for many existing and future point-based planetary datasets.
Svetnik, Vladimir; Ma, Junshui; Soper, Keith A.; Doran, Scott; Renger, John J.; Deacon, Steve; Koblan, Ken S.
2007-01-01
Objective: To evaluate the performance of 2 automated systems, Morpheus and Somnolyzer24X7, with various levels of human review/editing, in scoring polysomnographic (PSG) recordings from a clinical trial using zolpidem in a model of transient insomnia. Methods: 164 all-night PSG recordings from 82 subjects collected during 2 nights of sleep, one under placebo and one under zolpidem (10 mg) treatment were used. For each recording, 6 different methods were used to provide sleep stage scores based on Rechtschaffen & Kales criteria: 1) full manual scoring, 2) automated scoring by Morpheus 3) automated scoring by Somnolyzer24X7, 4) automated scoring by Morpheus with full manual review, 5) automated scoring by Morpheus with partial manual review, 6) automated scoring by Somnolyzer24X7 with partial manual review. Ten traditional clinical efficacy measures of sleep initiation, maintenance, and architecture were calculated. Results: Pair-wise epoch-by-epoch agreements between fully automated and manual scores were in the range of intersite manual scoring agreements reported in the literature (70%-72%). Pair-wise epoch-by-epoch agreements between automated scores manually reviewed were higher (73%-76%). The direction and statistical significance of treatment effect sizes using traditional efficacy endpoints were essentially the same whichever method was used. As the degree of manual review increased, the magnitude of the effect size approached those estimated with fully manual scoring. Conclusion: Automated or semi-automated sleep PSG scoring offers valuable alternatives to costly, time consuming, and intrasite and intersite variable manual scoring, especially in large multicenter clinical trials. Reduction in scoring variability may also reduce the sample size of a clinical trial. Citation: Svetnik V; Ma J; Soper KA; Doran S; Renger JJ; Deacon S; Koblan KS. Evaluation of automated and semi-automated scoring of polysomnographic recordings from a clinical trial using zolpidem in the treatment of insomnia. SLEEP 2007;30(11):1562-1574. PMID:18041489
Complications of invasive video-EEG monitoring with subdural grid electrodes.
Hamer, H M; Morris, H H; Mascha, E J; Karafa, M T; Bingaman, W E; Bej, M D; Burgess, R C; Dinner, D S; Foldvary, N R; Hahn, J F; Kotagal, P; Najm, I; Wyllie, E; Lüders, H O
2002-01-08
To evaluate the risk factors, type, and frequency of complications during video-EEG monitoring with subdural grid electrodes. The authors retrospectively reviewed the records of all patients who underwent invasive monitoring with subdural grid electrodes (n = 198 monitoring sessions on 187 patients; median age: 24 years; range: 1 to 50 years) at the Cleveland Clinic Foundation from 1980 to 1997. From 1980 to 1997, the complication rate decreased (p = 0.003). In the last 5 years, 19/99 patients (19%) had complications, including two patients (2%) with permanent sequelae. In the last 3 years, the complication rate was 13.5% (n = 5/37) without permanent deficits. Overall, complications occurred during 52 monitoring sessions (26.3%): infection (n = 24; 12.1%), transient neurologic deficit (n = 22; 11.1%), epidural hematoma (n = 5; 2.5%), increased intracranial pressure (n = 5; 2.5%), and infarction (n = 3; 1.5%). One patient (0.5%) died during grid insertion. Complication occurrence was associated with greater number of grids/electrodes (p = 0.021/p = 0.052; especially >60 electrodes), longer duration of monitoring (p = 0.004; especially >10 days), older age of the patient (p = 0.005), left-sided grid insertion (p = 0.01), and burr holes in addition to the craniotomy (p = 0.022). No association with complications was found for number of seizures, IQ, anticonvulsants, or grid localization. Invasive monitoring with grid electrodes was associated with significant complications. Most of them were transient. Increased complication rates were related to left-sided grid insertion and longer monitoring with a greater number of electrodes (especially more than 60 electrodes). Improvements in grid technology, surgical technique, and postoperative care resulted in significant reductions in the complication rate.
Grid-cell representations in mental simulation
Bellmund, Jacob LS; Deuker, Lorena; Navarro Schröder, Tobias; Doeller, Christian F
2016-01-01
Anticipating the future is a key motif of the brain, possibly supported by mental simulation of upcoming events. Rodent single-cell recordings suggest the ability of spatially tuned cells to represent subsequent locations. Grid-like representations have been observed in the human entorhinal cortex during virtual and imagined navigation. However, hitherto it remains unknown if grid-like representations contribute to mental simulation in the absence of imagined movement. Participants imagined directions between building locations in a large-scale virtual-reality city while undergoing fMRI without re-exposure to the environment. Using multi-voxel pattern analysis, we provide evidence for representations of absolute imagined direction at a resolution of 30° in the parahippocampal gyrus, consistent with the head-direction system. Furthermore, we capitalize on the six-fold rotational symmetry of grid-cell firing to demonstrate a 60° periodic pattern-similarity structure in the entorhinal cortex. Our findings imply a role of the entorhinal grid-system in mental simulation and future thinking beyond spatial navigation. DOI: http://dx.doi.org/10.7554/eLife.17089.001 PMID:27572056
Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B.; Kirkman, M. Sue; Kovatchev, Boris
2014-01-01
Introduction: Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. Methods: A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. Results: SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to the data plotted on the CEG and PEG produced risk estimates that were more granular and reflective of a continuously increasing risk scale. Discussion: The SEG is a modern metric for clinical risk assessments of BG monitor errors that assigns a unique risk score to each monitor data point when compared to a reference value. The SEG allows the clinical accuracy of a BG monitor to be portrayed in many ways, including as the percentages of data points falling into custom-defined risk zones. For modeled data the SEG, compared with the CEG and PEG, allows greater precision for quantifying risk, especially when the risks are low. This tool will be useful to allow regulators and manufacturers to monitor and evaluate glucose monitor performance in their surveillance programs. PMID:25562886
Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris
2014-07-01
Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to the data plotted on the CEG and PEG produced risk estimates that were more granular and reflective of a continuously increasing risk scale. The SEG is a modern metric for clinical risk assessments of BG monitor errors that assigns a unique risk score to each monitor data point when compared to a reference value. The SEG allows the clinical accuracy of a BG monitor to be portrayed in many ways, including as the percentages of data points falling into custom-defined risk zones. For modeled data the SEG, compared with the CEG and PEG, allows greater precision for quantifying risk, especially when the risks are low. This tool will be useful to allow regulators and manufacturers to monitor and evaluate glucose monitor performance in their surveillance programs. © 2014 Diabetes Technology Society.
Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E D
2017-05-01
Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined "norm." Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team.
Kellis, Spencer; Sorensen, Larry; Darvas, Felix; Sayres, Conor; O'Neill, Kevin; Brown, Richard B; House, Paul; Ojemann, Jeff; Greger, Bradley
2016-01-01
Electrocorticography grids have been used to study and diagnose neural pathophysiology for over 50 years, and recently have been used for various neural prosthetic applications. Here we provide evidence that micro-scale electrodes are better suited for studying cortical pathology and function, and for implementing neural prostheses. This work compares dynamics in space, time, and frequency of cortical field potentials recorded by three types of electrodes: electrocorticographic (ECoG) electrodes, non-penetrating micro-ECoG (μECoG) electrodes that use microelectrodes and have tighter interelectrode spacing; and penetrating microelectrodes (MEA) that penetrate the cortex to record single- or multiunit activity (SUA or MUA) and local field potentials (LFP). While the finest spatial scales are found in LFPs recorded intracortically, we found that LFP recorded from μECoG electrodes demonstrate scales of linear similarity (i.e., correlation, coherence, and phase) closer to the intracortical electrodes than the clinical ECoG electrodes. We conclude that LFPs can be recorded intracortically and epicortically at finer scales than clinical ECoG electrodes are capable of capturing. Recorded with appropriately scaled electrodes and grids, field potentials expose a more detailed representation of cortical network activity, enabling advanced analyses of cortical pathology and demanding applications such as brain-computer interfaces. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
76 FR 3089 - Roundtable on Federal Government Engagement in Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... of a Smart Grid, secure and interoperable electronic health records, cybersecurity, cloud computing... government engage in sectors where there is a compelling national interest? How are existing public- private...
Fast, deep record length, time-resolved visible spectroscopy of plasmas using fiber grids
NASA Astrophysics Data System (ADS)
Brockington, Samuel; Case, Andrew; Cruz, Edward; Witherspoon, F. Douglas; Horton, Robert; Klauser, Ruth; Hwang, D. Q.
2016-10-01
HyperV Technologies is developing a fiber-coupled, deep-record-length, low-light camera head for performing high time resolution spectroscopy on visible emission from plasma events. New solid-state Silicon Photo-Multiplier (SiPM) chips are capable of single photon event detection and high speed data acquisition. By coupling the output of a spectrometer to an imaging fiber bundle connected to a bank of amplified SiPMs, time-resolved spectroscopic imagers of 100 to 1,000 pixels can be constructed. Target pixel performance is 10 Megaframes/sec with record lengths of up to 256,000 frames yielding 25.6 milliseconds of record at10 Megasamples/sec resolution. Pixel resolutions of 8 to 12 bits are pos- sible. Pixel pitch can be refined by using grids of 100 μm to 1000 μm diameter fibers. A prototype 32-pixel spectroscopic imager employing this technique was constructed and successfully tested at the University of California at Davis Compact Toroid Injection Experiment (CTIX) as a full demonstration of the concept. Experimental results will be dis-cussed, along with future plans for the Phase 2 project, and potential applications to plasma experiments . Work supported by USDOE SBIR Grant DE-SC0013801.
The effects of blood vessels on electrocorticography
NASA Astrophysics Data System (ADS)
Bleichner, M. G.; Vansteensel, M. J.; Huiskamp, G. M.; Hermes, D.; Aarnoutse, E. J.; Ferrier, C. H.; Ramsey, N. F.
2011-08-01
Electrocorticography, primarily used in a clinical context, is becoming increasingly important for fundamental neuroscientific research, as well as for brain-computer interfaces. Recordings from these implanted electrodes have a number of advantages over non-invasive recordings in terms of band width, spatial resolution, smaller vulnerability to artifacts and overall signal quality. However, an unresolved issue is that signals vary greatly across electrodes. Here, we examine the effect of blood vessels lying between an electrode and the cortex on signals recorded from subdural grid electrodes. Blood vessels of different sizes cover extensive parts of the cortex causing variations in the electrode-cortex connection across grids. The power spectral density of electrodes located on the cortex and electrodes located on blood vessels obtained from eight epilepsy patients is compared. We find that blood vessels affect the power spectral density of the recorded signal in a frequency-band-specific way, in that frequencies between 30 and 70 Hz are attenuated the most. Here, the signal is attenuated on average by 30-40% compared to electrodes directly on the cortex. For lower frequencies this attenuation effect is less pronounced. We conclude that blood vessels influence the signal properties in a non-uniform manner.
Quality of qualitative studies centred on patients in family practice: a systematic review.
Cambon, Benoit; Vorilhon, Philippe; Michel, Laurence; Cadwallader, Jean-Sébastien; Aubin-Auger, Isabelle; Pereira, Bruno; Vaillant Roussel, Hélène
2016-12-01
Qualitative research is often used in the field of general medicine. Our objective was to evaluate the quality of published qualitative studies conducted using individual interviews or focus groups centred on patients monitored in general practice. We have undertaken a review of the literature in the PubMed and Embase databases of articles up to February 2014. The selection criteria were qualitative studies conducted using individual interviews or focus groups, centred on patients monitored in general practice. The articles chosen were analysed and evaluated using a score established from the Relevance, Appropriateness, Transparency and Soundness (RATS) grid. The average score of the 52 studies chosen was 28 out of 42. The criteria least often present were the description of the patients who chose not to participate in the study, the justification of the end of data collection, the discussion of the influence of the researchers and the discussion of the confidentiality of the data. The criteria most frequently described were an explicit research question, justified and in relation to existing knowledge, the agreement of the ethical committee and the presence of quotations. The number of studies and the score increased from year-to-year. The score was independent of the impact factor of the journal. Even though the qualitative research was published in reviews with a low impact factor, our results suggest that this research responded to the quality criteria of the RATS grid. The evaluation scored using RATS could be useful for authors or reviewers and for literature reviews. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Finan, Patrick H; Richards, Jessica M; Gamaldo, Charlene E; Han, Dingfen; Leoutsakos, Jeannie Marie; Salas, Rachel; Irwin, Michael R; Smith, Michael T
2016-11-15
To evaluate the validity of an ambulatory electroencephalographic (EEG) monitor for the estimation of sleep continuity and architecture in healthy adults. Healthy, good sleeping participants (n = 14) were fit with both an ambulatory EEG monitor (Sleep Profiler) and a full polysomnography (PSG) montage. EEG recordings were gathered from both devices on the same night, during which sleep was permitted uninterrupted for eight hours. The study was set in an inpatient clinical research suite. PSG and Sleep Profiler records were scored by a neurologist board certified in sleep medicine, blinded to record identification. Agreement between the scored PSG record, the physician-scored Sleep Profiler record, and the Sleep Profiler record scored by an automatic algorithm was evaluated for each sleep stage, with the PSG record serving as the reference. Results indicated strong percent agreement across stages. Kappa was strongest for Stage N3 and REM. Specificity was high for all stages; sensitivity was low for Wake and Stage N1, and high for Stage N2, Stage N3, and REM. Agreement indices improved for the manually scored Sleep Profiler record relative to the autoscore record. Overall, the Sleep Profiler yields an EEG record with comparable sleep architecture estimates to PSG. Future studies should evaluate agreement between devices with a clinical sample that has greater periods of wake in order to better understand utility of this device for estimating sleep continuity indices, such as sleep onset latency and wake after sleep onset. © 2016 American Academy of Sleep Medicine
Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K
2010-12-01
Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.
Mulas, Marcello; Waniek, Nicolai; Conradt, Jörg
2016-01-01
After the discovery of grid cells, which are an essential component to understand how the mammalian brain encodes spatial information, three main classes of computational models were proposed in order to explain their working principles. Amongst them, the one based on continuous attractor networks (CAN), is promising in terms of biological plausibility and suitable for robotic applications. However, in its current formulation, it is unable to reproduce important electrophysiological findings and cannot be used to perform path integration for long periods of time. In fact, in absence of an appropriate resetting mechanism, the accumulation of errors over time due to the noise intrinsic in velocity estimation and neural computation prevents CAN models to reproduce stable spatial grid patterns. In this paper, we propose an extension of the CAN model using Hebbian plasticity to anchor grid cell activity to environmental landmarks. To validate our approach we used as input to the neural simulations both artificial data and real data recorded from a robotic setup. The additional neural mechanism can not only anchor grid patterns to external sensory cues but also recall grid patterns generated in previously explored environments. These results might be instrumental for next generation bio-inspired robotic navigation algorithms that take advantage of neural computation in order to cope with complex and dynamic environments. PMID:26924979
NASA Astrophysics Data System (ADS)
Qu, Yue; Slootsky, Michael; Forrest, Stephen
2015-10-01
We demonstrate a method for extracting waveguided light trapped in the organic and indium tin oxide layers of bottom emission organic light emitting devices (OLEDs) using a patterned planar grid layer (sub-anode grid) between the anode and the substrate. The scattering layer consists of two transparent materials with different refractive indices on a period sufficiently large to avoid diffraction and other unwanted wavelength-dependent effects. The position of the sub-anode grid outside of the OLED active region allows complete freedom in varying its dimensions and materials from which it is made without impacting the electrical characteristics of the device itself. Full wave electromagnetic simulation is used to study the efficiency dependence on refractive indices and geometric parameters of the grid. We show the fabrication process and characterization of OLEDs with two different grids: a buried sub-anode grid consisting of two dielectric materials, and an air sub-anode grid consisting of a dielectric material and gridline voids. Using a sub-anode grid, substrate plus air modes quantum efficiency of an OLED is enhanced from (33+/-2)% to (40+/-2)%, resulting in an increase in external quantum efficiency from (14+/-1)% to (18+/-1)%, with identical electrical characteristics to that of a conventional device. By varying the thickness of the electron transport layer (ETL) of sub-anode grid OLEDs, we find that all power launched into the waveguide modes is scattered into substrate. We also demonstrate a sub-anode grid combined with a thick ETL significantly reduces surface plasmon polaritons, and results in an increase in substrate plus air modes by a >50% compared with a conventional OLED. The wavelength, viewing angle and molecular orientational independence provided by this approach make this an attractive and general solution to the problem of extracting waveguided light and reducing plasmon losses in OLEDs.
An efficient grid layout algorithm for biological networks utilizing various biological attributes
Kojima, Kaname; Nagasaki, Masao; Jeong, Euna; Kato, Mitsuru; Miyano, Satoru
2007-01-01
Background Clearly visualized biopathways provide a great help in understanding biological systems. However, manual drawing of large-scale biopathways is time consuming. We proposed a grid layout algorithm that can handle gene-regulatory networks and signal transduction pathways by considering edge-edge crossing, node-edge crossing, distance measure between nodes, and subcellular localization information from Gene Ontology. Consequently, the layout algorithm succeeded in drastically reducing these crossings in the apoptosis model. However, for larger-scale networks, we encountered three problems: (i) the initial layout is often very far from any local optimum because nodes are initially placed at random, (ii) from a biological viewpoint, human layouts still exceed automatic layouts in understanding because except subcellular localization, it does not fully utilize biological information of pathways, and (iii) it employs a local search strategy in which the neighborhood is obtained by moving one node at each step, and automatic layouts suggest that simultaneous movements of multiple nodes are necessary for better layouts, while such extension may face worsening the time complexity. Results We propose a new grid layout algorithm. To address problem (i), we devised a new force-directed algorithm whose output is suitable as the initial layout. For (ii), we considered that an appropriate alignment of nodes having the same biological attribute is one of the most important factors of the comprehension, and we defined a new score function that gives an advantage to such configurations. For solving problem (iii), we developed a search strategy that considers swapping nodes as well as moving a node, while keeping the order of the time complexity. Though a naïve implementation increases by one order, the time complexity, we solved this difficulty by devising a method that caches differences between scores of a layout and its possible updates. Conclusion Layouts of the new grid layout algorithm are compared with that of the previous algorithm and human layout in an endothelial cell model, three times as large as the apoptosis model. The total cost of the result from the new grid layout algorithm is similar to that of the human layout. In addition, its convergence time is drastically reduced (40% reduction). PMID:17338825
Lausch, V; Hermann, P; Laue, M; Bannert, N
2014-06-01
Successive application of negative staining transmission electron microscopy (TEM) and tip-enhanced Raman spectroscopy (TERS) is a new correlative approach that could be used to rapidly and specifically detect and identify single pathogens including bioterrorism-relevant viruses in complex samples. Our objective is to evaluate the TERS-compatibility of commonly used electron microscopy (EM) grids (sample supports), chemicals and negative staining techniques and, if required, to devise appropriate alternatives. While phosphortungstic acid (PTA) is suitable as a heavy metal stain, uranyl acetate, paraformaldehyde in HEPES buffer and alcian blue are unsuitable due to their relatively high Raman scattering. Moreover, the low thermal stability of the carbon-coated pioloform film on copper grids (pioloform grids) negates their utilization. The silicon in the cantilever of the silver-coated atomic force microscope tip used to record TERS spectra suggested that Si-based grids might be employed as alternatives. From all evaluated Si-based TEM grids, the silicon nitride (SiN) grid was found to be best suited, with almost no background Raman signals in the relevant spectral range, a low surface roughness and good particle adhesion properties that could be further improved by glow discharge. Charged SiN grids have excellent particle adhesion properties. The use of these grids in combination with PTA for contrast in the TEM is suitable for subsequent analysis by TERS. The study reports fundamental modifications and optimizations of the negative staining EM method that allows a combination with near-field Raman spectroscopy to acquire a spectroscopic signature from nanoscale biological structures. This should facilitate a more precise diagnosis of single viral particles and other micro-organisms previously localized and visualized in the TEM. © 2014 The Society for Applied Microbiology.
Methodological Caveats in the Detection of Coordinated Replay between Place Cells and Grid Cells.
Trimper, John B; Trettel, Sean G; Hwaun, Ernie; Colgin, Laura Lee
2017-01-01
At rest, hippocampal "place cells," neurons with receptive fields corresponding to specific spatial locations, reactivate in a manner that reflects recently traveled trajectories. These "replay" events have been proposed as a mechanism underlying memory consolidation, or the transfer of a memory representation from the hippocampus to neocortical regions associated with the original sensory experience. Accordingly, it has been hypothesized that hippocampal replay of a particular experience should be accompanied by simultaneous reactivation of corresponding representations in the neocortex and in the entorhinal cortex, the primary interface between the hippocampus and the neocortex. Recent studies have reported that coordinated replay may occur between hippocampal place cells and medial entorhinal cortex grid cells, cells with multiple spatial receptive fields. Assessing replay in grid cells is problematic, however, as the cells exhibit regularly spaced spatial receptive fields in all environments and, therefore, coordinated replay between place cells and grid cells may be detected by chance. In the present report, we adapted analytical approaches utilized in recent studies of grid cell and place cell replay to determine the extent to which coordinated replay is spuriously detected between grid cells and place cells recorded from separate rats. For a subset of the employed analytical methods, coordinated replay was detected spuriously in a significant proportion of cases in which place cell replay events were randomly matched with grid cell firing epochs of equal duration. More rigorous replay evaluation procedures and minimum spike count requirements greatly reduced the amount of spurious findings. These results provide insights into aspects of place cell and grid cell activity during rest that contribute to false detection of coordinated replay. The results further emphasize the need for careful controls and rigorous methods when testing the hypothesis that place cells and grid cells exhibit coordinated replay.
Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E.D.
2017-01-01
Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined “norm.” Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team. PMID:27832032
Wood, T J; Avery, G; Balcam, S; Needler, L; Smith, A; Saunderson, J R; Beavis, A W
2015-01-01
Objective: The aim of this study was to investigate via simulation a proposed change to clinical practice for chest radiography. The validity of using a scatter rejection grid across the diagnostic energy range (60–125 kVp), in conjunction with appropriate tube current–time product (mAs) for imaging with a computed radiography (CR) system was investigated. Methods: A digitally reconstructed radiograph algorithm was used, which was capable of simulating CR chest radiographs with various tube voltages, receptor doses and scatter rejection methods. Four experienced image evaluators graded images with a grid (n = 80) at tube voltages across the diagnostic energy range and varying detector air kermas. These were scored against corresponding images reconstructed without a grid, as per current clinical protocol. Results: For all patients, diagnostic image quality improved with the use of a grid, without the need to increase tube mAs (and therefore patient dose), irrespective of the tube voltage used. Increasing tube mAs by an amount determined by the Bucky factor made little difference to image quality. Conclusion: A virtual clinical trial has been performed with simulated chest CR images. Results indicate that the use of a grid improves diagnostic image quality for average adults, without the need to increase tube mAs, even at low tube voltages. Advances in knowledge: Validated with images containing realistic anatomical noise, it is possible to improve image quality by utilizing grids for chest radiography with CR systems without increasing patient exposure. Increasing tube mAs by an amount determined by the Bucky factor is not justified. PMID:25571914
2007-07-02
TYPE Final Report 3. DATES COVERED (From - To) 26-Sep-01 to 26-Jun-07 4. TITLE AND SUBTITLE OBTAINING UNIQUE, COMPREHENSIVE DEEP SEISMIC ... seismic records from 12 major Deep Seismic Sounding (DSS) projects acquired in 1970-1980’s in the former Soviet Union. The data include 3-component...records from 22 Peaceful Nuclear Explosions (PNEs) and over 500 chemical explosions recorded by a grid of linear, reversed seismic profiles covering a
Narrower grid structure of artificial reef enhances initial survival of in situ settled coral.
Suzuki, Go; Kai, Sayaka; Yamashita, Hiroshi; Suzuki, Kiyoshi; Iehisa, Yukihiro; Hayashibara, Takeshi
2011-12-01
The initial factors that cause a decline in the survival of in situ settled corals remain poorly understood. In this study, we demonstrated through field experiments that the design of artificial grid plates may influence the initial survival of Acropora corals, with narrower grids being the most effective. In fact, grid plates with a 2.5-cm mesh presented the highest recorded survival rate (14%) at 6 months after settlement (representing approximately 50 corals per 0.25 m(2) of plate). This is the first study where such high survival rates, matching those of cultures under aquarium conditions, were obtained in the field without using additional protective measures, such as guard nets against fish grazing after seeding. Therefore, our results provide a foundation for establishing new and effective coral restoration techniques for larval seeding, in parallel to clarifying the details of the early life stages of reef-building corals. Copyright © 2011 Elsevier Ltd. All rights reserved.
Patient doses from chest radiography in Victoria.
Cardillo, I; Boal, T J; Einsiedel, P F
1997-06-01
This survey examines doses from PA chest radiography at radiology practices, private hospitals and public hospitals throughout metropolitan and country Victoria. Data were collected from 111 individual X-ray units at 86 different practices. Entrance skin doses in air were measured for exposure factors used by the centre for a 23 cm thick male chest. A CDRH LucA1 chest phantom was used when making these measurements. About half of the centres used grid technique and half used non-grid technique. There was a factor of greater than 10 difference in the entrance dose delivered between the highest dose centre and the lowest dose centre for non-grid centres; and a factor of about 5 for centres using grids. Factors contributing to the high doses recorded at some centres were identified. Guidance levels for chest radiography based on the third quartile value of the entrance doses from this survey have been recommended and compared with guidance levels recommended in other countries.
Local transformations of the hippocampal cognitive map.
Krupic, Julija; Bauza, Marius; Burton, Stephen; O'Keefe, John
2018-03-09
Grid cells are neurons active in multiple fields arranged in a hexagonal lattice and are thought to represent the "universal metric for space." However, they become nonhomogeneously distorted in polarized enclosures, which challenges this view. We found that local changes to the configuration of the enclosure induce individual grid fields to shift in a manner inversely related to their distance from the reconfigured boundary. The grid remained primarily anchored to the unchanged stable walls and showed a nonuniform rescaling. Shifts in simultaneously recorded colocalized grid fields were strongly correlated, which suggests that the readout of the animal's position might still be intact. Similar field shifts were also observed in place and boundary cells-albeit of greater magnitude and more pronounced closer to the reconfigured boundary-which suggests that there is no simple one-to-one relationship between these three different cell types. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
GPS Spoofing Attack Characterization and Detection in Smart Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blum, Rick S.; Pradhan, Parth; Nagananda, Kyatsandra
The problem of global positioning system (GPS) spoofing attacks on smart grids endowed with phasor measurement units (PMUs) is addressed, taking into account the dynamical behavior of the states of the system. First, it is shown how GPS spoofing introduces a timing synchronization error in the phasor readings recorded by the PMUs and alters the measurement matrix of the dynamical model. Then, a generalized likelihood ratio-based hypotheses testing procedure is devised to detect changes in the measurement matrix when the system is subjected to a spoofing attack. Monte Carlo simulations are performed on the 9-bus, 3-machine test grid to demonstratemore » the implication of the spoofing attack on dynamic state estimation and to analyze the performance of the proposed hypotheses test.« less
Development and validation of the FRAGIRE tool for assessment an older person's risk for frailty.
Vernerey, Dewi; Anota, Amelie; Vandel, Pierre; Paget-Bailly, Sophie; Dion, Michele; Bailly, Vanessa; Bonin, Marie; Pozet, Astrid; Foubert, Audrey; Benetkiewicz, Magdalena; Manckoundia, Patrick; Bonnetain, Franck
2016-11-17
Frailty is highly prevalent in elderly people. While significant progress has been made to understand its pathogenesis process, few validated questionnaire exist to assess the multidimensional concept of frailty and to detect people frail or at risk to become frail. The objectives of this study were to construct and validate a new frailty-screening instrument named Frailty Groupe Iso-Ressource Evaluation (FRAGIRE) that accurately predicts the risk for frailty in older adults. A prospective multicenter recruitment of the elderly patients was undertaken in France. The subjects were classified into financially-helped group (FH, with financial assistance) and non-financially helped group (NFH, without any financial assistance), considering FH subjects are more frail than the NFH group and thus representing an acceptable surrogate population for frailty. Psychometric properties of the FRAGIRE grid were assessed including discrimination between the FH and NFH groups. Items reduction was made according to statistical analyses and experts' point of view. The association between items response and tests with "help requested status" was assessed in univariate and multivariate unconditional logistic regression analyses and a prognostic score to become frail was finally proposed for each subject. Between May 2013 and July 2013, 385 subjects were included: 338 (88%) in the FH group and 47 (12%) in the NFH group. The initial FRAGIRE grid included 65 items. After conducting the item selection, the final grid of the FRAGIRE was reduced to 19 items. The final grid showed fair discrimination ability to predict frailty (area under the curve (AUC) = 0.85) and good calibration (Hosmer-Lemeshow P-value = 0.580), reflecting a good agreement between the prediction by the final model and actual observation. The Cronbach's alpha for the developed tool scored as high as 0.69 (95% Confidence Interval: 0.64 to 0.74). The final prognostic score was excellent, with an AUC of 0.756. Moreover, it facilitated significant separation of patients into individuals requesting for help from others (P-value < 0.0001), with sensitivity of 81%, specificity of 61%, positive predictive value of 93%, negative predictive value of 34%, and a global predictive value of 78%. The FRAGIRE seems to have considerable potential as a reliable and effective tool for identifying frail elderly individuals by a public health social worker without medical training.
NASA Astrophysics Data System (ADS)
Davis, Sean M.; Rosenlof, Karen H.; Hassler, Birgit; Hurst, Dale F.; Read, William G.; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert
2016-09-01
In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a "reference" satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at doi:10.7289/V5TD9VBX.
Davis, Sean M.; Rosenlof, Karen H.; Hassler, Birgit; Hurst, Dale F.; Read, William G.; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert
2017-01-01
In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a “reference” satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at doi:10.7289/V5TD9VBX. PMID:28966693
Davis, Sean M; Rosenlof, Karen H; Hassler, Birgit; Hurst, Dale F; Read, William G; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert
2016-01-01
In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a "reference" satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at doi:10.7289/V5TD9VBX.
Assuring the privacy and security of transmitting sensitive electronic health information.
Peng, Charlie; Kesarinath, Gautam; Brinks, Tom; Young, James; Groves, David
2009-11-14
The interchange of electronic health records between healthcare providers and public health organizations has become an increasingly desirable tool in reducing healthcare costs, improving healthcare quality, and protecting population health. Assuring privacy and security in nationwide sharing of Electronic Health Records (EHR) in an environment such as GRID has become a top challenge and concern. The Centers for Disease Control and Prevention's (CDC) and The Science Application International Corporation (SAIC) have jointly conducted a proof of concept study to find and build a common secure and reliable messaging platform (the SRM Platform) to handle this challenge. The SRM Platform is built on the open standards of OASIS, World Wide Web Consortium (W3C) web-services standards, and Web Services Interoperability (WS-I) specifications to provide the secure transport of sensitive EHR or electronic medical records (EMR). Transmitted data may be in any digital form including text, data, and binary files, such as images. This paper identifies the business use cases, architecture, test results, and new connectivity options for disparate health networks among PHIN, NHIN, Grid, and others.
Brinberg, Miriam; Fosco, Gregory M; Ram, Nilam
2017-12-01
Family systems theorists have forwarded a set of theoretical principles meant to guide family scientists and practitioners in their conceptualization of patterns of family interaction-intra-family dynamics-that, over time, give rise to family and individual dysfunction and/or adaptation. In this article, we present an analytic approach that merges state space grid methods adapted from the dynamic systems literature with sequence analysis methods adapted from molecular biology into a "grid-sequence" method for studying inter-family differences in intra-family dynamics. Using dyadic data from 86 parent-adolescent dyads who provided up to 21 daily reports about connectedness, we illustrate how grid-sequence analysis can be used to identify a typology of intrafamily dynamics and to inform theory about how specific types of intrafamily dynamics contribute to adolescent behavior problems and family members' mental health. Methodologically, grid-sequence analysis extends the toolbox of techniques for analysis of family experience sampling and daily diary data. Substantively, we identify patterns of family level microdynamics that may serve as new markers of risk/protective factors and potential points for intervention in families. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
From grid cells to place cells with realistic field sizes
2017-01-01
While grid cells in the medial entorhinal cortex (MEC) of rodents have multiple, regularly arranged firing fields, place cells in the cornu ammonis (CA) regions of the hippocampus mostly have single spatial firing fields. Since there are extensive projections from MEC to the CA regions, many models have suggested that a feedforward network can transform grid cell firing into robust place cell firing. However, these models generate place fields that are consistently too small compared to those recorded in experiments. Here, we argue that it is implausible that grid cell activity alone can be transformed into place cells with robust place fields of realistic size in a feedforward network. We propose two solutions to this problem. Firstly, weakly spatially modulated cells, which are abundant throughout EC, provide input to downstream place cells along with grid cells. This simple model reproduces many place cell characteristics as well as results from lesion studies. Secondly, the recurrent connections between place cells in the CA3 network generate robust and realistic place fields. Both mechanisms could work in parallel in the hippocampal formation and this redundancy might account for the robustness of place cell responses to a range of disruptions of the hippocampal circuitry. PMID:28750005
El Mansouri, Bouchra; Amarir, Fatima; Hajli, Yamina; Fellah, Hajiba; Sebti, Faiza; Delouane, Bouchra; Sadak, Abderrahim; Adlaoui, El Bachir; Rhajaoui, Mohammed
2017-01-01
The aim of our study was to assess a standardized supervisory grid as a new supervision tool being used in the laboratories of leishmaniasis. We conducted a pilot trial to evaluate the ongoing performances of seven provincial laboratories, in four provinces in Morocco, over a period of two years, between 2006 and 2014. This study detailed the situation in provincial laboratories before and after the implementation of the supervisory grid. A total of twenty-one grids were analyzed. In 2006, the results clearly showed a poor performance of laboratories: need for training (41.6%), staff performing skin biopsy (25%), shortage of materials and reagents (65%), non-compliant document and local management (85%). Several corrective actions were conducted by the National Reference Laboratory (LNRL) of Leishmaniasis during the study period. In 2014, the LNRL recorded a net improvement of the performances of the laboratories. The need for training, the quality of the biopsy, the supply of tools and reagents were met and an effective coordination activity was established between the LNRL and the provincial laboratories. This trial shows the effectiveness of the grid as a high quality supervisory tool and as a cornerstone of making progress on fight programs against leishmaniases.
Working with interpreters: The challenges of introducing Option Grid patient decision aids.
Wood, Fiona; Phillips, Katie; Edwards, Adrian; Elwyn, Glyn
2017-03-01
We aimed to observe how an Option Grid™ decision aid for clinical encounters might be used where an interpreter is present, and to assess the impact of its use on shared decision making. Data were available from three clinical consultations between patient, clinician (a physiotherapist), and interpreter about knee osteoarthritis. Clinicians were trained in the use of an Option Grid decision aid and the tool was used. Consultations were audio-recorded, transcribed, and translated by independent translators into English. Analysis revealed the difficulties with introducing a written decision aid into an interpreted consultation. The extra discussion needed between the clinician and interpreter around the principles and purpose of shared decision making and instructions regarding the Option Grid decision aid proved challenging and difficult to manage. Discussion of treatment options while using an Option Grid decision aid was predominantly done between clinician and interpreter. The patient appeared to have little involvement in discussion of treatment options. Patients were not active participants within the discussion. Further work needs to be done on how shared decision making can be achieved within interpreted consultations. Option Grid decision aids are not being used as intended in interpreted consultations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A novel high nitrogen nickel-free coronary stents system: evaluation in a porcine model.
Zhang, Bin; Chen, Ming; Zheng, Bo; Wang, Xin Gang; Wang, Xi Ting; Fan, Yuan Yuan; Huo, Yong
2014-04-01
To study the safety of the novel high nitrogen nickel-free austenitic stainless steel bare metal stents (BMS) in a recognized porcine coronary model and to select a better grid structure of it. Three types of stents were randomly implanted in different coronary arteries of the same pig: 316 L stainless steel BMS (316 L-BMS) (n=12), novel high nitrogen nickel-free stents Grid A (NF-A-BMS) (n=12) and novel high nitrogen nickel-free stents Grid B (NF-B-BMS) (n=12). In total, eighteen animals underwent successful random placement of 36 oversized stents in the coronary arteries. Coronary angiography was performed after 36 d of stents implantation. Nine animals were respectively sacrificed after 14 d and 36 d for histomorphologic analysis. Quantitative coronary angiography (QCA) showed similar luminal loss (LL) in the three groups: (0.21 ± 0.17) mm for 316 L-BMS, (0.16 ± 0.12) mm for NF-A-BMS, (0.24 ± 0.15) mm for NF-B-BMS (P>0.05). Histomorphomeric analysis after 15 d and 36 d revealed that there was also no significant difference among the three groups in neointimal area (NA) with similar injury scores respectively. High magnification histomorphologic examination showed similar inflammation scores in the three groups, but NF-A-BMS group had poorer endothelialization scores compared with NF-B-BMS group, 2.00 ± 0.63 vs. 2.83 ± 0.41 (P=0.015) at 15 d, which also could be proved by the scanning electron microscope. However, the difference could not been observed at 36 d. The novel NF-BMS showed similar safety as 316 L-BMS during the short-term study. NF-B-BMS had better endothelialization than NF-A-BMS and this may owe to the specific strut units. Copyright © 2014 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.
SU-F-T-513: Dosimetric Validation of Spatially Fractionated Radiotherapy Using Gel Dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papanikolaou, P; Watts, L; Kirby, N
2016-06-15
Purpose: Spatially fractionated radiation therapy, also known as GRID therapy, is used to treat large solid tumors by irradiating the target to a single dose of 10–20Gy through spatially distributed beamlets. We have investigated the use of a 3D gel for dosimetric characterization of GRID therapy. Methods: GRID therapy is an external beam analog of volumetric brachytherapy, whereby we produce a distribution of hot and cold dose columns inside the tumor volume. Such distribution can be produced with a block or by using a checker-like pattern with MLC. We have studied both types of GRID delivery. A cube shaped acrylicmore » phantom was filled with polymer gel and served as a 3D dosimeter. The phantom was scanned and the CT images were used to produce two plans in Pinnacle, one with the grid block and one with the MLC defined grid. A 6MV beam was used for the plan with a prescription of 1500cGy at dmax. The irradiated phantom was scanned in a 3T MRI scanner. Results: 3D dose maps were derived from the MR scans of the gel dosimeter and were found to be in good agreement with the predicted dose distribution from the RTP system. Gamma analysis showed a passing rate of 93% for 5% dose and 2mm DTA scoring criteria. Both relative and absolute dose profiles are in good agreement, except in the peripheral beamlets where the gel measured slightly higher dose, possibly because of the changing head scatter conditions that the RTP is not fully accounting for. Our results have also been benchmarked against ionization chamber measurements. Conclusion: We have investigated the use of a polymer gel for the 3D dosimetric characterization and evaluation of GRID therapy. Our results demonstrated that the planning system can predict fairly accurately the dose distribution for GRID type therapy.« less
Grid cells on steeply sloping terrain: evidence for planar rather than volumetric encoding
Hayman, Robin M. A.; Casali, Giulio; Wilson, Jonathan J.; Jeffery, Kate J.
2015-01-01
Neural encoding of navigable space involves a network of structures centered on the hippocampus, whose neurons –place cells – encode current location. Input to the place cells includes afferents from the entorhinal cortex, which contains grid cells. These are neurons expressing spatially localized activity patches, or firing fields, that are evenly spaced across the floor in a hexagonal close-packed array called a grid. It is thought that grids function to enable the calculation of distances. The question arises as to whether this odometry process operates in three dimensions, and so we queried whether grids permeate three-dimensional (3D) space – that is, form a lattice – or whether they simply follow the environment surface. If grids form a 3D lattice then this lattice would ordinarily be aligned horizontally (to explain the usual hexagonal pattern observed). A tilted floor would transect several layers of this putative lattice, resulting in interruption of the hexagonal pattern. We model this prediction with simulated grid lattices, and show that the firing of a grid cell on a 40°-tilted surface should cover proportionally less of the surface, with smaller field size, fewer fields, and reduced hexagonal symmetry. However, recording of real grid cells as animals foraged on a 40°-tilted surface found that firing of grid cells was almost indistinguishable, in pattern or rate, from that on the horizontal surface, with if anything increased coverage and field number, and preserved field size. It thus appears unlikely that the sloping surface transected a lattice. However, grid cells on the slope displayed slightly degraded firing patterns, with reduced coherence and slightly reduced symmetry. These findings collectively suggest that the grid cell component of the metric representation of space is not fixed in absolute 3D space but is influenced both by the surface the animal is on and by the relationship of this surface to the horizontal, supporting the hypothesis that the neural map of space is “multi-planar” rather than fully volumetric. PMID:26236245
NASA Astrophysics Data System (ADS)
Vidmar, David; Narayan, Sanjiv M.; Krummen, David E.; Rappel, Wouter-Jan
2016-11-01
We present a general method of utilizing bioelectric recordings from a spatially sparse electrode grid to compute a dynamic vector field describing the underlying propagation of electrical activity. This vector field, termed the wave-front flow field, permits quantitative analysis of the magnitude of rotational activity (vorticity) and focal activity (divergence) at each spatial point. We apply this method to signals recorded during arrhythmias in human atria and ventricles using a multipolar contact catheter and show that the flow fields correlate with corresponding activation maps. Further, regions of elevated vorticity and divergence correspond to sites identified as clinically significant rotors and focal sources where therapeutic intervention can be effective. These flow fields can provide quantitative insights into the dynamics of normal and abnormal conduction in humans and could potentially be used to enhance therapies for cardiac arrhythmias.
Kessler, Richard; Strain, R.E.; Marlowe, J. I.; Currin, K.B.
1996-01-01
A ground-penetrating radar survey was conducted at the Monroe Crossroads Battlefield site at Fort Bragg, North Carolina, to determine possible locations of subsurface archaeological features. An electromagnetic survey also was conducted at the site to verify and augment the ground-penetrating radar data. The surveys were conducted over a 67,200-square-foot grid with a grid point spacing of 20 feet. During the ground-penetrating radar survey, 87 subsurface anomalies were detected based on visual inspection of the field records. These anomalies were flagged in the field as they appeared on the ground-penetrating radar records and were located by a land survey. The electromagnetic survey produced two significant readings at ground-penetrating radar anomaly locations. The National Park Service excavated 44 of the 87 anomaly locations at the Civil War battlefield site. Four of these excavations produced significant archaeological features, including one at an abandoned well.
Aristizabal, F; Nieto, J; Yamout, S; Snyder, J
2014-07-01
Obesity and gastric ulceration are highly prevalent in horses. Management modifications for preventing squamous gastric ulceration include frequent feeding and free access to pasture; however, these practices may predispose horses to obesity. To compare the percentage of hay consumed, intragastric pH and horse activity between feeding from the ground and a hay grid feeder. Crossover experimental study. A pH electrode was inserted into the stomach to record the intragastric pH for 48 h. Horses received 1% of their body weight in grass hay twice a day. Horses were assigned to be fed from the ground or a commercial hay grid feeder for 24 h and then switched to the opposite protocol for an additional 24 h. Horses were continuously video-recorded and the percentage of time spent eating or drinking, walking or standing, and lying down were calculated. Two point data were compared by paired t test and pH over time was compared by repeated measures ANOVA. Horses consumed significantly greater amounts of grass hay when fed on the ground compared with a hay grid feeder (n = 9; P<0.001). There were no significant differences between the groups for mean intragastric pH values (n = 6; P = 0.97), mean intragastric pH over time (n = 6; P = 0.45) the length of time the pH was below 4.0 (n = 6; P = 0.54), and the percentage of time horses spent eating or drinking (n = 9; P = 0.52), walking or standing (n = 9; P = 0.3), or lying down (n = 9; P = 0.4). Within each group horses spent more time eating during the day compared with the night (n = 9; hay grid feeder P = 0.003; ground feeding P = 0.007). The hay grid feeder studied may be used to reduce the amount of hay ingested by horses without reducing the time horses spend eating. © 2013 EVJ Ltd.
Perrin, Maxine; Robillard, Manon; Roy-Charland, Annie
2017-12-01
This study examined eye movements during a visual search task as well as cognitive abilities within three age groups. The aim was to explore scanning patterns across symbol grids and to better understand the impact of symbol location in AAC displays on speed and accuracy of symbol selection. For the study, 60 students were asked to locate a series of symbols on 16 cell grids. The EyeLink 1000 was used to measure eye movements, accuracy, and response time. Accuracy was high across all cells. Participants had faster response times, longer fixations, and more frequent fixations on symbols located in the middle of the grid. Group comparisons revealed significant differences for accuracy and reaction times. The Leiter-R was used to evaluate cognitive abilities. Sustained attention and cognitive flexibility scores predicted the participants' reaction time and accuracy in symbol selection. Findings suggest that symbol location within AAC devices and individuals' cognitive abilities influence the speed and accuracy of retrieving symbols.
On non-parametric maximum likelihood estimation of the bivariate survivor function.
Prentice, R L
The likelihood function for the bivariate survivor function F, under independent censorship, is maximized to obtain a non-parametric maximum likelihood estimator &Fcirc;. &Fcirc; may or may not be unique depending on the configuration of singly- and doubly-censored pairs. The likelihood function can be maximized by placing all mass on the grid formed by the uncensored failure times, or half lines beyond the failure time grid, or in the upper right quadrant beyond the grid. By accumulating the mass along lines (or regions) where the likelihood is flat, one obtains a partially maximized likelihood as a function of parameters that can be uniquely estimated. The score equations corresponding to these point mass parameters are derived, using a Lagrange multiplier technique to ensure unit total mass, and a modified Newton procedure is used to calculate the parameter estimates in some limited simulation studies. Some considerations for the further development of non-parametric bivariate survivor function estimators are briefly described.
NASA Astrophysics Data System (ADS)
Crespi, Alice; Brunetti, Michele; Maugeri, Maurizio
2017-04-01
The availability of gridded high-resolution spatial climatologies and corresponding secular records has acquired an increasing importance in the recent years both to research purposes and as decision-support tools in the management of natural resources and economical activities. High-resolution monthly precipitation climatologies for Italy were computed by gridding on a 30-arc-second-resolution Digital Elevation Model (DEM) the precipitation normals (1961-1990) obtained from a quality-controlled dataset of about 6200 stations covering the Italian surface and part of the Northern neighbouring regions. Starting from the assumption that the precipitation distribution is strongly influenced by orography, especially elevation, a local weighted linear regression (LWLR) of precipitation versus elevation was performed at each DEM cell. The regression coefficients for each cell were estimated by selecting the stations with the highest weights in which the distances and the level of similarity between the station cells and the considered grid cell, in terms of orographic features, are taken into account. An optimisation procedure was then set up in order to define, for each month and for each grid cell, the most suitable decreasing coefficients for the weighting factors which enter in the LWLR scheme. The model was validated by the comparison with the results provided by inverse distance weighting (IDW) applied both to station normals and to the residuals of a global regression of station normals versus elevation. In both cases, the LWLR leave-one-out reconstructions show the best agreement with the observed station normals, especially when considering specific station clusters (high elevation sites for example). After producing the high-resolution precipitation climatological field, the temporal component on the high-resolution grid was obtained by following the anomaly method. It is based on the assumption that the spatio-temporal structure of the signal of a meteorological variable over a certain area can be described by the superimposition of two independent fields: the climatologies and the anomalies, i.e. the departures from the normal values. The secular precipitation anomaly records were thus estimated for each cell of the grid by averaging the anomaly values of neighbouring stations, by means of Gaussian weighting functions, taking into account both the distance and the elevation differences between the stations and the considered grid cell. The local secular precipitation records were then obtained by multiplying the local estimated anomalies for the corresponding 1961-1990 normals. To compute the anomaly field, a different dataset was used by selecting the stations with the longest series and extending them both to the past, retrieving data from non-digitised archives, and to the more recent decades. In particular, after a careful procedure of updating, quality-check and homogenisation of series, this methodology was applied on two Italian areas characterised by very different orography: Sardinia region and the Alpine areas within Adda basin.
VP Structure of Mount St. Helens, Washington, USA, imaged with local earthquake tomography
Waite, G.P.; Moran, S.C.
2009-01-01
We present a new P-wave velocity model for Mount St. Helens using local earthquake data recorded by the Pacific Northwest Seismograph Stations and Cascades Volcano Observatory since the 18 May 1980 eruption. These data were augmented with records from a dense array of 19 temporary stations deployed during the second half of 2005. Because the distribution of earthquakes in the study area is concentrated beneath the volcano and within two nearly linear trends, we used a graded inversion scheme to compute a coarse-grid model that focused on the regional structure, followed by a fine-grid inversion to improve spatial resolution directly beneath the volcanic edifice. The coarse-grid model results are largely consistent with earlier geophysical studies of the area; we find high-velocity anomalies NW and NE of the edifice that correspond with igneous intrusions and a prominent low-velocity zone NNW of the edifice that corresponds with the linear zone of high seismicity known as the St. Helens Seismic Zone. This low-velocity zone may continue past Mount St. Helens to the south at depths below 5??km. Directly beneath the edifice, the fine-grid model images a low-velocity zone between about 2 and 3.5??km below sea level that may correspond to a shallow magma storage zone. And although the model resolution is poor below about 6??km, we found low velocities that correspond with the aseismic zone between about 5.5 and 8??km that has previously been modeled as the location of a large magma storage volume. ?? 2009 Elsevier B.V.
Jiménez-García, Brian; Pons, Carles; Fernández-Recio, Juan
2013-07-01
pyDockWEB is a web server for the rigid-body docking prediction of protein-protein complex structures using a new version of the pyDock scoring algorithm. We use here a new custom parallel FTDock implementation, with adjusted grid size for optimal FFT calculations, and a new version of pyDock, which dramatically speeds up calculations while keeping the same predictive accuracy. Given the 3D coordinates of two interacting proteins, pyDockWEB returns the best docking orientations as scored mainly by electrostatics and desolvation energy. The server does not require registration by the user and is freely accessible for academics at http://life.bsc.es/servlet/pydock. Supplementary data are available at Bioinformatics online.
Damage mapping in structural health monitoring using a multi-grid architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathews, V. John
2015-03-31
This paper presents a multi-grid architecture for tomography-based damage mapping of composite aerospace structures. The system employs an array of piezo-electric transducers bonded on the structure. Each transducer may be used as an actuator as well as a sensor. The structure is excited sequentially using the actuators and the guided waves arriving at the sensors in response to the excitations are recorded for further analysis. The sensor signals are compared to their baseline counterparts and a damage index is computed for each actuator-sensor pair. These damage indices are then used as inputs to the tomographic reconstruction system. Preliminary damage mapsmore » are reconstructed on multiple coordinate grids defined on the structure. These grids are shifted versions of each other where the shift is a fraction of the spatial sampling interval associated with each grid. These preliminary damage maps are then combined to provide a reconstruction that is more robust to measurement noise in the sensor signals and the ill-conditioned problem formulation for single-grid algorithms. Experimental results on a composite structure with complexity that is representative of aerospace structures included in the paper demonstrate that for sufficiently high sensor densities, the algorithm of this paper is capable of providing damage detection and characterization with accuracy comparable to traditional C-scan and A-scan-based ultrasound non-destructive inspection systems quickly and without human supervision.« less
Automatic measurement of skin textures of the dorsal hand in evaluating skin aging.
Gao, Qian; Yu, Jiaming; Wang, Fang; Ge, Tiantian; Hu, Liwen; Liu, Yang
2013-05-01
Changes in skin textures have been used to evaluate skin aging in many studies. In our previous study, we built some skin texture parameters, which can be used to evaluate skin aging of human dorsal hand. However, it will take too much time and need to work arduously to get the information from digital skin image by manual work. So, we want to build a simple and effective method to automatically count some of those skin texture parameters by using digital image-processing technology. A total of 100 subjects aged 30 years and above were involved. Sun exposure history and demographic information were collected by using a questionnaire. The skin image of subjects' dorsal hand was obtained by using a portable skin detector. The number of grids, which is one of skin texture parameters built in our previous study, was measured manually and automatically. Automated image analysis program was developed by using Matlab 7.1 software. The number of grids counted automatically (NGA) was significantly correlated with the number of grids counted manually (NGM) (r = 0.9287, P < 0.0001). And in each age group, there were no significant differences between NGA and NGM. The NGA was negatively correlated with age and lifetime sun exposure, and decreased with increasing Beagley-Gibson score from 3 to 6. In addition, even after adjusting for NGA, the standard deviation of grid areas for each image was positively correlated with age, sun exposure, and Bealey-Gibson score. The method introduced in present study can be used to measure some skin aging parameters automatically and objectively. And it will save much time, reduce labor, and avoid measurement errors of deferent investigators when evaluating a great deal of skin images in a short time. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Bioacoustic Signal Classification in Cat Auditory Cortex
1991-06-14
Studies Preparations for the setup to record from awake animals in a behavioral setting were initiated with the help of Dr. William Jenkins, our...temporal muscle over the right hemisphere was then retracted and the lateral cortex exposed by a craniotomy . The dura overlaying the middle ectosylvian...sites. For recording topographically identified single neurons, a wire mesh was placed over the craniotomy and the space between the grid and cortex was
Guidelines for Documenting Historic Military Structures
1994-03-01
Field records are incorporated into the field notebook. They are customarily recorded on grid vellum paper with graphite or ink, and must be legibly...concentration, linkage, or continuity of sites, buildings, structures, or objects united by past events or aesthetically by plan or physical development...into the slots precut into the mount cards.) Color negatives and prints, as well as black and white prints on resin-coated paper , are not acceptable
NASA Technical Reports Server (NTRS)
Peng, G.; Meier, W. N.; Scott, D. J.; Savoie, M. H.
2013-01-01
A long-term, consistent, and reproducible satellite-based passive microwave sea ice concentration climate data record (CDR) is available for climate studies, monitoring, and model validation with an initial operation capability (IOC). The daily and monthly sea ice concentration data are on the National Snow and Ice Data Center (NSIDC) polar stereographic grid with nominal 25 km × 25 km grid cells in both the Southern and Northern Hemisphere polar regions from 9 July 1987 to 31 December 2007. The data files are available in the NetCDF data format at http://nsidc.org/data/g02202.html and archived by the National Climatic Data Center (NCDC) of the National Oceanic and Atmospheric Administration (NOAA) under the satellite climate data record program (http://www.ncdc.noaa.gov/cdr/operationalcdrs.html). The description and basic characteristics of the NOAA/NSIDC passive microwave sea ice concentration CDR are presented here. The CDR provides similar spatial and temporal variability as the heritage products to the user communities with the additional documentation, traceability, and reproducibility that meet current standards and guidelines for climate data records. The data set, along with detailed data processing steps and error source information, can be found at http://dx.doi.org/10.7265/N5B56GN3.
NASA Astrophysics Data System (ADS)
Gilgen, H.; Roesch, A.; Wild, M.; Ohmura, A.
2009-05-01
Decadal changes in shortwave irradiance at the Earth's surface are estimated for the period from approximately 1960 through to 2000 from pyranometer records stored in the Global Energy Balance Archive. For this observational period, estimates could be calculated for a total of 140 cells of the International Satellite Cloud Climatology Project grid (an equal area 2.5° × 2.5° grid at the equator) using regression models allowing for station effects. In large regions worldwide, shortwave irradiance decreases in the first half of the observational period, recovers from the decrease in the 1980s, and thereafter increases, in line with previous reports. Years of trend reversals are determined for the grid cells which are best described with a second-order polynomial model. This reversal of the trend is observed in the majority of the grid cells in the interior of Europe and in Japan. In China, shortwave irradiance recovers during the 1990s in the majority of the grid cells in the southeast and northeast from the decrease observed in the period from 1960 through to 1990. A reversal of the trend in the 1980s or early 1990s is also observed for two grid cells in North America, and for the grid cells containing the Kuala Lumpur (Malaysia), Singapore, Casablanca (Morocco), Valparaiso (Chile) sites, and, noticeably, the remote South Pole and American Samoa sites. Negative trends persist, i.e., shortwave radiation decreases, for the observational period 1960 through to 2000 at the European coasts, in central and northwest China, and for three grid cells in India and two in Africa.
Comparing Correlations between Four-Quadrant and Five-Factor Personality Assessments
ERIC Educational Resources Information Center
Jones, Cathleen S.; Hartley, Nell T.
2013-01-01
For decades,some of the most popular devices used in educating students and employees to the values of diversity are those that are based on a four-grid identification of behavior style. The results from the scoring of the instruments provide individual profiles in terms of a person's assertiveness, responsiveness, and preferred tone of…
Score-Informed Musical Source Separation and Reconstruction
ERIC Educational Resources Information Center
Han, Yushen
2013-01-01
A systematic approach to retrieve individual parts in a monaural music recording with its score is introduced. We are interested in isolating the accompaniment part by removing the solo part from a recording of concerto music in which a solo instrument is accompanied by an orchestra. We require the music audio, the score, and optionally a sample…
Energy dependent features of X-ray signals in a GridPix detector
NASA Astrophysics Data System (ADS)
Krieger, C.; Kaminski, J.; Vafeiadis, T.; Desch, K.
2018-06-01
We report on the calibration of an argon/isobutane (97.7%/2.3%)-filled GridPix detector with soft X-rays (277 eV to 8 keV) using the variable energy X-ray source of the CAST Detector Lab at CERN. We study the linearity and energy resolution of the detector using both the number of pixels hit and the total measured charge as energy measures. For the latter, the energy resolution σE / E is better than 10% (20%) for energies above 2 keV (0.5 keV). Several characteristics of the recorded events are studied.
NOTE: MCDE: a new Monte Carlo dose engine for IMRT
NASA Astrophysics Data System (ADS)
Reynaert, N.; DeSmedt, B.; Coghe, M.; Paelinck, L.; Van Duyse, B.; DeGersem, W.; DeWagter, C.; DeNeve, W.; Thierens, H.
2004-07-01
A new accurate Monte Carlo code for IMRT dose computations, MCDE (Monte Carlo dose engine), is introduced. MCDE is based on BEAMnrc/DOSXYZnrc and consequently the accurate EGSnrc electron transport. DOSXYZnrc is reprogrammed as a component module for BEAMnrc. In this way both codes are interconnected elegantly, while maintaining the BEAM structure and only minimal changes to BEAMnrc.mortran are necessary. The treatment head of the Elekta SLiplus linear accelerator is modelled in detail. CT grids consisting of up to 200 slices of 512 × 512 voxels can be introduced and up to 100 beams can be handled simultaneously. The beams and CT data are imported from the treatment planning system GRATIS via a DICOM interface. To enable the handling of up to 50 × 106 voxels the system was programmed in Fortran95 to enable dynamic memory management. All region-dependent arrays (dose, statistics, transport arrays) were redefined. A scoring grid was introduced and superimposed on the geometry grid, to be able to limit the number of scoring voxels. The whole system uses approximately 200 MB of RAM and runs on a PC cluster consisting of 38 1.0 GHz processors. A set of in-house made scripts handle the parallellization and the centralization of the Monte Carlo calculations on a server. As an illustration of MCDE, a clinical example is discussed and compared with collapsed cone convolution calculations. At present, the system is still rather slow and is intended to be a tool for reliable verification of IMRT treatment planning in the case of the presence of tissue inhomogeneities such as air cavities.
Misclassification of OSA severity with automated scoring of home sleep recordings.
Aurora, R Nisha; Swartz, Rachel; Punjabi, Naresh M
2015-03-01
The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov.
Misclassification of OSA Severity With Automated Scoring of Home Sleep Recordings
Aurora, R. Nisha; Swartz, Rachel
2015-01-01
BACKGROUND: The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. METHODS: Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. RESULTS: Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. CONCLUSIONS: Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. TRIAL REGISTRY: ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov PMID:25411804
Exploring the use of Option Grid™ patient decision aids in a sample of clinics in Poland.
Scalia, Peter; Elwyn, Glyn; Barr, Paul; Song, Julia; Zisman-Ilani, Yaara; Lesniak, Monika; Mullin, Sarah; Kurek, Krzysztof; Bushell, Matt; Durand, Marie-Anne
2018-05-29
Research on the implementation of patient decision aids to facilitate shared decision making in clinical settings has steadily increased across Western countries. A study which implements decision aids and measures their impact on shared decision making has yet to be conducted in the Eastern part of Europe. To study the use of Option Grid TM patient decision aids in a sample of Grupa LUX MED clinics in Warsaw, Poland, and measure their impact on shared decision making. We conducted a pre-post interventional study. Following a three-month period of usual care, clinicians from three Grupa LUX MED clinics received a one-hour training session on how to use three Option Grid TM decision aids and were provided with copies for use for four months. Throughout the study, all eligible patients were asked to complete the three-item CollaboRATE patient-reported measure of shared decision making after their clinical encounter. CollaboRATE enables patients to assess the efforts clinicians make to: (i) inform them about their health issues; (ii) listen to 'what matters most'; (iii) integrate their treatment preference in future plans. A Hierarchical Logistic Regression model was performed to understand which variables had an effect on CollaboRATE. 2,048 patients participated in the baseline phase; 1,889 patients participated in the intervention phase. Five of the thirteen study clinicians had a statistically significant increase in their CollaboRATE scores (p<.05) when comparing baseline phase to intervention phase. All five clinicians were located at the same clinic, the only clinic where an overall increase (non-significant) in the mean CollaboRATE top score percentage occurred from baseline phase (M=60 %, SD=0.49; 95 % CI [57-63 %]) to intervention phase (M=62 %, SD=0.49; 95% CI [59-65%]). Only three of those five clinicians who had a statistically significant increase had a clinically significant difference. The implementation of Option Grid TM helped some clinicians practice shared decision making as reflected in CollaboRATE scores, but most clinicians did not have a significant increase in their scores. Our study indicates that the effect of these interventions may be dependent on clinic contexts and clinician engagement. Copyright © 2018. Published by Elsevier GmbH.
Methodological Caveats in the Detection of Coordinated Replay between Place Cells and Grid Cells
Trimper, John B.; Trettel, Sean G.; Hwaun, Ernie; Colgin, Laura Lee
2017-01-01
At rest, hippocampal “place cells,” neurons with receptive fields corresponding to specific spatial locations, reactivate in a manner that reflects recently traveled trajectories. These “replay” events have been proposed as a mechanism underlying memory consolidation, or the transfer of a memory representation from the hippocampus to neocortical regions associated with the original sensory experience. Accordingly, it has been hypothesized that hippocampal replay of a particular experience should be accompanied by simultaneous reactivation of corresponding representations in the neocortex and in the entorhinal cortex, the primary interface between the hippocampus and the neocortex. Recent studies have reported that coordinated replay may occur between hippocampal place cells and medial entorhinal cortex grid cells, cells with multiple spatial receptive fields. Assessing replay in grid cells is problematic, however, as the cells exhibit regularly spaced spatial receptive fields in all environments and, therefore, coordinated replay between place cells and grid cells may be detected by chance. In the present report, we adapted analytical approaches utilized in recent studies of grid cell and place cell replay to determine the extent to which coordinated replay is spuriously detected between grid cells and place cells recorded from separate rats. For a subset of the employed analytical methods, coordinated replay was detected spuriously in a significant proportion of cases in which place cell replay events were randomly matched with grid cell firing epochs of equal duration. More rigorous replay evaluation procedures and minimum spike count requirements greatly reduced the amount of spurious findings. These results provide insights into aspects of place cell and grid cell activity during rest that contribute to false detection of coordinated replay. The results further emphasize the need for careful controls and rigorous methods when testing the hypothesis that place cells and grid cells exhibit coordinated replay. PMID:28824388
Smith, Iain M; Naumann, David N; Guyver, Paul; Bishop, Jonathan; Davies, Simon; Lundy, Jonathan B; Bowley, Douglas M
2015-01-01
Anatomic measures of injury burden provide key information for studies of prehospital and in-hospital trauma care. The military version of the Abbreviated Injury Scale [AIS(M)] is used to score injuries in deployed military hospitals. Estimates of total trauma burden are derived from this. These scores are used for categorization of patients, assessment of care quality, and research studies. Scoring is normally performed retrospectively from chart review. We compared data recorded in the UK Joint Theatre Trauma Registry (JTTR) and scores calculated independently at the time of surgery by the operating surgeons to assess the concordance between surgeons and trauma nurse coordinators in assigning injury severity scores. Trauma casualties treated at a deployed Role 3 hospital were assigned AIS(M) scores by surgeons between 24 September 2012 and 16 October 2012. JTTR records from the same period were retrieved. The AIS(M), Injury Severity Score (ISS), and New Injury Severity Score (NISS) were compared between datasets. Among 32 matched casualties, 214 injuries were recorded in the JTTR, whereas surgeons noted 212. Percentage agreement for number of injuries was 19%. Surgeons scored 75 injuries as "serious" or greater compared with 68 in the JTTR. Percentage agreement for the maximum AIS(M), ISS, and NISS assigned to cases was 66%, 34%, and 28%, respectively, although the distributions of scores were not statistically different (median ISS: surgeons: 20 [interquartile range (IQR), 9-28] versus JTTR: 17.5 [IQR, 9-31.5], p = .7; median NISS: surgeons: 27 [IQR, 12-42] versus JTTR: 25.5 [IQR, 11.5-41], p = .7). There are discrepancies in the recording of AIS(M) between surgeons directly involved in the care of trauma casualties and trauma nurse coordinators working by retrospective chart review. Increased accuracy might be achieved by actively collaborating in this process. 2015.
Mobile health technology transforms injury severity scoring in South Africa.
Spence, Richard Trafford; Zargaran, Eiman; Hameed, S Morad; Navsaria, Pradeep; Nicol, Andrew
2016-08-01
The burden of data collection associated with injury severity scoring has limited its application in areas of the world with the highest incidence of trauma. Since January 2014, electronic records (electronic Trauma Health Records [eTHRs]) replaced all handwritten records at the Groote Schuur Hospital Trauma Unit in South Africa. Data fields required for Glasgow Coma Scale, Revised Trauma Score, Kampala Trauma Score, Injury Severity Score (ISS), and Trauma Score-Injury Severity Score calculations are now prospectively collected. Fifteen months after implementation of eTHR, the injury severity scores were compared as predictors of mortality on three accounts: (1) ability to discriminate (area under receiver operating curve, ROC); (2) ability to calibrate (observed versus expected ratio, O/E); and (3) feasibility of data collection (rate of missing data). A total of 7460 admissions were recorded by eTHR from April 1, 2014 to July 7, 2015, including 770 severely injured patients (ISS > 15) and 950 operations. The mean age was 33.3 y (range 13-94), 77.6% were male, and the mechanism of injury was penetrating in 39.3% of cases. The cohort experienced a mortality rate of 2.5%. Patient reserve predictors required by the scores were 98.7% complete, physiological injury predictors were 95.1% complete, and anatomic injury predictors were 86.9% complete. The discrimination and calibration of Trauma Score-Injury Severity Score was superior for all admissions (ROC 0.9591 and O/E 1.01) and operatively managed patients (ROC 0.8427 and O/E 0.79). In the severely injured cohort, the discriminatory ability of Revised Trauma Score was superior (ROC 0.8315), but no score provided adequate calibration. Emerging mobile health technology enables reliable and sustainable injury severity scoring in a high-volume trauma center in South Africa. Copyright © 2016 Elsevier Inc. All rights reserved.
van Staaveren, Nienke; Teixeira, Dayane Lemos; Hanlon, Alison; Boyle, Laura Ann
2015-01-01
Research is needed to validate lesions recorded at meat inspection as indicators of pig welfare on farm. The aims were to determine the influence of mixing pigs on carcass lesions and to establish whether such lesions correlate with pig behaviour and lesions scored on farm. Aggressive and mounting behaviour of pigs in three single sex pens was recorded on Day −5, −2, and −1 relative to slaughter (Day 0). On Day 0 pigs were randomly allocated to 3 treatments (n = 20/group) over 5 replicates: males mixed with females (MF), males mixed with males (MM), and males unmixed (MUM). Aggressive and mounting behaviours were recorded on Day 0 at holding on farm and lairage. Skin/tail lesions were scored according to severity at the farm (Day −1), lairage, and on the carcass (Day 0). Effect of treatment and time on behaviour and lesions were analysed by mixed models. Spearman rank correlations between behaviour and lesion scores and between scores recorded at different stages were determined. In general, MM performed more aggressive behaviour (50.4 ± 10.72) than MUM (20.3 ± 9.55, P < 0.05) and more mounting (30.9 ± 9.99) than MF (11.4 ± 3.76) and MUM (9.8 ± 3.74, P < 0.05). Skin lesion scores increased between farm (Day −1) and lairage (P < 0.001), but this tended to be significant only for MF and MM (P = 0.08). There was no effect of treatment on carcass lesions and no associations were found with fighting/mounting. Mixing entire males prior to slaughter stimulated mounting and aggressive behaviour but did not influence carcass lesion scores. Carcass skin/tail lesions scores were correlated with scores recorded on farm (rskin = 0.21 and rtail = 0.18, P < 0.01) suggesting that information recorded at meat inspection could be used as indicators of pig welfare on farm. PMID:25830336
Correlations and Functional Connections in a Population of Grid Cells
Roudi, Yasser
2015-01-01
We study the statistics of spike trains of simultaneously recorded grid cells in freely behaving rats. We evaluate pairwise correlations between these cells and, using a maximum entropy kinetic pairwise model (kinetic Ising model), study their functional connectivity. Even when we account for the covariations in firing rates due to overlapping fields, both the pairwise correlations and functional connections decay as a function of the shortest distance between the vertices of the spatial firing pattern of pairs of grid cells, i.e. their phase difference. They take positive values between cells with nearby phases and approach zero or negative values for larger phase differences. We find similar results also when, in addition to correlations due to overlapping fields, we account for correlations due to theta oscillations and head directional inputs. The inferred connections between neurons in the same module and those from different modules can be both negative and positive, with a mean close to zero, but with the strongest inferred connections found between cells of the same module. Taken together, our results suggest that grid cells in the same module do indeed form a local network of interconnected neurons with a functional connectivity that supports a role for attractor dynamics in the generation of grid pattern. PMID:25714908
Xia, Zhouhui; Gao, Peng; Sun, Teng; Wu, Haihua; Tan, Yeshu; Song, Tao; Lee, Shuit-Tong; Sun, Baoquan
2018-04-25
Silicon (Si)/organic heterojunction solar cells based on poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) (PEDOT:PSS) and n-type Si have attracted wide interests because they promise cost-effectiveness and high-efficiency. However, the limited conductivity of PEDOT:PSS leads to an inefficient hole transport efficiency for the heterojunction device. Therefore, a high dense top-contact metal grid electrode is required to assure the efficient charge collection efficiency. Unfortunately, the large metal grid coverage ratio electrode would lead to undesirable optical loss. Here, we develop a strategy to balance PEDOT:PSS conductivity and grid optical transmittance via a buried molybdenum oxide/silver grid electrode. In addition, the grid electrode coverage ratio is optimized to reduce its light shading effect. The buried electrode dramatically reduces the device series resistance, which leads to a higher fill factor (FF). With the optimized buried electrode, a record FF of 80% is achieved for flat Si/PEDOT:PSS heterojunction devices. With further enhancement adhesion between the PEDOT:PSS film and Si substrate by a chemical cross-linkable silance, a power conversion efficiency of 16.3% for organic/textured Si heterojunction devices is achieved. Our results provide a path to overcome the inferior organic semiconductor property to enhance the organic/Si heterojunction solar cell.
Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J
2011-01-01
Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.
Multiproxy summer precipitation reconstructions for Asia during the past 530 years
NASA Astrophysics Data System (ADS)
Feng, S.; Hu, Q. S.; Wu, Q.
2011-12-01
The Asian summer monsoons and the monsoon circulation affect the weather and climate in most of the tropics and extra-tropics of the Eastern Hemisphere, where more than 60% of the earth's population live. Thus it is of paramount importance to understand variations of the Asian summer monsoons from a long-term perspective. This study reconstructed a 0.5°×0.5° gridded summer (June-August) precipitation in Asia (5°-55°N, 60°-135°E) during the past 530 years based on annually resolved predictors from natural and human archives. There are 221 proxy records with temporally stable and significant correlations with the summer precipitation in the study region. Most of the proxy records only cover the last 300-400 years, and a few proxy records were available before 1470AD. The missing values in the proxy data were infilled using analogue techniques. Then the regularized expectation maximization method is used to reconstruct the summer precipitation back to 1470AD. The reduction of error (RE) between the reconstructed values and observations suggests that the reconstructions are reliable, with RE>0.0 on all grid points for the study region. The reconstruction skill is very high (RE>0.4) over regions with denser proxy records (e.g. East China, Mongolia and Central Asia), and slightly lower in northeastern and southeastern Asia with RE usually less than 0.2. The reconstructed gridded summer precipitation data allow us to identify and analyze the regional variations of drought and flood during the last 530 years. These analysis results show that the severe droughts that affected China during the Little Ice Age (e.g. the mega-drought during the late 1630s to early 1640s that triggered the collapse of the Ming Dynasty) shared a similar spatial extent with the modern droughts in northern and central China.
NASA Astrophysics Data System (ADS)
Kim, Youngwook; Kimball, John S.; Glassy, Joseph; Du, Jinyang
2017-02-01
The landscape freeze-thaw (FT) signal determined from satellite microwave brightness temperature (Tb) observations has been widely used to define frozen temperature controls on land surface water mobility and ecological processes. Calibrated 37 GHz Tb retrievals from the Scanning Multichannel Microwave Radiometer (SMMR), Special Sensor Microwave Imager (SSM/I), and SSM/I Sounder (SSMIS) were used to produce a consistent and continuous global daily data record of landscape FT status at 25 km grid cell resolution. The resulting FT Earth system data record (FT-ESDR) is derived from a refined classification algorithm and extends over a larger domain and longer period (1979-2014) than prior FT-ESDR releases. The global domain encompasses all land areas affected by seasonal frozen temperatures, including urban, snow- and ice-dominant and barren land, which were not represented by prior FT-ESDR versions. The FT retrieval is obtained using a modified seasonal threshold algorithm (MSTA) that classifies daily Tb variations in relation to grid-cell-wise FT thresholds calibrated using surface air temperature data from model reanalysis. The resulting FT record shows respective mean annual spatial classification accuracies of 90.3 and 84.3 % for evening (PM) and morning (AM) overpass retrievals relative to global weather station measurements. Detailed data quality metrics are derived characterizing the effects of sub-grid-scale open water and terrain heterogeneity, as well as algorithm uncertainties on FT classification accuracy. The FT-ESDR results are also verified against other independent cryospheric data, including in situ lake and river ice phenology, and satellite observations of Greenland surface melt. The expanded FT-ESDR enables new investigations encompassing snow- and ice-dominant land areas, while the longer record and favorable accuracy allow for refined global change assessments that can better distinguish transient weather extremes, landscape phenological shifts, and climate anomalies from longer-term trends extending over multiple decades. The dataset is freely available online (doi:10.5067/MEASURES/CRYOSPHERE/nsidc-0477.003).
Quantification of myocardial fibrosis by digital image analysis and interactive stereology
2014-01-01
Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374
Quantification of myocardial fibrosis by digital image analysis and interactive stereology.
Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas
2014-06-09
Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.
Turkoglu, Elif Betul; Celık, Erkan; Aksoy, Nilgun; Bursalı, Ozlem; Ucak, Turgay; Alagoz, Gursoy
2015-01-01
To compare the changes in vision related quality of life (VR-QoL) in patients with diabetic macular edema (DME) undergoing intravitreal ranibizumab (IVR) injection or focal/grid laser. In this prospective study, 70 patients with clinically significant macular edema (CSME) were randomized to undergo IVR injection (n=35) and focal/grid laser (n=35). If necessary, the laser or ranibizumab injections were repeated. Distance and near visual acuities, central retinal thickness (CRT) and The 25-item Visual Function Questionnaire (VFQ-25) were used to measure the effectiveness of treatments and VR-QoL before and after 6 months following IVR or laser treatment. The demographic and clinical findings before the treatments were similar in both main groups. The improvements in distance and near visual acuities were higher in IVR group than the laser group (p<0.01). The reduction in CRT in IVR group was higher than that in laser treatment group (p<0.01). In both groups, the VFQ-25 composite score tended to improve from baseline to 6 months. And at 6th month, the changes in composite score were significantly higher in IVR group than in laser group (p<0.05). The improvements in overall composite scores were 6.3 points for the IVR group compared with 3.0 points in the laser group. Patients treated with IVR and laser had large improvements in composite scores, general vision, near and distance visual acuities in VFQ-25 at 6 months, in comparison with baseline scores, and also mental health subscale in IVR group. Our study revealed that IVR improved not only visual acuity or CRT, but also vision related quality of life more than laser treatment in DME. And these patient-reported outcomes may play an important role in the treatment choice in DME for clinicians. Copyright © 2015 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
... easements, restrictions and rights-of- way of record. The bearings shown above are based on the grid bearing..., on September 21, 2012. John L. Mayfield, Jr., Manager, Detroit Airports District Office FAA, Great...
Ferroli, Paolo; Broggi, Morgan; Schiavolin, Silvia; Acerbi, Francesco; Bettamio, Valentina; Caldiroli, Dario; Cusin, Alberto; La Corte, Emanuele; Leonardi, Matilde; Raggi, Alberto; Schiariti, Marco; Visintini, Sergio; Franzini, Angelo; Broggi, Giovanni
2015-12-01
OBJECT The Milan Complexity Scale-a new practical grading scale designed to estimate the risk of neurological clinical worsening after performing surgery for tumor removal-is presented. METHODS A retrospective study was conducted on all elective consecutive surgical procedures for tumor resection between January 2012 and December 2014 at the Second Division of Neurosurgery at Fondazione IRCCS Istituto Neurologico Carlo Besta of Milan. A prospective database dedicated to reporting complications and all clinical and radiological data was retrospectively reviewed. The Karnofsky Performance Scale (KPS) was used to classify each patient's health status. Complications were divided into major and minor and recorded based on etiology and required treatment. A logistic regression model was used to identify possible predictors of clinical worsening after surgery in terms of changes between the preoperative and discharge KPS scores. Statistically significant predictors were rated based on their odds ratios in order to build an ad hoc complexity scale. For each patient, a corresponding total score was calculated, and ANOVA was performed to compare the mean total scores between the improved/unchanged and worsened patients. Relative risk (RR) and chi-square statistics were employed to provide the risk of worsening after surgery for each total score. RESULTS The case series was composed of 746 patients (53.2% female; mean age 51.3 ± 17.1). The most common tumors were meningiomas (28.6%) and glioblastomas (24.1%). The mortality rate was 0.94%, the major complication rate was 9.1%, and the minor complication rate was 32.6%. Of 746 patients, 523 (70.1%) patients improved or remained unchanged, and 223 (29.9%) patients worsened. The following factors were found to be statistically significant predictors of the change in KPS scores: tumor size larger than 4 cm, cranial nerve manipulation, major brain vessel manipulation, posterior fossa location, and eloquent area involvement (Nagelkerke R(2) = 0.286). A grading scale was obtained with scores ranging between 0 and 8. Worsened patients showed mean total scores that were significantly higher than the improved/unchanged scores (3.24 ± 1.55 vs 1.47 ± 1.58; p < 0.001). Finally, a grid was developed to show the risk of worsening after surgery for each total score: scores higher than 3 are suggestive of worse clinical outcome. CONCLUSIONS Through the evaluation of the 5 aforementioned parameters-the Big Five-the Milan Complexity Scale enables neurosurgeons to estimate the risk of a negative clinical course after brain tumor surgery and share these data with the patient. Furthermore, the Milan Complexity Scale could be used for research and educational purposes and better health system management.
PDF added value of a high resolution climate simulation for precipitation
NASA Astrophysics Data System (ADS)
Soares, Pedro M. M.; Cardoso, Rita M.
2015-04-01
General Circulation Models (GCMs) are models suitable to study the global atmospheric system, its evolution and response to changes in external forcing, namely to increasing emissions of CO2. However, the resolution of GCMs, of the order of 1o, is not sufficient to reproduce finer scale features of the atmospheric flow related to complex topography, coastal processes and boundary layer processes, and higher resolution models are needed to describe observed weather and climate. The latter are known as Regional Climate Models (RCMs) and are widely used to downscale GCMs results for many regions of the globe and are able to capture physically consistent regional and local circulations. Most of the RCMs evaluations rely on the comparison of its results with observations, either from weather stations networks or regular gridded datasets, revealing the ability of RCMs to describe local climatic properties, and assuming most of the times its higher performance in comparison with the forcing GCMs. The additional climatic details given by RCMs when compared with the results of the driving models is usually named as added value, and it's evaluation is still scarce and controversial in the literuature. Recently, some studies have proposed different methodologies to different applications and processes to characterize the added value of specific RCMs. A number of examples reveal that some RCMs do add value to GCMs in some properties or regions, and also the opposite, elighnening that RCMs may add value to GCM resuls, but improvements depend basically on the type of application, model setup, atmospheric property and location. The precipitation can be characterized by histograms of daily precipitation, or also known as probability density functions (PDFs). There are different strategies to evaluate the quality of both GCMs and RCMs in describing the precipitation PDFs when compared to observations. Here, we present a new method to measure the PDF added value obtained from dynamical downscaling, based on simple PDF skill scores. The measure can assess the full quality of the PDFs and at the same time integrates a flexible manner to weight differently the PDF tails. In this study we apply the referred method to characaterize the PDF added value of a high resolution simulation with the WRF model. Results from a WRF climate simulation centred at the Iberian Penisnula with two nested grids, a larger one at 27km and a smaller one at 9km. This simulation is forced by ERA-Interim. The observational data used covers from rain gauges precipitation records to observational regular grids of daily precipitation. Two regular gridded precipitation datasets are used. A Portuguese grid precipitation dataset developed at 0.2°× 0.2°, from observed rain gauges daily precipitation. A second one corresponding to the ENSEMBLES observational gridded dataset for Europe, which includes daily precipitation values at 0.25°. The analisys shows an important PDF added value from the higher resolution simulation, regarding the full PDF and the extremes. This method shows higher potential to be applied to other simulation exercises and to evaluate other variables.
NASA Astrophysics Data System (ADS)
Sparks, T. H.; Huber, K.; Croxton, P. J.
2006-05-01
In 1944, John Willis produced a summary of his meticulous record keeping of weather and plants over the 30 years 1913 1942. This publication contains fixed-date, fixed-subject photography taken on the 1st of each month from January to May, using as subjects snowdrop Galanthus nivalis, daffodil Narcissus pseudo-narcissus, horse chestnut Aesculus hippocastanum and beech Fagus sylvatica. We asked 38 colleagues to assess rapidly the plant development in each of these photographs according to a supplied five-point score. The mean scores from this exercise were assessed in relation to mean monthly weather variables preceding the date of the photograph and the consistency of scoring was examined according to the experience of the recorders. Plant development was more strongly correlated with mean temperature than with minimum or maximum temperatures or sunshine. No significant correlations with rainfall were detected. Whilst mean scores were very similar, botanists were more consistent in their scoring of developmental stages than non-botanists. However, there was no overall pattern for senior staff to be more consistent in scoring than junior staff. These results suggest that scoring of plant development stages on fixed dates could be a viable method of assessing the progress of the season. We discuss whether such recording could be more efficient than traditional phenology, especially in those sites that are not visited regularly and hence are less amenable to frequent or continuous observation to assess when a plant reaches a particular growth stage.
NASA Astrophysics Data System (ADS)
Ichinose, G. A.; Saikia, C. K.
2007-12-01
We applied the moment tensor (MT) analysis scheme to identify seismic sources using regional seismograms based on the representation theorem for the elastic wave displacement field. This method is applied to estimate the isotropic (ISO) and deviatoric MT components of earthquake, volcanic, and isotropic sources within the Basin and Range Province (BRP) and western US. The ISO components from Hoya, Bexar, Montello and Junction were compared to recently well recorded recent earthquakes near Little Skull Mountain, Scotty's Junction, Eureka Valley, and Fish Lake Valley within southern Nevada. We also examined "dilatational" sources near Mammoth Lakes Caldera and two mine collapses including the August 2007 event in Utah recorded by US Array. Using our formulation we have first implemented the full MT inversion method on long period filtered regional data. We also applied a grid-search technique to solve for the percent deviatoric and %ISO moments. By using the grid-search technique, high-frequency waveforms are used with calibrated velocity models. We modeled the ISO and deviatoric components (spall and tectonic release) as separate events delayed in time or offset in space. Calibrated velocity models helped the resolution of the ISO components and decrease the variance over the average, initial or background velocity models. The centroid location and time shifts are velocity model dependent. Models can be improved as was done in previously published work in which we used an iterative waveform inversion method with regional seismograms from four well recorded and constrained earthquakes. The resulting velocity models reduced the variance between predicted synthetics by about 50 to 80% for frequencies up to 0.5 Hz. Tests indicate that the individual path-specific models perform better at recovering the earthquake MT solutions even after using a sparser distribution of stations than the average or initial models.
NASA Astrophysics Data System (ADS)
Venable, N. B. H.; Fassnacht, S. R.; Adyabadam, G.
2014-12-01
Precipitation data in semi-arid and mountainous regions is often spatially and temporally sparse, yet it is a key variable needed to drive hydrological models. Gridded precipitation datasets provide a spatially and temporally coherent alternative to the use of point-based station data, but in the case of Mongolia, may not be constructed from all data available from government data sources, or may only be available at coarse resolutions. To examine the uncertainty associated with the use of gridded and/or point precipitation data, monthly water balance models of three river basins across forest steppe (the Khoid Tamir River at Ikhtamir), steppe (the Baidrag River at Bayanburd), and desert steppe (the Tuin River at Bogd) ecozones in the Khangai Mountain Region of Mongolia were compared. The models were forced over a 10-year period from 2001-2010, with gridded temperature and precipitation data at a 0.5 x 0.5 degree resolution. These results were compared to modeling using an interpolated hybrid of the gridded data and additional point data recently gathered from government sources; and with point data from the nearest meteorological station to the streamflow gage of choice. Goodness-of-fit measures including the Nash-Sutcliff Efficiency statistic, the percent bias, and the RMSE-observations standard deviation ratio were used to assess model performance. The results were mixed with smaller differences between the two gridded products as compared to the differences between gridded products and station data. The largest differences in precipitation inputs and modeled runoff amounts occurred between the two gridded datasets and station data in the desert steppe (Tuin), and the smallest differences occurred in the forest steppe (Khoid Tamir) and steppe (Baidrag). Mean differences between water balance model results are generally smaller than mean differences in the initial input data over the period of record. Seasonally, larger differences in gridded versus station-based precipitation products and modeled outputs occur in summer in the desert-steppe, and in spring in the forest steppe. Choice of precipitation data source in terms of gridded or point-based data directly affects model outcomes with greater uncertainty noted on a seasonal basis across ecozones of the Khangai.
1975-03-01
Loss Relationships 199 109 37-Tube, 4.5 Area Ratio Nozzle, Premergcd Jet Turbulence Noise 200 110 37-Tube Nozzle Premerged Jet Noise Peak...were obtained with the tunnel oil and at 165 knots. The tunnel air flows through a large , rectangular bell-mouth inlet, a (low straightening grid... ratio conditions on a fourteen-track annlog tape recorder for subsecjuent analysis after test com- pletion. Basic analysis of the recorded acoustic
Improving quality in an internal medicine residency program through a peer medical record audit.
Asao, Keiko; Mansi, Ishak A; Banks, Daniel
2009-12-01
This study examined the effectiveness of a quality improvement project of a limited didactic session, a medical record audit by peers, and casual feedback within a residency program. Residents audited their peers' medical records from the clinic of a university hospital in March, April, August, and September 2007. A 24-item quality-of-care score was developed for five common diagnoses, expressed from 0 to 100, with 100 as complete compliance. Audit scores were compared by month and experience of the resident as an auditor. A total of 469 medical records, audited by 12 residents, for 80 clinic residents, were included. The mean quality-of-care score was 89 (95% CI = 88-91); the scores in March, April, August, and September were 88 (95% CI = 85-91), 94 (95% CI = 90-96), 87 (95% CI = 85-89), and 91 (95% CI = 89-93), respectively. The mean score of 58 records of residents who had experience as auditors was 94 (95% CI = 89-96) compared with 89 (95% CI = 87-90) for those who did not. The score significantly varied (P = .0009) from March to April and from April to August, but it was not significantly associated with experience as an auditor with multivariate analysis. Residents' compliance with the standards of care was generally high. Residents responded to the project well, but their performance dropped after a break in the intervention. Continuation of the audit process may be necessary for a sustained effect on quality.
NASA Astrophysics Data System (ADS)
Brunner, Raimund; Schmidtke, Gerhard; Konz, Werner; Pfeffer, Wilfried
A low-cost monitor to measure the EUV and plasma environment in space is presented. The device consists of three (or more) isolated spheres, a metallic sphere, one or more highly trans-parent Inner Grids and Outer Grids. Each one is being connected to a sensitive floating elec-trometer. By setting different potentials to the grids as well as to the sphere and varying one or more of their voltages, measurements of spectral solar EUV irradiance (15-200 nm), of local plasma parameters such as electron and ion densities, electron energies and temperatures as well as ion compositions and debris events can be derived from the current recordings. This detector does not require any (solar) pointing device. The primary goal is to study the impact of solar activity events (e.g. CMEs) as well as subsequent reactions of the ionospheric/thermospheric systems (including space weather occurences). The capability of SEPS for measuring EUV pho-ton fluxes as well as plasma parameters in the energy range from 0 to +/-70 eV is demonstrated by laboratory measurements as performed in the IPM laboratory, at BESSY-PTB electron syn-chrotron in Berlin and at ESA/ESTEC plasma chamber. Based on the laboratory recording of plasma recombination EUV emission the sensor is suitable to detect also auroral and airglow radiations. -The state of the art in the development of this device is reported.
A stand-alone tidal prediction application for mobile devices
NASA Astrophysics Data System (ADS)
Tsai, Cheng-Han; Fan, Ren-Ye; Yang, Yi-Chung
2017-04-01
It is essential for people conducting fishing, leisure, or research activities at the coasts to have timely and handy tidal information. Although tidal information can be found easily on the internet or using mobile device applications, this information is all applicable for only certain specific locations, not anywhere on the coast, and they need an internet connection. We have developed an application for Android devices, which allows the user to obtain hourly tidal height anywhere on the coast for the next 24 hours without having to have any internet connection. All the necessary information needed for the tidal height calculation is stored in the application. To develop this application, we first simulate tides in the Taiwan Sea using the hydrodynamic model (MIKE21 HD) developed by the DHI. The simulation domain covers the whole coast of Taiwan and the surrounding seas with a grid size of 1 km by 1 km. This grid size allows us to calculate tides with high spatial resolution. The boundary conditions for the simulation domain were obtained from the Tidal Model Driver of the Oregon State University, using its tidal constants of eight constituents: M2, S2, N2, K2, K1, O1, P1, and Q1. The simulation calculates tides for 183 days so that the tidal constants for the above eight constituents of each water grid can be extracted by harmonic analysis. Using the calculated tidal constants, we can predict the tides in each grid of our simulation domain, which is useful when one needs the tidal information for any location in the Taiwan Sea. However, for the mobile application, we only store the eight tidal constants for the water grids on the coast. Once the user activates the application, it reads the longitude and latitude from the GPS sensor in the mobile device and finds the nearest coastal grid which has our tidal constants. Then, the application calculates tidal height variation based on the harmonic analysis. The application also allows the user to input location and time to obtain tides for any historic or future dates for the input location. The predicted tides have been verified with the historic tidal records of certain tidal stations. The verification shows that the tides predicted by the application match the measured record well.
Multisource feedback analysis of pediatric outpatient teaching
2013-01-01
Background This study aims to evaluate the outpatient communication skills of medical students via multisource feedback, which may be useful to map future directions in improving physician-patient communication. Methods Family respondents of patients, a nurse, a clinical teacher, and a research assistant evaluated video-recorded medical students’ interactions with outpatients by using multisource feedback questionnaires; students also assessed their own skills. The questionnaire was answered based on the video-recorded interactions between outpatients and the medical students. Results A total of 60 family respondents of the 60 patients completed the questionnaires, 58 (96.7%) of them agreed with the video recording. Two reasons for reluctance were “personal privacy” issues and “simply disagree” with the video recording. The average satisfaction score of the 58 students was 85.1 points, indicating students’ performance was in the category between satisfied and very satisfied. The family respondents were most satisfied with the “teacher”s attitude,“ followed by ”teaching quality”. In contrast, the family respondents were least satisfied with “being open to questions”. Among the 6 assessment domains of communication skills, the students scored highest on “explaining” and lowest on “giving recommendations”. In the detailed assessment by family respondents, the students scored lowest on “asking about life/school burden”. In the multisource analysis, the nurses’ mean score was much higher and the students’ mean self-assessment score was lower than the average scores on all domains. Conclusion The willingness and satisfaction of family respondents were high in this study. Students scored the lowest on giving recommendations to patients. Multisource feedback with video recording is useful in providing more accurate evaluation of students’ communication competence and in identifying the areas of communication that require enhancement. PMID:24180615
NASA Astrophysics Data System (ADS)
Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Newman, Andrew J.; Hughes, Mimi; McGurk, Bruce; Lundquist, Jessica D.
2018-01-01
Given uncertainty in precipitation gauge-based gridded datasets over complex terrain, we use multiple streamflow observations as an additional source of information about precipitation, in order to identify spatial and temporal differences between a gridded precipitation dataset and precipitation inferred from streamflow. We test whether gridded datasets capture across-crest and regional spatial patterns of variability, as well as year-to-year variability and trends in precipitation, in comparison to precipitation inferred from streamflow. We use a Bayesian model calibration routine with multiple lumped hydrologic model structures to infer the most likely basin-mean, water-year total precipitation for 56 basins with long-term (>30 year) streamflow records in the Sierra Nevada mountain range of California. We compare basin-mean precipitation derived from this approach with basin-mean precipitation from a precipitation gauge-based, 1/16° gridded dataset that has been used to simulate and evaluate trends in Western United States streamflow and snowpack over the 20th century. We find that the long-term average spatial patterns differ: in particular, there is less precipitation in the gridded dataset in higher-elevation basins whose aspect faces prevailing cool-season winds, as compared to precipitation inferred from streamflow. In a few years and basins, there is less gridded precipitation than there is observed streamflow. Lower-elevation, southern, and east-of-crest basins show better agreement between gridded and inferred precipitation. Implied actual evapotranspiration (calculated as precipitation minus streamflow) then also varies between the streamflow-based estimates and the gridded dataset. Absolute uncertainty in precipitation inferred from streamflow is substantial, but the signal of basin-to-basin and year-to-year differences are likely more robust. The findings suggest that considering streamflow when spatially distributing precipitation in complex terrain may improve its representation, particularly for basins whose orientations (e.g., windward-facing) are favored for orographic precipitation enhancement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pokharel, S; Rana, S
Purpose: purpose of this study is to evaluate the effect of grid size in Eclipse AcurosXB dose calculation algorithm for SBRT lung. Methods: Five cases of SBRT lung previously treated have been chosen for present study. Four of the plans were 5 fields conventional IMRT and one was Rapid Arc plan. All five cases have been calculated with five grid sizes (1, 1.5, 2, 2.5 and 3mm) available for AXB algorithm with same plan normalization. Dosimetric indices relevant to SBRT along with MUs and time have been recorded for different grid sizes. The maximum difference was calculated as a percentagemore » of mean of all five values. All the plans were IMRT QAed with portal dosimetry. Results: The maximum difference of MUs was within 2%. The time increased was as high as 7 times from highest 3mm to lowest 1mm grid size. The largest difference of PTV minimum, maximum and mean dose were 7.7%, 1.5% and 1.6% respectively. The highest D2-Max difference was 6.1%. The highest difference in ipsilateral lung mean, V5Gy, V10Gy and V20Gy were 2.6%, 2.4%, 1.9% and 3.8% respectively. The maximum difference of heart, cord and esophagus dose were 6.5%, 7.8% and 4.02% respectively. The IMRT Gamma passing rate at 2%/2mm remains within 1.5% with at least 98% points passing with all grid sizes. Conclusion: This work indicates the lowest grid size of 1mm available in AXB is not necessarily required for accurate dose calculation. The IMRT passing rate was insignificant or not observed with the reduction of grid size less than 2mm. Although the maximum percentage difference of some of the dosimetric indices appear large, most of them are clinically insignificant in absolute dose values. So we conclude that 2mm grid size calculation is best compromise in light of dose calculation accuracy and time it takes to calculate dose.« less
Long-Term Effects of Safinamide on Mood Fluctuations in Parkinson's Disease.
Cattaneo, Carlo; Müller, Thomas; Bonizzoni, Erminio; Lazzeri, Gabriele; Kottakis, Ioannis; Keywood, Charlotte
2017-01-01
Mood disorders are very frequent in Parkinson's Disease (PD), and their effective treatment is still a major unresolved issue: growing evidence suggests that glutamatergic system dysfunction is directly involved. Safinamide is a drug with an innovative mechanism of action, dopaminergic and non-dopaminergic, that includes the reversible inhibition of the monoamine oxidase-B (MAO-B) enzyme and the modulation of excessive glutamate release through the use- and state-dependent blockade of the sodium channels. To investigate the effects of safinamide on mood over two-year treatment in PD patients with motor fluctuations. This was a post-hoc analysis of the data from studies 016 and 018. The analysis focused on outcomes related to mood, namely: scores of the "Emotional well-being" domain of the Parkinson's Disease Questionnaire (PDQ-39), scores of the GRID Hamilton Rating Scale for Depression (GRID-HAMD) and the proportion of patients reporting depression as an adverse event over the entire treatment period. Safinamide, compared to placebo, significantly improved the PDQ-39 "Emotional well-being" domain after6-months (p = 0.0067) and 2 years (p = 0.0006), as well as the GRID-HAMD (p = 0.0408 after 6 months and p = 0.0027 after 2 years). Significantly fewer patients in the safinamide group, compared to placebo, experienced depression as adverse event (p = 0.0444 after 6 months and p = 0.0057 after 2 years). The favorable effect of safinamide on mood may be explained by the improvement in wearing off and by its modulation of glutamatergic hyperactivity and reversible MAO-B inhibition. Prospective studies are warranted to investigate this potential benefit.
Long-Term Effects of Safinamide on Mood Fluctuations in Parkinson’s Disease
Cattaneo, Carlo; Müller, Thomas; Bonizzoni, Erminio; Lazzeri, Gabriele; Kottakis, Ioannis; Keywood, Charlotte
2017-01-01
Background: Mood disorders are very frequent in Parkinson’s Disease (PD), and their effective treatment is still a major unresolved issue: growing evidence suggests that glutamatergic system dysfunction is directly involved. Safinamide is a drug with an innovative mechanism of action, dopaminergic and non-dopaminergic, that includes the reversible inhibition of the monoamine oxidase-B (MAO-B) enzyme and the modulation of excessive glutamate release through the use- and state-dependent blockade of the sodium channels. Objective: To investigate the effects of safinamide on mood over two-year treatment in PD patients with motor fluctuations. Methods: This was a post-hoc analysis of the data from studies 016 and 018. The analysis focused on outcomes related to mood, namely: scores of the “Emotional well-being” domain of the Parkinson’s Disease Questionnaire (PDQ-39), scores of the GRID Hamilton Rating Scale for Depression (GRID-HAMD) and the proportion of patients reporting depression as an adverse event over the entire treatment period. Results: Safinamide, compared to placebo, significantly improved the PDQ-39 “Emotional well-being” domain after6-months (p = 0.0067) and 2 years (p = 0.0006), as well as the GRID-HAMD (p = 0.0408 after 6 months and p = 0.0027 after 2 years). Significantly fewer patients in the safinamide group, compared to placebo, experienced depression as adverse event (p = 0.0444 after 6 months and p = 0.0057 after 2 years). Conclusion: The favorable effect of safinamide on mood may be explained by the improvement in wearing off and by its modulation of glutamatergic hyperactivity and reversible MAO-B inhibition. Prospective studies are warranted to investigate this potential benefit. PMID:28777756
Precipitation From a Multiyear Database of Convection-Allowing WRF Simulations
NASA Astrophysics Data System (ADS)
Goines, D. C.; Kennedy, A. D.
2018-03-01
Convection-allowing models (CAMs) have become frequently used for operational forecasting and, more recently, have been utilized for general circulation model downscaling. CAM forecasts have typically been analyzed for a few case studies or over short time periods, but this limits the ability to judge the overall skill of deterministic simulations. Analysis over long time periods can yield a better understanding of systematic model error. Four years of warm season (April-August, 2010-2013)-simulated precipitation has been accumulated from two Weather Research and Forecasting (WRF) models with 4 km grid spacing. The simulations were provided by the National Center for Environmental Prediction (NCEP) and the National Severe Storms Laboratory (NSSL), each with different dynamic cores and parameterization schemes. These simulations are evaluated against the NCEP Stage-IV precipitation data set with similar 4 km grid spacing. The spatial distribution and diurnal cycle of precipitation in the central United States are analyzed using Hovmöller diagrams, grid point correlations, and traditional verification skill scoring (i.e., ETS; Equitable Threat Score). Although NCEP-WRF had a high positive error in total precipitation, spatial characteristics were similar to observations. For example, the spatial distribution of NCEP-WRF precipitation correlated better than NSSL-WRF for the Northern Plains. Hovmöller results exposed a delay in initiation and decay of diurnal precipitation by NCEP-WRF while both models had difficulty in reproducing the timing and location of propagating precipitation. ETS was highest for NSSL-WRF in all domains at all times. ETS was also higher in areas of propagating precipitation compared to areas of unorganized diurnal scattered precipitation. Monthly analysis identified unique differences between the two models in their abilities to correctly simulate the spatial distribution and zonal motion of precipitation through the warm season.
Diagnostic accuracy of sleep bruxism scoring in absence of audio-video recording: a pilot study.
Carra, Maria Clotilde; Huynh, Nelly; Lavigne, Gilles J
2015-03-01
Based on the most recent polysomnographic (PSG) research diagnostic criteria, sleep bruxism is diagnosed when >2 rhythmic masticatory muscle activity (RMMA)/h of sleep are scored on the masseter and/or temporalis muscles. These criteria have not yet been validated for portable PSG systems. This pilot study aimed to assess the diagnostic accuracy of scoring sleep bruxism in absence of audio-video recordings. Ten subjects (mean age 24.7 ± 2.2) with a clinical diagnosis of sleep bruxism spent one night in the sleep laboratory. PSG were performed with a portable system (type 2) while audio-video was recorded. Sleep studies were scored by the same examiner three times: (1) without, (2) with, and (3) without audio-video in order to test the intra-scoring and intra-examiner reliability for RMMA scoring. The RMMA event-by-event concordance rate between scoring without audio-video and with audio-video was 68.3 %. Overall, the RMMA index was overestimated by 23.8 % without audio-video. However, the intra-class correlation coefficient (ICC) between scorings with and without audio-video was good (ICC = 0.91; p < 0.001); the intra-examiner reliability was high (ICC = 0.97; p < 0.001). The clinical diagnosis of sleep bruxism was confirmed in 8/10 subjects based on scoring without audio-video and in 6/10 subjects with audio-video. Although the absence of audio-video recording, the diagnostic accuracy of assessing RMMA with portable PSG systems appeared to remain good, supporting their use for both research and clinical purposes. However, the risk of moderate overestimation in absence of audio-video must be taken into account.
7 CFR 1730.21 - Inspections and tests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... reliability and security of the electric power grid, cause significant risk to the safety and health of the... AGRICULTURE ELECTRIC SYSTEM OPERATIONS AND MAINTENANCE Operations and Maintenance Requirements § 1730.21... parts of its electric system, annually exercise its ERP, and maintain records of such inspections and...
Gross, Brooks A.; Walsh, Christine M.; Turakhia, Apurva A.; Booth, Victoria; Mashour, George; Poe, Gina R.
2009-01-01
Manual state scoring of physiological recordings in sleep studies is time-consuming, resulting in a data backlog, research delays and increased personnel costs. We developed MATLAB-based software to automate scoring of sleep/waking states in rats, potentially extendable to other animals, from a variety of recording systems. The software contains two programs, Sleep Scorer and Auto-Scorer, for manual and automated scoring. Auto-Scorer is a logic-based program that displays power spectral densities of an electromyographic signal and σ, δ, and θ frequency bands of an electroencephalographic signal, along with the δ/θ ratio and σ ×θ, for every epoch. The user defines thresholds from the training file state definitions which the Auto-Scorer uses with logic to discriminate the state of every epoch in the file. Auto-Scorer was evaluated by comparing its output to manually scored files from 6 rats under 2 experimental conditions by 3 users. Each user generated a training file, set thresholds, and autoscored the 12 files into 4 states (waking, non-REM, transition-to-REM, and REM sleep) in ¼ the time required to manually score the file. Overall performance comparisons between Auto-Scorer and manual scoring resulted in a mean agreement of 80.24 +/− 7.87%, comparable to the average agreement among 3 manual scorers (83.03 +/− 4.00%). There was no significant difference between user-user and user-Auto-Scorer agreement ratios. These results support the use of our open-source Auto-Scorer, coupled with user review, to rapidly and accurately score sleep/waking states from rat recordings. PMID:19615408
NASA Astrophysics Data System (ADS)
Nightingale, M. P. S.; Kugel, H.; Gee, S. J.; Price, M. N.
1999-01-01
Theoretical modeling of 1-2 MW positive hydrogen ion neutral injectors developed at Oak Ridge National Laboratory (ORNL) has suggested that the plasma grid temperature could rise by up to 180 °C at pulse lengths above 0.5 s, leading to a grid deformation on the order of 5 mm, with a consequent change in focal length (from 4 to 2 m) and beamlet focusing. One of these injectors (on loan from ORNL) was used to achieve record β values on the Small Tight Aspect Ratio Tokamak at Culham, and two more are to be used on the Mega-Ampere Spherical Tokamak (MAST) at pulse lengths of up to 5 s. Since the grid modeling has never been tested experimentally, a method for diagnosing changes in beam transport as a function of pulse length using light emitted by the beam is now under development at Culham to see if grid modifications are required for MAST. Initial experimental results, carried out using a 50 A 30 keV hydrogen beam, are presented (including comparison with thermocouple data using an EK98 graphite beam stop). These confirm that emission measurement should allow the accelerator focal length and beamlet divergence to be determined to accuracies of better than ±0.45 m and ±0.2°, respectively (compared to nominal values of 4 m and 1.2°).
Siciliano, Mattia; Raimo, Simona; Tufano, Dario; Basile, Giuseppe; Grossi, Dario; Santangelo, Franco; Trojano, Luigi; Santangelo, Gabriella
2016-03-01
The Addenbrooke's Cognitive Examination Revised (ACE-R) is a rapid screening battery, including five sub-scales to explore different cognitive domains: attention/orientation, memory, fluency, language and visuospatial. ACE-R is considered useful in discriminating cognitively normal subjects from patients with mild dementia. The aim of present study was to provide normative values for ACE-R total score and sub-scale scores in a large sample of Italian healthy subjects. Five hundred twenty-six Italian healthy subjects (282 women and 246 men) of different ages (age range 20-93 years) and educational level (from primary school to university) underwent ACE-R and Montreal Cognitive Assessment (MoCA). Multiple linear regression analysis revealed that age and education significantly influenced performance on ACE-R total score and sub-scale scores. A significant effect of gender was found only in sub-scale attention/orientation. From the derived linear equation, a correction grid for raw scores was built. Inferential cut-offs score were estimated using a non-parametric technique and equivalent scores (ES) were computed. Correlation analysis showed a good significant correlation between ACE-R adjusted scores with MoCA adjusted scores (r = 0.612, p < 0.001). The present study provided normative data for the ACE-R in an Italian population useful for both clinical and research purposes.
Personality scores and smoking behaviour. A longitudinal study.
Cherry, N; Kiernan, K
1976-01-01
The personality scores at 16 years of age of 2753 people, all members of the National Survey of Health and Development, were related, in a follow-up study, to cigarette smoking behaviour in their young adult years. Survey members who recorded high neuroticism scores were found to be more likely to smoke than those with low scores and, among the smokers, deep inhalers formed the most neurotic group. Extraverts were more likely to smoke than introverts, the mean extraversion score being greatest for the male smokers with a high daily consumption of cigarettes. The personality scores were found to have some power in predicting changes in smoking behaviour. Neurotics and extraverts who had not started to smoke by the time of completing the personality inventory at 16 were more likely than the stable and introverted to take up the habit subsequently. Among survey members who were regular smokers at the time of completing the personality inventory the proportion giving up smoking by the time they reached the age of 25 years was related to consumption level recorded at 20 years and the personality scores recorded at 16, stable extraverts among the men being most likely to stop smoking. PMID:953376
77 FR 68123 - Privacy Act of 1974; System of Records Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
... test records, including registrant's first and last name, evaluation data, pretest and posttest scores..., pretest and posttest scores, and registration information, will be disclosed to accrediting bodies (such... educational information, training, best practices, and tools to health professionals as one initiative to help...
Grid site availability evaluation and monitoring at CMS
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
NASA Astrophysics Data System (ADS)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.
Nast, Justin B
2014-11-01
In 2011, over 3,000 active duty U.S. Air Force (USAF) members were prescribed a phosphodiesterase inhibitor (PDEI). PDEIs are first-line therapy for treating erectile dysfunction and can have significant side effects that could impact aircrew performance. In total, 200 eligible subject records were randomly sampled from the active duty USAF population of those males filling a prescription for a PDEI in June 2011; 100 of those records were from aviators. The electronic records were reviewed and scored to determine if USAF aeromedical standards for prescribing PDEIs were followed, with a minimum score of 0 for no standards met and a maximum of 3 for all standards met. The average score for both groups was 1, with no significant difference between the group scores. A proper aeromedical disposition was documented in 67% of the aviator records. Although there was no significant difference in standard of care for aviators and nonaviators, the overall documented standard of care was poor. Lack of documentation was the primary reason for the low scores and the low percentage of properly rendered aeromedical dispositions. Proper medical record documentation is important for evaluating quality of care and ensuring compliance with regulations in an Air Force aviator population. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.
van Staaveren, N; Teixeira, D L; Hanlon, A; Boyle, L A
2017-01-01
Tail lesions are important pig welfare indicators that could be recorded during meat inspection as they are more visible on the carcass than on the live animal. Tail biting is associated with reduced performance in the bitten pig, but it is not clear whether problems with tail biting are reflected in general farm performance figures. Farm advisory services aim to improve farm productivity which could be associated with improvements in pig welfare. Record keeping forms an integral part of such advisory services. The aim of this study was to examine the influence of record keeping in the Teagasc eProfit Monitor (ePM herds) on the prevalence of tail lesion severity scores in Irish slaughter pigs. In addition, we investigated associations between the prevalence of tail lesion scores and production parameters at farm level in ePM herds. Pigs were observed after scalding/dehairing and tail lesion score (0 to 4), sex and farm identification were recorded. Tail lesion scores were collapsed into none/mild lesions (score ⩽1), moderate lesions (score 2) and severe lesions (score ⩾3). The effect of record keeping (ePM herd) on the different tail lesion outcomes was analysed at batch level using the events/trials structure in generalized linear mixed models (PROC GLIMMIX). Spearman's rank correlations were calculated between average tail lesion score of a batch and production parameters. A total of 13 133 pigs were assessed from 73 batches coming from 61 farms. In all, 23 farms were identified as ePM herds. The average prevalence of moderate tail lesions was 26.8% and of severe tail lesions was 3.4% in a batch. Batches coming from ePM herds had a lower prevalence of moderate tail lesions than non-ePM herds (P<0.001). Average tail lesion score was negatively associated with age (P<0.05) and weight (P<0.05) at sale/transfer of weaners, and tended to be positively associated with the number of finishing days (P=0.06). In addition, the prevalence of severe tail lesions was negatively associated with average daily gain in weaners (P<0.05) and tended to do so with average daily gain in finishers (P=0.08). This study provides the first indication that record keeping through an advisory service may help to lower the risk of tail biting, which is associated with improved farm performance.
The Impact of Sika Deer on Vegetation in Japan: Setting Management Priorities on a National Scale
NASA Astrophysics Data System (ADS)
Ohashi, Haruka; Yoshikawa, Masato; Oono, Keiichi; Tanaka, Norihisa; Hatase, Yoriko; Murakami, Yuhide
2014-09-01
Irreversible shifts in ecosystems caused by large herbivores are becoming widespread around the world. We analyzed data derived from the 2009-2010 Sika Deer Impact Survey, which assessed the geographical distribution of deer impacts on vegetation through a questionnaire, on a scale of 5-km grid-cells. Our aim was to identify areas facing irreversible ecosystem shifts caused by deer overpopulation and in need of management prioritization. Our results demonstrated that the areas with heavy impacts on vegetation were widely distributed across Japan from north to south and from the coastal to the alpine areas. Grid-cells with heavy impacts are especially expanding in the southwestern part of the Pacific side of Japan. The intensity of deer impacts was explained by four factors: (1) the number of 5-km grid-cells with sika deer in neighboring 5 km-grid-cells in 1978 and 2003, (2) the year sika deer were first recorded in a grid-cell, (3) the number of months in which maximum snow depth exceeded 50 cm, and (4) the proportion of urban areas in a particular grid-cell. Based on our model, areas with long-persistent deer populations, short snow periods, and fewer urban areas were predicted to be the most vulnerable to deer impact. Although many areas matching these criteria already have heavy deer impact, there are some areas that remain only slightly impacted. These areas may need to be designated as having high management priority because of the possibility of a rapid intensification of deer impact.
The impact of Sika deer on vegetation in Japan: setting management priorities on a national scale.
Ohashi, Haruka; Yoshikawa, Masato; Oono, Keiichi; Tanaka, Norihisa; Hatase, Yoriko; Murakami, Yuhide
2014-09-01
Irreversible shifts in ecosystems caused by large herbivores are becoming widespread around the world. We analyzed data derived from the 2009-2010 Sika Deer Impact Survey, which assessed the geographical distribution of deer impacts on vegetation through a questionnaire, on a scale of 5-km grid-cells. Our aim was to identify areas facing irreversible ecosystem shifts caused by deer overpopulation and in need of management prioritization. Our results demonstrated that the areas with heavy impacts on vegetation were widely distributed across Japan from north to south and from the coastal to the alpine areas. Grid-cells with heavy impacts are especially expanding in the southwestern part of the Pacific side of Japan. The intensity of deer impacts was explained by four factors: (1) the number of 5-km grid-cells with sika deer in neighboring 5 km-grid-cells in 1978 and 2003, (2) the year sika deer were first recorded in a grid-cell, (3) the number of months in which maximum snow depth exceeded 50 cm, and (4) the proportion of urban areas in a particular grid-cell. Based on our model, areas with long-persistent deer populations, short snow periods, and fewer urban areas were predicted to be the most vulnerable to deer impact. Although many areas matching these criteria already have heavy deer impact, there are some areas that remain only slightly impacted. These areas may need to be designated as having high management priority because of the possibility of a rapid intensification of deer impact.
Constraining earthquake source inversions with GPS data: 1. Resolution-based removal of artifacts
Page, M.T.; Custodio, S.; Archuleta, R.J.; Carlson, J.M.
2009-01-01
We present a resolution analysis of an inversion of GPS data from the 2004 Mw 6.0 Parkfield earthquake. This earthquake was recorded at thirteen 1-Hz GPS receivers, which provides for a truly coseismic data set that can be used to infer the static slip field. We find that the resolution of our inverted slip model is poor at depth and near the edges of the modeled fault plane that are far from GPS receivers. The spatial heterogeneity of the model resolution in the static field inversion leads to artifacts in poorly resolved areas of the fault plane. These artifacts look qualitatively similar to asperities commonly seen in the final slip models of earthquake source inversions, but in this inversion they are caused by a surplus of free parameters. The location of the artifacts depends on the station geometry and the assumed velocity structure. We demonstrate that a nonuniform gridding of model parameters on the fault can remove these artifacts from the inversion. We generate a nonuniform grid with a grid spacing that matches the local resolution length on the fault and show that it outperforms uniform grids, which either generate spurious structure in poorly resolved regions or lose recoverable information in well-resolved areas of the fault. In a synthetic test, the nonuniform grid correctly averages slip in poorly resolved areas of the fault while recovering small-scale structure near the surface. Finally, we present an inversion of the Parkfield GPS data set on the nonuniform grid and analyze the errors in the final model. Copyright 2009 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Petersson, Anders; Rodgers, Arthur
2010-05-01
The finite difference method on a uniform Cartesian grid is a highly efficient and easy to implement technique for solving the elastic wave equation in seismic applications. However, the spacing in a uniform Cartesian grid is fixed throughout the computational domain, whereas the resolution requirements in realistic seismic simulations usually are higher near the surface than at depth. This can be seen from the well-known formula h ≤ L-P which relates the grid spacing h to the wave length L, and the required number of grid points per wavelength P for obtaining an accurate solution. The compressional and shear wave lengths in the earth generally increase with depth and are often a factor of ten larger below the Moho discontinuity (at about 30 km depth), than in sedimentary basins near the surface. A uniform grid must have a grid spacing based on the small wave lengths near the surface, which results in over-resolving the solution at depth. As a result, the number of points in a uniform grid is unnecessarily large. In the wave propagation project (WPP) code, we address the over-resolution-at-depth issue by generalizing our previously developed single grid finite difference scheme to work on a composite grid consisting of a set of structured rectangular grids of different spacings, with hanging nodes on the grid refinement interfaces. The computational domain in a regional seismic simulation often extends to depth 40-50 km. Hence, using a refinement ratio of two, we need about three grid refinements from the bottom of the computational domain to the surface, to keep the local grid size in approximate parity with the local wave lengths. The challenge of the composite grid approach is to find a stable and accurate method for coupling the solution across the grid refinement interface. Of particular importance is the treatment of the solution at the hanging nodes, i.e., the fine grid points which are located in between coarse grid points. WPP implements a new, energy conserving, coupling procedure for the elastic wave equation at grid refinement interfaces. When used together with our single grid finite difference scheme, it results in a method which is provably stable, without artificial dissipation, for arbitrary heterogeneous isotropic elastic materials. The new coupling procedure is based on satisfying the summation-by-parts principle across refinement interfaces. From a practical standpoint, an important advantage of the proposed method is the absence of tunable numerical parameters, which seldom are appreciated by application experts. In WPP, the composite grid discretization is combined with a curvilinear grid approach that enables accurate modeling of free surfaces on realistic (non-planar) topography. The overall method satisfies the summation-by-parts principle and is stable under a CFL time step restriction. A feature of great practical importance is that WPP automatically generates the composite grid based on the user provided topography and the depths of the grid refinement interfaces. The WPP code has been verified extensively, for example using the method of manufactured solutions, by solving Lamb's problem, by solving various layer over half- space problems and comparing to semi-analytic (FK) results, and by simulating scenario earthquakes where results from other seismic simulation codes are available. WPP has also been validated against seismographic recordings of moderate earthquakes. WPP performs well on large parallel computers and has been run on up to 32,768 processors using about 26 Billion grid points (78 Billion DOF) and 41,000 time steps. WPP is an open source code that is available under the Gnu general public license.
Grid mapping: a novel method of signal quality evaluation on a single lead electrocardiogram.
Li, Yanjun; Tang, Xiaoying
2017-12-01
Diagnosis of long-term electrocardiogram (ECG) calls for automatic and accurate methods of ECG signal quality estimation, not only to lighten the burden of the doctors but also to avoid misdiagnoses. In this paper, a novel waveform-based method of phase-space reconstruction for signal quality estimation on a single lead ECG was proposed by projecting the amplitude of the ECG and its first order difference into grid cells. The waveform of a single lead ECG was divided into non-overlapping episodes (T s = 10, 20, 30 s), and the number of grids in both the width and the height of each map are in the range [20, 100] (N X = N Y = 20, 30, 40, … 90, 100). The blank pane ratio (BPR) and the entropy were calculated from the distribution of ECG sampling points which were projected into the grid cells. Signal Quality Indices (SQI) bSQI and eSQI were calculated according to the BPR and the entropy, respectively. The MIT-BIH Noise Stress Test Database was used to test the performance of bSQI and eSQI on ECG signal quality estimation. The signal-to-noise ratio (SNR) during the noisy segments of the ECG records in the database is 24, 18, 12, 6, 0 and - 6 dB, respectively. For the SQI quantitative analysis, the records were divided into three groups: good quality group (24, 18 dB), moderate group (12, 6 dB) and bad quality group (0, - 6 dB). The classification among good quality group, moderate quality group and bad quality group were made by linear support-vector machine with the combination of the BPR, the entropy, the bSQI and the eSQI. The classification accuracy was 82.4% and the Cohen's Kappa coefficient was 0.74 on a scale of N X = 40 and T s = 20 s. In conclusion, the novel grid mapping offers an intuitive and simple approach to achieving signal quality estimation on a single lead ECG.
NASA Astrophysics Data System (ADS)
Zhu, Kefeng; Xue, Ming
2016-11-01
On 21 July 2012, an extreme rainfall event that recorded a maximum rainfall amount over 24 hours of 460 mm, occurred in Beijing, China. Most operational models failed to predict such an extreme amount. In this study, a convective-permitting ensemble forecast system (CEFS), at 4-km grid spacing, covering the entire mainland of China, is applied to this extreme rainfall case. CEFS consists of 22 members and uses multiple physics parameterizations. For the event, the predicted maximum is 415 mm d-1 in the probability-matched ensemble mean. The predicted high-probability heavy rain region is located in southwest Beijing, as was observed. Ensemble-based verification scores are then investigated. For a small verification domain covering Beijing and its surrounding areas, the precipitation rank histogram of CEFS is much flatter than that of a reference global ensemble. CEFS has a lower (higher) Brier score and a higher resolution than the global ensemble for precipitation, indicating more reliable probabilistic forecasting by CEFS. Additionally, forecasts of different ensemble members are compared and discussed. Most of the extreme rainfall comes from convection in the warm sector east of an approaching cold front. A few members of CEFS successfully reproduce such precipitation, and orographic lift of highly moist low-level flows with a significantly southeasterly component is suggested to have played important roles in producing the initial convection. Comparisons between good and bad forecast members indicate a strong sensitivity of the extreme rainfall to the mesoscale environmental conditions, and, to less of an extent, the model physics.
Lunar Dust Monitor to BE Onboard the Next Japanese Lunar Mission SELENE-2
NASA Astrophysics Data System (ADS)
Ohashi, Hideo
The next Japanese lunar mission SELENE-2, after a successful mission Kaguya (a project named SELENE), is planned to be launched in mid 2010s and is consisted of a lander, a rover, and an orbiter, as a transmitting satellite to the earth. A dust particle detector LDM (Lunar Dust Monitor) is proposed to be onboard the orbiter. The LDM is an impact ionization detector with dimensions 25 cm × 25 cm × 30 cm, and it has a sensor part (LDM-S, upper module) and an electronics part (LDM-E, lower module). The LDM-S has a large target (gold-plated Al) of 400 cm2 , to which a high voltage of +500 V is applied. The LDM-S also has two meshed grids parallel to the target. The grids are etched stainless steel with 90% transparency: the inner grid is 2 cm apart from the target and the outer grid is 15 cm from the target. When a charged dust particle passes through the outer and inner grids, it induces an electric signal on the grids separated by a certain time interval, determined by the velocity of the incident particle and the distance between the outer and inner grids. By measuring the time interval, we can calculate the velocity of the particle, with the ambiguity of its trajectory to the target. When the incident particle impacts on the target, plasma gas of electrons and ions is generated. The electrons of the plasma are collected by the target and the ions are accelerated toward the inner grids as a result of the electric field. Some of the ions drift through the inner grid and reach the outer grid. The outer and inner grids and the target are connected to charge-sensitive amplifiers, which convert charge signals induced by the electrons and ions to voltage signals that are fed to a following flash ADC driven with 10 MHz. The waveforms from two grids and the target can be stored and be sent back to ground for data analysis. We can deduce the mass and velocity information of the incident dust particle from the recorded waveforms. The orbiter of SELENE-2 is planned to be in operation for one year or more, and the LDM will observe circumlunar dust for as long as possible. We report scientific importance of dust measurement around the Moon, and current status of LDM in this conference.
Apply lightweight recognition algorithms in optical music recognition
NASA Astrophysics Data System (ADS)
Pham, Viet-Khoi; Nguyen, Hai-Dang; Nguyen-Khac, Tung-Anh; Tran, Minh-Triet
2015-02-01
The problems of digitalization and transformation of musical scores into machine-readable format are necessary to be solved since they help people to enjoy music, to learn music, to conserve music sheets, and even to assist music composers. However, the results of existing methods still require improvements for higher accuracy. Therefore, the authors propose lightweight algorithms for Optical Music Recognition to help people to recognize and automatically play musical scores. In our proposal, after removing staff lines and extracting symbols, each music symbol is represented as a grid of identical M ∗ N cells, and the features are extracted and classified with multiple lightweight SVM classifiers. Through experiments, the authors find that the size of 10 ∗ 12 cells yields the highest precision value. Experimental results on the dataset consisting of 4929 music symbols taken from 18 modern music sheets in the Synthetic Score Database show that our proposed method is able to classify printed musical scores with accuracy up to 99.56%.
PepArML: A Meta-Search Peptide Identification Platform
Edwards, Nathan J.
2014-01-01
The PepArML meta-search peptide identification platform provides a unified search interface to seven search engines; a robust cluster, grid, and cloud computing scheduler for large-scale searches; and an unsupervised, model-free, machine-learning-based result combiner, which selects the best peptide identification for each spectrum, estimates false-discovery rates, and outputs pepXML format identifications. The meta-search platform supports Mascot; Tandem with native, k-score, and s-score scoring; OMSSA; MyriMatch; and InsPecT with MS-GF spectral probability scores — reformatting spectral data and constructing search configurations for each search engine on the fly. The combiner selects the best peptide identification for each spectrum based on search engine results and features that model enzymatic digestion, retention time, precursor isotope clusters, mass accuracy, and proteotypic peptide properties, requiring no prior knowledge of feature utility or weighting. The PepArML meta-search peptide identification platform often identifies 2–3 times more spectra than individual search engines at 10% FDR. PMID:25663956
Digging into Archaeology Projects.
ERIC Educational Resources Information Center
Grambo, Greg
1996-01-01
Suggestions are offered for a classroom project of planning and conducting an archaeological dig on or near school property. Principles of archaeological practice such as making drawings of the site and using a grid frame to record locations are explained. Also suggested is a simulation activity in which students pick imbedded "findings" out of…
2008-09-01
DEMONSTRATOR’S FIELD PERSONNEL Geophysicist: Craig Hyslop Geophysicist: John Jacobsen Geophysicist: Rob Mehl 3.7 DEMONSTRATOR’S FIELD SURVEYING...Yuma Proving Ground Soil Survey Report, May 2003. 5. Practical Nonparametric Statistics, W.J. Conover, John Wiley & Sons, 1980 , pages 144 through
The ANKLe Score: An Audit of Otolaryngology Emergency Clinic Record Keeping
Dexter, Sara C; Hayashi, Daichi; Tysome, James R
2008-01-01
INTRODUCTION Accurate and legible medical records are essential to good quality patient care. Guidelines from The Royal College of Surgeons of England (RCSE) state the content required to form a complete medical record, but do not address legibility. An audit of otolaryngology emergency clinic record keeping was performed using a new scoring system. PATIENTS AND METHODS The Adjusted Note Keeping and Legibility (ANKLe) score was developed as an objective and quantitative method to assess both the content and legibility of case notes, incorporating the RCSE guidelines. Twenty consecutive otolaryngology emergency clinic case notes from each of 7 senior house officers were audited against standards for legibility and content using the ANKLe score. A proforma was introduced to improve documentation and handwriting advice was given. A further set of 140 notes (20 notes for each of the 7 doctors) was audited in the same way to provide feedback. RESULTS The introduction of a proforma and advice on handwriting significantly increased the quality of case note entries in terms of content, legibility and overall ANKLe score. CONCLUSIONS Accurate note keeping can be improved by the use of a proforma. The legibility of handwriting can be improved using simple advice. The ANKLe score is an objective assessment tool of the overall quality of medical note documentation which can be adapted for use in other specialties. PMID:18430339
Discovering amino acid patterns on binding sites in protein complexes
Kuo, Huang-Cheng; Ong, Ping-Lin; Lin, Jung-Chang; Huang, Jen-Peng
2011-01-01
Discovering amino acid (AA) patterns on protein binding sites has recently become popular. We propose a method to discover the association relationship among AAs on binding sites. Such knowledge of binding sites is very helpful in predicting protein-protein interactions. In this paper, we focus on protein complexes which have protein-protein recognition. The association rule mining technique is used to discover geographically adjacent amino acids on a binding site of a protein complex. When mining, instead of treating all AAs of binding sites as a transaction, we geographically partition AAs of binding sites in a protein complex. AAs in a partition are treated as a transaction. For the partition process, AAs on a binding site are projected from three-dimensional to two-dimensional. And then, assisted with a circular grid, AAs on the binding site are placed into grid cells. A circular grid has ten rings: a central ring, the second ring with 6 sectors, the third ring with 12 sectors, and later rings are added to four sectors in order. As for the radius of each ring, we examined the complexes and found that 10Å is a suitable range, which can be set by the user. After placing these recognition complexes on the circular grid, we obtain mining records (i.e. transactions) from each sector. A sector is regarded as a record. Finally, we use the association rule to mine these records for frequent AA patterns. If the support of an AA pattern is larger than the predetermined minimum support (i.e. threshold), it is called a frequent pattern. With these discovered patterns, we offer the biologists a novel point of view, which will improve the prediction accuracy of protein-protein recognition. In our experiments, we produced the AA patterns by data mining. As a result, we found that arginine (arg) most frequently appears on the binding sites of two proteins in the recognition protein complexes, while cysteine (cys) appears the fewest. In addition, if we discriminate the shape of binding sites between concave and convex further, we discover that patterns {arg, glu, asp} and {arg, ser, asp} on the concave shape of binding sites in a protein more frequently (i.e. higher probability) make contact with {lys} or {arg} on the convex shape of binding sites in another protein. Thus, we can confidently achieve a rate of at least 78%. On the other hand {val, gly, lys} on the convex surface of binding sites in proteins is more frequently in contact with {asp} on the concave site of another protein, and the confidence achieved is over 81%. Applying data mining in biology can reveal more facts that may otherwise be ignored or not easily discovered by the naked eye. Furthermore, we can discover more relationships among AAs on binding sites by appropriately rotating these residues on binding sites from a three-dimension to two-dimension perspective. We designed a circular grid to deposit the data, which total to 463 records consisting of AAs. Then we used the association rules to mine these records for discovering relationships. The proposed method in this paper provides an insight into the characteristics of binding sites for recognition complexes. PMID:21464838
NASA Astrophysics Data System (ADS)
Ramsdale, Jason; Balme, Matthew; Conway, Susan
2015-04-01
An International Space Science Institute (ISSI) team project has been convened to study the northern plains of Mars. The northern plains are younger and at lower elevation than the majority of the martian surface and are thought to be the remnants of an ancient ocean. Understanding the surface geology and geomorphology of the Northern Plains is complex, because the surface has been subtly modified many times, making traditional unit-boundaries hard to define. Our ISSI team project aims to answer the following questions: 1) "What is the distribution of ice-related landforms in the northern plains, and can it be related to distinct latitude bands or different geological or geomorphological units?" 2) "What is the relationship between the latitude dependent mantle (LDM; a draping unit believed to comprise of ice and dust thought to be deposited under periods of high axial obliquity) and (i) landforms indicative of ground ice, and (ii) other geological units in the northern plains?" 3) "What are the distributions and associations of recent landforms indicative of thaw of ice or snow?" With increasing coverage of high-resolution images of the surface of we are able to identify increasing numbers and varieties of small-scale landforms on Mars. Many such landforms are too small to represent on regional maps, yet determining their presence or absence across large areas can form the observational basis for developing hypotheses on the nature and history of an area. The combination of improved spatial resolution with near-continuous coverage increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre-scale landforms. Here, we describe an approach to mapping small features across large areas. Rather than traditional mapping with points, lines and polygons, we used a grid "tick box" approach to locate specific landforms. The mapping strips were divided into 15×150 grid of squares, each approximately 20×20 km, for each study area. Orbital images at 6-15m/pix were then viewed systematically for each grid square and the presence or absence of each of the basic suite of landforms recorded. The landforms were recorded as being either "present", "dominant", "possible", or "absent" in each grid square. The result is a series of coarse-resolution "rasters" showing the distribution of the different types of landforms across the strip. We have found this approach to be efficient, scalable and appropriate for teams of people mapping remotely. It is easily scalable because, carrying the "absent" values forward to finer grids from the larger grids would mean only areas with positive values for that landform would need to be examined to increase the resolution for the whole strip. As each sub-grid only requires the presence or absence of a landform ascertaining, it therefore removes an individual's decision as to where to draw boundaries, making the method efficient and repeatable.
Mapping Seabird Sensitivity to Offshore Wind Farms
Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N.; Caldow, Richard W. G.; Hume, Duncan
2014-01-01
We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979–2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species’ ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented. PMID:25210739
Oral health status and alveolar bone loss in treated leprosy patients of central India.
Rawlani, S M; Rawlani, S; Degwekar, S; Bhowte, R R; Motwani, M
2011-01-01
A descriptive cross sectional study was carried out, in a group of 160 leprosy patients treated with multi drug therapy. The patients with age group of 25 to 60 year were considered. Out of 160 patients 50 patients were selected by simple random sampling technique for radiological assessments. Intra-oral periapical radiographs (6 for each patient) were taken. The paralleling long cone technique was used and radiographs were attached with grids so as to enable measuring the bone height. The grid was spaced in 1 mm marking and placed directly over the film. Clinical examination revealed that Prevalence of dental caries was 76.25% and periodontal disease was 78.75%. Mean DMFT score was 2.26. Mean OHI-S score was 3.50. Score for Gingival index was 1.60 and average loss of gingival attachment was 1.2 mm. Radiographic findings showed mean alveolar bone loss in maxillary anterior region to be 5.05 mm and in maxillary posterior region it was 4.92 mm. Alveolar bone loss in mandibular anterior region was 4.35 mm and in mandibular posterior region was 5.14 mm. Overall Dental Health Status of the leprosy patients was poor and needed more attention for dental care. There was also an increase in the alveolar bone loss, which was generalized. This bone loss could be due to advance stage of the disease or late approach to rehabilitation center, these patients also had peripheral neuropathy leading to hand and feet deformity in the form of claw hand or ulcer on hand, making maintenance of oral hygiene difficult.
Functional redundancy of ventral spinal locomotor pathways.
Loy, David N; Magnuson, David S K; Zhang, Y Ping; Onifer, Stephen M; Mills, Michael D; Cao, Qi-lin; Darnall, Jessica B; Fajardo, Lily C; Burke, Darlene A; Whittemore, Scott R
2002-01-01
Identification of long tracts responsible for the initiation of spontaneous locomotion is critical for spinal cord injury (SCI) repair strategies. Pathways derived from the mesencephalic locomotor region and pontomedullary medial reticular formation responsible for fictive locomotion in decerebrate preparations project to the thoracolumbar levels of the spinal cord via reticulospinal axons in the ventrolateral funiculus (VLF). However, white matter regions critical for spontaneous over-ground locomotion remain unclear because cats, monkeys, and humans display varying degrees of locomotor recovery after ventral SCIs. We studied the contributions of myelinated tracts in the VLF and ventral columns (VC) to spontaneous over-ground locomotion in the adult rat using demyelinating lesions. Animals received ethidium bromide plus photon irradiation producing discrete demyelinating lesions sufficient to stop axonal conduction in the VLF, VC, VLF-VC, or complete ventral white matter (CV). Behavior [open-field Basso, Beattie, and Bresnahan (BBB) scores and grid walking] and transcranial magnetic motor-evoked potentials (tcMMEP) were studied at 1, 2, and 4 weeks after lesion. VLF lesions resulted in complete loss or severe attenuation of tcMMEPs, with mean BBB scores of 18.0, and no grid walking deficits. VC lesions produced behavior similar to VLF-lesioned animals but did not significantly affect tcMMEPs. VC-VLF and CV lesions resulted in complete loss of tcMMEP signals with mean BBB scores of 12.7 and 6.5, respectively. Our data support a diffuse arrangement of axons within the ventral white matter that may comprise a system of multiple descending pathways subserving spontaneous over-ground locomotion in the intact animal.
Mapping seabird sensitivity to offshore wind farms.
Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N; Caldow, Richard W G; Hume, Duncan
2014-01-01
We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979-2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species' ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented.
Gustafson, William Jr; Vogelmann, Andrew; Endo, Satoshi; Toto, Tami; Xiao, Heng; Li, Zhijin; Cheng, Xiaoping; Kim, Jinwon; Krishna, Bhargavi
2015-08-31
The Alpha 2 release is the second release from the LASSO Pilot Phase that builds upon the Alpha 1 release. Alpha 2 contains additional diagnostics in the data bundles and focuses on cases from spring-summer 2016. A data bundle is a unified package consisting of LASSO LES input and output, observations, evaluation diagnostics, and model skill scores. LES input include model configuration information and forcing data. LES output includes profile statistics and full domain fields of cloud and environmental variables. Model evaluation data consists of LES output and ARM observations co-registered on the same grid and sampling frequency. Model performance is quantified by skill scores and diagnostics in terms of cloud and environmental variables.
Monthly and seasonally verification of precipitation in Poland
NASA Astrophysics Data System (ADS)
Starosta, K.; Linkowska, J.
2009-04-01
The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds
Monthly and seasonally verification of precipitation in Poland
NASA Astrophysics Data System (ADS)
Starosta, K.; Linkowska, J.
2009-04-01
The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.
BioAcoustica: a free and open repository and analysis platform for bioacoustics
Baker, Edward; Price, Ben W.; Rycroft, S. D.; Smith, Vincent S.
2015-01-01
We describe an online open repository and analysis platform, BioAcoustica (http://bio.acousti.ca), for recordings of wildlife sounds. Recordings can be annotated using a crowdsourced approach, allowing voice introductions and sections with extraneous noise to be removed from analyses. This system is based on the Scratchpads virtual research environment, the BioVeL portal and the Taverna workflow management tool, which allows for analysis of recordings using a grid computing service. At present the analyses include spectrograms, oscillograms and dominant frequency analysis. Further analyses can be integrated to meet the needs of specific researchers or projects. Researchers can upload and annotate their recordings to supplement traditional publication. Database URL: http://bio.acousti.ca PMID:26055102
Initial Thrust Measurements of Marshall's Ion-ioN Thruster
NASA Technical Reports Server (NTRS)
Caruso, Natalie R. S.; Scogin, Tyler; Liu, Thomas M.; Walker, Mitchell L. R.; Polzin, Kurt A.; Dankanich, John W.
2015-01-01
Electronegative ion thrusters are a variation of traditional gridded ion thruster technology differentiated by the production and acceleration of both positive and negative ions. Benefits of electronegative ion thrusters include the elimination of lifetime-limiting cathodes from the thruster architecture and the ability to generate appreciable thrust from both charge species. While much progress has been made in the development of electronegative ion thruster technology, direct thrust measurements are required to unambiguously demonstrate the efficacy of the concept and support continued development. In the present work, direct thrust measurements of the thrust produced by the MINT (Marshall's Ion-ioN Thruster) are performed using an inverted-pendulum thrust stand in the High-Power Electric Propulsion Laboratory's Vacuum Test Facility-1 at the Georgia Institute of Technology with operating pressures ranging from 4.8 x 10(exp -5) and 5.7 x 10(exp -5) torr. Thrust is recorded while operating with a propellant volumetric mixture ratio of 5:1 argon to nitrogen with total volumetric flow rates of 6, 12, and 24 sccm (0.17, 0.34, and 0.68 mg/s). Plasma is generated using a helical antenna at 13.56 MHz and radio frequency (RF) power levels of 150 and 350 W. The acceleration grid assembly is operated using both sinusoidal and square waveform biases of +/-350 V at frequencies of 4, 10, 25, 125, and 225 kHz. Thrust is recorded for two separate thruster configurations: with and without the magnetic filter. No thrust is discernable during thruster operation without the magnetic filter for any volumetric flow rate, RF forward Power level, or acceleration grid biasing scheme. For the full thruster configuration, with the magnetic filter installed, a brief burst of thrust of approximately 3.75 mN +/- 3 mN of error is observed at the start of grid operation for a volumetric flow rate of 24 sccm at 350 W RF power using a sinusoidal waveform grid bias at 125 kHz and +/- 350 V. Similar bursts in thrust are observed using a square waveform grid bias at 10 kHz and +/- 350 V for volumetric flow rates of 6, 10, and 12 sccm at 150, 350, and 350 W respectively. The only operating condition that exhibits repeated thrust spikes throughout thruster operation is the 24 sccm condition with a 5:1 mixture ratio at 150 W RF power using the 10 kHz square waveform acceleration grid bias. Thrust spikes for this condition measure 3 mN with an error of +/- 2.5 mN. There are no operating conditions tested that show continuous thrust production.
Łęski, Szymon; Pettersen, Klas H; Tunstall, Beth; Einevoll, Gaute T; Gigg, John; Wójcik, Daniel K
2011-12-01
The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a two-dimensional grid using multi-electrode rectangular arrays. This new method, which we call two-dimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one- and three-dimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include system-specific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUI-based MATLAB toolbox to analyze and visualize our test data as well as user datasets.
First Gridded Spatial Field Reconstructions of Snow from Tree Rings
NASA Astrophysics Data System (ADS)
Coulthard, B. L.; Anchukaitis, K. J.; Pederson, G. T.; Alder, J. R.; Hostetler, S. W.; Gray, S. T.
2017-12-01
Western North America's mountain snowpacks provide critical water resources for human populations and ecosystems. Warmer temperatures and changing precipitation patterns will increasingly alter the quantity, extent, and persistence of snow in coming decades. A comprehensive understanding of the causes and range of long-term variability in this system is required for forecasting future anomalies, but snowpack observations are limited and sparse. While individual tree ring-based annual snowpack reconstructions have been developed for specific regions and mountain ranges, we present here the first collection of spatially-explicit gridded field reconstructions of seasonal snowpack within the American Rocky Mountains. Capitalizing on a new western North American snow-sensitive network of over 700 tree-ring chronologies, as well as recent advances in PRISM-based snow modeling, our gridded reconstructions offer a full space-time characterization of snow and associated water resource fluctuations over several centuries. The quality of reconstructions is evaluated against existing observations, proxy-records, and an independently-developed first-order monthly snow model.
Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools
NASA Technical Reports Server (NTRS)
Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji
2016-01-01
The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.
The Validity of Computer Audits of Simulated Cases Records.
ERIC Educational Resources Information Center
Rippey, Robert M.; And Others
This paper describes the implementation of a computer-based approach to scoring open-ended problem lists constructed to evaluate student and practitioner clinical judgment from real or simulated records. Based on 62 previously administered and scored problem lists, the program was written in BASIC for a Heathkit H11A computer (equivalent to DEC…
ERIC Educational Resources Information Center
Wiggins, J. D.; Weslander, Darrell
1977-01-01
Expressed vocational choices were more predictive of employment status four years after high school graduation for males than were scores on either the Vocational Preference Inventory or the Kuder Preference Record--Vocational. Predictions for males were more accurate than for females on all measures. (Author)
USDA-ARS?s Scientific Manuscript database
Genetic merits in first vs. later parity with correlations <1 were compared to official repeatability models using 88 million lactation records of 34 million cows for yield traits and fewer records for somatic cell score (SCS) and 2 cow fertility traits. Estimated genetic correlations of first with ...
Crowdsourcing: a valid alternative to expert evaluation of robotic surgery skills.
Polin, Michael R; Siddiqui, Nazema Y; Comstock, Bryan A; Hesham, Helai; Brown, Casey; Lendvay, Thomas S; Martino, Martin A
2016-11-01
Robotic-assisted gynecologic surgery is common, but requires unique training. A validated assessment tool for evaluating trainees' robotic surgery skills is Robotic-Objective Structured Assessments of Technical Skills. We sought to assess whether crowdsourcing can be used as an alternative to expert surgical evaluators in scoring Robotic-Objective Structured Assessments of Technical Skills. The Robotic Training Network produced the Robotic-Objective Structured Assessments of Technical Skills, which evaluate trainees across 5 dry lab robotic surgical drills. Robotic-Objective Structured Assessments of Technical Skills were previously validated in a study of 105 participants, where dry lab surgical drills were recorded, de-identified, and scored by 3 expert surgeons using the Robotic-Objective Structured Assessments of Technical Skills checklist. Our methods-comparison study uses these previously obtained recordings and expert surgeon scores. Mean scores per participant from each drill were separated into quartiles. Crowdworkers were trained and calibrated on Robotic-Objective Structured Assessments of Technical Skills scoring using a representative recording of a skilled and novice surgeon. Following this, 3 recordings from each scoring quartile for each drill were randomly selected. Crowdworkers evaluated the randomly selected recordings using Robotic-Objective Structured Assessments of Technical Skills. Linear mixed effects models were used to derive mean crowdsourced ratings for each drill. Pearson correlation coefficients were calculated to assess the correlation between crowdsourced and expert surgeons' ratings. In all, 448 crowdworkers reviewed videos from 60 dry lab drills, and completed a total of 2517 Robotic-Objective Structured Assessments of Technical Skills assessments within 16 hours. Crowdsourced Robotic-Objective Structured Assessments of Technical Skills ratings were highly correlated with expert surgeon ratings across each of the 5 dry lab drills (r ranging from 0.75-0.91). Crowdsourced assessments of recorded dry lab surgical drills using a validated assessment tool are a rapid and suitable alternative to expert surgeon evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.
Cartoon distraction alleviates anxiety in children during induction of anesthesia.
Lee, Jeongwoo; Lee, Jihye; Lim, Hyungsun; Son, Ji-Seon; Lee, Jun-Rae; Kim, Dong-Chan; Ko, Seonghoon
2012-11-01
We performed this study to determine the beneficial effects of viewing an animated cartoon and playing with a favorite toy on preoperative anxiety in children aged 3 to 7 years in the operating room before anesthesia induction. One hundred thirty children aged 3 to 7 years with ASA physical status I or II were enrolled. Subjects were randomly assigned to 1 of 3 groups: group 1 (control), group 2 (toy), and group 3 (animated cartoon). The children in group 2 were asked to bring their favorite toy and were allowed to play with it until anesthesia induction. The children in group 3 watched their selected animated cartoon until anesthesia induction. Children's preoperative anxiety was determined by the modified Yale Preoperative Anxiety Scale (mYPAS) and parent-recorded anxiety Visual Analog Scale (VAS) the night before surgery, in the preanesthetic holding room, and just before anesthesia induction. In the preanesthetic holding room, the group 2 mYPAS and parent-recorded anxiety VAS scores were significantly lower than those of groups 1 and 3 (mYPAS: P = 0.007; parent-recorded anxiety VAS: P = 0.02). In the operating room, the children in group 3 had the lowest mYPAS and parent-recorded anxiety VAS scores among the 3 groups (mYPAS: P < 0.001; parent-recorded anxiety VAS: P < 0.001). In group 3, the mYPAS and parent-recorded anxiety VAS scores of only 3 and 5 children were increased in the operating room compared with their scores in the preanesthetic holding room, whereas the anxiety scores of 32 and 34 children in group 1 and 25 and 32 children in group 2 had increased (P < 0.001). The number of children whose scores indicated no anxiety (mYPAS score <30) in the operating room was 3 (7%), 9 (23%), and 18 (43%) in groups 1, 2, and 3, respectively (P < 0.001). Allowing the viewing of animated cartoons by pediatric surgical patients is a very effective method to alleviate preoperative anxiety. Our study suggests that this intervention is an inexpensive, easy to administer, and comprehensive method for anxiety reduction in the pediatric surgical population.
Effects of three hypnotics on the sleep-wakefulness cycle in sleep-disturbed rats.
Shinomiya, Kazuaki; Shigemoto, Yuki; Omichi, Junji; Utsu, Yoshiaki; Mio, Mitsunobu; Kamei, Chiaki
2004-04-01
New sleep disturbance model in rats is useful for estimating the characteristics of some hypnotics. The present study was undertaken to investigate the utility of a sleep disturbance model by placing rats on a grid suspended over water using three kinds of hypnotics, that is, short-acting benzodiazepine (triazolam), intermediate-acting benzodiazepine (flunitrazepam) and long-acting barbiturate (phenobarbital). Electrodes for measurement of EEG and EMG were implanted into the frontal cortex and the dorsal neck muscle of rats. EEG and EMG were recorded with an electroencephalogram. SleepSign ver.2.0 was used for EEG and EMG analysis. Total times of wakefulness, non-REM and REM sleep were measured from 0900 to 1500 hours. In rats placed on the grid suspended over water up to 1 cm under the grid surface, not only triazolam but also flunitrazepam and phenobarbital caused a shortening of sleep latency. Both flunitrazepam and phenobarbital were effective in increasing of total non-REM sleep time in rats placed on sawdust or the grid, and the effects of both drugs in rats placed on the grid were larger than those in rats placed on sawdust. Measurement of the hourly non-REM sleep time was useful for investigating the peak time and duration of effect of the three hypnotics. Phenobarbital showed a decrease in total REM sleep time in rats placed on the grid, although both triazolam and flunitrazepam were without effect. The present insomnia model can be used as a sleep disturbance model for testing not only the sleep-inducing effects but also the sleep-maintaining effects including non-REM sleep and REM sleep of hypnotics.
Refining area of occupancy to address the modifiable areal unit problem in ecology and conservation.
Moat, Justin; Bachman, Steven P; Field, Richard; Boyd, Doreen S
2018-05-23
The 'modifiable areal unit problem' is prevalent across many aspects of spatial analysis within ecology and conservation. The problem is particularly manifest when calculating metrics for extinction risk estimation, for example, area of occupancy (AOO). Although embedded into the International Union for the Conservation of Nature (IUCN) Red List criteria, AOO is often not used or is poorly applied. Here we evaluate new and existing methods for calculating AOO from occurrence records and present a method for determining the minimum AOO using a uniform grid. We evaluate the grid cell shape, grid origin and grid rotation with both real-world and simulated data, reviewing the effects on AOO values, and possible impacts for species already assessed on the IUCN Red List. We show that AOO can vary by up to 80% and a ratio of cells to points of 1:1.21 gives the maximum variation in the number of occupied cells. These findings potentially impact 3% of existing species on the IUCN Red List, as well as species not yet assessed. We show that a new method that combines both grid rotation and moving grid origin gives fast, robust and reproducible results and, in the majority of cases, achieves the minimum AOO. As well as reporting minimum AOO, we outline a confidence interval which should be incorporated into existing tools that support species risk assessment. We also make further recommendations for reporting AOO and other areal measurements within ecology, leading to more robust methods for future species risk assessment. This article is protected by copyright. All rights reserved. © 2018 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
Yoshida, Motoharu; Jochems, Arthur; Hasselmo, Michael E
2013-01-01
Mechanisms underlying grid cell firing in the medial entorhinal cortex (MEC) still remain unknown. Computational modeling studies have suggested that cellular properties such as spike frequency adaptation and persistent firing might underlie the grid cell firing. Recent in vivo studies also suggest that cholinergic activation influences grid cell firing. Here we investigated the anatomical distribution of firing frequency adaptation, the medium spike after hyperpolarization potential (mAHP), subthreshold membrane potential oscillations, sag potential, input resistance and persistent firing, in MEC layer II principal cells using in vitro whole-cell patch clamp recordings in rats. Anatomical distributions of these properties were compared along both the dorso-ventral and medio-lateral axes, both with and without the cholinergic receptor agonist carbachol. We found that spike frequency adaptation is significantly stronger in ventral than in dorsal neurons both with and without carbachol. Spike frequency adaptation was significantly correlated with the duration of the mAHP, which also showed a gradient along the dorso-ventral axis. In carbachol, we found that about 50% of MEC layer II neurons show persistent firing which lasted more than 30 seconds. Persistent firing of MEC layer II neurons might contribute to grid cell firing by providing the excitatory drive. Dorso-ventral differences in spike frequency adaptation we report here are opposite from previous predictions by a computational model. We discuss an alternative mechanism as to how dorso-ventral differences in spike frequency adaptation could contribute to different scales of grid spacing.
ERIC Educational Resources Information Center
Allan, George
1999-01-01
A student-centered learning model for a course on information systems project management consisted of individual study and group discussion with facilitator guidance. Data from session records, repertory grids, and a learning network diagram showed that interactive learning was more effective and students took responsibility, although some…
76 FR 60478 - Record of Decision, Texas Clean Energy Project
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... the plant with one or both of the nearby power grids; process water supply pipelines; a natural gas... per year. The CO 2 will be delivered through a regional pipeline network to existing oil fields in the... proposed Fort Stockton Holdings water supply pipeline; Possible changes in discharges to Monahans Draw and...
Al Harrach, M; Afsharipour, B; Boudaoud, S; Carriou, V; Marin, F; Merletti, R
2016-08-01
The Brachialis (BR) is placed under the Biceps Brachii (BB) deep in the upper arm. Therefore, the detection of the corresponding surface Electromyogram (sEMG) is a complex task. The BR is an important elbow flexor, but it is usually not considered in the sEMG based force estimation process. The aim of this study was to attempt to separate the two sEMG activities of the BR and the BB by using a High Density sEMG (HD-sEMG) grid placed at the upper arm and Canonical Component Analysis (CCA) technique. For this purpose, we recorded sEMG signals from seven subjects with two 8 × 4 electrode grids placed over BB and BR. Four isometric voluntary contraction levels were recorded (5, 10, 30 and 50 %MVC) for 90° elbow angle. Then using CCA and image processing tools the sources of each muscle activity were separated. Finally, the corresponding sEMG signals were reconstructed using the remaining canonical components in order to retrieve the activity of the BB and the BR muscles.
Multi-terabyte EIDE disk arrays running Linux RAID5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, D.A.; Cremaldi, L.M.; Eschenburg, V.
2004-11-01
High-energy physics experiments are currently recording large amounts of data and in a few years will be recording prodigious quantities of data. New methods must be developed to handle this data and make analysis at universities possible. Grid Computing is one method; however, the data must be cached at the various Grid nodes. We examine some storage techniques that exploit recent developments in commodity hardware. Disk arrays using RAID level 5 (RAID-5) include both parity and striping. The striping improves access speed. The parity protects data in the event of a single disk failure, but not in the case ofmore » multiple disk failures. We report on tests of dual-processor Linux Software RAID-5 arrays and Hardware RAID-5 arrays using a 12-disk 3ware controller, in conjunction with 250 and 300 GB disks, for use in offline high-energy physics data analysis. The price of IDE disks is now less than $1/GB. These RAID-5 disk arrays can be scaled to sizes affordable to small institutions and used when fast random access at low cost is important.« less
Soo, M; Sneddon, N W; Lopez-Villalobos, N; Worth, A J
2015-03-01
To use estimated breeding value (EBV) analysis to investigate the genetic trend of the total hip score (to assess canine hip dysplasia) in four populous breeds of dogs using the records from the New Zealand Veterinary Association (NZVA) Canine Hip Dysplasia Scheme database (1991 to 2011). Estimates of heritability and EBV for the NZVA total hip score of individual dogs from the German Shepherd, Labrador Retriever, Golden Retriever and Rottweiler breeds were obtained using restricted maximum likelihood procedures with a within-breed linear animal model. The model included the fixed effects of gender, birth year, birth season, age at scoring and the random effect of animal. The pedigree file included animals recorded between 1990 and 2011. A total of 2,983 NZVA hip score records, from a pedigree of 3,172 animals, were available for genetic evaluation. Genetic trends of the NZVA total hip score were calculated as the regression coefficient of the EBV (weighted by reliabilities) on year of birth. The estimates of heritability for hip score were 0.32 (SE 0.08) in German Shepherd, 0.37 (SE 0.08) in Labrador Retriever, 0.29 (SE 0.08) in Golden Retriever and 0.52 (SE 0.18) in Rottweiler breeds. Genetic trend analysis revealed that only the German Shepherd breed exhibited a genetic trend towards better hip conformation over time, with a decline of 0.13 (SE 0.04) NZVA total hip score units per year (p<0.001). The genetic trends of total hip score for the remaining three breeds were not significantly different from zero (p>0.1). Despite moderate heritability of the NZVA total hip score, there has not been substantial improvement of this trait for the four breeds analysed in the study period. Greater improvement in reducing the prevalence of canine hip dysplasia may be possible if screening were to be compulsory as a requirement for registration of pedigree breeding stock, greater selection pressure were to be applied and selection of breeding stock made on the basis on an individual's EBV rather than the NZVA total hip score alone.
Navier-Stokes Analysis of the Flowfield Characteristics of an Ice Contaminated Aircraft Wing
NASA Technical Reports Server (NTRS)
Chung, J.; Choo, Y.; Reehorst, A.; Potapczuk, M.; Slater, J.
1999-01-01
An analytical study was performed as part of the NASA Lewis support of a National Transportation Safety Board (NTSB) aircraft accident investigation. The study was focused on the performance degradation associated with ice contamination on the wing of a commercial turbo-prop-powered aircraft. Based upon the results of an earlier numerical study conducted by the authors, a prominent ridged-ice formation on the subject aircraft wing was selected for detailed flow analysis using 2-dimensional (2-D), as well as, 3-dimensional (3-D) Navier-Stokes computations. This configuration was selected because it caused the largest lift decrease and drag increase among all the ice shapes investigated in the earlier study. A grid sensitivity test was performed to find out the influence of grid spacing on the lift, drag, and associated angle-of-attack for the maximum lift (C(sub lmax)). This study showed that grid resolution is important and a sensitivity analysis is an essential element of the process in order to assure that the final solution is independent of the grid. The 2-D results suggested that a severe stability and control difficulty could have occurred at a slightly higher angle-of-attack (AOA) than the one recorded by the Flight Data Recorder (FDR). This stability and control problem was thought to have resulted from a decreased differential lift on the wings with respect to the normal loading for the configuration. The analysis also indicated that this stability and control problem could have occurred whether or not natural ice shedding took place. Numerical results using an assumed 3-D ice shape showed an increase of the angle at which this phenomena occurred of about 4 degrees. As it occurred with the 2-D case, the trailing edge separation was observed but started only when the AOA was very close to the angle at which the maximum lift occurred.
Paciorek, Christopher J; Goring, Simon J; Thurman, Andrew L; Cogbill, Charles V; Williams, John W; Mladenoff, David J; Peters, Jody A; Zhu, Jun; McLachlan, Jason S
2016-01-01
We present a gridded 8 km-resolution data product of the estimated composition of tree taxa at the time of Euro-American settlement of the northeastern United States and the statistical methodology used to produce the product from trees recorded by land surveyors. Composition is defined as the proportion of stems larger than approximately 20 cm diameter at breast height for 22 tree taxa, generally at the genus level. The data come from settlement-era public survey records that are transcribed and then aggregated spatially, giving count data. The domain is divided into two regions, eastern (Maine to Ohio) and midwestern (Indiana to Minnesota). Public Land Survey point data in the midwestern region (ca. 0.8-km resolution) are aggregated to a regular 8 km grid, while data in the eastern region, from Town Proprietor Surveys, are aggregated at the township level in irregularly-shaped local administrative units. The product is based on a Bayesian statistical model fit to the count data that estimates composition on the 8 km grid across the entire domain. The statistical model is designed to handle data from both the regular grid and the irregularly-shaped townships and allows us to estimate composition at locations with no data and to smooth over noise caused by limited counts in locations with data. Critically, the model also allows us to quantify uncertainty in our composition estimates, making the product suitable for applications employing data assimilation. We expect this data product to be useful for understanding the state of vegetation in the northeastern United States prior to large-scale Euro-American settlement. In addition to specific regional questions, the data product can also serve as a baseline against which to investigate how forests and ecosystems change after intensive settlement. The data product is being made available at the NIS data portal as version 1.0.
NASA Astrophysics Data System (ADS)
Peng, L.; Sheffield, J.; Verbist, K. M. J.
2016-12-01
Hydrological predictions at regional-to-global scales are often hampered by the lack of meteorological forcing data. The use of large-scale gridded meteorological data is able to overcome this limitation, but these data are subject to regional biases and unrealistic values at local scale. This is especially challenging in regions such as Chile, where climate exhibits high spatial heterogeneity as a result of long latitude span and dramatic elevation changes. However, regional station-based observational datasets are not fully exploited and have the potential of constraining biases and spatial patterns. This study aims at adjusting precipitation and temperature estimates from the Princeton University global meteorological forcing (PGF) gridded dataset to improve hydrological simulations over Chile, by assimilating 982 gauges from the Dirección General de Aguas (DGA). To merge station data with the gridded dataset, we use a state-space estimation method to produce optimal gridded estimates, considering both the error of the station measurements and the gridded PGF product. The PGF daily precipitation, maximum and minimum temperature at 0.25° spatial resolution are adjusted for the period of 1979-2010. Precipitation and temperature gauges with long and continuous records (>70% temporal coverage) are selected, while the remaining stations are used for validation. The leave-one-out cross validation verifies the robustness of this data assimilation approach. The merged dataset is then used to force the Variable Infiltration Capacity (VIC) hydrological model over Chile at daily time step which are compared to the observations of streamflow. Our initial results show that the station-merged PGF precipitation effectively captures drizzle and the spatial pattern of storms. Overall the merged dataset has significant improvements compared to the original PGF with reduced biases and stronger inter-annual variability. The invariant spatial pattern of errors between the station data and the gridded product opens up the possibility of merging real-time satellite and intermittent gauge observations to produce more accurate real-time hydrological predictions.
Solving Navigational Uncertainty Using Grid Cells on Robots
Milford, Michael J.; Wiles, Janet; Wyeth, Gordon F.
2010-01-01
To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643
ERIC Educational Resources Information Center
Powers, Donald E.; Kaufman, James C.
2004-01-01
The objective of the study reported here was to explore the relationship of Graduate Record Examinations (GRE) General Test scores to selected personality traits--conscientiousness, rationality, ingenuity, quickness, creativity, and depth. A sample of 342 GRE test takers completed short personality inventory scales for each trait. Analyses…
Treatment Outcome and Follow-Up Evaluation Based on Client Case Records in a Mental Health Center.
ERIC Educational Resources Information Center
Simons, Lynn S.; And Others
1978-01-01
Evaluated the application of Goal Attainment Scaling (GAS) to client case records as a measure of treatment effectiveness and examined its correspondence to other measures of outcome. Findings were that GAS scores converged significantly with therapist ratings of global improvement and GAS scores obtained from client reports at follow-up.…
ERIC Educational Resources Information Center
Burton, Nancy W.; Ramist, Leonard
2001-01-01
Studies predicting success in college for students graduating since 1980 are reviewed. SAT scores and high school records are the most common predictors, but a few studies of other predictors are included. The review establishes that SAT scores and high school records predict academic performance, nonacademic accomplishments, leadership in…
Velez, Vicente J; Kaw, Roop; Hu, Bo; Frankel, Richard M; Windover, Amy K; Bokar, Dan; Rish, Julie M; Rothberg, Michael B
2017-06-01
Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) scores measure patient satisfaction with hospital care. It is not known if these reflect the communication skills of the attending physician on record. The Four Habits Coding Scheme (4HCS) is a validated instrument that measures bedside physician communication skills according to 4 habits, namely: investing in the beginning, eliciting the patient's perspective, demonstrating empathy, and investing in the end. To investigate whether the 4HCS correlates with provider HCAHPS scores. Using a cross-sectional design, consenting hospitalist physicians (n = 28), were observed on inpatient rounds during 3 separate encounters. We compared hospitalists' 4HCS scores with their doctor communication HCAHPS scores to assess the degree to which these correlated with inpatient physician communication skills. We performed sensitivity analysis excluding scores returned by patients cared for by more than 1 hospitalist. A total of 1003 HCAHPS survey responses were available. Pearson correlation between 4HCS and doctor communication scores was not significant, at 0.098 (-0.285, 0.455; P = 0.619). Also, no significant correlations were found between each habit and HCAHPS. When including only scores attributable to 1 hospitalist, Pearson correlation between the empathy habit and the HCAHPS respect score was 0.515 (0.176, 0.745; P = 0.005). Between empathy and overall doctor communication, it was 0.442 (0.082, 0.7; P = 0.019). Attending-of-record HCAHPS scores do not correlate with 4HCS. After excluding patients cared for by more than 1 hospitalist, demonstrating empathy did correlate with the doctor communication and respect HCAHPS scores. Journal of Hospital Medicine 2017;12:421-427. © 2017 Society of Hospital Medicine
The Metadata Coverage Index (MCI): A standardized metric for quantifying database metadata richness.
Liolios, Konstantinos; Schriml, Lynn; Hirschman, Lynette; Pagani, Ioanna; Nosrat, Bahador; Sterk, Peter; White, Owen; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Taylor, Chris; Kyrpides, Nikos C; Field, Dawn
2012-07-30
Variability in the extent of the descriptions of data ('metadata') held in public repositories forces users to assess the quality of records individually, which rapidly becomes impractical. The scoring of records on the richness of their description provides a simple, objective proxy measure for quality that enables filtering that supports downstream analysis. Pivotally, such descriptions should spur on improvements. Here, we introduce such a measure - the 'Metadata Coverage Index' (MCI): the percentage of available fields actually filled in a record or description. MCI scores can be calculated across a database, for individual records or for their component parts (e.g., fields of interest). There are many potential uses for this simple metric: for example; to filter, rank or search for records; to assess the metadata availability of an ad hoc collection; to determine the frequency with which fields in a particular record type are filled, especially with respect to standards compliance; to assess the utility of specific tools and resources, and of data capture practice more generally; to prioritize records for further curation; to serve as performance metrics of funded projects; or to quantify the value added by curation. Here we demonstrate the utility of MCI scores using metadata from the Genomes Online Database (GOLD), including records compliant with the 'Minimum Information about a Genome Sequence' (MIGS) standard developed by the Genomic Standards Consortium. We discuss challenges and address the further application of MCI scores; to show improvements in annotation quality over time, to inform the work of standards bodies and repository providers on the usability and popularity of their products, and to assess and credit the work of curators. Such an index provides a step towards putting metadata capture practices and in the future, standards compliance, into a quantitative and objective framework.
The Influence of Training and Experience on Rater Performance in Scoring Spoken Language
ERIC Educational Resources Information Center
Davis, Larry
2016-01-01
Two factors were investigated that are thought to contribute to consistency in rater scoring judgments: rater training and experience in scoring. Also considered were the relative effects of scoring rubrics and exemplars on rater performance. Experienced teachers of English (N = 20) scored recorded responses from the TOEFL iBT speaking test prior…
A Unified Air-Sea Visualization System: Survey on Gridding Structures
NASA Technical Reports Server (NTRS)
Anand, Harsh; Moorhead, Robert
1995-01-01
The goal is to develop a Unified Air-Sea Visualization System (UASVS) to enable the rapid fusion of observational, archival, and model data for verification and analysis. To design and develop UASVS, modelers were polled to determine the gridding structures and visualization systems used, and their needs with respect to visual analysis. A basic UASVS requirement is to allow a modeler to explore multiple data sets within a single environment, or to interpolate multiple datasets onto one unified grid. From this survey, the UASVS should be able to visualize 3D scalar/vector fields; render isosurfaces; visualize arbitrary slices of the 3D data; visualize data defined on spectral element grids with the minimum number of interpolation stages; render contours; produce 3D vector plots and streamlines; provide unified visualization of satellite images, observations and model output overlays; display the visualization on a projection of the users choice; implement functions so the user can derive diagnostic values; animate the data to see the time-evolution; animate ocean and atmosphere at different rates; store the record of cursor movement, smooth the path, and animate a window around the moving path; repeatedly start and stop the visual time-stepping; generate VHS tape animations; work on a variety of workstations; and allow visualization across clusters of workstations and scalable high performance computer systems.
RainyDay: An Online, Open-Source Tool for Physically-based Rainfall and Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Wright, D.; Yu, G.; Holman, K. D.
2017-12-01
Flood frequency analysis in ungaged or changing watersheds typically requires rainfall intensity-duration-frequency (IDF) curves combined with hydrologic models. IDF curves only depict point-scale rainfall depth, while true rainstorms exhibit complex spatial and temporal structures. Floods result from these rainfall structures interacting with watershed features such as land cover, soils, and variable antecedent conditions as well as river channel processes. Thus, IDF curves are traditionally combined with a variety of "design storm" assumptions such as area reduction factors and idealized rainfall space-time distributions to translate rainfall depths into inputs that are suitable for flood hydrologic modeling. The impacts of such assumptions are relatively poorly understood. Meanwhile, modern precipitation estimates from gridded weather radar, grid-interpolated rain gages, satellites, and numerical weather models provide more realistic depictions of rainfall space-time structure. Usage of such datasets for rainfall and flood frequency analysis, however, are hindered by relatively short record lengths. We present RainyDay, an open-source stochastic storm transposition (SST) framework for generating large numbers of realistic rainfall "scenarios." SST "lengthens" the rainfall record by temporal resampling and geospatial transposition of observed storms to extract space-time information from regional gridded rainfall data. Relatively short (10-15 year) records of bias-corrected radar rainfall data are sufficient to estimate rainfall and flood events with much longer recurrence intervals including 100-year and 500-year events. We describe the SST methodology as implemented in RainyDay and compare rainfall IDF results from RainyDay to conventional estimates from NOAA Atlas 14. Then, we demonstrate some of the flood frequency analysis properties that are possible when RainyDay is integrated with a distributed hydrologic model, including robust estimation of flood hazards in a changing watershed. The U.S. Bureau of Reclamation is supporting the development of a web-based variant of RainyDay, a "beta" version of which is available at http://her.cee.wisc.edu/projects/rainyday/.
Jones, P D. [University of East Anglia, Norwich, United Kingdom; Wigley, T. M. L. [University of East Anglia, Norwich, United Kingdom; Briffa, K. R. [University of East Anglia, Norwich, United Kingdom
2012-01-01
Real and reconstructed measurements of monthly mean pressure data have been constructed for Europe for 1780 through 1980 and North America for 1858 through 1980. The reconstructions use early pressure, temperature, and precipitation data from a variety of sources including World Weather Records, meteorological and national archives, circulation maps, and daily chart series. Each record contains the year, monthly mean pressure, quality code, and annual mean pressure. These reconstructed gridded monthly pressures provide a reliable historical record of mean sea-level pressures for Europe and North America. The data are in two files: pressure reconstructions for Europe (1.47 MB) and for North America (0.72 MB).
Qualities of dental chart recording and coding.
Chantravekin, Yosananda; Tasananutree, Munchulika; Santaphongse, Supitcha; Aittiwarapoj, Anchisa
2013-01-01
Chart recording and coding are the important processes in the healthcare informatics system, but there were only a few reports in the dentistry field. The objectives of this study are to study the qualities of dental chart recording and coding, as well as the achievement of lecture/workshop on this topic. The study was performed by auditing the patient's charts at the TU Dental Student Clinic from July 2011-August 2012. The chart recording mean scores ranged from 51.0-55.7%, whereas the errors in the coding process were presented in the coder part more than the doctor part. The lecture/workshop could improve the scores only in some topics.
NASA Astrophysics Data System (ADS)
Mohammed Anzar, Sharafudeen Thaha; Sathidevi, Puthumangalathu Savithri
2014-12-01
In this paper, we have considered the utility of multi-normalization and ancillary measures, for the optimal score level fusion of fingerprint and voice biometrics. An efficient matching score preprocessing technique based on multi-normalization is employed for improving the performance of the multimodal system, under various noise conditions. Ancillary measures derived from the feature space and the score space are used in addition to the matching score vectors, for weighing the modalities, based on their relative degradation. Reliability (dispersion) and the separability (inter-/intra-class distance and d-prime statistics) measures under various noise conditions are estimated from the individual modalities, during the training/validation stage. The `best integration weights' are then computed by algebraically combining these measures using the weighted sum rule. The computed integration weights are then optimized against the recognition accuracy using techniques such as grid search, genetic algorithm and particle swarm optimization. The experimental results show that, the proposed biometric solution leads to considerable improvement in the recognition performance even under low signal-to-noise ratio (SNR) conditions and reduces the false acceptance rate (FAR) and false rejection rate (FRR), making the system useful for security as well as forensic applications.
Orzol, Leonard L.
1997-01-01
MODTOOLS uses the particle data calculated by MODPATH to construct several types of GIS output. MODTOOLS uses particle information recorded by MODPATH such as the row, column, or layer of the model grid, to generate a set of characteristics associated with each particle. The user can choose from the set of characteristics associated with each particle and use the capabilities of the GIS to selectively trace the movement of water discharging from specific cells in the model grid. MODTOOLS allows the hydrogeologist to utilize the capabilities of the GIS to graphically combine the results of the particle-tracking analysis, which facilitates the analysis and understanding of complex ground-water flow systems.
NASA Astrophysics Data System (ADS)
Brooker, LM; Balme, MR; Conway; Hagermann, A.; Collins, GS
2015-10-01
Lyot crater, a 215km dia meter, Hesperian-aged ma rtian impact crater, contains many landforms that appear to have formed by glac ial, perig lacia l and fluvia l processes [1-3]. Around Lyot are large channels potentially formed by groundwater release during the impact event[1,3]. Hence, the landscape of Lyot crater appears to record the act ion of both ancient water sourced fro m underground, and more recent water sourced fro m the at mosphere. We have used a grid mapping approach [5] to describe the distribution of these landf orms and landscapes in and around Lyot crater.These data are presented here and potential avenues of future work discussed.
Hassan, Elham A; Hassan, Marwa H; Torad, Faisal A
2018-05-18
The aim of the study was to correlate the clinical severity of pectus excavatum with its type and degree based on objective radiographic evaluation. Twelve brachycephalic dogs were included. Grading of the clinical severity was done based on a 6-point grading score. Thoracic radiographs were used to calculate the frontosagittal and vertebral indices at the tenth thoracic vertebra and the vertebra overlying the excavatum. Correlation between the clinical severity score and frontosagittal and vertebral indices was evaluated using Pearson's correlation coefficient. Typical pectus excavatum was recorded in the caudal sternum in seven dogs, with a mean clinical severity score of 1.7 ± 1.4, whereas in five dogs, atypical mid-sternal deviation was recorded with a mean clinical severity score of 3.8 ± 0.7. A strong correlation (r=0.7) was recorded between the clinical severity score and vertebral index in the atypical form, whereas a weak correlation (r=0.02) was recorded in the typical form (P<0.05). The clinical severity and degree of pectus excavatum was poorly correlated (r=0.3) in the typical form of pectus excavatum, whereas it was strongly correlated (r=0.9) in the atypical form. Pectus excavatum in dogs is associated with compressive cardiopulmonary dysfunction, which depends mainly on the site/type of deviation rather than the degree of deviation.
NASA Astrophysics Data System (ADS)
Rahmani, Elham; Friederichs, Petra; Keller, Jan; Hense, Andreas
2016-05-01
The main purpose of this study is to develop an easy-to-use weather generator (WG) for the downscaling of gridded data to point measurements at regional scale. The WG is applied to daily averaged temperatures and annual growing degree days (GDD) of wheat. This particular choice of variables is motivated by future investigations on temperature impacts as the most important climate variable for wheat cultivation under irrigation in Iran. The proposed statistical downscaling relates large-scale ERA-40 reanalysis to local daily temperature and annual GDD. Long-term local observations in Iran are used at 16 synoptic stations from 1961 to 2001, which is the common period with ERA-40 data. We perform downscaling using two approaches: the first is a linear regression model that uses the ERA-40 fingerprints (FP) defined by the squared correlation with local variability, and the second employs a linear multiple regression (MR) analysis to relate the large-scale information at the neighboring grid points to the station data. Extending the usual downscaling, we implement a WG providing uncertainty information and realizations of the local temperatures and GDD by adding a Gaussian random noise. ERA-40 reanalysis well represents the local daily temperature as well as the annual GDD variability. For 2-m temperature, the FPs are more localized during the warm compared with the cold season. While MR is slightly superior for daily temperature time series, FP seems to perform best for annual GDD. We further assess the quality of the WGs applying probabilistic verification scores like the continuous ranked probability score (CRPS) and the respective skill score. They clearly demonstrate the superiority of WGs compared with a deterministic downscaling.
Álvarez-Camacho, M; Martínez-Michel, L; Gonella, S; Scrimger, R A; Chu, K P; Wismer, W V
2016-06-01
Dietary advice for post treatment head and neck cancer (HNC) patients emphasizes food characteristics of nutritional value and texture, and not patients' characterization of food. The aim of this study was to determine patients' characterization of food. Repertory grid interviews were conducted with 19 orally-fed HNC patients between 4 and 10 months post-treatment to characterize foods commonly eaten, avoided and eaten sometimes. Patients compared and rated 12 foods using their own descriptors. Data were analyzed by General Procrustes Analysis (GPA). Socio-demographic status, taste and smell alterations, appetite and food intake data were also collected. Patient physical symptom burden was defined by University of Washington-Quality of Life Physical Function domain scores and used to stratify patients with "less physical symptom burden" (n = 11, score ≥ 61.7) or "greater physical symptom burden" (n = 8, score < 61.7). All patients used descriptors of taste, ease of eating, convenience, texture, potential to worsen symptoms and liking to characterize foods. Overall, avoided foods were characterized as having dry texture, while foods commonly eaten were characterized by their ease of eating and low potential to worsen symptoms. Descriptors of nutrition and smell were significant only for patients with greater physical symptom burden. Physical symptom burden influenced the characterization of foods among post-treatment HNC patients. Nutrition counseling must consider patients' physical symptom burden and the subsequent characterization of food that drive food selection or avoidance to facilitate dietary advice for adequate, appropriate and enjoyable food intake. Copyright © 2016 Elsevier Ltd. All rights reserved.
Miranda, Dinis Reis; Nap, Raoul; de Rijk, Angelique; Schaufeli, Wilmar; Iapichino, Gaetano
2003-02-01
The instruments used for measuring nursing workload in the intensive care unit (e.g., Therapeutic Intervention Scoring System-28) are based on therapeutic interventions related to severity of illness. Many nursing activities are not necessarily related to severity of illness, and cost-effectiveness studies require the accurate evaluation of nursing activities. The aim of the study was to determine the nursing activities that best describe workload in the intensive care unit and to attribute weights to these activities so that the score describes average time consumption instead of severity of illness. To define by consensus a list of nursing activities, to determine the average time consumption of these activities by use of a 1-wk observational cross-sectional study, and to compare these results with those of the Therapeutic Intervention Scoring System-28. A total of 99 intensive care units in 15 countries. Consecutive admissions to the intensive care units. Daily recording of nursing activities at a patient level and random multimoment recording of these activities. A total of five new items and 14 subitems describing nursing activities in the intensive care unit (e.g., monitoring, care of relatives, administrative tasks) were added to the list of therapeutic interventions in Therapeutic Intervention Scoring System-28. Data from 2,041 patients (6,451 nursing days and 127,951 multimoment recordings) were analyzed. The new activities accounted for 60% of the average nursing time; the new scoring system (Nursing Activities Score) explained 81% of the nursing time (vs. 43% in Therapeutic Intervention Scoring System-28). The weights in the Therapeutic Intervention Scoring System-28 are not derived from the use of nursing time. Our study suggests that the Nursing Activities Score measures the consumption of nursing time in the intensive care unit. These results should be validated in independent databases.
Chopra, Radhika; Marwaha, Mohita; Bansal, Kalpana; Mittal, Meenu
2016-01-01
Failure of inferior alveolar nerve block in achieving profound anesthesia of the pulp due to various reasons has led to the introduction of more potent local anesthetic agents like articaine. This study was conducted to compare the efficacy of buccal infiltration with articaine in achieving pulpal anesthesia of primary molars as compared to inferior alveolar nerve block with lignocaine. 30 patients (4-8 years) with indication of pulp therapy in at least two mandibular primary molars were selected. Patients were randomly assigned to receive nerve block with lignocaine or infiltration with articaine on first appointment and the other solution on second appointment. All the pulpotomies and pulpectomies were performed by a pediatric dentist. Two researchers standing at a distance of 1.5 m recorded the Pain Scores and Sound, Eye, Motor (SEM) scores. After the completion of procedure, the patient was asked to record the Facial Image score and Heft-Parker Visual Analogue Score (HP-VAS). Pain Score recorded at the time of injection showed significantly more movements with block as compared to infiltration (p<0.001). SEM scores at time of pulp extirpation were also higher for block than infiltration (p<0.001). Articaine infiltration has the potential to replace inferior alveolar nerve block for primary mandibular molars.
A scoring system for ascertainment of incident stroke; the Risk Index Score (RISc).
Kass-Hout, T A; Moyé, L A; Smith, M A; Morgenstern, L B
2006-01-01
The main objective of this study was to develop and validate a computer-based statistical algorithm that could be translated into a simple scoring system in order to ascertain incident stroke cases using hospital admission medical records data. The Risk Index Score (RISc) algorithm was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christi (BASIC) project, 2000. The validity of RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment by physician and/or abstractor review of hospital admission records. RISc was developed on 1718 randomly selected patients (training set) and then statistically validated on an independent sample of 858 patients (validation set). A multivariable logistic model was used to develop RISc and subsequently evaluated by goodness-of-fit and receiver operating characteristic (ROC) analyses. The higher the value of RISc, the higher the patient's risk of potential stroke. The study showed RISc was well calibrated and discriminated those who had potential stroke from those that did not on initial screening. In this study we developed and validated a rapid, easy, efficient, and accurate method to ascertain incident stroke cases from routine hospital admission records for epidemiologic investigations. Validation of this scoring system was achieved statistically; however, clinical validation in a community hospital setting is warranted.
NASA Astrophysics Data System (ADS)
Oriani, F.; Stisen, S.; Demirel, C.
2017-12-01
The spatial representation of rainfall is of primary importance to correctly study the uncertainty of basin recharge and its propagation to the surface and underground circulation. We consider here the daily grid rainfall product provided by the Danish Meteorological Institute as input to the National Water Resources Model of Denmark. Due to a drastic reduction in the rain gauge network (from approximately 500 stations in the period 1996-2006, to 250 in the period 2007-2014), the grid rainfall product, based on the interpolation of these data, is much less reliable. The research is focused on the Skjern catchment (1,050 km2 western Jutland), where we can dispose of the complete rain-gauge database from the Danish Hydrological Observatory and compute the distributed hydrological response at the 1-km scale.To give a better estimation of the gridded rainfall input, we start from ground measurements by simulating the missing data with a stochastic data-mining approach, then we compute again the grid interpolation. To maximize the predictive power of the technique, combinations of station time-series that are the most informative to each other are selected on the basis of their correlation and available historical data. Then, the missing data inside these time-series are simulated together using the direct sampling technique (DS) [1, 2]. DS simulates a datum by sampling the historical record of the same stations where a similar data pattern occurs, preserving their complex statistical relation. The simulated data are reinjected in the whole dataset and used as well as conditioning data to progressively fill up the gaps in other stations.The results show that the proposed methodology, tested on the period 1995-2012, can increase the realism of the grid rainfall product by regenerating the missing ground measurements. The hydrological response is analyzed considering the observations at 5 hydrological stations. The presented methodology can be used in many regions to regenerate the missing data using the information contained in the historical record and propagate the uncertainty of the prediction to the hydrological response. [1] G.Mariethoz et al. (2010), Water Resour. Res., 10.1029/2008WR007621.[2] F. Oriani et al. (2014), Hydrol. Earth Syst. Sc., 10.5194/hessd-11-3213-2014.
NASA Astrophysics Data System (ADS)
Takagi, R.; Obara, K.; Uchida, N.
2017-12-01
Understanding slow earthquake activity improves our knowledge of slip behavior in brittle-ductile transition zone and subduction process including megathrust earthquakes. In order to understand overall picture of slow slip activity, it is important to make a comprehensive catalog of slow slip events (SSEs). Although short-term SSEs have been detected by GNSS and tilt meter records systematically, analysis of long-term slow slip events relies on individual slip inversions. We develop an algorism to systematically detect long-term SSEs and estimate source parameters of the SSEs using GNSS data. The algorism is similar to GRiD-MT (Tsuruoka et al., 2009), which is grid-based automatic determination of moment tensor solution. Instead of moment tensor fitting to long period seismic records, we estimate parameters of a single rectangle fault to fit GNSS displacement time series. First, we make a two dimensional grid covering possible location of SSE. Second, we estimate best-fit parameters (length, width, slip, and rake) of the rectangle fault at each grid point by an iterative damped least square method. Depth, strike, and dip are fixed on the plate boundary. Ramp function with duration of 300 days is used for expressing time evolution of the fault slip. Third, a grid maximizing variance reduction is selected as a candidate of long-term SSE. We also search onset of ramp function based on the grid search. We applied the method to GNSS data in southwest Japan to detect long-term SSEs in Nankai subduction zone. With current selection criteria, we found 13 events with Mw6.2-6.9 in Hyuga-nada, Bungo channel, and central Shikoku from 1998 to 2015, which include unreported events. Key finding is along strike migrations of long-term SSEs from Hyuga-nada to Bungo channel and from Bungo channel to central Shikoku. In particular, three successive events migrating northward in Hyuga-nada preceded the 2003 Bungo channel SSE, and one event in central Shikoku followed the 2003 SSE in Bungo channel. The space-time dimensions of the possible along-strike migration are about 300km in length and 6 years in time. Systematic detection with assumptions of various durations in the time evolution of SSE may improve the picture of SSE activity and possible interaction with neighboring SSEs.
Manganelli, P; Salaffi, F; Nervetti, A; Chierici, P; Ferraccioli, G F; Ambanelli, U
1987-01-01
Fifty-five patients, (30 Rheumatoid Arthritis (RA) and 25 Osteoarthritis (OA], with knee synovial effusion and popliteal cysts, visualized through arthrograms, were studied. A relationship was sought between radiological findings and area of the cysts, measured through a millimeter grid. Ten radiological parameters were graded and summed up to obtain a "total knee score". A "total geode score" was also obtained by scoring, separately, the geodes. In addition two specific indexes were used--for comparison--the erosive index, modified after Berens and Lin, in RA and the Kelligren's index in OA. In RA a statistically significant, inverse correlation was found between the x-ray scores and the area of the cysts, while such a relationship was not observed in OA. However, only a third of the cysts accounted for the inverse relationship in RA. Furthermore, two control groups of RA and OA patients revealed a striking association between degree of radiological damage and frequency of popliteal cysts. Therefore, the hypothesis that popliteal cysts might have a protective effect against the articular-bone damage in RA, can be held only in few cases.
A Novel Approach for Lie Detection Based on F-Score and Extreme Learning Machine
Gao, Junfeng; Wang, Zhao; Yang, Yong; Zhang, Wenjia; Tao, Chunyi; Guan, Jinan; Rao, Nini
2013-01-01
A new machine learning method referred to as F-score_ELM was proposed to classify the lying and truth-telling using the electroencephalogram (EEG) signals from 28 guilty and innocent subjects. Thirty-one features were extracted from the probe responses from these subjects. Then, a recently-developed classifier called extreme learning machine (ELM) was combined with F-score, a simple but effective feature selection method, to jointly optimize the number of the hidden nodes of ELM and the feature subset by a grid-searching training procedure. The method was compared to two classification models combining principal component analysis with back-propagation network and support vector machine classifiers. We thoroughly assessed the performance of these classification models including the training and testing time, sensitivity and specificity from the training and testing sets, as well as network size. The experimental results showed that the number of the hidden nodes can be effectively optimized by the proposed method. Also, F-score_ELM obtained the best classification accuracy and required the shortest training and testing time. PMID:23755136
The ShakeOut earthquake scenario: Verification of three simulation sets
Bielak, J.; Graves, R.W.; Olsen, K.B.; Taborda, R.; Ramirez-Guzman, L.; Day, S.M.; Ely, G.P.; Roten, D.; Jordan, T.H.; Maechling, P.J.; Urbanic, J.; Cui, Y.; Juve, G.
2010-01-01
This paper presents a verification of three simulations of the ShakeOut scenario, an Mw 7.8 earthquake on a portion of the San Andreas fault in southern California, conducted by three different groups at the Southern California Earthquake Center using the SCEC Community Velocity Model for this region. We conducted two simulations using the finite difference method, and one by the finite element method, and performed qualitative and quantitative comparisons between the corresponding results. The results are in good agreement with each other; only small differences occur both in amplitude and phase between the various synthetics at ten observation points located near and away from the fault-as far as 150 km away from the fault. Using an available goodness-of-fit criterion all the comparisons scored above 8, with most above 9.2. This score would be regarded as excellent if the measurements were between recorded and synthetic seismograms. We also report results of comparisons based on time-frequency misfit criteria. Results from these two criteria can be used for calibrating the two methods for comparing seismograms. In those cases in which noticeable discrepancies occurred between the seismograms generated by the three groups, we found that they were the product of inherent characteristics of the various numerical methods used and their implementations. In particular, we found that the major source of discrepancy lies in the difference between mesh and grid representations of the same material model. Overall, however, even the largest differences in the synthetic seismograms are small. Thus, given the complexity of the simulations used in this verification, it appears that the three schemes are consistent, reliable and sufficiently accurate and robust for use in future large-scale simulations. ?? 2009 The Authors Journal compilation ?? 2009 RAS.
Ambiguity assessment of small-angle scattering curves from monodisperse systems.
Petoukhov, Maxim V; Svergun, Dmitri I
2015-05-01
A novel approach is presented for an a priori assessment of the ambiguity associated with spherically averaged single-particle scattering. The approach is of broad interest to the structural biology community, allowing the rapid and model-independent assessment of the inherent non-uniqueness of three-dimensional shape reconstruction from scattering experiments on solutions of biological macromolecules. One-dimensional scattering curves recorded from monodisperse systems are nowadays routinely utilized to generate low-resolution particle shapes, but the potential ambiguity of such reconstructions remains a major issue. At present, the (non)uniqueness can only be assessed by a posteriori comparison and averaging of repetitive Monte Carlo-based shape-determination runs. The new a priori ambiguity measure is based on the number of distinct shape categories compatible with a given data set. For this purpose, a comprehensive library of over 14,000 shape topologies has been generated containing up to seven beads closely packed on a hexagonal grid. The computed scattering curves rescaled to keep only the shape topology rather than the overall size information provide a `scattering map' of this set of shapes. For a given scattering data set, one rapidly obtains the number of neighbours in the map and the associated shape topologies such that in addition to providing a quantitative ambiguity measure the algorithm may also serve as an alternative shape-analysis tool. The approach has been validated in model calculations on geometrical bodies and its usefulness is further demonstrated on a number of experimental X-ray scattering data sets from proteins in solution. A quantitative ambiguity score (a-score) is introduced to provide immediate and convenient guidance to the user on the uniqueness of the ab initio shape reconstruction from the given data set.
Efficacy of postoperative pain management in head and neck cancer patients.
Hinther, Ashley; Nakoneshny, Steven C; Chandarana, Shamir P; Wayne Matthews, T; Dort, Joseph C
2018-05-02
Our study quantifies the effectiveness of perioperative pain control in a cohort of patients undergoing major head and neck surgery with free flap reconstruction. Our long-term goal is to improve pain control and thereby increase mobility, decrease postoperative complications and decrease hospital stay. A retrospective analysis was performed at a tertiary, academic head and neck surgical oncology program in Calgary, Alberta, Canada from January 1, 2015 - December 31, 2015. Pain scores were recorded prospectively. Primary outcomes were frequency of postoperative pain assessments and pain intensity using the numeric rating scale. The cohort included 41 patients. Analysis was limited to pain scores recorded from postoperative days 1-14. There was an average of 7.3 pain measurements per day (SD 4.6, range 1-24) with the most frequent monitoring on postoperative days 1-4. Median pain scores ranged from 0 to 4.5 with the highest median score on postoperative day 6. The daily maximum pain scores recorded ranged from 8 to 10 with scores of 10 recorded on postoperative days 1, 2, 3, 5, 7, 8, and 10. Patients most frequently had inadequate pain control on postoperative days 1, 2, 4, and 5 with the majority occurring on postoperative day 1. Postoperative pain control could be improved at our centre. The frequency of pain assessments is also highly variable. Ongoing measurement, audit, and feedback of analgesic protocol effectiveness is an excellent first step in improving perioperative pain management in patients undergoing major head and neck cancer surgery with free flap reconstruction.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... subject to all easements, restrictions and rights- of-way of record. The bearings shown herein are based on the Grid Bearing of North 79[deg] 49' 22'' West for the centerline of Dublin-Granville Road, as.... Mayfield, Jr., Manager, Detroit Airports District Office, FAA, Great Lakes Region. [FR Doc. 2011-33469...
Regional Data Assimilation of AIRS Profiles and Radiances at the SPoRT Center
NASA Technical Reports Server (NTRS)
Zavodsky, Brad; Chou, Shih-hung; Jedlovec, Gary
2009-01-01
This slide presentation reviews the Short Term Prediction Research and Transition (SPoRT) Center's mission to improve short-term weather prediction at the regional and local scale. It includes information on the cold bias in Weather Research and Forcasting (WRF), troposphere recordings from the Atmospheric Infrared Sounder (AIRS), and vertical resolution of analysis grid.
Isaia, Federica; Gyurko, Robert; Roomian, Tamar C; Hawley, Charles E
2018-04-06
The Root Coverage Esthetic Score (RES) was published in 2009 as an esthetic scoring system to measure visible final outcomes of root coverage procedures performed on Miller I and II recession defects. The aim of this study was to evaluate the intra-examiner, intra-group, and inter-examiner reliability of the (Root Coverage Esthetic Score) RES when used among periodontal faculty, post-graduate students in periodontology, and pre-doctoral DMD students when using the RES at Tufts University School of Dental Medicine (TUSDM). Thirty-three participants (12 second year DMD students, 11 periodontal residents, and 10 faculty members) were assembled to evaluate 25 baseline and 6-months post-treatment outcomes of mucogingival surgeries using the RES. Each projection was shown for 30 seconds during which the participants were asked to use the RES scoring system to evaluate the surgical outcomes. The results were then recorded on a standardized worksheet grid. To test intra-examiner reliability, 7 of the 25 projections were shown twice. Intra-examiner reliability and inter-examiner reliability were assessed using intraclass correlation coefficient using a two-way mixed effects model, and stratified by education level. PG residents had the highest tendency to agree with each other with an interclass correlation (ICC) of 0.53 (95%CI 0.36 - 0.74). DMD students had an ICC: 0.51 (95%CI: 0.33 - 0.75), and PG faculty members produced an ICC: 0.41 (95%CI: 0.24 - 0.64). There was no statistically significant difference in ICC among the three groups of participants (Kruskal-Wallis test, P = 0.2440). When the data for each RES element were then combined, the mean ICC for the total interrater agreement for RES was 0.48 (95% CI: 0.32-0.71). This corresponds to an overall moderate agreement among all participants using the RES to evaluate the 25 surgical outcomes. The intra-examiner reliability within each of the three groups was quite high. The highest mean ICC was produced by the PG Faculty (0.908). The mean ICCs for PG residents was 0.867, and the mean ICC for DMD students was 0.855. The Kruskal-Wallis test (p = 0.46) failed to find any statistical difference in intra-examiner reliability between the three groups of participants CONCLUSIONS: The RES is a "moderately" reliable scoring system for mucogingival treatments in a dental school setting and can be used even by operators with different level of periodontal experience. This scoring system can be repeated by the same examiner obtaining reliable results. This article is protected by copyright. All rights reserved. © 2018 American Academy of Periodontology.
Tsai, David; John, Esha; Chari, Tarun; Yuste, Rafael; Shepard, Kenneth
2015-01-01
We present a system for large-scale electrophysiological recording and stimulation of neural tissue with a planar topology. The recording system has 65,536 electrodes arranged in a 256 × 256 grid, with 25.5 μm pitch, and covering an area approximately 42.6 mm(2). The recording chain has 8.66 μV rms input-referred noise over a 100 ~ 10k Hz bandwidth while providing up to 66 dB of voltage gain. When recording from all electrodes in the array, it is capable of 10-kHz sampling per electrode. All electrodes can also perform patterned electrical microstimulation. The system produces ~ 1 GB/s of data when recording from the full array. To handle, store, and perform nearly real-time analyses of this large data stream, we developed a framework based around Xilinx FPGAs, Intel x86 CPUs and the NVIDIA Streaming Multiprocessors to interface with the electrode array.
[Results of training in the electronic health records in a tertiary care hospital].
Alva Espinosa, Carlos; Fuentes Domínguez, Marco Antonio; Garibay Huarte, Tania
2014-12-01
To assess the user evaluation of the electronic health records system together with its training program and to investigate the relation between the number of training sessions and the corresponding evaluation scores given by the participants. An anonymous survey was conducted between the medical, nursing and social worker personnel. The survey included seven multiple-choice questions with a numerical scale from 1 to 10 and an additional open question. IBM SPSS Statistics v18 software was used to perform ANOVA variance analysis. In total, 340 workers participated in this study; 317 were included in the statistical analysis, out of which 76% had one or two training sessions, 13.9% received three or more sessions and 10% had no training. The mean global training evaluation by the participants was 5.9 ± 2.3, median 6.3, while the electronic records system evaluation was 5.2 ± 2.3, median 5.5. In relation to the training and electronic records system it was observed that higher evaluation scores were obtained with increasing number of training sessions (p < 0.001). On the electronic records systems, personnel with no training evaluated the system with a mean score of 3.9 ± 2.7, while those who received three or more training sessions evaluated the system with a mean score of 6.1 ± 1.8 (p < 0.001).
Shay, Christopher F.; Ferrante, Michele; Chapman, G. William; Hasselmo, Michael E.
2015-01-01
Rebound spiking properties of medial entorhinal cortex (mEC) stellate cells induced by inhibition may underlie their functional properties in awake behaving rats, including the temporal phase separation of distinct grid cells and differences in grid cell firing properties. We investigated rebound spiking properties using whole cell patch recording in entorhinal slices, holding cells near spiking threshold and delivering sinusoidal inputs, superimposed with realistic inhibitory synaptic inputs to test the capacity of cells to selectively respond to specific phases of inhibitory input. Stellate cells showed a specific phase range of hyperpolarizing inputs that elicited spiking, but non-stellate cells did not show phase specificity. In both cell types, the phase range of spiking output occurred between the peak and subsequent descending zero crossing of the sinusoid. The phases of inhibitory inputs that induced spikes shifted earlier as the baseline sinusoid frequency increased, while spiking output shifted to later phases. Increases in magnitude of the inhibitory inputs shifted the spiking output to earlier phases. Pharmacological blockade of h-current abolished the phase selectivity of hyperpolarizing inputs eliciting spikes. A network computational model using cells possessing similar rebound properties as found in vitro produces spatially periodic firing properties resembling grid cell firing when a simulated animal moves along a linear track. These results suggest that the ability of mEC stellate cells to fire rebound spikes in response to a specific range of phases of inhibition could support complex attractor dynamics that provide completion and separation to maintain spiking activity of specific grid cell populations. PMID:26385258
Record keeping in Norwegian general practice.
Lönberg, N C; Bentsen, B G
1984-11-01
Routines of medical record keeping were studied in a random sample of 50 out of 228 general practitioners in two counties, Möre & Romsdal and Sör-Tröndelag. One doctor refused to participate and one had retired. The 48 physicians were interviewed and a questionnaire was completed with details about their record keeping. The standard of the records was assessed according to legibility, quality of notes, past history and tidiness using a score system. All general practitioners had records for every patient, but the quality of the records varied considerably. More than 50 per cent used handwriting in progress notes, which varied from diagnostic labels to extended reports. Few records contained accessible background information about the patient concerned, and many records contained large amounts of old and irrelevant papers. The record-scores varied from 3 to maximum 10 with an average of 6.7. Higher Standards of recording in general practice are called for, since the quality of records does not only affect the individual patient, but, in the end, the quality of medical care in general.
Childhood IQ and In-Service Mortality in Scottish Army Personnel during World War II
ERIC Educational Resources Information Center
Corley, Janie; Crang, Jeremy A.; Deary, Ian J.
2009-01-01
The Scottish Mental Survey of 1932 (SMS1932) provides a record of intelligence test scores for almost a complete year-of-birth group of children born in 1921. By linking UK Army personnel records, the Scottish National War Memorial data, and the SMS1932 dataset it was possible to examine the effect of childhood intelligence scores on wartime…
Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G
2014-09-01
The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).
NASA Astrophysics Data System (ADS)
Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G.
2014-09-01
The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).
Elastic extension of a local analysis facility on external clouds for the LHC experiments
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Codispoti, G.; Rinaldi, L.; Aiftimiei, D. C.; Bonacorsi, D.; Calligola, P.; Dal Pra, S.; De Girolamo, D.; Di Maria, R.; Grandi, C.; Michelotto, D.; Panella, M.; Taneja, S.; Semeria, F.
2017-10-01
The computing infrastructures serving the LHC experiments have been designed to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, the LHC experiments are exploring the opportunity to access Cloud resources provided by external partners or commercial providers. In this work we present the proof of concept of the elastic extension of a local analysis facility, specifically the Bologna Tier-3 Grid site, for the LHC experiments hosted at the site, on an external OpenStack infrastructure. We focus on the Cloud Bursting of the Grid site using DynFarm, a newly designed tool that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on an OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage.
Jiang, Yuyuan; Bebee, Brian; Mendoza, Alvaro; Robinson, Alice K; Zhang, Xiaying; Rosso, Diego
2018-01-01
Mixing is the driver for the energy footprint of water resource recovery in lagoons. With the availability of solar-powered equipment, one potential measure to decrease the environmental impacts of treatment is to transition to an off-the-grid treatment. We studied the comparative scenarios of an existing grid-powered mixer and a solar-powered mixer. Testing was conducted to monitor the water quality, and to guarantee that the effluent concentrations were maintained equally between the two scenarios. Meanwhile, the energy consumption was recorded with the electrical energy monitor by the wastewater treatment utility, and the carbon emission changes were calculated using the emission intensity of the power utility. The results show that after the replacement, both energy usage and energy costs were significantly reduced, with the energy usage having decreased by 70% and its cost by 47%. Additionally, carbon-equivalent emission from electricity importation dropped by 64%, with an effect on the overall carbon emissions (i.e., including all other contributions from the process) decreasing from 3.8% to 1.5%. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nimbus-7 Stratospheric and Mesospheric Sounder (SAMS) experiment data user's guide
NASA Technical Reports Server (NTRS)
Taylor, F. W.; Rodgers, C. D.; Nutter, S. T.; Oslik, N.
1989-01-01
The Stratospheric and Mesospheric Sounder (SAMS) aboard Nimbus-7 observes infrared radiation from the atmospheric limb. Global upper atmosphere temperature profiles and vertical concentrations of H2O, NO, N2O, CH4 and CO2 are derived from these measurements. The status of all channels was carefully monitored. Temperature and composition were retrieved from the measurements by linearizing the direct equation about an a priori profile and using an optimum statistical estimator to find the most likely solution. The derived temperature and composition profiles are archived on two tape products whose file structure and record formats are described in detail. The gridded retrieved temperature tape (GRID-T) contains daily day and night average temperatures at 62 pressure levels in a 2.5 degree latitude by 10 degree longitude grid extending from 67.5 degrees N to 50 degrees S. The zonal mean methane and nitrous oxide composition tape (ZMT-G) contains zonal mean day and night average CH4 and N2O mixing ratios at 31 pressure levels for 2.5 degrees latitude zones extending from 67.5 degrees N to 50 degrees S.
Rabelo-Silva, Eneida Rejane; Dantas Cavalcanti, Ana Carla; Ramos Goulart Caldas, Maria Cristina; Lucena, Amália de Fátima; Almeida, Miriam de Abreu; Linch, Graciele Fernanda da Costa; da Silva, Marcos Barragan; Müller-Staub, Maria
2017-02-01
To assess the quality of the advanced nursing process in nursing documentation in two hospitals. Various standardised terminologies are employed by nurses worldwide, whether for teaching, research or patient care. These systems can improve the quality of nursing records, enable care continuity, consistency in written communication and enhance safety for patients and providers alike. Cross-sectional study. A total of 138 records from two facilities (69 records from each facility) were analysed, one using the NANDA-International and Nursing Interventions Classification terminology (Centre 1) and one the International Classification for Nursing Practice (Centre 2), by means of the Quality of Diagnoses, Interventions, and Outcomes instrument. Quality of Diagnoses, Interventions, and Outcomes scores range from 0-58 points. Nursing records were dated 2012-2013 for Centre 1 and 2010-2011 for Centre 2. Centre 1 had a Quality of Diagnoses, Interventions, and Outcomes score of 35·46 (±6·45), whereas Centre 2 had a Quality of Diagnoses, Interventions, and Outcomes score of 31·72 (±4·62) (p < 0·001). Centre 2 had higher scores in the 'Nursing Diagnoses as Process' dimension, whereas in the 'Nursing Diagnoses as Product', 'Nursing Interventions' and 'Nursing Outcomes' dimensions, Centre 1 exhibited superior performance; acceptable reliability values were obtained for both centres, except for the 'Nursing Interventions' domain in Centre 1 and the 'Nursing Diagnoses as Process' and 'Nursing Diagnoses as Product' domains in Centre 2. The quality of nursing documentation was superior at Centre 1, although both facilities demonstrated moderate scores considering the maximum potential score of 58 points. Reliability analyses showed satisfactory results for both standardised terminologies. Nursing leaders should use a validated instrument to investigate the quality of nursing records after implementation of standardised terminologies. © 2016 John Wiley & Sons Ltd.
Sex Differences in Reported Pain Across 11,000 Patients Captured in Electronic Medical Records
Ruau, David; Liu, Linda Y.; Clark, J. David; Angst, Martin S.; Butte, Atul J.
2011-01-01
Clinically recorded pain scores are abundant in patient health records but are rarely used in research. The use of this information could help improve clinical outcomes. For example, a recent report by the Institute of Medicine stated that ineffective use of clinical information contributes to under-treatment of patient subpopulations — especially women. This study used diagnosis-associated pain scores from a large hospital database to document sex differences in reported pain. We used de-identified electronic medical records from Stanford Hospital and Clinics for more than 72,000 patients. Each record contained at least one disease-associated pain score. We found over 160,000 pain scores in more than 250 primary diagnoses, and analyzed differences in disease-specific pain reported by men and women. After filtering for diagnoses with minimum encounter numbers, we found diagnosis-specific sex differences in reported pain. The most significant differences occurred in patients with disorders of the musculoskeletal, circulatory, respiratory and digestive systems, followed by infectious diseases, and injury and poisoning. We also discovered sex-specific differences in pain intensity in previously unreported diseases, including disorders of the cervical region, and acute sinusitis (p = 0.01, 0.017, respectively). Pain scores were collected during hospital encounters. No information about the use of pre-encounter over-the-counter medications was available. To our knowledge, this is the largest data-driven study documenting sex differences of disease-associated pain. It highlights the utility of EMR data to corroborate and expand on results of smaller clinical studies. Our findings emphasize the need for future research examining the mechanisms underlying differences in pain. PMID:22245360
Investigating the Written Exam Scores' Prediction Power of TEOG Exam Scores
ERIC Educational Resources Information Center
Kontas, Hakki; Özpolat, Esen Turan
2017-01-01
The purpose of this study was to investigate exam scores' predicting Transition from Primary to Secondary Education (TEOG) exam scores. The research data were obtained from the records of 1035 students studying at the first term of eighth grade in 2015-2016 academic year in e-school system. The research was on relational screening model. Linear…
Eslamian, Ladan; Borzabadi-Farahani, Ali; Gholami, Hadi
2016-05-01
To compare the analgesic effect of topical benzocaine (5%) and ketoprofen (1.60 mg/mL) after 2 mm activation of 7 mm long delta loops used for maxillary en-masse orthodontic space closure. Twenty patients (seven males, 13 females, 15-25 years of age, mean age of 19.5 years) participated in a randomised crossover, double-blind trial. After appliance activation, participants were instructed to use analgesic gels and record pain perception at 2, 6, 24 hours and 2, 3 and 7 days (at 18.00 hrs), using a visual analogue scale ruler (VAS, 0-4). Each patient received all three gels (benzocaine, ketoprofen, and a control (placebo)) randomly, but at three different appliance activation visits following a wash-over gap of one month. After the first day, the patients were instructed to repeat gel application twice a day at 10:00 and 18:00 hrs for three days. The recorded pain scores were subjected to non-parametric analysis. The highest pain was recorded at 2 and 6 hours. Pain scores were significantly different between the three groups (Kruskal-Wallis test, p < 0.01). The overall mean (SD) pain scores for the benzocaine 5%, ketoprofen, and control (placebo) groups were 0.89 (0.41), 0.68 (0.34), and 1.15 (0.81), respectively. The pain scores were significantly different between the ketoprofen and control groups (mean difference = 0.47, p = 0.005). All groups demonstrated significant differences in pain scores at the six different time intervals (p < 0.05) and there was no gender difference (p > 0.05). A significant pain reduction was observed following the use of ketoprofen when tested against a control gel (placebo). The highest pain scores were experienced in patients administered the placebo and the lowest scores in patients who applied ketoprofen gel. Benzocaine had an effect mid-way between ketoprofen and the placebo. The highest pain scores were recorded 2 hours following force application, which decreased to the lowest scores after 7 days.
Conroy, S B; Drennan, M J; Kenny, D A; McGee, M
2009-11-01
This study examined the relationship of muscular and skeletal scores and ultrasound measurements in the live animal, and carcass conformation and fat scores with carcass composition and value using 336 steers, slaughtered at 2 years of age. Live animal scores and measurements were recorded at 8 to 12 months of age and pre-slaughter. Following slaughter, each carcass was classified for conformation and fatness and the right side dissected into meat, fat and bone. Carcass conformation scores and fat scores were both measured on a continuous 15-point scale and ranged from 2.0 to 12.0 and from 2.8 to 13.3, respectively. Pre-slaughter muscular scores showed positive correlations (P < 0.001) ranging from 0.31 to 0.86 with carcass meat proportion, proportion of high-value cuts in the carcass, conformation score and carcass value, significant negative correlations with carcass fat (r = -0.13) and bone (r = -0.81) proportions, and generally low non-significant relationships with the proportion of high-value cuts in meat and carcass fat score. Pre-slaughter ultrasound muscle depth and carcass conformation score showed similar correlations with carcass traits to those using the pre-slaughter muscular scoring procedure. Pre-slaughter ultrasound fat depth showed positive correlations (P < 0.001) with carcass fat proportion (r = 0.59) and fat score (r = 0.63), and significant negative correlations (-0.23 to -0.50) with carcass meat and bone proportions, high-value cuts in the carcass and in meat, and carcass value. Pre-slaughter skeletal scores generally showed poor correlations ranging from -0.38 to 0.52 with the various carcass traits. Corresponding correlations (-0.26 to 0.44) involving records collected at 8 to 12 months of age were lower than those using pre-slaughter records. A one-unit increase in carcass conformation score increased carcass meat proportion and value by 11.2 g/kg and 5.6 cents/kg, respectively. Corresponding values for fat score were -8.2 g/kg and -5.1 cents/kg. In conclusion, both pre-slaughter live animal scores/measurements and carcass classification scores, explained an appreciable amount of the total variation in carcass meat, fat and bone proportions and carcass value, and a moderate amount of the variation in proportion of high-value meat cuts in the carcass.
Jin, H; Wu, S; Vidyanti, I; Di Capua, P; Wu, B
2015-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Depression is a common and often undiagnosed condition for patients with diabetes. It is also a condition that significantly impacts healthcare outcomes, use, and cost as well as elevating suicide risk. Therefore, a model to predict depression among diabetes patients is a promising and valuable tool for providers to proactively assess depressive symptoms and identify those with depression. This study seeks to develop a generalized multilevel regression model, using a longitudinal data set from a recent large-scale clinical trial, to predict depression severity and presence of major depression among patients with diabetes. Severity of depression was measured by the Patient Health Questionnaire PHQ-9 score. Predictors were selected from 29 candidate factors to develop a 2-level Poisson regression model that can make population-average predictions for all patients and subject-specific predictions for individual patients with historical records. Newly obtained patient records can be incorporated with historical records to update the prediction model. Root-mean-square errors (RMSE) were used to evaluate predictive accuracy of PHQ-9 scores. The study also evaluated the classification ability of using the predicted PHQ-9 scores to classify patients as having major depression. Two time-invariant and 10 time-varying predictors were selected for the model. Incorporating historical records and using them to update the model may improve both predictive accuracy of PHQ-9 scores and classification ability of the predicted scores. Subject-specific predictions (for individual patients with historical records) achieved RMSE about 4 and areas under the receiver operating characteristic (ROC) curve about 0.9 and are better than population-average predictions. The study developed a generalized multilevel regression model to predict depression and demonstrated that using generalized multilevel regression based on longitudinal patient records can achieve high predictive ability.
NIMBUS 7 Earth Radiation Budget (ERB) Matrix User's Guide. Volume 2: Tape Specifications
NASA Technical Reports Server (NTRS)
Ray, S. N.; Vasanth, K. L.
1984-01-01
The ERB MATRIX tape is generated by an IBM 3081 computer program and is a 9 track, 1600 BPI tape. The gross format of the tape given on Page 1, shows an initial standard header file followed by data files. The standard header file contains two standard header records. A trailing documentation file (TDF) is the last file on the tape. Pages 9 through 17 describe, in detail, the standard header file and the TDF. The data files contain data for 37 different ERB parameters. Each file has data based on either a daily, 6 day cyclic, or monthly time interval. There are three types of physical records in the data files; namely, the world grid physical record, the documentation mercator/polar map projection physical record, and the monthly calibration physical record. The manner in which the data for the 37 ERB parameters are stored in the physical records comprising the data files, is given in the gross format section.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
Repressive coping and self-reports of parenting.
Myers, L B; Brewin, C R; Winter, D A
1999-03-01
To investigate whether women who possess a repressive coping style (repressors) self-report more positive judgments of their childhood on questionnaire and repertory grid measures compared with non-repressors. Repressors (low anxiety-high defensiveness) were compared with a composite group of non-repressors, containing some low anxious (low anxiety-low defensiveness), some high anxious (high anxiety-low defensiveness), some defensive high anxious (high anxiety-high defensiveness) and some non-extreme scorers. Participants completed the Parental Bonding Instrument (PBI; Parker, Tupling & Brown, 1979) and a 10 x 10 repertory grid, Self-Identification Form. On the PBI, repressors scored significantly higher than non-repressors on paternal care and significantly lower on paternal overprotection. There were no group differences for maternal measures. On the repertory grid, repressors compared with non-repressors perceived (a) themselves as significantly closer to their father, a woman they like, and their ideal partner, and significantly further from a woman they dislike, and a man they dislike; and (b) their father as significantly closer to a woman they like, a partner/person they admire, and an ideal partner. In addition, repressors were significantly tighter on construing than non-repressors. The results supported the hypothesis that repressors would rate their interactions with their fathers more positively than non-repressors when allowed to do so on self-report measures.
Visioli, Sonia; Lodi, Giovanni; Carrassi, Antonio; Zannini, Lucia
2009-08-01
This pilot study is based on observational research of lecturing skills during the annual Oral Medicine course at the Milan Dentistry School. Our goals were to explore how teachers exhibited desirable lecturing skills, to observe how their attitudes and lecturing skills affected students' attention and thereby learning, and to provide feedback. We prepared a structured observational grid divided into four categories: explaining, questioning, visual aids, and lecturer attitude. The grid was filled in by a participant, nonactive researcher. Two main types of lecture were observed: "traditional" and "interactive". Both of these can result in a high level of attention among students. Among the categories, only "lecturer attitude" appeared to affect student attention. In particular, the skills of "speaking aloud" and "sustaining verbal communication with vocal inflection" appeared to have the greatest impact on lecturer attitude. The data were then presented blindly to the five lecturers, who were able to identify their own lesson. Our grid proved to be a valid instrument although it was very expensive. When integrated with other strategies for improving lecturing, such as student scoring, peer evaluation, and microteaching, observational research can be a cost-effective method to stimulate guided reflection and to improve the lecturing skills of faculty members.
Profile and genetic parameters of dairy cattle locomotion score and lameness across lactation.
Kougioumtzis, A; Valergakis, G E; Oikonomou, G; Arsenos, G; Banos, G
2014-01-01
This study investigated the profile of locomotion score and lameness before the first calving and throughout the first (n=237) and second (n=66) lactation of 303 Holstein cows raised on a commercial farm. Weekly heritability estimates of locomotion score and lameness, and their genetic and phenotypic correlations with milk yield, body condition score, BW and reproduction traits were derived. Daughter future locomotion score and lameness predictions from their sires��� breeding values for conformation traits were also calculated. First-lactation cows were monitored weekly from 6 weeks before calving to the end of lactation. Second-lactation cows were monitored weekly throughout lactation. Cows were locomotion scored on a scale from one (sound) to five (severely lame); a score greater than or equal to two defined presence of lameness. Cows��� weekly body condition score and BW was also recorded. These records were matched to corresponding milk yield records, where the latter were 7-day averages on the week of inspection. The total number of repeated records amounted to 12 221. Data were also matched to the farm���s reproduction database, from which five traits were derived. Statistical analyses were based on uni- and bivariate random regression models. The profile analysis showed that locomotion and lameness problems in first lactation were fewer before and immediately after calving, and increased as lactation progressed. The profile of the two traits remained relatively constant across the second lactation. Highest heritability estimates were observed in the weeks before first calving (0.66 for locomotion score and 0.54 for lameness). Statistically significant genetic correlations were found for first lactation weekly locomotion score and lameness with body condition score, ranging from ���0.31 to ���0.65 and from ���0.44 to ���0.76, respectively, suggesting that cows genetically pre-disposed for high body condition score have fewer locomotion and lameness issues. Negative (favourable) phenotypic correlations between first lactation weekly locomotion score/lameness and milk yield averaged ���0.27 and ���0.17, respectively, and were attributed to management factors. Also a phenotypic correlation between lameness and conception rate of ���0.19 indicated that lame cows were associated with lower success at conceiving. First-lactation daughter locomotion score and/or lameness predictions from sires��� estimated breeding values for conformation traits revealed a significant linear effect of rear leg side view, rear leg rear view, overall conformation, body condition score and locomotion, and a quadratic effect of foot angle.
Googe, Joseph; Brucker, Alexander J; Bressler, Neil M; Qin, Haijing; Aiello, Lloyd P; Antoszyk, Andrew; Beck, Roy W; Bressler, Susan B; Ferris, Frederick L; Glassman, Adam R; Marcus, Dennis; Stockdale, Cynthia R
2011-06-01
To evaluate 14-week effects of intravitreal ranibizumab or triamcinolone in eyes receiving focal/grid laser for diabetic macular edema and panretinal photocoagulation. Three hundred and forty-five eyes with a visual acuity of 20/320 or better, center-involved diabetic macular edema receiving focal/grid laser, and diabetic retinopathy receiving prompt panretinal photocoagulation were randomly assigned to sham (n = 123), 0.5-mg ranibizumab (n = 113) at baseline and 4 weeks, and 4-mg triamcinolone at baseline and sham at 4 weeks (n = 109). Treatment was at investigator discretion from 14 weeks to 56 weeks. Mean changes (±SD) in visual acuity letter score from baseline were significantly better in the ranibizumab (+1 ± 11; P < 0.001) and triamcinolone (+2 ± 11; P < 0.001) groups compared with those in the sham group (-4 ± 14) at the 14-week visit, mirroring retinal thickening results. These differences were not maintained when study participants were followed for 56 weeks for safety outcomes. One eye (0.9%; 95% confidence interval, 0.02%-4.7%) developed endophthalmitis after receiving ranibizumab. Cerebrovascular/cardiovascular events occurred in 4%, 7%, and 3% of the sham, ranibizumab, and triamcinolone groups, respectively. The addition of 1 intravitreal triamcinolone injection or 2 intravitreal ranibizumab injections in eyes receiving focal/grid laser for diabetic macular edema and panretinal photocoagulation is associated with better visual acuity and decreased macular edema by 14 weeks. Whether continued long-term intravitreal treatment is beneficial cannot be determined from this study.
Improving the quality of marine geophysical track line data: Along-track analysis
NASA Astrophysics Data System (ADS)
Chandler, Michael T.; Wessel, Paul
2008-02-01
We have examined 4918 track line geophysics cruises archived at the U.S. National Geophysical Data Center (NGDC) using comprehensive error checking methods. Each cruise was checked for observation outliers, excessive gradients, metadata consistency, and general agreement with satellite altimetry-derived gravity and predicted bathymetry grids. Thresholds for error checking were determined empirically through inspection of histograms for all geophysical values, gradients, and differences with gridded data sampled along ship tracks. Robust regression was used to detect systematic scale and offset errors found by comparing ship bathymetry and free-air anomalies to the corresponding values from global grids. We found many recurring error types in the NGDC archive, including poor navigation, inappropriately scaled or offset data, excessive gradients, and extended offsets in depth and gravity when compared to global grids. While ˜5-10% of bathymetry and free-air gravity records fail our conservative tests, residual magnetic errors may exceed twice this proportion. These errors hinder the effective use of the data and may lead to mistakes in interpretation. To enable the removal of gross errors without over-writing original cruise data, we developed an errata system that concisely reports all errors encountered in a cruise. With such errata files, scientists may share cruise corrections, thereby preventing redundant processing. We have implemented these quality control methods in the modified MGD77 supplement to the Generic Mapping Tools software suite.
Visualization of grid-generated turbulence in He II using PTV
NASA Astrophysics Data System (ADS)
Mastracci, B.; Guo, W.
2017-12-01
Due to its low viscosity, cryogenic He II has potential use for simulating large-scale, high Reynolds number turbulent flow in a compact and efficient apparatus. To realize this potential, the behavior of the fluid in the simplest cases, such as turbulence generated by flow past a mesh grid, must be well understood. We have designed, constructed, and commissioned an apparatus to visualize the evolution of turbulence in the wake of a mesh grid towed through He II. Visualization is accomplished using the particle tracking velocimetry (PTV) technique, where μm-sized tracer particles are introduced to the flow, illuminated with a planar laser sheet, and recorded by a scientific imaging camera; the particles move with the fluid, and tracking their motion with a computer algorithm results in a complete map of the turbulent velocity field in the imaging region. In our experiment, this region is inside a carefully designed He II filled cast acrylic channel measuring approximately 16 × 16 × 330 mm. One of three different grids, which have mesh numbers M = 3, 3.75, or 5 mm, can be attached to the pulling system which moves it through the channel with constant velocity up to 600 mm/s. The consequent motion of the solidified deuterium tracer particles is used to investigate the energy statistics, effective kinematic viscosity, and quantized vortex dynamics in turbulent He II.
Evaluation of decadal hindcasts using satellite simulators
NASA Astrophysics Data System (ADS)
Spangehl, Thomas; Mazurkiewicz, Alex; Schröder, Marc
2013-04-01
The evaluation of dynamical ensemble forecast systems requires a solid validation of basic processes such as the global atmospheric water and energy cycle. The value of any validation approach strongly depends on the quality of the observational data records used. Current approaches utilize in situ measurements, remote sensing data and reanalyses. Related data records are subject to a number of uncertainties and limitations such as representativeness, spatial and temporal resolution and homogeneity. However, recently several climate data records with known and sufficient quality became available. In particular, the satellite data records offer the opportunity to obtain reference information on global scales including the oceans. Here we consider the simulation of satellite radiances from the climate model output enabling an evaluation in the instrument's parameter space to avoid uncertainties stemming from the application of retrieval schemes in order to minimise uncertainties on the reference side. Utilizing the CFMIP Observation Simulator Package (COSP) we develop satellite simulators for the Tropical Rainfall Measuring Mission precipitation radar (TRMM PR) and the Infrared Atmospheric Sounding Interferometer (IASI). The simulators are applied within the MiKlip project funded by BMBF (German Federal Ministry of Education and Research) to evaluate decadal climate predictions performed with the MPI-ESM developed at the Max Planck Institute for Meteorology. While TRMM PR enables the evaluation of the vertical structure of precipitation over tropical and sub-tropical areas, IASI is used to support the global evaluation of clouds and radiation. In a first step the reliability of the developed simulators needs to be explored. The simulation of radiances in the instrument space requires the generation of sub-grid scale variability from the climate model output. Furthermore, assumptions are made to simulate radiances such as, for example, the distribution of different hydrometeor types. Therefore, testing is performed to determine the extent to which the quality of the simulator results depends on the applied methods used to generate sub-grid variability (e.g. sub-grid resolution). Moreover, the sensitivity of results to the choice of different distributions of hydrometeors is explored. The model evaluation is carried out in a statistical manner using histograms of radar reflectivities (TRMM PR) and brightness temperatures (IASI). Finally, methods to deduce data suitable for probabilistic evaluation of decadal hindcasts such as simple indices are discussed.
Development of the TeamOBS-PPH - targeting clinical performance in postpartum hemorrhage.
Brogaard, Lise; Hvidman, Lone; Hinshaw, Kim; Kierkegaard, Ole; Manser, Tanja; Musaeus, Peter; Arafeh, Julie; Daniels, Kay I; Judy, Amy E; Uldbjerg, Niels
2018-06-01
This study aimed to develop a valid and reliable TeamOBS-PPH tool for assessing clinical performance in the management of postpartum hemorrhage (PPH). The tool was evaluated using video-recordings of teams managing PPH in both real-life and simulated settings. A Delphi panel consisting of 12 obstetricians from the UK, Norway, Sweden, Iceland, and Denmark achieved consensus on (i) the elements to include in the assessment tool, (ii) the weighting of each element, and (iii) the final tool. The validity and reliability were evaluated according to Cook and Beckman. (Level 1) Four raters scored four video-recordings of in situ simulations of PPH. (Level 2) Two raters scored 85 video-recordings of real-life teams managing patients with PPH ≥1000 mL in two Danish hospitals. (Level 3) Two raters scored 15 video-recordings of in situ simulations of PPH from a US hospital. The tool was designed with scores from 0 to 100. (Level 1) Teams of novices had a median score of 54 (95% CI 48-60), whereas experienced teams had a median score of 75 (95% CI 71-79; p < 0.001). (Level 2) The intra-rater [intra-class correlation (ICC) = 0.96] and inter-rater (ICC = 0.83) agreements for real-life PPH were strong. The tool was applicable in all cases: atony, retained placenta, and lacerations. (Level 3) The tool was easily adapted to in situ simulation settings in the USA (ICC = 0.86). The TeamOBS-PPH tool appears to be valid and reliable for assessing clinical performance in real-life and simulated settings. The tool will be shared as the free TeamOBS App. © 2018 Nordic Federation of Societies of Obstetrics and Gynecology.
Universite de Nancy (France) measurement report
NASA Astrophysics Data System (ADS)
Hadni, A.; Gerbaux, X.
1991-10-01
Measurements made by conventional Fourier transform spectroscopy using a polarizing wire grid interferometer with roof top reflectors and a rotating polarizing radiation chopper giving 10 Hz radiation modulation are presented. The radiation source used is a mercury vapor arc discharge lamp, and the detector a pumped liquid helium temperature silicon bolometer with a teflon input window and a low temperature quartz wedge acting as a low pass filter. The power transmission spectrum of each specimen measured is determined at nearly normal incidence with the specimen placed in a nominally collimated beam between the final analyzer grid and the output lens. The interferograms are recorded over a range of moving mirror positions about the position of zero path difference. No interferogram weighting function is used in the measurements. The spectral resolution of the measurements is 0.006 cm.
Medical Data GRIDs as approach towards secure cross enterprise document sharing (based on IHE XDS).
Wozak, Florian; Ammenwerth, Elske; Breu, Micheal; Penz, Robert; Schabetsberger, Thomas; Vogl, Raimund; Wurz, Manfred
2006-01-01
Quality and efficiency of health care services is expected to be improved by the electronic processing and trans-institutional availability of medical data. A prototype architecture based on the IHE-XDS profile is currently being developed. Due to legal and organizational requirements specific adaptations to the IHE-XDS profile have been made. In this work the services of the health@net reference architecture are described in details, which have been developed with focus on compliance to both, the IHE-XDS profile and the legal situation in Austria. We expect to gain knowledge about the development of a shared electronic health record using Medical Data Grids as an Open Source reference implementation and how proprietary Hospital Information systems can be integrated in this environment.
Introducing MCgrid 2.0: Projecting cross section calculations on grids
NASA Astrophysics Data System (ADS)
Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen
2015-11-01
MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.
Spatial and temporal variability of canopy microclimate in a Sierra Nevada riparian forest
T. Rambo; M. North
2008-01-01
Past riparian microclimate studies have measured changes horizontally from streams, but not vertically through the forest canopy. We recorded temperature and relative humidity for a year along a two-dimensional grid of 24 data-loggers arrayed up to 40 m height in four trees 2 - 30 m slope distance from a perennial second order stream in...
Comparison of methods for estimating the spread of a non-indigenous species
Patrick C. Tobin; Andrew M. Liebhold; E. Anderson Roberts
2007-01-01
Aim: To compare different quantitative approaches for estimating rates of spread in the exotic species gypsy moth, Lymantria dispar L., using county-level presence/absence data and spatially extensive trapping grids. Location: USA. Methods: We used county-level presence/absence records of the gypsy moth?s distribution in the USA, which are available beginning in 1900,...
Campbell, Wallace H.
1995-01-01
The social uses of geomagnetism include the physics of the space environment, satellite damage, pipeline corrosion, electric power-grid failure, communication interference, global positioning disruption, mineral-resource detection, interpretation of the Earth's formation and structure, navigation, weather, and magnetoreception in organisms. The need for continuing observations of the geomagnetic field, together with careful archiving of these records and mechanisms for dissemination of these data, is emphasized.
NASA Astrophysics Data System (ADS)
Leibowitz, Elia
2017-01-01
In an intensive observational campaign in the nine month duration of Chandra X-ray Visionary Project that was conducted in the year 2012, 39 large X-ray flares of Sgr A* were recorded. An analysis of the times of the observed flares reveals that the 39 flares are separated in time by intervals that are grouped around integer numbers times 0.10333 days. This time interval is thus the period of a uniform grid of equally spaced points on the time axis. The grouping of the flares around tic marks of this grid is derived from the data with at least a 3.2 σ level of statistical significance. No signal of any period can be found among 22 flares recorded by Chandra in the years 2013-2014. If the 0.10333 day period is that of a nearly circular Keplerian orbit around the blackhole at the center of the Galaxy, its radius is at 7.6 Schwarzschild radii. Large flares were more likely to be triggered when the agent responsible for their outbursts was near the peri-center phase of its slightly eccentric orbit.
Bleeding and starving: fasting and delayed refeeding after upper gastrointestinal bleeding.
Fonseca, Jorge; Meira, Tânia; Nunes, Ana; Santos, Carla Adriana
2014-01-01
Early refeeding after nonvariceal upper gastrointestinal bleeding is safe and reduces hospital stay/costs. The aim of this study was obtaining objective data on refeeding after nonvariceal upper gastrointestinal bleeding. From 1 year span records of nonvariceal upper gastrointestinal bleeding patients that underwent urgent endoscopy: clinical features; rockall score; endoscopic data, including severity of lesions and therapy; feeding related records of seven days: liquid diet prescription, first liquid intake, soft/solid diet prescription, first soft/solid intake. From 133 patients (84 men) Rockall classification was possible in 126: 76 score ≥5, 50 score <5. One persistent bleeding, eight rebled, two underwent surgery, 13 died. Ulcer was the major bleeding cause, 63 patients underwent endoscopic therapy. There was 142/532 possible refeeding records, no record 37% patients. Only 16% were fed during the first day and half were only fed on third day or later. Rockall <5 patients started oral diet sooner than Rockall ≥5. Patients that underwent endoscopic therapy were refed earlier than those without endotherapy. Most feeding records are missing. Data reveals delayed refeeding, especially in patients with low-risk lesions who should have been fed immediately. Nonvariceal upper gastrointestinal bleeding patients must be refed earlier, according to guidelines.
Automated extraction of clinical traits of multiple sclerosis in electronic medical records
Davis, Mary F; Sriram, Subramaniam; Bush, William S; Denny, Joshua C; Haines, Jonathan L
2013-01-01
Objectives The clinical course of multiple sclerosis (MS) is highly variable, and research data collection is costly and time consuming. We evaluated natural language processing techniques applied to electronic medical records (EMR) to identify MS patients and the key clinical traits of their disease course. Materials and methods We used four algorithms based on ICD-9 codes, text keywords, and medications to identify individuals with MS from a de-identified, research version of the EMR at Vanderbilt University. Using a training dataset of the records of 899 individuals, algorithms were constructed to identify and extract detailed information regarding the clinical course of MS from the text of the medical records, including clinical subtype, presence of oligoclonal bands, year of diagnosis, year and origin of first symptom, Expanded Disability Status Scale (EDSS) scores, timed 25-foot walk scores, and MS medications. Algorithms were evaluated on a test set validated by two independent reviewers. Results We identified 5789 individuals with MS. For all clinical traits extracted, precision was at least 87% and specificity was greater than 80%. Recall values for clinical subtype, EDSS scores, and timed 25-foot walk scores were greater than 80%. Discussion and conclusion This collection of clinical data represents one of the largest databases of detailed, clinical traits available for research on MS. This work demonstrates that detailed clinical information is recorded in the EMR and can be extracted for research purposes with high reliability. PMID:24148554
Lee, S W; Lee, J H; Sung, H H; Park, H J; Park, J K; Choi, S K; Kam, S C
2013-01-01
This study compared the prevalence of premature ejaculation (PE) diagnosed by the PE diagnostic tool (PEDT) score, self-reporting and stopwatch-recorded intravaginal ejaculation latency time (IELT). It examined the characteristics of males diagnosed with PE by each criterion. A questionnaire survey enrolled 2081 subjects from March to October, 2010. Stopwatch-recorded IELT was measured in 1035 of the 2081 subjects. We aimed to determine whether PE has an influence on the frequency and satisfaction of sexual intercourse, the degree of libido/erectile function and the satisfaction. These factors were evaluated according to different definitions of PE to assess whether the definition used yielded differences in the data. The prevalence of PE, based on a PEDT score of ≥11, self-reporting and stopwatch-recorded IELT of ≤1 min was 11.3%, 19.5% and 3%, respectively. The prevalence of PE diagnoses based on PEDT score and self-reporting increased with age, but stopwatch-recorded IELT-based diagnoses did not. Males experiencing PE showed lower levels of libido, erectile function and frequency and satisfaction of sexual intercourse compared with non-PE males. PE males felt that they did not satisfy their partners in terms of the partners' sexual satisfaction and frequency of orgasm, in comparison with non-PE males. PE is a highly prevalent sexual dysfunction in males. Regardless of whether the PE diagnosis was made on the basis of self-reporting, PEDT score or stopwatch-recorded IELT, subjective symptoms were similar among PE males.
Mallen, C A
1983-01-01
The sex-role stereotypes held by heterosexual and homosexual men were examined by comparing their Repertory Grid scores. It was found that homosexual men held less rigid sex-role stereotypes than heterosexuals. Degree of opposite-sex identification was marginally greater in homosexuals, but neither group showed strong masculine or feminine stereotypic identification. Homosexual men perceived themselves as psychologically more distant from their fathers than did their heterosexual counterparts; this was probably an effect of homosexuality rather than a cause.
NASA Astrophysics Data System (ADS)
Mizukami, N.; Smith, M. B.
2010-12-01
It is common for the error characteristics of long-term precipitation data to change over time due to various factors such as gauge relocation and changes in data processing methods. The temporal consistency of precipitation data error characteristics is as important as data accuracy itself for hydrologic model calibration and subsequent use of the calibrated model for streamflow prediction. In mountainous areas, the generation of precipitation grids relies on sparse gage networks, the makeup of which often varies over time. This causes a change in error characteristics of the long-term precipitation data record. We will discuss the diagnostic analysis of the consistency of gridded precipitation time series and illustrate the adverse effect of inconsistent precipitation data on a hydrologic model simulation. We used hourly 4 km gridded precipitation time series over a mountainous basin in the Sierra Nevada Mountains of California from October 1988 through September 2006. The basin is part of the broader study area that served as the focus of the second phase of the Distributed Model Intercomparison Project (DMIP-2), organized by the U.S. National Weather Service (NWS) of the National Oceanographic and Atmospheric Administration (NOAA). To check the consistency of the gridded precipitation time series, double mass analysis was performed using single pixel and basin mean areal precipitation (MAP) values derived from gridded DMIP-2 and Parameter-Elevation Regressions on Independent Slopes Model (PRISM) precipitation data. The analysis leads to the conclusion that over the entire study time period, a clear change in error characteristics in the DMIP-2 data occurred in the beginning of 2003. This matches the timing of one of the major gage network changes. The inconsistency of two MAP time series computed from the gridded precipitation fields over two elevation zones was corrected by adjusting hourly values based on the double mass analysis. We show that model simulations using the adjusted MAP data produce improved stream flow compared to simulations using the inconsistent MAP input data.
Mining and Integration of Environmental Data
NASA Astrophysics Data System (ADS)
Tran, V.; Hluchy, L.; Habala, O.; Ciglan, M.
2009-04-01
The project ADMIRE (Advanced Data Mining and Integration Research for Europe) is a 7th FP EU ICT project aims to deliver a consistent and easy-to-use technology for extracting information and knowledge. The project is motivated by the difficulty of extracting meaningful information by data mining combinations of data from multiple heterogeneous and distributed resources. It will also provide an abstract view of data mining and integration, which will give users and developers the power to cope with complexity and heterogeneity of services, data and processes. The data sets describing phenomena from domains like business, society, and environment often contain spatial and temporal dimensions. Integration of spatio-temporal data from different sources is a challenging task due to those dimensions. Different spatio-temporal data sets contain data at different resolutions (e.g. size of the spatial grid) and frequencies. This heterogeneity is the principal challenge of geo-spatial and temporal data sets integration - the integrated data set should hold homogeneous data of the same resolution and frequency. Thus, to integrate heterogeneous spatio-temporal data from distinct source, transformation of one or more data sets is necessary. Following transformation operation are required: • transformation to common spatial and temporal representation - (e.g. transformation to common coordinate system), • spatial and/or temporal aggregation - data from detailed data source are aggregated to match the resolution of other resources involved in the integration process, • spatial and/or temporal record decomposition - records from source with lower resolution data are decomposed to match the granularity of the other data source. This operation decreases data quality (e.g. transformation of data from 50km grid to 10 km grid) - data from lower resolution data set in the integrated schema are imprecise, but it allows us to preserve higher resolution data. We can decompose the spatio-temporal data integration to following phases: • pre-integration data processing - different data set can be physically stored in different formats (e.g. relational databases, text files); it might be necessary to pre-process the data sets to be integrated, • identification of transformation operations necessary to integrate data in spatio-temporal dimensions, • identification of transformation operations to be performed on non-spatio-temporal attributes and • output data schema and set generation - given prepared data and the set of transformation, operations, the final integrated schema is produces. Spatio-temporal dimension brings its specifics also to the problem of mining spatio-temporal data sets. Spatio-temporal relationships exist among records in (s-t) data sets and those relationships should be considered in mining operation. This means that when analyzing a record in spatio-temporal data set, the records in its spatial and/or temporal proximity should be taken into account. In addition, the relationships discovered in spatio-temporal data can be different when mining the same data on different scales (e.g. mining the same data sets on 50 km grid with daily data vs. 10 km grid with hourly data). To be able to do effective data mining, we first needed to gather a sufficient amount of environmental data covering similar area and time span. For this purpose we have engaged in cooperation with several organizations working in the environmental domain in Slovakia, some of which are also our partners from previous research efforts. The organizations which volunteered some of their data are the Slovak Hydro-meteorological Institute (SHMU), the Slovak Water Enterprise (SVP), the Soil Science and Conservation Institute (VUPOP), and the Institute of Hydrology of the Slovak Academy of Sciences (UHSAV). We have prepared scenarios from general meteorology, as well as specialized in hydrology and soil protection.
ERIC Educational Resources Information Center
Zimmermann, Judith; von Davier, Alina A.; Buhmann, Joachim M.; Heinimann, Hans R.
2018-01-01
Graduate admission has become a critical process in tertiary education, whereby selecting valid admissions instruments is key. This study assessed the validity of Graduate Record Examination (GRE) General Test scores for admission to Master's programmes at a technical university in Europe. We investigated the indicative value of GRE scores for the…
Goyal, Manoj
2015-11-04
In Jodhpur, large number of people suffering with non-insulin dependent diabetes mellitus (type 2 diabetes). They are using medicinal plants along with modern medicine for the management of diabetes. The aim of this work is to document the anti-diabetic plants and determine the most relevant anti-diabetic plant species using the Disease Consensus Index. Ethnomedicinal survey was conducted for selection of anti-diabetic plant. Structured questionnaire was developed for calculation of Disease Consensus Index and administered to fifty Type 2 diabetic patients for recording their response. Twenty-one species of anti-diabetic plants were recorded, Momordica charantia (score: 0.71), Azadirachta indica (score: 0.64), Trigonella foenum-graecum (score: 0.63), Capparis decidua (score: 0.60), Withania coagulans (score: 0.54), Gymnema sylvestre (score: 0.52) and Syzygium cumini (score: 0.51) were the most significant anti-diabetic plants of the area of study, having DCI more than 0.5. Use of anti-diabetic plants is prevalent diabetic patients of the area. C. decidua, W. coagulans and G. sylvestre are recommend the further phytochemical and pharmacological investigation due to high DCI score and relatively unexplored status. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Efficacy of Electrocuting Devices to Catch Tsetse Flies (Glossinidae) and Other Diptera
Vale, Glyn A.; Hargrove, John W.; Cullis, N. Alan; Chamisa, Andrew; Torr, Stephen J.
2015-01-01
Background The behaviour of insect vectors has an important bearing on the epidemiology of the diseases they transmit, and on the opportunities for vector control. Two sorts of electrocuting device have been particularly useful for studying the behaviour of tsetse flies (Glossina spp), the vectors of the trypanosomes that cause sleeping sickness in humans and nagana in livestock. Such devices consist of grids on netting (E-net) to catch tsetse in flight, or on cloth (E-cloth) to catch alighting flies. Catches are most meaningful when the devices catch as many as possible of the flies potentially available to them, and when the proportion caught is known. There have been conflicting indications for the catching efficiency, depending on whether the assessments were made by the naked eye or assisted by video recordings. Methodology/Principal Findings Using grids of 0.5m2 in Zimbabwe, we developed catch methods of studying the efficiency of E-nets and E-cloth for tsetse, using improved transformers to supply the grids with electrical pulses of ~40kV. At energies per pulse of 35–215mJ, the efficiency was enhanced by reducing the pulse interval from 3200 to 1ms. Efficiency was low at 35mJ per pulse, but there seemed no benefit of increasing the energy beyond 70mJ. Catches at E-nets declined when the fine netting normally used became either coarser or much finer, and increased when the grid frame was moved from 2.5cm to 27.5cm from the grid. Data for muscoids and tabanids were roughly comparable to those for tsetse. Conclusion/Significance The catch method of studying efficiency is useful for supplementing and extending video methods. Specifications are suggested for E-nets and E-cloth that are ~95% efficient and suitable for estimating the absolute numbers of available flies. Grids that are less efficient, but more economical, are recommended for studies of relative numbers available to various baits. PMID:26505202
Time-marching multi-grid seismic tomography
NASA Astrophysics Data System (ADS)
Tong, P.; Yang, D.; Liu, Q.
2016-12-01
From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.
Efficacy of Electrocuting Devices to Catch Tsetse Flies (Glossinidae) and Other Diptera.
Vale, Glyn A; Hargrove, John W; Cullis, N Alan; Chamisa, Andrew; Torr, Stephen J
2015-10-01
The behaviour of insect vectors has an important bearing on the epidemiology of the diseases they transmit, and on the opportunities for vector control. Two sorts of electrocuting device have been particularly useful for studying the behaviour of tsetse flies (Glossina spp), the vectors of the trypanosomes that cause sleeping sickness in humans and nagana in livestock. Such devices consist of grids on netting (E-net) to catch tsetse in flight, or on cloth (E-cloth) to catch alighting flies. Catches are most meaningful when the devices catch as many as possible of the flies potentially available to them, and when the proportion caught is known. There have been conflicting indications for the catching efficiency, depending on whether the assessments were made by the naked eye or assisted by video recordings. Using grids of 0.5m2 in Zimbabwe, we developed catch methods of studying the efficiency of E-nets and E-cloth for tsetse, using improved transformers to supply the grids with electrical pulses of ~40kV. At energies per pulse of 35-215mJ, the efficiency was enhanced by reducing the pulse interval from 3200 to 1ms. Efficiency was low at 35mJ per pulse, but there seemed no benefit of increasing the energy beyond 70mJ. Catches at E-nets declined when the fine netting normally used became either coarser or much finer, and increased when the grid frame was moved from 2.5cm to 27.5cm from the grid. Data for muscoids and tabanids were roughly comparable to those for tsetse. The catch method of studying efficiency is useful for supplementing and extending video methods. Specifications are suggested for E-nets and E-cloth that are ~95% efficient and suitable for estimating the absolute numbers of available flies. Grids that are less efficient, but more economical, are recommended for studies of relative numbers available to various baits.
NASA Technical Reports Server (NTRS)
Omar, Ali H.; Liu, Z.; Tackett, J.; Vaughan, M.; Trepte, C.; Winker, D.; H. Yu,
2015-01-01
The lidar on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, makes robust measurements of dust and has generated a length of record that is significant both seasonally and inter-annually. We exploit this record to determine a multi-year climatology of the properties of Asian and Saharan dust, in particular seasonal optical depths, layer frequencies, and layer heights of dust gridded in accordance with the Level 3 data products protocol, between 2006-2015. The data are screened using standard CALIPSO quality assurance flags, cloud aerosol discrimination (CAD) scores, overlying features and layer properties. To evaluate the effects of transport on the morphology, vertical extent and size of the dust layers, we compare probability distribution functions of the layer integrated volume depolarization ratios, geometric depths and integrated attenuated color ratios near the source to the same distributions in the far field or transport region. CALIPSO is collaboration between NASA and Centre National D'études Spatiales (CNES), was launched in April 2006 to provide vertically resolved measurements of cloud and aerosol distributions. The primary instrument on the CALIPSO satellite is the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP), a near-nadir viewing two-wavelength polarization-sensitive instrument. The unique nature of CALIOP measurements make it quite challenging to validate backscatter profiles, aerosol type, and cloud phase, all of which are used to retrieve extinction and optical depth. To evaluate the uncertainty in the lidar ratios, we compare the values computed from dust layers overlying opaque water clouds, considered nominal, with the constant lidar ratio value used in the CALIOP algorithms for dust. We also explore the effects of noise on the CALIOP retrievals at daytime by comparing the distributions of the properties at daytime to the nighttime distributions.
Harvey, Colin E; Laster, Larry; Shofer, Frances S
2012-01-01
A total mouth periodontal score (TMPS) system in dogs has been described previously. Use of buccal and palatal/lingual surfaces of all teeth requires observation and recording of 120 gingivitis scores and 120 periodontitis scores. Although the result is a reliable, repeatable assessment of the extent of periodontal disease in the mouth, observing and recording 240 data points is time-consuming. Using data from a previously reported study of periodontal disease in dogs, correlation analysis was used to determine whether use of any of seven different subsets of teeth can generate TMPS subset gingivitis and periodontitis scores that are highly correlated with TMPS all-site, all-teeth scores. Overall, gingivitis scores were less highly correlated than periodontitis scores. The minimal tooth set with a significant intra-class correlation (> or = 0.9 of means of right and left sides) for both gingivitis scores and attachment loss measurements consisted of the buccal surface of the maxillary third incisor canine, third premolar fourth premolar; and first molar teeth; and, the mandibular canine, third premolar, fourth premolar and first molar teeth on one side (9 teeth, 15 root sites). Use of this subset of teeth, which reduces the number of data points per dog from 240 to 30 for gingivitis and periodontitis at each scoring episode, is recommended when calculating the gingivitis and periodontitis scores using the TMPS system.
Lod scores for gene mapping in the presence of marker map uncertainty.
Stringham, H M; Boehnke, M
2001-07-01
Multipoint lod scores are typically calculated for a grid of locus positions, moving the putative disease locus across a fixed map of genetic markers. Changing the order of a set of markers and/or the distances between the markers can make a substantial difference in the resulting lod score curve and the location and height of its maximum. The typical approach of using the best maximum likelihood marker map is not easily justified if other marker orders are nearly as likely and give substantially different lod score curves. To deal with this problem, we propose three weighted multipoint lod score statistics that make use of information from all plausible marker orders. In each of these statistics, the information conditional on a particular marker order is included in a weighted sum, with weight equal to the posterior probability of that order. We evaluate the type 1 error rate and power of these three statistics on the basis of results from simulated data, and compare these results to those obtained using the best maximum likelihood map and the map with the true marker order. We find that the lod score based on a weighted sum of maximum likelihoods improves on using only the best maximum likelihood map, having a type 1 error rate and power closest to that of using the true marker order in the simulation scenarios we considered. Copyright 2001 Wiley-Liss, Inc.
Spillmann, Brigitte; van Noordwijk, Maria A; Willems, Erik P; Mitra Setia, Tatang; Wipfli, Urs; van Schaik, Carel P
2015-07-01
The long call is an important vocal communication signal in the widely dispersed, semi-solitary orangutan. Long calls affect individuals' ranging behavior and mediate social relationships and regulate encounters between dispersed individuals in a dense rainforest. The aim of this study was to test the utility of an Acoustic Location System (ALS) for recording and triangulating the loud calls of free-living primates. We developed and validated a data extraction protocol for an ALS used to record wild orangutan males' long calls at the Tuanan field site (Central Kalimantan). We installed an ALS in a grid of 300 ha, containing 20 SM2+ recorders placed in a regular lattice at 500 m intervals, to monitor the distribution of calling males in the area. The validated system had the following main features: (i) a user-trained software algorithm (Song Scope) that reliably recognized orangutan long calls from sound files at distances up to 700 m from the nearest recorder, resulting in a total area of approximately 900 ha that could be monitored continuously; (ii) acoustic location of calling males up to 200 m outside the microphone grid, which meant that within an area of approximately 450 ha, call locations could be calculated through triangulation. The mean accuracy was 58 m, an error that is modest relative to orangutan mobility and average inter-individual distances. We conclude that an ALS is a highly effective method for detecting long-distance calls of wild primates and triangulating their position. In combination with conventional individual focal follow data, an ALS can greatly improve our knowledge of orangutans' social organization, and is readily adaptable for studying other highly vocal animals. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Hitt, N. T.; Cobb, K. M.; Sayani, H. R.; Grothe, P. R.; Atwood, A. R.; O'Connor, G.; Chen, T.; Hagos, M. M.; Deocampo, D.; Edwards, R. L.; Cheng, H.; Lu, Y.; Thompson, D. M.
2016-12-01
Sea-surface temperature (SST) variability in the central tropical Pacific drives global-scale responses through atmospheric teleconnections, so the response of this region to anthropogenic forcing has important implications for regional climate responses in many areas. However, quantification of anthropogenic SST trends in the central tropical Pacific is complicated by the fact that instrumental SST observations in this region are extremely limited prior to 1950, with trends of opposite sign observed across the various gridded instrumental datasets (Deser et al., 2010). Researchers have turned to multi-century coral records to reconstruct ocean temperatures through time, but the paucity of such records prohibits the generation of uncertainty estimates. In this study, we use a large collection of U/Th-dated fossil corals that to investigate a new ensemble approach to reconstructing temperature from the Central Pacific over the late 20th century. Here we combine monthly-resolved d18O and Sr/Ca from 8 5-14 year long coral records from Christmas Island (2°N, 157°W) to quantify temperature and hydrological trends in this region from 1930 to present. We compare our fossil coral ensemble reconstruction to a long modern coral core from this site that extends back to 1940, as well as to gridded SST datasets. We also provide the first well-replicated coral d18O and Sr/Ca records across both the 1997/98 and 2015/2016 El Nino events, comparing the strength of these two events in the context of long-term temperature trends observed in our longer reconstruction. We conclude that the fossil coral ensemble approach provides a robust means of reconstructing 20th century climate trends. Deser et al., 2010, GRL, doi: 10.1029/2010GL043321
Jenny, J-Y; Adamczewski, B; De Thomasson, E; Godet, J; Bonfait, H; Delaunay, C
2016-04-01
The diagnosis of periprosthetic joint infection can be challenging, in part because there is no universal diagnostic test. Current recommendations include several diagnostic criteria, and are mainly based on the results of deep microbiological samples; however, these only provide a diagnosis after surgery. A predictive infection score would improve the management of revision arthroplasty cases. The purpose of this study was to define a composite infection score using standard clinical, radiological and laboratory data that can be used to predict whether an infection is present before a total hip arthroplasty (THA) revision procedure. The infection score will make it possible to differentiate correctly between infected and non-infected patients in 75% of cases. One hundred and four records from patients who underwent THA revision for any reason were analysed retrospectively: 43 with infection and 61 without infection. There were 54 men and 50 women with an average age of 70±12 years (range 30-90). A univariate analysis was performed to look for individual discriminating factors between the data in the medical records of infected and non-infected patients. A multivariate analysis subsequently integrated these factors together. A composite score was defined and its diagnostic effectiveness was evaluated as the percentage of correctly classified records, along with its sensitivity and specificity. The score consisted of the following individually weighed factors: body mass index, presence of diabetes, mechanical complication, wound healing disturbance and fever. This composite infection score was able to distinguish correctly between the infected patients (positive score) and non-infected patients (negative score) in 78% of cases; the sensitivity was 57% and the specificity 93%. Once this score is evaluated prospectively, it could be an important tool for defining the medical - surgical strategy during THA revision, no matter the reason for revision. Level IV - retrospective study. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Effectiveness of dental checkups incorporating tooth brushing instruction.
Furusawa, Masahiro; Takahashi, Jun-ichi; Isoyama, Motoko; Kitamura, Yoshiko; Kashima, Tomoko; Ueshima, Fumie; Nakahama, Noriko; Araki, Misako; Rokukawa, Yasuko; Takahashi, Yoshikazu; Makiishi, Takemi; Yatabe, Ken-ichi
2011-01-01
The purpose of this study was to compare the effectiveness of dental checkups incorporating tooth-brushing instruction (TBI) with that of conventional dental checkups. A team consisting of one dentist and three dental hygienists saw an average of 60 employees per day on-site at an airline company. The patient's teeth were stained with a disclosing tablet and the results recorded on a Plaque Control Record (PCR) chart. The patient was then given TBI. After recording the relevant data, including TBI given and PCR scores, the charts were stored. Checkups were performed in a total of 3,854 patients between 2001 and 2005 and changes in annual scores investigated. In addition, annual shifts in mean score in patients receiving checkups over all five years were compared with those in patients receiving checkups for the first time in each of the five years. The mean score in patients receiving a checkup in 2001 was 35.1%, declining by 2.6 points to 32.5% in 2005. Among patients receiving checkups over all five years, the mean score in 2001 was 34.0%, declining by 11.2 points to 22.8% in 2005. Over the five-year period, the mean score in patients receiving checkups was 34.1%. In patients receiving checkups over all five years, the proportion with PCR scores <30% increased each year. This was because the number of patients with PCR scores ≥60% decreased each year. These findings suggest that TBI is effective in reducing poor plaque control. When compared with in patients who had not received TBI, five consecutive years of checkups was clearly effective. These results indicate that checkups incorporating TBI are more effective than conventional dental checkups that simply check for caries. In future, this type of checkup should contribute to improved preventative dentistry with minimal intervention.
Kittel, T.G.F.; Rosenbloom, N.A.; Royle, J. Andrew; Daly, Christopher; Gibson, W.P.; Fisher, H.H.; Thornton, P.; Yates, D.N.; Aulenbach, S.; Kaufman, C.; McKeown, R.; Bachelet, D.; Schimel, D.S.; Neilson, R.; Lenihan, J.; Drapek, R.; Ojima, D.S.; Parton, W.J.; Melillo, J.M.; Kicklighter, D.W.; Tian, H.; McGuire, A.D.; Sykes, M.T.; Smith, B.; Cowling, S.; Hickler, T.; Prentice, I.C.; Running, S.; Hibbard, K.A.; Post, W.M.; King, A.W.; Smith, T.; Rizzo, B.; Woodward, F.I.
2004-01-01
Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the Vegetation/Ecosystem Modeling and Analysis Project (VEMAP), a biogeochemical and dynamic vegetation model intercomparison. The dataset covers the period 1895-1993 on a 0.5?? latitude/longitude grid. Climate is represented at both monthly and daily timesteps. Variables are: precipitation, mininimum and maximum temperature, total incident solar radiation, daylight-period irradiance, vapor pressure, and daylight-period relative humidity. The dataset was derived from US Historical Climate Network (HCN), cooperative network, and snowpack telemetry (SNOTEL) monthly precipitation and mean minimum and maximum temperature station data. We employed techniques that rely on geostatistical and physical relationships to create the temporally and spatially complete dataset. We developed a local kriging prediction model to infill discontinuous and limited-length station records based on spatial autocorrelation structure of climate anomalies. A spatial interpolation model (PRISM) that accounts for physiographic controls was used to grid the infilled monthly station data. We implemented a stochastic weather generator (modified WGEN) to disaggregate the gridded monthly series to dailies. Radiation and humidity variables were estimated from the dailies using a physically-based empirical surface climate model (MTCLIM3). Derived datasets include a 100 yr model spin-up climate and a historical Palmer Drought Severity Index (PDSI) dataset. The VEMAP dataset exhibits statistically significant trends in temperature, precipitation, solar radiation, vapor pressure, and PDSI for US National Assessment regions. The historical climate and companion datasets are available online at data archive centers. ?? Inter-Research 2004.
Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data
NASA Technical Reports Server (NTRS)
Rompala, John T.
2005-01-01
A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.
Finite-Difference Algorithm for Simulating 3D Electromagnetic Wavefields in Conductive Media
NASA Astrophysics Data System (ADS)
Aldridge, D. F.; Bartel, L. C.; Knox, H. A.
2013-12-01
Electromagnetic (EM) wavefields are routinely used in geophysical exploration for detection and characterization of subsurface geological formations of economic interest. Recorded EM signals depend strongly on the current conductivity of geologic media. Hence, they are particularly useful for inferring fluid content of saturated porous bodies. In order to enhance understanding of field-recorded data, we are developing a numerical algorithm for simulating three-dimensional (3D) EM wave propagation and diffusion in heterogeneous conductive materials. Maxwell's equations are combined with isotropic constitutive relations to obtain a set of six, coupled, first-order partial differential equations governing the electric and magnetic vectors. An advantage of this system is that it does not contain spatial derivatives of the three medium parameters electric permittivity, magnetic permeability, and current conductivity. Numerical solution methodology consists of explicit, time-domain finite-differencing on a 3D staggered rectangular grid. Temporal and spatial FD operators have order 2 and N, where N is user-selectable. We use an artificially-large electric permittivity to maximize the FD timestep, and thus reduce execution time. For the low frequencies typically used in geophysical exploration, accuracy is not unduly compromised. Grid boundary reflections are mitigated via convolutional perfectly matched layers (C-PMLs) imposed at the six grid flanks. A shared-memory-parallel code implementation via OpenMP directives enables rapid algorithm execution on a multi-thread computational platform. Good agreement is obtained in comparisons of numerically-generated data with reference solutions. EM wavefields are sourced via point current density and magnetic dipole vectors. Spatially-extended inductive sources (current carrying wire loops) are under development. We are particularly interested in accurate representation of high-conductivity sub-grid-scale features that are common in industrial environments (borehole casing, pipes, railroad tracks). Present efforts are oriented toward calculating the EM responses of these objects via a First Born Approximation approach. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Cook, E. R.
2007-05-01
The North American Drought Atlas describes a detailed reconstruction of drought variability from tree rings over most of North America for the past 500-1000 years. The first version of it, produced over three years ago, was based on a network of 835 tree-ring chronologies and a 286-point grid of instrumental Palmer Drought Severity Indices (PDSI). These gridded PDSI reconstructions have been used in numerous published studies now that range from modeling fire in the American West, to the impact of drought on palaeo-Indian societies, and to the determination of the primary causes of drought over North America through climate modeling experiments. Some examples of these applications will be described to illustrate the scientific value of these large-scale reconstructions of drought. Since the development and free public release of Version 1 of the North American Drought Atlas (see http:iridl.ldeo.columbia.edu/SOURCES/.LDEO/.TRL/.NADA2004/.pdsi-atlas.html), great improvements have been made in the critical tree-ring network used to reconstruct PDSI at each grid point. This network has now been enlarged to 1743 annual tree-ring chronologies, which greatly improves the density of tree-ring records in certain parts of the grid, especially in Canada and Mexico. In addition, the number of tree-ring records that extend back before AD 1400 has been substantially increased. These developments justify the creation of Version 2 of the North American Drought Atlas. In this talk I will describe this new version of the drought atlas and some of its properties that make it a significant improvement over the previous version. The new product provides enhanced resolution of the spatial and temporal variability of prolonged drought such as the late 16th century event that impacted regions of both Mexico and the United States. I will also argue for the North American Drought Atlas being used as a template for the development of large-scale drought reconstructions in other land areas of the Northern Hemisphere where sufficient tree-ring data exist. By doing so, the importance of this product to the modeling community will be significantly enhanced.
Ma, Irene W Y; Zalunardo, Nadia; Brindle, Mary E; Hatala, Rose; McLaughlin, Kevin
2015-09-01
Blinded assessments of technical skills using video-recordings may offer more objective assessments than direct observations. This study seeks to compare these two modalities. Two trained assessors independently assessed 18 central venous catheterization performances by direct observation and video-recorded assessments using two tools. Although sound quality was deemed adequate in all videos, portions of the video for wire handling and drape handling were frequently out of view (n = 13, 72% for wire-handling; n = 17, 94% for drape-handling). There were no differences in summary global rating scores, checklist scores, or pass/fail decisions for either modality (p > 0.05). Inter-rater reliability was acceptable for both modalities. Of the 26 discrepancies identified between direct observation and video-recorded assessments, three discrepancies (12%) were due to inattention during video review, while one (4%) discrepancy was due to inattention during direct observation. In conclusion, although scores did not differ between the two assessment modalities, techniques of video-recording may significantly impact individual items of assessments. © The Author(s) 2014.
Rethans, J J; Martin, E; Metsemakers, J
1994-01-01
BACKGROUND. Review of clinical notes is used extensively as an indirect method of assessing doctors' performance. However, to be acceptable it must be valid. AIM. This study set out to examine the extent to which clinical notes in medical records of general practice consultations reflected doctors' actual performance during consultations. METHOD. Thirty nine general practitioners in the Netherlands were consulted by four simulated patients who were indistinguishable from real patients and who reported on the consultations. The complaints presented by the simulated patients were tension headache, acute diarrhoea and pain in the shoulder, and one presented for a check up for non-insulin dependent diabetes. Later, the doctors forwarded their medical records of these patients to the researchers. Content of consultations was measured against accepted standards for general practice and then compared with content of clinical notes. An index, or content score, was calculated as the measure of agreement between actions which had actually been recorded and actions which could have been recorded in the clinical notes. A high content score reflected a consultation which had been recorded well in the medical record. The correlation between number of actions across the four complaints recorded in the clinical notes and number of actions taken during the consultations was also calculated. RESULTS. The mean content score (interquartile range) for the four types of complaint was 0.32 (0.27-0.37), indicating that of all actions undertaken, only 32% had been recorded. However, mean content scores for the categories 'medication and therapy' and 'laboratory examination' were much higher than for the categories 'history' and 'guidance and advice' (0.68 and 0.64, respectively versus 0.29 and 0.22, respectively). The correlation between number of actions across the four complaints recorded in the clinical notes and number of actions taken during the consultations was 0.54 (P < 0.05). CONCLUSION. The use of clinical notes to audit doctors' performance in Dutch general practice is invalid. However, the use of clinical notes to rank doctors according to those who perform many or a few actions in a consultation may be justified. PMID:8185988
Mauntel, Timothy C; Padua, Darin A; Stanley, Laura E; Frank, Barnett S; DiStefano, Lindsay J; Peck, Karen Y; Cameron, Kenneth L; Marshall, Stephen W
2017-11-01
The Landing Error Scoring System (LESS) can be used to identify individuals with an elevated risk of lower extremity injury. The limitation of the LESS is that raters identify movement errors from video replay, which is time-consuming and, therefore, may limit its use by clinicians. A markerless motion-capture system may be capable of automating LESS scoring, thereby removing this obstacle. To determine the reliability of an automated markerless motion-capture system for scoring the LESS. Cross-sectional study. United States Military Academy. A total of 57 healthy, physically active individuals (47 men, 10 women; age = 18.6 ± 0.6 years, height = 174.5 ± 6.7 cm, mass = 75.9 ± 9.2 kg). Participants completed 3 jump-landing trials that were recorded by standard video cameras and a depth camera. Their movement quality was evaluated by expert LESS raters (standard video recording) using the LESS rubric and by software that automates LESS scoring (depth-camera data). We recorded an error for a LESS item if it was present on at least 2 of 3 jump-landing trials. We calculated κ statistics, prevalence- and bias-adjusted κ (PABAK) statistics, and percentage agreement for each LESS item. Interrater reliability was evaluated between the 2 expert rater scores and between a consensus expert score and the markerless motion-capture system score. We observed reliability between the 2 expert LESS raters (average κ = 0.45 ± 0.35, average PABAK = 0.67 ± 0.34; percentage agreement = 0.83 ± 0.17). The markerless motion-capture system had similar reliability with consensus expert scores (average κ = 0.48 ± 0.40, average PABAK = 0.71 ± 0.27; percentage agreement = 0.85 ± 0.14). However, reliability was poor for 5 LESS items in both LESS score comparisons. A markerless motion-capture system had the same level of reliability as expert LESS raters, suggesting that an automated system can accurately assess movement. Therefore, clinicians can use the markerless motion-capture system to reliably score the LESS without being limited by the time requirements of manual LESS scoring.
Quasi-Supervised Scoring of Human Sleep in Polysomnograms Using Augmented Input Variables
Yaghouby, Farid; Sunderam, Sridhar
2015-01-01
The limitations of manual sleep scoring make computerized methods highly desirable. Scoring errors can arise from human rater uncertainty or inter-rater variability. Sleep scoring algorithms either come as supervised classifiers that need scored samples of each state to be trained, or as unsupervised classifiers that use heuristics or structural clues in unscored data to define states. We propose a quasi-supervised classifier that models observations in an unsupervised manner but mimics a human rater wherever training scores are available. EEG, EMG, and EOG features were extracted in 30s epochs from human-scored polysomnograms recorded from 42 healthy human subjects (18 to 79 years) and archived in an anonymized, publicly accessible database. Hypnograms were modified so that: 1. Some states are scored but not others; 2. Samples of all states are scored but not for transitional epochs; and 3. Two raters with 67% agreement are simulated. A framework for quasi-supervised classification was devised in which unsupervised statistical models—specifically Gaussian mixtures and hidden Markov models—are estimated from unlabeled training data, but the training samples are augmented with variables whose values depend on available scores. Classifiers were fitted to signal features incorporating partial scores, and used to predict scores for complete recordings. Performance was assessed using Cohen's K statistic. The quasi-supervised classifier performed significantly better than an unsupervised model and sometimes as well as a completely supervised model despite receiving only partial scores. The quasi-supervised algorithm addresses the need for classifiers that mimic scoring patterns of human raters while compensating for their limitations. PMID:25679475
Quasi-supervised scoring of human sleep in polysomnograms using augmented input variables.
Yaghouby, Farid; Sunderam, Sridhar
2015-04-01
The limitations of manual sleep scoring make computerized methods highly desirable. Scoring errors can arise from human rater uncertainty or inter-rater variability. Sleep scoring algorithms either come as supervised classifiers that need scored samples of each state to be trained, or as unsupervised classifiers that use heuristics or structural clues in unscored data to define states. We propose a quasi-supervised classifier that models observations in an unsupervised manner but mimics a human rater wherever training scores are available. EEG, EMG, and EOG features were extracted in 30s epochs from human-scored polysomnograms recorded from 42 healthy human subjects (18-79 years) and archived in an anonymized, publicly accessible database. Hypnograms were modified so that: 1. Some states are scored but not others; 2. Samples of all states are scored but not for transitional epochs; and 3. Two raters with 67% agreement are simulated. A framework for quasi-supervised classification was devised in which unsupervised statistical models-specifically Gaussian mixtures and hidden Markov models--are estimated from unlabeled training data, but the training samples are augmented with variables whose values depend on available scores. Classifiers were fitted to signal features incorporating partial scores, and used to predict scores for complete recordings. Performance was assessed using Cohen's Κ statistic. The quasi-supervised classifier performed significantly better than an unsupervised model and sometimes as well as a completely supervised model despite receiving only partial scores. The quasi-supervised algorithm addresses the need for classifiers that mimic scoring patterns of human raters while compensating for their limitations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fine resolution 3D temperature fields off Kerguelen from instrumented penguins
NASA Astrophysics Data System (ADS)
Charrassin, Jean-Benoît; Park, Young-Hyang; Le Maho, Yvon; Bost, Charles-André
2004-12-01
The use of diving animals as autonomous vectors of oceanographic instruments is rapidly increasing, because this approach yields cost-efficient new information and can be used in previously poorly sampled areas. However, methods for analyzing the collected data are still under development. In particular, difficulties may arise from the heterogeneous data distribution linked to animals' behavior. Here we show how raw temperature data collected by penguin-borne loggers were transformed to a regular gridded dataset that provided new information on the local circulation off Kerguelen. A total of 16 king penguins ( Aptenodytes patagonicus) were equipped with satellite-positioning transmitters and with temperature-time-depth recorders (TTDRs) to record dive depth and sea temperature. The penguins' foraging trips recorded during five summers ranged from 140 to 600 km from the colony and 11,000 dives >100 m were recorded. Temperature measurements recorded during diving were used to produce detailed 3D temperature fields of the area (0-200 m). The data treatment included dive location, determination of the vertical profile for each dive, averaging and gridding of those profiles onto 0.1°×0.1° cells, and optimal interpolation in both the horizontal and vertical using an objective analysis. Horizontal fields of temperature at the surface and 100 m are presented, as well as a vertical section along the main foraging direction of the penguins. Compared to conventional temperature databases (Levitus World Ocean Atlas and historical stations available in the area), the 3D temperature fields collected from penguins are extremely finely resolved, by one order finer. Although TTDRs were less accurate than conventional instruments, such a high spatial resolution of penguin-derived data provided unprecedented detailed information on the upper level circulation pattern east of Kerguelen, as well as the iron-enrichment mechanism leading to a high primary production over the Kerguelen Plateau.
ERIC Educational Resources Information Center
Maidana, Nora L.; da Fonseca, Monaliza; Barros, Suelen F.; Vanin, Vito R.
2016-01-01
The Virtual Laboratory was created as a complementary educational activity, with the aim of working abstract concepts from an experimental point of view. In this work, the motion of a ring rolling and slipping in front of a grid printed panel was recorded. The frames separated from this video received a time code, and the resulting set of images…
NASA Astrophysics Data System (ADS)
Hardman, M.; Brodzik, M. J.; Long, D. G.
2017-12-01
Beginning in 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Historical versions of the gridded passive microwave data sets were produced as flat binary files described in human-readable documentation. This format is error-prone and makes it difficult to reliably include all processing and provenance. Funded by NASA MEaSUREs, we have completely reprocessed the gridded data record that includes SMMR, SSM/I-SSMIS and AMSR-E. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) files are self-describing. Our approach to the new data set was to create netCDF4 files that use standard metadata conventions and best practices to incorporate file-level, machine- and human-readable contents, geolocation, processing and provenance metadata. We followed the flexible and adaptable Climate and Forecast (CF-1.6) Conventions with respect to their coordinate conventions and map projection parameters. Additionally, we made use of Attribute Conventions for Dataset Discovery (ACDD-1.3) that provided file-level conventions with spatio-temporal bounds that enable indexing software to search for coverage. Our CETB files also include temporal coverage and spatial resolution in the file-level metadata for human-readability. We made use of the JPL CF/ACDD Compliance Checker to guide this work. We tested our file format with real software, for example, netCDF Command-line Operators (NCO) power tools for unlimited control on spatio-temporal subsetting and concatenation of files. The GDAL tools understand the CF metadata and produce fully-compliant geotiff files from our data. ArcMap can then reproject the geotiff files on-the-fly and work with other geolocated data such as coastlines, with no special work required. We expect this combination of standards and well-tested interoperability to significantly improve the usability of this important ESDR for the Earth Science community.
NASA Technical Reports Server (NTRS)
Hall, Dorothy K.; Box, Jason E.; Koenig, Lora S.; DiGirolamo, Nicolo E.; Comiso, Josefino C.; Shuman, Christopher A.
2011-01-01
Surface temperatures on the Greenland Ice Sheet have been studied on the ground, using automatic weather station (AWS) data from the Greenland-Climate Network (GC-Net), and from analysis of satellite sensor data. Using Advanced Very High Frequency Radiometer (AVHRR) weekly surface temperature maps, warming of the surface of the Greenland Ice Sheet has been documented since 1981. We extended and refined this record using higher-resolution Moderate-Resolution Imaging Spectroradiometer (MODIS) data from March 2000 to the present. We developed a daily and monthly climate-data record (CDR) of the "clear-sky" surface temperature of the Greenland Ice Sheet using an ice-surface temperature (1ST) algorithm developed for use with MODIS data. Validation of this CDR is ongoing. MODIS Terra swath data are projected onto a polar stereographic grid at 6.25-km resolution to develop binary, gridded daily and mean-monthly 1ST maps. Each monthly map also has a color-coded image map that is available to download. Also included with the monthly maps is an accompanying map showing number of days in the month that were used to calculate the mean-monthly 1ST. This is important because no 1ST decision is made by the algorithm for cells that are considered cloudy by the internal cloud mask, so a sufficient number of days must be available to produce a mean 1ST for each grid cell. Validation of the CDR consists of several facets: 1) comparisons between ISTs and in-situ measurements; 2) comparisons between ISTs and AWS data; and 3) comparisons of ISTs with surface temperatures derived from other satellite instruments such as the Thermal Emission and Reflection Radiometer (ASTER) and Enhanced Thematic Mapper Plus (ETM+). Previous work shows that Terra MODIS ISTs are about 3 C lower than in-situ temperatures measured at Summit Camp, during the winter of 2008-09 under clear skies. In this work we begin to compare surface temperatures derived from AWS data with ISTs from the MODIS CDR.
Ballestros Peña, Sendoa; Lorrio Palomino, Sergio; Ariz Zubiaur, Mónica
2012-11-01
BASICS: A Prehospital Care and Transfer Recording (PCTR) is an out-of-hospital medical recording. This paper was made to assess and compare the level of fulfillment of the basic parameters of the PCTR developed by the Life Support Units with nurses (Life Support Units with Nurse, LSUwN and without nurses (Basic Life Support Units, BLSU) from SAMUR Bilbao in 2010. A descriptive, retrospective and comparative study was performed by analysing a randomized sample of 660 PCTR (precision 3%), aiming to check the fulfillment of the basic data. 98.33% of total recordings were readable. In overall, fulfillment rate was 90.31% (CI 89.24- 97.3 71%) of all basic parameters for LSUwN PCTR and 84.81% (CI 83.56 to 86%) for BLSU. 34.1% of PCTR were completely and correctly fulfilled. The LSUwN scored significantly better (p < 0.000). There were recording failures in "date and time", "address" and "physical examination". There were differences between the recording of clinical and administrative information (88.64% vs 86.72%, p = 0.02). In order to consider a parameter has optimal, it has to reach 100% of fulfillment. If it doesn't, and its score reaches no more than 80%, it should be reviewed. In this case, the results would be considered acceptable, but the administrative items of BLSU records, and allergies in both units should be strengthened. LSUwN has obtained better scores. The need of recording clinical information must be instilled as evidence of quality care.
Privacy Impact Assessment for the Lead-based Paint System of Records
The Lead-based Paint System of Records collects personally identifiable information, test scores, and submitted fees. Learn how this data is collected, how it will be used, access to the data, the purpose of data collection, and record retention policies.
NASA Astrophysics Data System (ADS)
Zhou, Lijun; Liu, Jisheng
2017-03-01
Tourism safety is gradually gaining more attention due to the rapid development of the tourism industry in China. Changbai Mountain is one of the most famous mountainous scenic areas in Northeast Asia. Assessment on Changbai Mountain scenic area’s tourism safety risk could do a favor in detecting influence factor of tourism safety risk and classifying tourism safety risk rank, thereby reducing and preventing associated tourism safety risks. This paper uses the Changbai Mountain scenic area as the study subject. By the means of experts scoring and analytic hierarchy process on quantified relevant evaluation indicator, the grid GIS method is used to vectorize the relevant data within a 1000m grid. It respectively analyzes main indicators associated tourism safety risk in Changbai Mountain scenic area, including hazard, exposure, vulnerability and ability to prevent and mitigate disasters. The integrated tourism safety risk model is used to comprehensively evaluate tourism safety risk in Changbai Mountain scenic area.
NASA Astrophysics Data System (ADS)
Hanshaw, M. N.; Schmidt, K. M.; Jorgensen, D. P.; Stock, J. D.
2007-12-01
Constraining the distribution of rainfall is essential to evaluating the post-fire mass-wasting response of steep soil-mantled landscapes. As part of a pilot early-warning project for flash floods and debris flows, NOAA deployed a portable truck-mounted Shared Mobile Atmospheric Research and Teaching Radar (SMART-R) to the 2006 Day fire in the Transverse Ranges of Southern California. In conjunction with a dense array of ground- based instruments, including 8 tipping-bucket rain gages located within an area of 170 km2, this C-band mobile Doppler radar provided 200-m grid cell estimates of precipitation data at fine temporal and spatial scales in burned steeplands at risk from hazardous flash floods and debris flows. To assess the utility of using this data in process models for flood and debris flow initiation, we converted grids of radar reflectivity to hourly time-steps of precipitation using an empirical relationship for convective storms, sampling the radar data at the locations of each rain gage as determined by GPS. The SMART-R was located 14 km from the farthest rain gage, but <10 km away from our intensive research area, where 5 gages are located within <1-2 km of each other. Analyses of the nine storms imaged by radar throughout the 2006/2007 winter produced similar cumulative rainfall totals between the gages and their SMART-R grid location over the entire season which correlate well on the high side, with gages recording the most precipitation agreeing to within 11% of the SMART-R. In contrast, on the low rainfall side, totals between the two recording systems are more variable, with a 62% variance between the minimums. In addition, at the scale of individual storms, a correlation between ground-based rainfall measurements and radar-based rainfall estimates is less evident, with storm totals between the gages and the SMART-R varying between 7 and 88%, a possible result of these being relatively small, fast-moving storms in an unusually dry winter. The SMART-R also recorded higher seasonal cumulative rainfall than the terrestrial gages, perhaps indicating that not all precipitation reached the ground. For one storm in particular, time-lapse photographs of the ground document snow. This could explain, in part, the discrepancy between storm-specific totals when the rain gages recorded significantly lower totals than the SMART-R. For example, during the storm where snow was observed, the SMART-R recorded a maximum of 66% higher rainfall than the maximum recorded by the gages. Unexpectedly, the highest elevation gage, located in a pre-fire coniferous vegetation community, consistently recorded the lowest precipitation, whereas gages in the lower elevation pre- fire chaparral community recorded the highest totals. The spatial locations of the maximum rainfall inferred by the SMART-R and the terrestrial gages are also offset by 1.6 km, with terrestrial values shifted easterly. The observation that the SMART-R images high rainfall intensities recorded by rain gages suggests that this technology has the ability to quantitatively estimate the spatial distribution over larger areas at a high resolution. Discrepancies on the storm scale, however, need to be investigated further, but we are optimistic that such high resolution data from the SMART-R and the terrestrial gages may lead to the effective application of a prototype debris-flow warning system where such processes put lives at risk.
2014-01-01
Introduction Intensive care unit (ICU) patients are known to experience severely disturbed sleep, with possible detrimental effects on short- and long- term outcomes. Investigation into the exact causes and effects of disturbed sleep has been hampered by cumbersome and time consuming methods of measuring and staging sleep. We introduce a novel method for ICU depth of sleep analysis, the ICU depth of sleep index (IDOS index), using single channel electroencephalography (EEG) and apply it to outpatient recordings. A proof of concept is shown in non-sedated ICU patients. Methods Polysomnographic (PSG) recordings of five ICU patients and 15 healthy outpatients were analyzed using the IDOS index, based on the ratio between gamma and delta band power. Manual selection of thresholds was used to classify data as either wake, sleep or slow wave sleep (SWS). This classification was compared to visual sleep scoring by Rechtschaffen & Kales criteria in normal outpatient recordings and ICU recordings to illustrate face validity of the IDOS index. Results When reduced to two or three classes, the scoring of sleep by IDOS index and manual scoring show high agreement for normal sleep recordings. The obtained overall agreements, as quantified by the kappa coefficient, were 0.84 for sleep/wake classification and 0.82 for classification into three classes (wake, non-SWS and SWS). Sensitivity and specificity were highest for the wake state (93% and 93%, respectively) and lowest for SWS (82% and 76%, respectively). For ICU recordings, agreement was similar to agreement between visual scorers previously reported in literature. Conclusions Besides the most satisfying visual resemblance with manually scored normal PSG recordings, the established face-validity of the IDOS index as an estimator of depth of sleep was excellent. This technique enables real-time, automated, single channel visualization of depth of sleep, facilitating the monitoring of sleep in the ICU. PMID:24716479
Baskan, Semih; Cankaya, Deniz; Unal, Hidayet; Yoldas, Burak; Taspinar, Vildan; Deveci, Alper; Tabak, Yalcin; Baydar, Mustafa
2017-01-01
This study compared the efficacy of continuous interscalene block (CISB) and subacromial infusion of local anesthetic (CSIA) for postoperative analgesia after open shoulder surgery. This randomized, prospective, double-blinded, single-center study included 40 adult patients undergoing open shoulder surgery. All patients received a standardized general anesthetic. The patients were separated into group CISB and group CSIA. A loading dose of 40 mL 0.25% bupivacaine was administered and patient-controlled analgesia was applied by catheter with 0.1% bupivacaine 5 mL/h throughout 24 h basal infusion, 2 mL bolus dose, and 20 min knocked time in both groups postoperatively. Visual analog scale (VAS) scores, additional analgesia need, local anesthetic consumption, complications, and side effects were recorded during the first 24 h postoperatively. The range of motion (ROM) score was recorded preoperatively and in the first and third weeks postoperatively. A statistically significant difference was determined between the groups in respect of consumption of local anesthetic, VAS scores, additional analgesia consumption, complications, and side effects, with lower values recorded in the CISB group. There were no significant differences in ROM scoring in the preoperative and postoperative third week between the two groups but there were significant differences in ROM scoring in the postoperative first week, with higher ROM scoring values in the group CISB patients. The results of this study have shown that continuous interscalene infusion of bupivacaine is an effective and safe method of postoperative analgesia after open shoulder surgery.
Inaccurate, inadequate and inconsistent: A content analysis of burn first aid information online.
Burgess, J D; Cameron, C M; Cuttle, L; Tyack, Z; Kimble, R M
2016-12-01
With the popularity of the Internet as a primary source of health-related information, the aim of this website content analysis was to assess the accuracy and quality of burn first aid information available on the Internet. Using the search term 'burn first aid' in four popular search engines, the first 10 websites from each search engine were recorded. From a total of 40 websites recorded, 14 websites were evaluated after removing duplicates. Websites were assessed on content accuracy by four independent reviewers with checks conducted on inter-rater reliability. Website quality was recorded based on Health on the Net Code of Conduct (HONcode) principles. Country of origin for the 14 websites was the US (7), Australia (6), and New Zealand (1). The mean content accuracy score was 5.6 out of 10. The mean website quality score was 6.6 out of 12. Australasian websites scored lower for quality but higher for accuracy. The US websites scored higher for quality than accuracy. Website usability and accuracy in a crisis situation were also assessed. The median crisis usability score was 3 out of five, and the median crisis accuracy score was 3.5 out of five. The inaccurate and inconsistent burn first aid treatments that appear online are reflected in the often-incorrect burn first aid treatments seen in patients attending emergency departments. Global consistency in burn first aid information is needed to avoid confusion by members of the public. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
The impact of foot arch height on quality of life in 6-12 year olds.
López López, Daniel; Bouza Prego, M de Los Ángeles; Requeijo Constenla, Ana; Saleta Canosa, Jesús Luis; Bautista Casasnovas, Adolfo; Tajes, Francisco Alonso
2014-01-01
To determine whether arch height has an effect on the health-related quality of life of schoolchildren. One hundred and thirteen schoolchildren attended an out-patient centre where self-reported data were recorded, their feet were classified into one of three groups according to their arch index (high, normal or low) and the scores obtained from the Foot Health Status Questionnaire (FHSQ - Spanish version) were compared. The groups with high, low and normal arch recorded lower scores in Section One for the general foot health and footwear domains and higher scores in foot pain and foot function. In Section Two they obtained lower scores in general health and higher scores in physical activity, social capacity and vigour. Comparison of the scores obtained reveals that arch height has a negative impact on quality of life. Given the limited extent of available evidence in respect of the aetiology and treatment of foot diseases and deformities, these findings reveal the need to implement programmes to promote foot health and carry out further research into this commonly occurring disabling condition.
Mortality in Code Blue; can APACHE II and PRISM scores be used as markers for prognostication?
Bakan, Nurten; Karaören, Gülşah; Tomruk, Şenay Göksu; Keskin Kayalar, Sinem
2018-03-01
Code blue (CB) is an emergency call system developed to respond to cardiac and respiratory arrest in hospitals. However, in literature, no scoring system has been reported that can predict mortality in CB procedures. In this study, we aimed to investigate the effectiveness of estimated APACHE II and PRISM scores in the prediction of mortality in patients assessed using CB to retrospectively analyze CB calls. We retrospectively examined 1195 patients who were evaluated by the CB team at our hospital between 2009 and 2013. The demographic data of the patients, diagnosis and relevant de-partments, reasons for CB, cardiopulmonary resuscitation duration, mortality calculated from the APACHE II and PRISM scores, and the actual mortality rates were retrospectively record-ed from CB notification forms and the hospital database. In all age groups, there was a significant difference between actual mortality rate and the expected mortality rate as estimated using APACHE II and PRISM scores in CB calls (p<0.05). The actual mortality rate was significantly lower than the expected mortality. APACHE and PRISM scores with the available parameters will not help predict mortality in CB procedures. Therefore, novels scoring systems using different parameters are needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Kyo-Sun; Hong, Song You; Yoon, Jin-Ho
2014-10-01
The most recent version of Simplified Arakawa-Schubert (SAS) cumulus scheme in National Center for Environmental Prediction (NCEP) Global Forecast System (GFS) (GFS SAS) has been implemented into the Weather and Research Forecasting (WRF) model with a modification of triggering condition and convective mass flux to become depending on model’s horizontal grid spacing. East Asian Summer Monsoon of 2006 from June to August is selected to evaluate the performance of the modified GFS SAS scheme. Simulated monsoon rainfall with the modified GFS SAS scheme shows better agreement with observation compared to the original GFS SAS scheme. The original GFS SAS schememore » simulates the similar ratio of subgrid-scale precipitation, which is calculated from a cumulus scheme, against total precipitation regardless of model’s horizontal grid spacing. This is counter-intuitive because the portion of resolved clouds in a grid box should be increased as the model grid spacing decreases. This counter-intuitive behavior of the original GFS SAS scheme is alleviated by the modified GFS SAS scheme. Further, three different cumulus schemes (Grell and Freitas, Kain and Fritsch, and Betts-Miller-Janjic) are chosen to investigate the role of a horizontal resolution on simulated monsoon rainfall. The performance of high-resolution modeling is not always enhanced as the spatial resolution becomes higher. Even though improvement of probability density function of rain rate and long wave fluxes by the higher-resolution simulation is robust regardless of a choice of cumulus parameterization scheme, the overall skill score of surface rainfall is not monotonically increasing with spatial resolution.« less
Use of force sensors to detect and analyse lameness in dairy cows.
Kujala, M; Pastell, M; Soveri, T
2008-03-22
Force sensors were used to detect lameness in dairy cows in two trials. In the first trial, leg weights were recorded during approximately 12,000 milkings with balances built into the floor of the milking robot. Cows that put less weight on one leg or kicked frequently during milking were checked first with a locomotion scoring system and then with a clinical inspection. A locomotion score of more than 2 was considered lame, and these cows' hooves were examined at hoof trimming to determine the cause and to identify any hoof lesions. In the second trial 315 locomotion scores were recorded and compared with force sensor data. The force sensors proved to be a good method for recognising lameness. Computer curves drawn from force sensor data helped to find differences between leg weights, thus indicating lameness and its duration. Sole ulcers and white line disease were identified more quickly by force sensors than by locomotion scoring, but joint problems were more easily detected by locomotion scoring.
Signal amplification of FISH for automated detection using image cytometry.
Truong, K; Boenders, J; Maciorowski, Z; Vielh, P; Dutrillaux, B; Malfoy, B; Bourgeois, C A
1997-05-01
The purpose of this study was to improve the detection of FISH signals, in order that spot counting by a fully automated image cytometer be comparable to that obtained visually under the microscope. Two systems of spot scoring, visual and automated counting, were investigated in parallel on stimulated human lymphocytes with FISH using a biotinylated centromeric probe for chromosome 3. Signal characteristics were first analyzed on images recorded with a coupled charge device (CCD) camera. Number of spots per nucleus were scored visually on these recorded images versus automatically with a DISCOVERY image analyzer. Several fluochromes, amplification and pretreatments were tested. Our results for both visual and automated scoring show that the tyramide amplification system (TSA) gives the best amplification of signal if pepsin treatment is applied prior to FISH. Accuracy of the automated scoring, however, remained low (58% of nuclei containing two spots) compared to the visual scoring because of the high intranuclear variation between FISH spots.
Improving the modeling of geomagnetically induced currents in Spain
NASA Astrophysics Data System (ADS)
Torta, J. M.; Marcuello, A.; Campanyà, J.; Marsal, S.; Queralt, P.; Ledo, J.
2017-05-01
Vulnerability assessments of the risk posed by geomagnetically induced currents (GICs) to power transmission grids benefit from accurate knowledge of the geomagnetic field variations at each node of the grid, the Earth's geoelectrical structures beneath them, and the topology and relative resistances of the grid elements in the precise instant of a storm. The results of previous analyses on the threat posed by GICs to the Spanish 400 kV grid are improved in this study by resorting to different strategies to progress in the three aspects identified above. First, although at midlatitude regions the source fields are rather uniform, we have investigated the effect of their spatial changes by interpolating the field from the records of several close observatories with different techniques. Second, we have performed a magnetotelluric (MT) sounding in the vicinity of one of the transformers where GICs are measured to determine the geoelectrical structure of the Earth, and we have identified the importance of estimating the MT impedance tensor when predicting GIC, especially where the effect of lateral heterogeneities is important. Finally, a sensitivity analysis to network changes has allowed us to assess the reliability of both the information about the network topology and resistances, and the assumptions made when all the details or the network status are not available. In our case, the most essential issue to improve the coincidence between model predictions and actual observations came from the use of realistic geoelectric information involving local MT measurements.
NASA Technical Reports Server (NTRS)
Gong, Gavin; Entekhabi, Dara; Salvucci, Guido D.
1994-01-01
Simulated climates using numerical atmospheric general circulation models (GCMs) have been shown to be highly sensitive to the fraction of GCM grid area assumed to be wetted during rain events. The model hydrologic cycle and land-surface water and energy balance are influenced by the parameter bar-kappa, which is the dimensionless fractional wetted area for GCM grids. Hourly precipitation records for over 1700 precipitation stations within the contiguous United States are used to obtain observation-based estimates of fractional wetting that exhibit regional and seasonal variations. The spatial parameter bar-kappa is estimated from the temporal raingauge data using conditional probability relations. Monthly bar-kappa values are estimated for rectangular grid areas over the contiguous United States as defined by the Goddard Institute for Space Studies 4 deg x 5 deg GCM. A bias in the estimates is evident due to the unavoidably sparse raingauge network density, which causes some storms to go undetected by the network. This bias is corrected by deriving the probability of a storm escaping detection by the network. A Monte Carlo simulation study is also conducted that consists of synthetically generated storm arrivals over an artificial grid area. It is used to confirm the bar-kappa estimation procedure and to test the nature of the bias and its correction. These monthly fractional wetting estimates, based on the analysis of station precipitation data, provide an observational basis for assigning the influential parameter bar-kappa in GCM land-surface hydrology parameterizations.
Ensuring excision of intraductal lesions: marker placement at time of ductography.
Woodward, Suzanne; Daly, Caroline P; Patterson, Stephanie K; Joe, Annette I; Helvie, Mark A
2010-11-01
To propose grid coordinate marker placement for patients with suspicious ductogram findings occult on routine workup. To compare the success of marker placement and wire localization (WL) with ductogram-guided WL. A retrospective search of radiology records identified all patients referred for ductography between January 2001 and May 2008. Results for 16 patients referred for ductogram-guided WL and 5 patients with grid coordinate marker placement at the time of ductography and subsequent WL were reviewed. Surgical pathology results and clinical follow-up were reviewed for concordance. Nine of 16 patients (56.3%) underwent successful ductogram-guided WL. Eight of nine patients had papillomas, one of which also had atypical ductal hyperplasia (ADH). One of nine patients had ectatic ducts with inspisated debris. Seven patients who failed ductogram-guided WL eventually underwent open surgical biopsy. Four of seven patients had papillomas, one of which also had lobular carcinoma in situ. Remaining patients had ADH (1/7) and fibrocystic changes with chronic inflammation (3/7). All five (100%) patients with grid coordinate marker placement underwent successful WL and marker excision. Pathology results included three papillomas, papillary intraductal hyperplasia, and fibrocystic change. Grid coordinate marker placement at the time of abnormal ductogram provided an accurate method of localizing ductal abnormalities that are occult on routine workup, thus facilitating future WL. Marker placement obviated the need for repeat ductogram on the day of surgery and ensured surgical removal of the ductogram abnormality. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.
Evaluating Mesoscale Simulations of the Coastal Flow Using Lidar Measurements
NASA Astrophysics Data System (ADS)
Floors, R.; Hahmann, A. N.; Peña, A.
2018-03-01
The atmospheric flow in the coastal zone is investigated using lidar and mast measurements and model simulations. Novel dual-Doppler scanning lidars were used to investigate the flow over a 7 km transect across the coast, and vertically profiling lidars were used to study the vertical wind profile at offshore and onshore positions. The Weather, Research and Forecasting model is set up in 12 different configurations using 2 planetary boundary layer schemes, 3 horizontal grid spacings and varied sources of land use, and initial and lower boundary conditions. All model simulations describe the observed mean wind profile well at different onshore and offshore locations from the surface up to 500 m. The simulated mean horizontal wind speed gradient across the shoreline is close to that observed, although all simulations show wind speeds that are slightly higher than those observed. Inland at the lowest observed height, the model has the largest deviations compared to the observations. Taylor diagrams show that using ERA-Interim data as boundary conditions improves the model skill scores. Simulations with 0.5 and 1 km horizontal grid spacing show poorer model performance compared to those with a 2 km spacing, partially because smaller resolved wave lengths degrade standard error metrics. Modeled and observed velocity spectra were compared and showed that simulations with the finest horizontal grid spacing resolved more high-frequency atmospheric motion.
Comparison of the goals and MISTELS scores for the evaluation of surgeons on training benches.
Wolf, Rémi; Medici, Maud; Fiard, Gaëlle; Long, Jean-Alexandre; Moreau-Gaudry, Alexandre; Cinquin, Philippe; Voros, Sandrine
2018-01-01
Evaluation of surgical technical abilities is a major issue in minimally invasive surgery. Devices such as training benches offer specific scores to evaluate surgeons but cannot transfer in the operating room (OR). A contrario, several scores measure performance in the OR, but have not been evaluated on training benches. Our aim was to demonstrate that the GOALS score, which can effectively grade in the OR the abilities involved in laparoscopy, can be used for evaluation on a laparoscopic testbench (MISTELS). This could lead to training systems that can identify more precisely the skills that have been acquired or must still be worked on. 32 volunteers (surgeons, residents and medical students) performed the 5 tasks of the MISTELS training bench and were simultaneously video-recorded. Their performance was evaluated with the MISTELS score and with the GOALS score based on the review of the recording by two experienced, blinded laparoscopic surgeons. The concurrent validity of the GOALS score was assessed using Pearson and Spearman correlation coefficients with the MISTELS score. The construct validity of the GOALS score was assessed with k-means clustering and accuracy rates. Lastly, abilities explored by each MISTELS task were identified with multiple linear regression. GOALS and MISTELS scores are strongly correlated (Pearson correlation coefficient = 0.85 and Spearman correlation coefficient = 0.82 for the overall score). The GOALS score proves to be valid for construction for the tasks of the training bench, with a better accuracy rate between groups of level after k-means clustering, when compared to the original MISTELS score (accuracy rates, respectively, 0.75 and 0.56). GOALS score is well suited for the evaluation of the performance of surgeons of different levels during the completion of the tasks of the MISTELS training bench.
Evaluation of Voice Acoustics as Predictors of Clinical Depression Scores.
Hashim, Nik Wahidah; Wilkes, Mitch; Salomon, Ronald; Meggs, Jared; France, Daniel J
2017-03-01
The aim of the present study was to determine if acoustic measures of voice, characterizing specific spectral and timing properties, predict clinical ratings of depression severity measured in a sample of patients using the Hamilton Depression Rating Scale (HAMD) and Beck Depression Inventory (BDI-II). This is a prospective study. Voice samples and clinical depression scores were collected prospectively from consenting adult patients who were referred to psychiatry from the adult emergency department or primary care clinics. The patients were audio-recorded as they read a standardized passage in a nearly closed-room environment. Mean Absolute Error (MAE) between actual and predicted depression scores was used as the primary outcome measure. The average MAE between predicted and actual HAMD scores was approximately two scores for both men and women, and the MAE for the BDI-II scores was approximately one score for men and eight scores for women. Timing features were predictive of HAMD scores in female patients while a combination of timing features and spectral features was predictive of scores in male patients. Timing features were predictive of BDI-II scores in male patients. Voice acoustic features extracted from read speech demonstrated variable effectiveness in predicting clinical depression scores in men and women. Voice features were highly predictive of HAMD scores in men and women, and BDI-II scores in men, respectively. The methodology is feasible for diagnostic applications in diverse clinical settings as it can be implemented during a standard clinical interview in a normal closed room and without strict control on the recording environment. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Silvey, Brian A.; Montemayor, Mark; Baumgartner, Christopher M.
2017-01-01
The purpose of this study was to investigate undergraduate instrumental music education majors' score study practices as they related to the effectiveness of their simulated conducting. Participants (N = 30) were video recorded in two sessions in which they completed a 20-min score study session and a simulated conducting performance. In the first…
ERIC Educational Resources Information Center
Ortiz, Arlene; Clinton, Amanda; Schaefer, Barbara A.
2015-01-01
Convergent and discriminant validity evidence was examined for scores on the Spanish Record Form of the Bracken School Readiness Assessment, Third Edition (BSRA-3). Participants included a sample of 68 Hispanic, Spanish-speaking children ages 4 to 5 years enrolled in preschool programs in Puerto Rico. Scores obtained from the BSRA-3 Spanish Record…
Ugur, Kadriye Serife; Karabayirli, Safinaz; Demircioğlu, Rüveyda İrem; Ark, Nebil; Kurtaran, Hanifi; Muslu, Bunyamin; Sert, Hüseyin
2013-11-01
To investigate and compare the effectiveness of preincisional peritonsillar infiltration of ketamine and tramadol for post-operative pain on children following adenotonsillectomy. Prospective randomized double blind controlled study. Seventy-five children aged 3-10 years undergoing adenotonsillectomy were included in study. Patients received injections in peritonsillar fossa of tramadol (2 mg/kg-2 ml), ketamine (0.5 mg/kg-2 ml) or 2 ml serum physiologic. During operation heart rate, oxygen saturation, average mean blood pressures were recorded in every 5 min. Operation, anesthesia and the time that Alderete scores 9-10, patient satisfaction, analgesic requirements were recorded. Postoperatively nausea, vomiting, sedation, dysphagia, bleeding scores were recorded at 0, 10, 30, 60 min and 2, 4, 8, 12, 18, 24h postoperatively. Pain was evaluated using modified Children's Hospital of Eastern Ontario Pain Scale (mCHEOPS) at fixed intervals after the procedure (15 min and 1, 4, 12, 16, and 24h postoperatively). The recordings of heart rate, mean arterial pressure, nausea, vomiting, sedation and bleeding scores were similar in all groups (p>0.05). The mCHEOPS scores at 10 min, 30 min, 1h, 8h were significantly lower in both tramadol and ketamine group when compared with control (p<0.05). Use of additional analgesia at 10 min and 18 h were higher in control group than ketamine, tramadol group (p<0.05). Dysphagia scores were significantly lower for both ketamine and tramadol group when compared with control group (p<0.05). mCHEOPS, additional analgesia, dysphagia, patient satisfaction scores were similar in tramadol, ketamine groups (p>0.05). Preincisional injection of ketamine and tramadol prior to tonsillectomy is safe, effective method and equivalent for post-tonsillectomy pain, patient satisfaction, postoperative nausea, vomiting, dysphagia. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
de Biase, Stefano; Gigli, Gian Luigi; Lorenzut, Simone; Bianconi, Claudio; Sfreddo, Patrizia; Rossato, Gianluca; Basaldella, Federica; Fuccaro, Matteo; Corica, Antonio; Tonon, Davide; Barbone, Fabio; Valente, Mariarosaria
2014-04-01
The aim of our study was to evaluate the importance of sleep recordings and stimulus-related evoked potentials (EPs) in patients with prolonged disorders of consciousness (DOCs) by correlating neurophysiologic variables with clinical evaluation obtained using specific standardized scales. There were 27 vegetative state (VS) and 5 minimally conscious state (MCS) patients who were evaluated from a clinical and neurophysiologic perspective. Clinical evaluation included the Coma Recovery Scale-Revised (CRS-R), Disability Rating Scale (DRS), and Glasgow Coma Scale (GCS). Neurophysiologic evaluation included 24-h polysomnography (PSG), somatosensory EPs (SEPs), brainstem auditory EPs (BAEPs), and visual EPs (VEPs). Patients with preservation of each single sleep element (sleep-wake cycle, sleep spindles, K-complexes, and rapid eye movement [REM] sleep) always showed better clinical scores compared to those who did not have preservation. Statistical significance was only achieved for REM sleep. In 7 patients PSG showed the presence of all considered sleep elements, and they had a CRS-R score of 8.29±1.38. In contrast, 25 patients who lacked one or more of the sleep elements had a CRS-R score of 4.84±1.46 (P<.05). Our multivariate analysis clarified that concurrent presence of sleep spindles and REM sleep were associated with a much higher CRS-R score (positive interaction, P<.0001). On the other hand, no significant associations were found between EPs and CRS-R scores. PSG recordings have proved to be a reliable tool in the neurophysiologic assessment of patients with prolonged DOCs, correlating more adequately than EPs with the clinical evaluation and the level of consciousness. The main contribution to higher clinical scores was determined by the concomitant presence of REM sleep and sleep spindles. PSG recordings may be considered inexpensive, noninvasive, and easy-to-perform examinations to provide supplementary information in patients with prolonged DOCs. Copyright © 2014 Elsevier B.V. All rights reserved.
Khalili, Hosseinali; Sadraei, Nazanin; Niakan, Amin; Ghaffarpasand, Fariborz; Sadraei, Amin
2016-10-01
To determine the role of intracranial pressure (ICP) monitoring in management of patients with severe traumatic brain injury (TBI) admitted to a large level I trauma center in Southern Iran. This was a cohort study performed during a 2-year period in a level I trauma center in Southern Iran including all adult patients (>16 years) with severe TBI (Glasgow Coma Scale [GCS] score, 3-8) who underwent ICP monitoring through ventriculostomy. The management was based on the recorded ICP values with threshold of 20 mm Hg. Decompressive craniectomy was performed in patients with intractable intracranial hypertension (persistent ICP ≥25 mm Hg). In unresponsive patients, barbiturate coma was induced. Patients were followed for 6 months and Glasgow Outcome Scale Extended was recorded. The determinants of favorable and unfavorable outcome were also determined. Overall, we included 248 patients with mean age of 34.6 ± 16.6 years, among whom there were 216 men (87.1%) and 32 women (12.9%). Eighty-five patients (34.2%) had favorable and 163 (65.8%) unfavorable outcomes. Those with favorable outcome had significantly lower age (P = 0.004), higher GCS score on admission (P < 0.001), lower Rotterdam score (P = 0.035), fewer episodes of intracranial hypertension (P < 0.001), and lower maximum recorded ICP (P = 0.041). These factors remained statistically significant after elimination of confounders by multivariate logistic regression model. Age, GCS score on admission, Rotterdam score, intracranial hypertension, and maximum recorded ICP are important determinants of outcome in patients with severe TBI. ICP monitoring assisted us in targeted therapy and management of patients with severe TBI. Copyright © 2016 Elsevier Inc. All rights reserved.
Winston P. Smith; Howard E. Hunt; W. Kent Townley
2001-01-01
To characterize bird species composition,relative abundance,and habitat affinities,spot-mapping and strip-count censuses were conducted in an old-growth stand and adjacent second-growth tracts in Moro Bottoms Natural Area, Arkansas, during 1991 and 1992. More species were recorded on the old-growth site (S =35) as compared to the second-growth grid (S =32). Similarly...
G. L. Wooldridge; R. C. Musselman; R. A. Sommerfeld; D. G. Fox; B. H. Connell
1996-01-01
1. Deformations of Engelmann spruce and subalpine fir trees were surveyed for the purpose of determining climatic wind speeds and directions and snow depths in the Glacier Lakes Ecosystem Experiments Site (GLEES) in the Snowy Range of southeastern Wyoming, USA. Tree deformations were recorded at 50- and 100-m grid intervals over areas of c. 30 ha and 300 ha,...
Modelling the spread of ragweed: Effects of habitat, climate change and diffusion
NASA Astrophysics Data System (ADS)
Vogl, G.; Smolik, M.; Stadler, L.-M.; Leitner, M.; Essl, F.; Dullinger, S.; Kleinbauer, I.; Peterseil, J.
2008-07-01
Ragweed (Ambrosia artemisiifolia L.) is an annual plant native in North America which has been invading Central Europe for 150 years. Caused by the warming of the European climate its spread process has accelerated in the last few decades. The pollen of ragweed evokes heavy allergies and what probably counts even more because of its bloom rather late in summer causes a second wave of allergy when other pollen allergies have decayed. We have reconstructed the invasion process of ragweed in Austria by collecting all records until the year 2005. Austria was subdivided into more than 2600 grid cells of ≈35 text{km}^2 each. Ragweed records were related to environmental descriptors (average temperatures, land use, etc.) by means of logistic regression models, and the suitability of grid cells as habitat for ragweed was determined. This enabled modelling of the diffusive spread of ragweed from 1990 to 2005. The results of the simulations were compared with the observed data, and thus the model was optimised. We then incorporated regional climate change models, in particular increased July mean temperatures of +2.3 ^circtext{C} in 2050, increasing considerably future habitat suitability. This is used for predicting the drastic dispersal of ragweed during the forthcoming decades.
Agarwal, Rahul; Chen, Zhe; Kloosterman, Fabian; Wilson, Matthew A; Sarma, Sridevi V
2016-07-01
Pyramidal neurons recorded from the rat hippocampus and entorhinal cortex, such as place and grid cells, have diverse receptive fields, which are either unimodal or multimodal. Spiking activity from these cells encodes information about the spatial position of a freely foraging rat. At fine timescales, a neuron's spike activity also depends significantly on its own spike history. However, due to limitations of current parametric modeling approaches, it remains a challenge to estimate complex, multimodal neuronal receptive fields while incorporating spike history dependence. Furthermore, efforts to decode the rat's trajectory in one- or two-dimensional space from hippocampal ensemble spiking activity have mainly focused on spike history-independent neuronal encoding models. In this letter, we address these two important issues by extending a recently introduced nonparametric neural encoding framework that allows modeling both complex spatial receptive fields and spike history dependencies. Using this extended nonparametric approach, we develop novel algorithms for decoding a rat's trajectory based on recordings of hippocampal place cells and entorhinal grid cells. Results show that both encoding and decoding models derived from our new method performed significantly better than state-of-the-art encoding and decoding models on 6 minutes of test data. In addition, our model's performance remains invariant to the apparent modality of the neuron's receptive field.
A simple teaching tool for training the pelvic organ prolapse quantification system.
Geiss, Ingrid M; Riss, Paul A; Hanzal, Engelbert; Dungl, Andrea
2007-09-01
The pelvic organ prolapse quantification (POPQ) system is currently the most common and specific system describing different prolapse stages. Nevertheless, its use is not yet accepted worldwide in routine care. Our aim was to develop a simple teaching tool for the POPQ system capable of simulating different stages of uterovaginal prolapse for use in medical education with hands on training. We constructed a moveable and flexible tool with an inverted Santa Claus' cap, which simulated the vaginal cuff and the tassel at the end representing the cervix. A wooden embroidery frame fixed the cap and served as the hymen, the reference point for all measurements. Inside the cap, we sewed buttons to define the anatomic landmark points Aa and Ap located 3 cm distal from the frame. After explaining the device to the students, we used the three-by-three grid for recording the quantitative description of the pelvic organ support. First, each student had to demonstrate a specific prolapse with his cap device. Then, a prolapse was simulated on the cap, and the student had to take the relevant measurements and record them in the POPQ grid. The main training effect to understand the POPQ system seems to be the possibility for each trainee to simulate a three-dimensional prolapse with this flexible vagina model.
Validation of Predictors of Fall Events in Hospitalized Patients With Cancer.
Weed-Pfaff, Samantha H; Nutter, Benjamin; Bena, James F; Forney, Jennifer; Field, Rosemary; Szoka, Lynn; Karius, Diana; Akins, Patti; Colvin, Christina M; Albert, Nancy M
2016-10-01
A seven-item cancer-specific fall risk tool (Cleveland Clinic Capone-Albert [CC-CA] Fall Risk Score) was shown to have a strong concordance index for predicting falls; however, validation of the model is needed. The aims of this study were to validate that the CC-CA Fall Risk Score, made up of six factors, predicts falls in patients with cancer and to determine if the CC-CA Fall Risk Score performs better than the Morse Fall Tool. Using a prospective, comparative methodology, data were collected from electronic health records of patients hospitalized for cancer care in four hospitals. Risk factors from each tool were recorded, when applicable. Multivariable models were created to predict the probability of a fall. A concordance index for each fall tool was calculated. The CC-CA Fall Risk Score provided higher discrimination than the Morse Fall Tool in predicting fall events in patients hospitalized for cancer management.
ERIC Educational Resources Information Center
Balthazar, Earl E.
The scoring form for functional independence skills for the mentally retarded includes a section for recording subjects' demographic characteristics as well as tests used, date administered, and raw score. Other sections provide for a brief description of the program being used, an item scoring sheet for the Eating Scales (dependent feeding,…
Yamaguchi, T; Abe, S; Rompré, P H; Manzini, C; Lavigne, G J
2012-01-01
Clinicians and investigators need a simple and reliable recording device to diagnose or monitor sleep bruxism (SB). The aim of this study was to compare recordings made with an ambulatory electromyographic telemetry recorder (TEL-EMG) with those made with standard sleep laboratory polysomnography with synchronised audio-visual recording (PSG-AV). Eight volunteer subjects without current history of tooth grinding spent one night in a sleep laboratory. Simultaneous bilateral masseter EMG recordings were made with a TEL-EMG and standard PSG. All types of oromotor activity and rhythmic masseter muscle activity (RMMA), typical of SB, were independently scored by two individuals. Correlation and intra-class coefficient (ICC) were estimated for scores on each system. The TEL-EMG was highly sensitive to detect RMMA (0·988), but with low positive predictive value (0·231) because of a high rate of oromotor activity detection (e.g. swallowing and scratching). Almost 72% of false-positive oromotor activity scored with the TEL-EMG occurred during the transient wake period of sleep. A non-significant correlation between recording systems was found (r = 0·49). Because of the high frequency of wake periods during sleep, ICC was low (0·47), and the removal of the influence of wake periods improved the detection reliability of the TEL-EMG (ICC = 0·88). The TEL-EMG is sensitive to detect RMMA in normal subjects. However, it obtained a high rate of false-positive detections because of the presence of frequent oromotor activities and transient wake periods of sleep. New algorithms are needed to improve the validity of TEL-EMG recordings. © 2011 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Morávek, Zdenek; Rickhey, Mark; Hartmann, Matthias; Bogner, Ludwig
2009-08-01
Treatment plans for intensity-modulated proton therapy may be sensitive to some sources of uncertainty. One source is correlated with approximations of the algorithms applied in the treatment planning system and another one depends on how robust the optimization is with regard to intra-fractional tissue movements. The irradiated dose distribution may substantially deteriorate from the planning when systematic errors occur in the dose algorithm. This can influence proton ranges and lead to improper modeling of the Braggpeak degradation in heterogeneous structures or particle scatter or the nuclear interaction part. Additionally, systematic errors influence the optimization process, which leads to the convergence error. Uncertainties with regard to organ movements are related to the robustness of a chosen beam setup to tissue movements on irradiation. We present the inverse Monte Carlo treatment planning system IKO for protons (IKO-P), which tries to minimize the errors described above to a large extent. Additionally, robust planning is introduced by beam angle optimization according to an objective function penalizing paths representing strongly longitudinal and transversal tissue heterogeneities. The same score function is applied to optimize spot planning by the selection of a robust choice of spots. As spots can be positioned on different energy grids or on geometric grids with different space filling factors, a variety of grids were used to investigate the influence on the spot-weight distribution as a result of optimization. A tighter distribution of spot weights was assumed to result in a more robust plan with respect to movements. IKO-P is described in detail and demonstrated on a test case and a lung cancer case as well. Different options of spot planning and grid types are evaluated, yielding a superior plan quality with dose delivery to the spots from all beam directions over optimized beam directions. This option shows a tighter spot-weight distribution and should therefore be less sensitive to movements compared to optimized directions. But accepting a slight loss in plan quality, the latter choice could potentially improve robustness even further by accepting only spots from the most proper direction. The choice of a geometric grid instead of an energy grid for spot positioning has only a minor influence on the plan quality, at least for the investigated lung case.
Botti, F; Alexander, A; Drygajlo, A
2004-12-02
This paper deals with a procedure to compensate for mismatched recording conditions in forensic speaker recognition, using a statistical score normalization. Bayesian interpretation of the evidence in forensic automatic speaker recognition depends on three sets of recordings in order to perform forensic casework: reference (R) and control (C) recordings of the suspect, and a potential population database (P), as well as a questioned recording (QR) . The requirement of similar recording conditions between suspect control database (C) and the questioned recording (QR) is often not satisfied in real forensic cases. The aim of this paper is to investigate a procedure of normalization of scores, which is based on an adaptation of the Test-normalization (T-norm) [2] technique used in the speaker verification domain, to compensate for the mismatch. Polyphone IPSC-02 database and ASPIC (an automatic speaker recognition system developed by EPFL and IPS-UNIL in Lausanne, Switzerland) were used in order to test the normalization procedure. Experimental results for three different recording condition scenarios are presented using Tippett plots and the effect of the compensation on the evaluation of the strength of the evidence is discussed.
NASA Astrophysics Data System (ADS)
Semeniuk, T. A.; Bruintjes, R. T.; Salazar, V.; Breed, D. W.; Jensen, T. L.; Buseck, P. R.
2014-03-01
An airborne study of cloud microphysics provided an opportunity to collect aerosol particles in ambient and updraft conditions of natural convection systems for transmission electron microscopy (TEM). Particles were collected simultaneously on lacey carbon and calcium-coated carbon (Ca-C) TEM grids, providing information on particle morphology and chemistry and a unique record of the particle's physical state on impact. In total, 22 particle categories were identified, including single, coated, aggregate, and droplet types. The fine fraction comprised up to 90% mixed cation sulfate (MCS) droplets, while the coarse fraction comprised up to 80% mineral-containing aggregates. Insoluble (dry), partially soluble (wet), and fully soluble particles (droplets) were recorded on Ca-C grids. Dry particles were typically silicate grains; wet particles were mineral aggregates with chloride, nitrate, or sulfate components; and droplets were mainly aqueous NaCl and MCS. Higher numbers of droplets were present in updrafts (80% relative humidity (RH)) compared with ambient conditions (60% RH), and almost all particles activated at cloud base (100% RH). Greatest changes in size and shape were observed in NaCl-containing aggregates (>0.3 µm diameter) along updraft trajectories. Their abundance was associated with high numbers of cloud condensation nuclei (CCN) and cloud droplets, as well as large droplet sizes in updrafts. Thus, compositional dependence was observed in activation behavior recorded for coarse and fine fractions. Soluble salts from local pollution and natural sources clearly affected aerosol-cloud interactions, enhancing the spectrum of particles forming CCN and by forming giant CCN from aggregates, thus, making cloud seeding with hygroscopic flares ineffective in this region.
A Comprehensive Precipitation Data Set for Global Land Areas (TR-051)
Eischeid, J. K. [Univ. of Colorado, Boulder, CO (United States) Cooperative Inst. for Research in Environmental Sciences (CIRES); NOAA; Diaz, H. F. [Univ. of Colorado, Boulder, CO (United States). Cooperative Inst. for Research in Environmental Sciences (CIRES); NOAA; Bradley, R. S. [University of Massachusetts, Amherst, MA (USA); Jones, P. D. [University of East Anglia, Norwich, United Kingdom
1994-01-01
An expanded and updated compilation of long-term station precipitation data, together with a new set of gridded monthly mean fields for global land areas, are described. The present data set contains 5328 station records of monthly total precipitation, covering the period from the mid-1800s to the late 1980s. The station data were individually tested and visually inspected for the presence of spurious trends, jumps, and other measurement biases. The quality control procedure which was used to check the station records for nonclimatic discontinuities and other biases is detailed. We also discuss some of the problems which typically contribute to potential inhomogeneities in precipitation records. The station data were interpolated onto a 4° latitude by 5° longitude uniform grid. Comparisons of these data with two other global-scale precipitation climatologies are presented. We find good agreement among the three global-scale climatologies over the common areas in each set. Three different indices of long-term precipitation variations over the global land areas all indicate a general increase of annual precipitation since the 1940s, although a decline is evident over the last decade. There is some indication that the last few decades of the 19th century may have been as wet as the recent ones. An interesting feature of this study is the presence of relatively large differences in seasonal trends, with March-May and September-November becoming wetter in the last few decades. The December-February and June-August seasons exhibit smaller overall trends, although the northern winter season does exhibit large decadal-scale fluctuations.
NASA Astrophysics Data System (ADS)
Vargas, Marco; Miura, Tomoaki; Csiszar, Ivan; Zheng, Weizhong; Wu, Yihua; Ek, Michael
2017-04-01
The first Joint Polar Satellite System (JPSS) mission, the Suomi National Polar-orbiting Partnership (S-NPP) satellite, was successfully launched in October, 2011, and it will be followed by JPSS-1, slated for launch in 2017. JPSS provides operational continuity of satellite-based observations and products for NOAA's Polar Operational Environmental Satellites (POES). Vegetation products derived from satellite measurements are used for weather forecasting, land modeling, climate research, and monitoring the environment including drought, the health of ecosystems, crop monitoring and forest fires. The operationally produced S-NPP VIIRS Vegetation Index (VI) Environmental Data Record (EDR) includes two vegetation indices: the Top of the Atmosphere (TOA) Normalized Difference Vegetation Index (NDVI), and the Top of the Canopy (TOC) Enhanced Vegetation Index (EVI). For JPSS-1, the S-NPP Vegetation Index EDR algorithm has been updated to include the TOC NDV. The current JPSS operational VI products are generated in granule style at 375 meter resolution at nadir, but these products in granule format cannot be ingested into NOAA operational monitoring and decision making systems. For that reason, the NOAA JPSS Land Team is developing a new global gridded Vegetation Index (VI) product suite for operational use by the NOAA National Centers for Environmental Prediction (NCEP). The new global gridded VIs will be used in the Multi-Physics (MP) version of the Noah land surface model (Noah-MP) in NCEP NOAA Environmental Modeling System (NEMS) for plant growth and data assimilation and to describe vegetation coverage and density in order to model the correct surface energy partition. The new VI 4km resolution global gridded products (TOA NDVI, TOC NDVI and TOC EVI) are being designed to meet the needs of directly ingesting vegetation index variables without the need to develop local gridding and compositing procedures. These VI products will be consistent with the already operational SNPP VIIRS Green Vegetation Fraction (GVF) global gridded 4km resolution. The ultimate goal is a global consistent set of global gridded land products at 1-km resolution to enable consistent use of the products in the full suite of global and regional NCEP land models. The new JPSS vegetation products system is scheduled to transition to operations in the fall of 2017.
Pallett, Edward J; Rentowl, Patricia; Johnson, Mark I; Watson, Paul J
2014-03-01
The efficacy of transcutaneous electrical nerve stimulation (TENS) for pain relief has not been reliably established. Inconclusive findings could be due to inadequate TENS delivery and inappropriate outcome assessment. Electronic monitoring devices were used to determine patient compliance with a TENS intervention and outcome assessment protocol, to record pain scores before, during, and after TENS, and measure electrical output settings. Patients with chronic back pain consented to use TENS daily for 2 weeks and to report pain scores before, during, and after 1-hour treatments. A ≥ 30% reduction in pain scores was used to classify participants as TENS responders. Electronic monitoring devices "TLOG" and "TSCORE" recorded time and duration of TENS use, electrical settings, and pain scores. Forty-two patients consented to participate. One of 35 (3%) patients adhered completely to the TENS use and pain score reporting protocol. Fourteen of 33 (42%) were TENS responders according to electronic pain score data. Analgesia onset occurred within 30 to 60 minutes for 13/14 (93%) responders. It was not possible to correlate TENS amplitude, frequency, or pulse width measurements with therapeutic response. Findings from TENS research studies depend on the timing of outcome assessment; pain should be recorded during stimulation. TENS device sophistication might be an issue and parameter restriction should be considered. Careful protocol design is required to improve adherence and monitoring is necessary to evaluate the validity of findings. This observational study provides objective evidence to support concerns about poor implementation fidelity in TENS research.
Drennan, M J; McGee, M; Keane, M G
2008-05-01
The objective was to determine the relationship of muscular and skeletal scores taken on the live animal and carcass conformation and fat scores with carcass composition and value. Bulls (n = 48) and heifers (n = 37) of 0.75 to 1.0 late-maturing breed genotypes slaughtered at 16 and 20 months of age, respectively, were used. At 8 months of age (weaning) and immediately pre-slaughter, visual muscular scores were recorded for each animal and additionally skeletal scores were recorded pre-slaughter. Carcass weight, kidney and channel fat weight, carcass conformation and fat scores, fat depth over the longissimus dorsi muscle at the 12th (bulls) or 10th (heifers) rib and carcass length were recorded post-slaughter. Each carcass was subsequently dissected into meat, fat and bone using a commercial dissection procedure. Muscular scores taken pre-slaughter showed positive correlations with killing-out rate (r ≈ 0.65), carcass meat proportion (r ≈ 0.60), value (r ≈ 0.55) and conformation score (r ≈ 0.70), and negative correlations with carcass bone (r ≈ -0.60) and fat (r ≈ -0.4) proportions. Corresponding correlations with muscular scores at weaning were lower. Correlations of skeletal scores taken pre-slaughter, carcass length and carcass weight with killing-out rate and the various carcass traits were mainly not significant. Carcass fat depth and kidney and channel fat weight were negatively correlated with carcass meat proportion and value, and positively correlated with fat proportion. Correlations of carcass conformation score were positive (r = 0.50 to 0.68) with killing-out rate, carcass meat proportion and carcass value and negative with bone (r ≈ -0.56) and fat (r ≈ -0.40) proportions. Corresponding correlations with carcass fat score were mainly negative except for carcass fat proportion (r ≈ 0.79). A one-unit (scale 1 to 15) increase in carcass conformation score increased carcass meat proportion by 8.9 and 8.1 g/kg, decreased fat proportion by 4.0 and 2.9 g/kg and decreased bone proportion by 4.9 and 5.2 g/kg in bulls and heifers, respectively. Corresponding values per unit increase in carcass fat score were -11.9 and -9.7 g/kg, 12.4 and 9.9 g/kg, and -0.5 and -0.2 g/kg. Carcass conformation and fat scores explained 0.70 and 0.55 of the total variation in meat yield for bulls and heifers, respectively. It is concluded that live animal muscular scores, and carcass conformation and fat scores, are useful indicators of carcass meat proportion and value.
Use of Synchronized Phasor Measurements for Model Validation in ERCOT
NASA Astrophysics Data System (ADS)
Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill
2013-05-01
This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.
Weiler, Richard; van Mechelen, Willem; Fuller, Colin; Ahmed, Osman Hassan; Verhagen, Evert
2018-01-01
To determine if baseline Sport Concussion Assessment Tool, third Edition (SCAT3) scores differ between athletes with and without disability. Cross-sectional comparison of preseason baseline SCAT3 scores for a range of England international footballers. Team doctors and physiotherapists supporting England football teams recorded players' SCAT 3 baseline tests from August 1, 2013 to July 31, 2014. A convenience sample of 249 England footballers, of whom 185 were players without disability (male: 119; female: 66) and 64 were players with disability (male learning disability: 17; male cerebral palsy: 28; male blind: 10; female deaf: 9). Between-group comparisons of median SCAT3 total and section scores were made using nonparametric Mann-Whitney-Wilcoxon ranked-sum test. All footballers with disability scored higher symptom severity scores compared with male players without disability. Male footballers with learning disability demonstrated no significant difference in the total number of symptoms, but recorded significantly lower scores on immediate memory and delayed recall compared with male players without disability. Male blind footballers' scored significantly higher for total concentration and delayed recall, and male footballers with cerebral palsy scored significantly higher on balance testing and immediate memory, when compared with male players without disability. Female footballers with deafness scored significantly higher for total concentration and balance testing than female footballers without disability. This study suggests that significant differences exist between SCAT3 baseline section scores for footballers with and without disability. Concussion consensus guidelines should recognize these differences and produce guidelines that are specific for the growing number of athletes living with disability.
Novel chemistries and materials for grid-scale energy storage: Quinones and halogen catalysis
NASA Astrophysics Data System (ADS)
Huskinson, Brian Thomas
In this work I describe various approaches to electrochemical energy storage at the grid-scale. Chapter 1 provides an introduction to energy storage and an overview of the history and development of flow batteries. Chapter 2 describes work on the hydrogen-chlorine regenerative fuel cell, detailing its development and the record-breaking performance of the device. Chapter 3 dives into catalyst materials for such a fuel cell, focusing on ruthenium oxide based alloys to be used as chlorine redox catalysts. Chapter 4 introduces and details the development of a performance model for a hydrogen-bromine cell. Chapter 5 delves into the more recent work I have done, switching to applications of quinone chemistries in flow batteries. It focuses on the pairing of one particular quinone (2,7-anthraquinone disulfonic acid) with bromine, and highlights the promising performance characteristics of a device based on this type of chemistry.
NASA Technical Reports Server (NTRS)
Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)
2008-01-01
A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.
Flexible Residential Smart Grid Simulation Framework
NASA Astrophysics Data System (ADS)
Xiang, Wang
Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.
Evaluation of gridding procedures for air temperature over Southern Africa
NASA Astrophysics Data System (ADS)
Eiselt, Kai-Uwe; Kaspar, Frank; Mölg, Thomas; Krähenmann, Stefan; Posada, Rafael; Riede, Jens O.
2017-06-01
Africa is considered to be highly vulnerable to climate change, yet the availability of observational data and derived products is limited. As one element of the SASSCAL initiative (Southern African Science Service Centre for Climate Change and Adaptive Land Management), a cooperation of Angola, Botswana, Namibia, Zambia, South Africa and Germany, networks of automatic weather stations have been installed or improved (http://www.sasscalweathernet.org). The increased availability of meteorological observations improves the quality of gridded products for the region. Here we compare interpolation methods for monthly minimum and maximum temperatures which were calculated from hourly measurements. Due to a lack of longterm records we focused on data ranging from September 2014 to August 2016. The best interpolation results have been achieved combining multiple linear regression (elevation, a continentality index and latitude as predictors) with three dimensional inverse distance weighted interpolation.
The global gridded crop model intercomparison: Data and modeling protocols for Phase 1 (v1.0)
Elliott, J.; Müller, C.; Deryng, D.; ...
2015-02-11
We present protocols and input data for Phase 1 of the Global Gridded Crop Model Intercomparison, a project of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The project consist of global simulations of yields, phenologies, and many land-surface fluxes using 12–15 modeling groups for many crops, climate forcing data sets, and scenarios over the historical period from 1948 to 2012. The primary outcomes of the project include (1) a detailed comparison of the major differences and similarities among global models commonly used for large-scale climate impact assessment, (2) an evaluation of model and ensemble hindcasting skill, (3) quantification ofmore » key uncertainties from climate input data, model choice, and other sources, and (4) a multi-model analysis of the agricultural impacts of large-scale climate extremes from the historical record.« less
NASA Astrophysics Data System (ADS)
1982-02-01
Performance data for the month of January, 1982 for a grid connected photovoltaic power supply in Massachusetts are presented. Data include: monthly and daily electrical energy produced; monthly and daily solar energy incident on the array; monthly and daily array efficiency; plots of energy produced as a function of power level, voltage, cell temperature and time of day; power conditioner input, output and efficiency for each of two individual units and for the total power conditioning system; photovoltaic system efficiency; capacity factor; PV system to load and grid to load energies and corresponding dollar values; daily energy supplies to the load by the PV system; daily PV system availability; monthly and hourly insolation; monthly and hourly temperature average; monthly and hourly wind speed; wind direction distribution; average heating and cooling degree days; number of freeze/thaw cycles; and the data acquisition mode and recording interval plot.
Associating extreme precipitation events to parent cyclones in gridded data
NASA Astrophysics Data System (ADS)
Rhodes, Ruari; Shaffrey, Len; Gray, Sue
2015-04-01
When analysing the relationship of regional precipitation to its parent cyclone, it is insufficient to consider the cyclone's region of influence as a fixed radius from the centre due to the irregular shape of rain bands. A new method is therefore presented which allows the use of objective feature tracking data in the analysis of regional precipitation. Utilising the spatial extent of precipitation in gridded datasets, the most appropriate cyclone(s) may be associated with regional precipitation events. This method is applied in the context of an analysis of the influence of clustering and stalling of extra-tropical cyclones in the North Atlantic on total precipitation accumulations over England and Wales. Cyclone counts and residence times are presented for historical records (ERA-Interim) and future projections (HadGEM2-ES) of extreme (> 98th percentile) precipitation accumulations over England and Wales, for accumulation periods ranging from one day to one month.
Poscia, Andrea; Cambieri, Andrea; Tucceri, Chiara; Ricciardi, Walter; Volpe, Massimo
2015-01-01
In the actual economic context, with increasing health needs, efficiency and efficacy represents fundamental keyword to ensure a successful use of the resources and the best health outcomes. Together, the medical record, completely and correctly compiled, is an essential tool in the patient diagnostic and therapeutic path, but it's becoming more and more essential for the administrative reporting and legal claims. Nevertheless, even if the improvement of medical records quality and of hospital stay appropriateness represent priorities for every health organization, they could be difficult to realize. This study aims to present the methodology and the preliminary results of a training and improvement process: it was carried out from the Hospital Management of a third level Italian teaching hospital through audit cycles to actively involve their health professionals. A self assessment process of medical records quality and hospital stay appropriateness (inpatients admission and Day Hospital) was conducted through a retrospective evaluation of medical records. It started in 2012 and a random sample of 2295 medical records was examined: the quality assessment was performed using a 48-item evaluation grid modified from the Lombardy Region manual of the medical record, while the appropriateness of each days was assessed using the Italian version of Appropriateness Evaluation Protocol (AEP) - 2002ed. The overall assessment was presented through departmental audit: the audit were designed according to the indication given by the Italian and English Ministry of Health to share the methodology and the results with all the involved professionals (doctors and nurses) and to implement improvement strategies that are synthesized in this paper. Results from quality and appropriateness assessment show several deficiencies, due to 40% of minimum level of acceptability not completely satisfied and to 30% of inappropriateness between days of hospitalization. Furthermore, there are great discrepancies among departments and among Care Units: the higher problems are centered in DHs, which are generally lacking on both profiles. Finally, our audit model, that could be considered a good project according the NHS (score of 20/25), has allowed to involve in 34 editions 480 professionals of different care Unit which are satisfied and stimulated to keep going in continuous improvement of the quality and appropriateness with these arrangements. The tools used in the project have proven their value for measuring the minimum quality of healthcare documentation and organizational appropriateness: furthermore, the audit has been shown as an effective methodology for their introduction because it ensures their acceptability among the staff and creates the basis for a rapid and quantifiable improvement that, through the promotion of accountability and transparency, could support the risk management activities and ensure greater efficiency in hospitalization.
ERIC Educational Resources Information Center
JUSTMAN, JOSEPH
CHANGES IN ACADEMIC APTITUDE AND ACHIEVEMENT TEST SCORES OF PUPILS ATTENDING PUBLIC SCHOOLS IN DISADVANTAGED AREAS IN NEW YORK CITY WERE INVESTIGATED. AN ATTEMPT WAS MADE TO DETERMINE WHETHER VARYING DEGREES OF MOBILITY WERE ASSOCIATED WITH VARIATION IN CHANGES IN TEST SCORES. THE CUMULATIVE RECORD CARDS OF SIXTH-GRADE PUPILS WERE EXAMINED TO…
ERIC Educational Resources Information Center
Breyer, F. Jay; Attali, Yigal; Williamson, David M.; Ridolfi-McCulla, Laura; Ramineni, Chaitanya; Duchnowski, Matthew; Harris, April
2014-01-01
In this research, we investigated the feasibility of implementing the "e-rater"® scoring engine as a check score in place of all-human scoring for the "Graduate Record Examinations"® ("GRE"®) revised General Test (rGRE) Analytical Writing measure. This report provides the scientific basis for the use of e-rater as a…
A Guide to the Computerized Medical Data Resources of the Naval Health Research Center.
1987-04-09
Selection Test Score o Mental group o Education certificate o SCREEN Score The GCT Score is designed to measure ability to understand verbal relationships...available on some members before that date. For female members this field will contain Armed Forces Women’s Selection Test Scores. Norms provided for 16...the Board are recorded in this file. Finally, disposition by the Board is indicated. Physical Evaluation Board File. Selected data elements in the
ERIC Educational Resources Information Center
Neal, Menka E.
This study investigated the relationship between graduate grade-point-average (GGPA) and the total score, as well as the scores on each part, of the Graduate Record Examination (GRE). It also investigated the relationship between GGPA and the total score, as well as the scores on each part, of the Test of English as a Foreign Language (TOEFL). The…
Lin, Chung-Ying; Hwang, Jing-Shiang; Wang, Wen-Chung; Lai, Wu-Wei; Su, Wu-Chou; Wu, Tzu-Yi; Yao, Grace; Wang, Jung-Der
2018-04-13
Quality of life (QoL) is important for clinicians to evaluate how cancer survivors judge their sense of well-being, and WHOQOL-BREF may be a good tool for clinical use. However, at least three issues remain unresolved: (1) the psychometric properties of the WHOQOL-BREF for cancer patients are insufficient; (2) the scoring method used for WHOQOL-BREF needs to be clarify; (3) whether different types of cancer patients interpret the WHOQOL-BREF similarly. We recruited 1000 outpatients with head/neck cancer, 1000 with colorectal cancer, 965 with liver cancer, 1438 with lung cancer and 1299 with gynecologic cancers in a medical center. Data analyses included Rasch models, confirmatory factor analysis (CFA), and Pearson correlations. The mean WHOQOL-BREF domain scores were between 13.34 and 14.77 among all participants. CFA supported construct validity; Rasch models revealed that almost all items were embedded in their expected domains and were interpreted similarly across five types of cancer patients; all correlation coefficients between Rasch scores and original domain scores were above 0.9. The linear relationship between Rasch scores and domain scores suggested that the current calculations for domain scores were applicable and without serious bias. Clinical practitioners may regularly collect and record the WHOQOL-BREF domain scores into electronic health records. Copyright © 2018. Published by Elsevier B.V.
Hammond, Kenric W; Ben-Ari, Alon Y; Laundry, Ryan J; Boyko, Edward J; Samore, Matthew H
2015-12-01
Free text in electronic health records resists large-scale analysis. Text records facts of interest not found in encoded data, and text mining enables their retrieval and quantification. The U.S. Department of Veterans Affairs (VA) clinical data repository affords an opportunity to apply text-mining methodology to study clinical questions in large populations. To assess the feasibility of text mining, investigation of the relationship between exposure to adverse childhood experiences (ACEs) and recorded diagnoses was conducted among all VA-treated Gulf war veterans, utilizing all progress notes recorded from 2000-2011. Text processing extracted ACE exposures recorded among 44.7 million clinical notes belonging to 243,973 veterans. The relationship of ACE exposure to adult illnesses was analyzed using logistic regression. Bias considerations were assessed. ACE score was strongly associated with suicide attempts and serious mental disorders (ORs = 1.84 to 1.97), and less so with behaviorally mediated and somatic conditions (ORs = 1.02 to 1.36) per unit. Bias adjustments did not remove persistent associations between ACE score and most illnesses. Text mining to detect ACE exposure in a large population was feasible. Analysis of the relationship between ACE score and adult health conditions yielded patterns of association consistent with prior research. Copyright © 2015 International Society for Traumatic Stress Studies.
Use of a new high-speed digital data acquisition system in airborne ice-sounding
Wright, David L.; Bradley, Jerry A.; Hodge, Steven M.
1989-01-01
A high-speed digital data acquisition and signal averaging system for borehole, surface, and airborne radio-frequency geophysical measurements was designed and built by the US Geological Survey. The system permits signal averaging at rates high enough to achieve significant signal-to-noise enhancement in profiling, even in airborne applications. The first field use of the system took place in Greenland in 1987 for recording data on a 150 by 150-km grid centered on the summit of the Greenland ice sheet. About 6000-line km were flown and recorded using the new system. The data can be used to aid in siting a proposed scientific corehole through the ice sheet.
The American Academy of Sleep Medicine Inter-scorer Reliability Program: Respiratory Events
Rosenberg, Richard S.; Van Hout, Steven
2014-01-01
Study Objectives: The American Academy of Sleep Medicine (AASM) Inter-scorer Reliability program provides a unique opportunity to compare a large number of scorers with varied levels of experience to determine agreement in the scoring of respiratory events. The objective of this paper is to examine areas of disagreement to inform future revisions of the AASM Manual for the Scoring of Sleep and Associated Events. Methods: The sample included 15 monthly records, 200 epochs each. The number of scorers increased steadily during the period of data collection, reaching more than 3,600 scorers by the final record. Scorers were asked to identify whether an obstructive, mixed, or central apnea; a hypopnea; or no event was seen in each of the 200 epochs. The “correct” respiratory event score was defined as the score endorsed by the most scorers. Percentage agreement with the majority score was determined for each epoch and the mean agreement determined. Results: The overall agreement for scoring of respiratory events was 93.9% (κ = 0.92). There was very high agreement on epochs without respiratory events (97.4%), and the majority score for most of the epochs (87.8%) was no event. For the 364 epochs scored as having a respiratory event, overall agreement that some type of respiratory event occurred was 88.4% (κ = 0.77). The agreement for epochs scored as obstructive apnea by the majority was 77.1% (κ = 0.71), and the most common disagreement was hypopnea rather than obstructive apnea (14.4%). The agreement for hypopnea was 65.4% (κ = 0.57), with 16.4% scoring no event and 14.8% scoring obstructive apnea. The agreement for central apnea was 52.4% (κ = 0.41). A single epoch was scored as a mixed apnea by a plurality of scorers. Conclusions: The study demonstrated excellent agreement among a large sample of scorers for epochs with no respiratory events. Agreement for some type of event was good, but disagreements in scoring of apnea vs. hypopnea and type of apnea were common. A limitation of the analysis is that most of the records had normal breathing. A review of controversial events yielded no consistent bias that might be resolved by a change of scoring rules. Citation: Rosenberg RS; Van Hout S. The American Academy of Sleep Medicine inter-scorer reliability program: respiratory events. J Clin Sleep Med 2014;10(4):447-454. PMID:24733993
Henry, Stephen G.; Jerant, Anthony; Iosif, Ana-Maria; Feldman, Mitchell D.; Cipri, Camille; Kravitz, Richard L.
2015-01-01
Objective To identify factors associated with participant consent to record visits; to estimate effects of recording on patient-clinician interactions Methods Secondary analysis of data from a randomized trial studying communication about depression; participants were asked for optional consent to audio record study visits. Multiple logistic regression was used to model likelihood of patient and clinician consent. Multivariable regression and propensity score analyses were used to estimate effects of audio recording on 6 dependent variables: discussion of depressive symptoms, preventive health, and depression diagnosis; depression treatment recommendations; visit length; visit difficulty. Results Of 867 visits involving 135 primary care clinicians, 39% were recorded. For clinicians, only working in academic settings (P=0.003) and having worked longer at their current practice (P=0.02) were associated with increased likelihood of consent. For patients, white race (P=0.002) and diabetes (P=0.03) were associated with increased likelihood of consent. Neither multivariable regression nor propensity score analyses revealed any significant effects of recording on the variables examined. Conclusion Few clinician or patient characteristics were significantly associated with consent. Audio recording had no significant effect on any dependent variables. Practice Implications Benefits of recording clinic visits likely outweigh the risks of bias in this setting. PMID:25837372
Differences in Error Detection Skills by Band and Choral Preservice Teachers
ERIC Educational Resources Information Center
Stambaugh, Laura A.
2016-01-01
Band and choral preservice teachers (N = 44) studied band and choral scores, listened to recordings of school ensembles, and identified errors in the recordings. Results indicated that preservice teachers identified significantly more errors when listening to recordings of their primary area (band majors listening to band, p = 0.045; choral majors…
Sensory integration functions of children with cochlear implants.
Koester, AnjaLi Carrasco; Mailloux, Zoe; Coleman, Gina Geppert; Mori, Annie Baltazar; Paul, Steven M; Blanche, Erna; Muhs, Jill A; Lim, Deborah; Cermak, Sharon A
2014-01-01
OBJECTIVE. We investigated sensory integration (SI) function in children with cochlear implants (CIs). METHOD. We analyzed deidentified records from 49 children ages 7 mo to 83 mo with CIs. Records included Sensory Integration and Praxis Tests (SIPT), Sensory Processing Measure (SPM), Sensory Profile (SP), Developmental Profile 3 (DP-3), and Peabody Developmental Motor Scales (PDMS), with scores depending on participants' ages. We compared scores with normative population mean scores and with previously identified patterns of SI dysfunction. RESULTS. One-sample t tests revealed significant differences between children with CIs and the normative population on the majority of the SIPT items associated with the vestibular and proprioceptive bilateral integration and sequencing (VPBIS) pattern. Available scores for children with CIs on the SPM, SP, DP-3, and PDMS indicated generally typical ratings. CONCLUSION. SIPT scores in a sample of children with CIs reflected the VPBIS pattern of SI dysfunction, demonstrating the need for further examination of SI functions in children with CIs during occupational therapy assessment and intervention planning. Copyright © 2014 by the American Occupational Therapy Association, Inc.
[Inter-rater concordance of the "Nursing Activities Score" in intensive care].
Valls-Matarín, Josefa; Salamero-Amorós, Maria; Roldán-Gil, Carmen; Quintana-Riera, Salvador
2015-01-01
To evaluate inter-rater concordance in the valuation of the "Nursing Activities Score". Cross-sectional descriptive study conducted from December 2012 until June 2013 in a general intensive care unit with twelve beds. Three evaluator nurses, simultaneously and independently, through the patient daily charts, scored the nursing workload using Nursing Activities Score scale in all patients admitted over 18 years old. Three hundreds and thirty-nine records were collected. The intra-class correlation coefficient (ICC) between evaluators was 0.92 (0.89-0.94). A perfect concordance was obtained in 39.1% of the items, with 52.2% having a high, and 8.7% having lower concordance, corresponding to two of the items with multiple scoring options. Significant differences between two of the evaluators (P=.049) were found. Although the inter-rater concordance was high, more accurate records are needed to reduce the variability of the items with multiple options and to allow more accuracy in the interpretation and measurement of the data regarding nursing workload. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Can outcome of pancreatic pseudocysts be predicted? Proposal for a new scoring system.
Şenol, Kazım; Akgül, Özgür; Gündoğdu, Salih Burak; Aydoğan, İhsan; Tez, Mesut; Coşkun, Faruk; Tihan, Deniz Necdet
2016-03-01
The spontaneous resolution rate of pancreatic pseudocysts (PPs) is 86%, and the serious complication rate is 3-9%. The aim of the present study was to develop a scoring system that would predict spontaneous resolution of PPs. Medical records of 70 patients were retrospectively reviewed. Two patients were excluded. Demographic data and laboratory measurements were obtained from patient records. Mean age of the 68 patients included was 56.6 years. Female:male ratio was 1.34:1. Causes of pancreatitis were stones (48.5%), alcohol consumption (26.5%), and unknown etiology (25%). Mean size of PP was 71 mm. Pseudocysts disappeared in 32 patients (47.1%). With univariate analysis, serum direct bilirubin level (>0.95 mg/dL), cyst carcinoembryonic antigen (CEA) level (>1.5), and cyst diameter (>55 mm) were found to be significantly different between patients with and without spontaneous resolution. In multivariate analysis, these variables were statistically significant. Scores were calculated with points assigned to each variable. Final scores predicted spontaneous resolution in approximately 80% of patients. The scoring system developed to predict resolution of PPs is simple and useful, but requires validation.
NASA Astrophysics Data System (ADS)
Beck, H.; Vergopolan, N.; Pan, M.; Levizzani, V.; van Dijk, A.; Weedon, G. P.; Brocca, L.; Huffman, G. J.; Wood, E. F.; William, L.
2017-12-01
We undertook a comprehensive evaluation of 22 gridded (quasi-)global (sub-)daily precipitation (P) datasets for the period 2000-2016. Twelve non-gauge-corrected P datasets were evaluated using daily P gauge observations from 76,086 gauges worldwide. Another ten gauge-corrected ones were evaluated using hydrological modeling, by calibrating the conceptual model HBV against streamflow records for each of 9053 small to medium-sized (<50,000 km2) catchments worldwide, and comparing the resulting performance. Marked differences in spatio-temporal patterns and accuracy were found among the datasets. Among the uncorrected P datasets, the satellite- and reanalysis-based MSWEP-ng V1.2 and V2.0 datasets generally showed the best temporal correlations with the gauge observations, followed by the reanalyses (ERA-Interim, JRA-55, and NCEP-CFSR), the estimates based primarily on passive microwave remote sensing of rainfall (CMORPH V1.0, GSMaP V5/6, and TMPA 3B42RT V7) or near-surface soil moisture (SM2RAIN-ASCAT), and finally, estimates based primarily on thermal infrared imagery (GridSat V1.0, PERSIANN, and PERSIANN-CCS). Two of the three reanalyses (ERA-Interim and JRA-55) unexpectedly obtained lower trend errors than the satellite datasets. Among the corrected P datasets, the ones directly incorporating daily gauge data (CPC Unified and MSWEP V1.2 and V2.0) generally provided the best calibration scores, although the good performance of the fully gauge-based CPC Unified is unlikely to translate to sparsely or ungauged regions. Next best results were obtained with P estimates directly incorporating temporally coarser gauge data (CHIRPS V2.0, GPCP-1DD V1.2, TMPA 3B42 V7, and WFDEI-CRU), which in turn outperformed those indirectly incorporating gauge data through other multi-source datasets (PERSIANN-CDR V1R1 and PGF). Our results highlight large differences in estimation accuracy, and hence, the importance of P dataset selection in both research and operational applications. The good performance of MSWEP emphasizes that careful data merging can exploit the complementary strengths of gauge-, satellite- and reanalysis-based P estimates.
Novy, Diane M; Engle, Mitchell P; Lai, Emily A; Cook, Christina; Martin, Emily C; Trahan, Lisa; Yu, Jun; Koyyalagunta, Dhanalakshmi
2016-07-01
The effectiveness of splanchnic nerve neurolysis (SNN) for cancer-related abdominal pain has been investigated using numeric pain intensity rating as an outcome variable. The outcome variable in this study used the grid method for obtaining a targeted pain drawing score on 60 patients with pain from pancreatic or gastro-intestinal primary cancers or metastatic disease to the abdominal region. Results demonstrate excellent inter-rater agreement (intra-class correlation [ICC] coefficient at pre-SNN = 0.97 and ICC at within one month post-SNN = 0.98) for the grid method of scoring the pain drawing and demonstrate psychometric generalizability among patients with cancer-related pain. Using the Wilcoxon signed rank test and associated effect sizes, results show significant improvement in dispersion of pain following SNN. Effect sizes for the difference in pre-SNN to 2 post-SNN time points were higher for the pain drawing than for pain intensity rating. Specifically, the effect size difference from pre- to within one month post-SNN was r = 0.42 for pain drawing versus r = 0.23 for pain intensity rating. Based on a smaller subset of patients who were seen within 1 - 6 months following SNN, the effect size difference from pre-SNN was r = 0.46 for pain drawing versus r = 0.00 for pain intensity rating. Collectively, these data support the use of the pain drawing as a reliable outcome measure among patients with cancer pain for procedures such as SNN that target specific location and dispersion of pain.
Experience in Grid Site Testing for ATLAS, CMS and LHCb with HammerCloud
NASA Astrophysics Data System (ADS)
Elmsheuser, Johannes; Medrano Llamas, Ramón; Legger, Federica; Sciabà, Andrea; Sciacca, Gianfranco; Úbeda García, Mario; van der Ster, Daniel
2012-12-01
Frequent validation and stress testing of the network, storage and CPU resources of a grid site is essential to achieve high performance and reliability. HammerCloud was previously introduced with the goals of enabling VO- and site-administrators to run such tests in an automated or on-demand manner. The ATLAS, CMS and LHCb experiments have all developed VO plugins for the service and have successfully integrated it into their grid operations infrastructures. This work will present the experience in running HammerCloud at full scale for more than 3 years and present solutions to the scalability issues faced by the service. First, we will show the particular challenges faced when integrating with CMS and LHCb offline computing, including customized dashboards to show site validation reports for the VOs and a new API to tightly integrate with the LHCbDIRAC Resource Status System. Next, a study of the automatic site exclusion component used by ATLAS will be presented along with results for tuning the exclusion policies. A study of the historical test results for ATLAS, CMS and LHCb will be presented, including comparisons between the experiments’ grid availabilities and a search for site-based or temporal failure correlations. Finally, we will look to future plans that will allow users to gain new insights into the test results; these include developments to allow increased testing concurrency, increased scale in the number of metrics recorded per test job (up to hundreds), and increased scale in the historical job information (up to many millions of jobs per VO).
Acoustic wave simulation using an overset grid for the global monitoring system
NASA Astrophysics Data System (ADS)
Kushida, N.; Le Bras, R.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been monitoring hydro-acoustic and infrasound waves over the globe. Because of the complex natures of the oceans and the atmosphere, computer simulation can play an important role in understanding the observed signals. In this regard, methods which depend on partial differential equations and require minimum modelling, are preferable. So far, to our best knowledge, acoustic wave propagation simulations based on partial differential equations on such a large scale have not been performed (pp 147 - 161 of ref [1], [2]). The main difficulties in building such simulation codes are: (1) considering the inhomogeneity of medium including background flows, (2) high aspect ratio of computational domain, (3) stability during long time integration. To overcome these difficulties, we employ a two-dimensional finite different (FDM) scheme on spherical coordinates with the Yin-Yang overset grid[3] solving the governing equation of acoustic waves introduces by Ostashev et. al.[4]. The comparison with real recording examples in hydro-acoustic will be presented at the conference. [1] Paul C. Etter: Underwater Acoustic Modeling and Simulation, Fourth Edition, CRC Press, 2013. [2] LIAN WANG et. al.: REVIEW OF UNDERWATER ACOUSTIC PROPAGATION MODELS, NPL Report AC 12, 2014. [3] A. Kageyama and T. Sato: "Yin-Yang grid": An overset grid in spherical geometry, Geochem. Geophys. Geosyst., 5, Q09005, 2004. [4] Vladimir E. Ostashev et. al: Equations for finite-difference, time-domain simulation of sound propagation in moving inhomogeneous media and numerical implementation, Acoustical Society of America. DOI: 10.1121/1.1841531, 2005.
Use of rectangular grid miniplates for fracture fixation at the mandibular angle.
Hochuli-Vieira, Eduardo; Ha, Thi Khanh Linh; Pereira-Filho, Valfrido Antonio; Landes, Constantin Alexander
2011-05-01
The aim of this study was to evaluate the clinical outcome of patients with mandibular angle fractures treated by intraoral access and a rectangular grid miniplate with 4 holes and stabilized with monocortical screws. This study included 45 patients with mandibular angle fractures from the Department of Oral and Maxillofacial Surgery São Paulo State University, Araraquara, Brazil, and from the Clinic of Oral and Maxillofacial Surgery at the University of Frankfurt, Germany. The 45 fractures of the mandibular angle were treated with a rectangular grid miniplate of a 2.0-mm system by an intraoral approach with monocortical screws. Clinical evaluations were postoperatively performed at 15 and 30 days and 3 and 6 months, and the complications encountered were recorded and treated. The infection rate was 4.44% (2 patients), and in 1 patient it was necessary to replace hardware. This patient also had a fracture of the left mandibular body; 3 patients (6.66%) had minor occlusal changes that have been resolved with small occlusal adjustments. Before surgery, 15 patients (33.33%) presented with hypoesthesia of the inferior alveolar nerve; 4 (8.88%) had this change until the last clinical control, at 6 months. The rectangular grid miniplate used in this study was stable for the treatment of simple mandibular angle fractures through intraoral access, with low complication rates, easy handling, and easy adjustment, with a low cost. Concomitant mandibular fracture may increase the rate of complications. This plate should be indicated in fractures with sufficient interfragmentary contact. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Physiological scoring: an aid to emergency medical services transport decisions?
Challen, Kirsty; Walter, Darren
2010-01-01
Attendance at UK emergency departments is rising steadily despite the proliferation of alternative unscheduled care providers. Evidence is mixed on the willingness of emergency medical services (EMS) providers to decline to transport patients and the safety of incorporating such an option into EMS provision. Physiologically based Early Warning Scores are in use in many hospitals and emergency departments, but not yet have been proven to be of benefit in the prehospital arena. The use of a physiological-social scoring system could safely identify patients calling EMS who might be diverted from the emergency department to an alternative, unscheduled, care provider. This was a retrospective, cohort study of patients with a presenting complaint of "shortness of breath" or "difficulty breathing" transported to the emergency department by EMS. Retrospective calculation of a physiological social score (PMEWS) based on first recorded data from EMS records was performed. Outcome measures of hospital admission and need for physiologically stabilizing treatment in the emergency department also were performed. A total of 215 records were analyzed. One hundred thirty-nine (65%) patients were admitted from the emergency department or received physiologically stabilizing treatment in the emergency department. Area Under the Receiver Operating Characteristic Curve (AUROC) for hospital admission was 0.697 and for admission or physiologically stabilizing treatment was 0.710. No patient scoring<2 was admitted or received stabilizing treatment. Despite significant over-triage, this system could have diverted 79 patients safely from the emergency department to alternative, unscheduled, care providers.
Sundus, Ayesha; Haider, Mohammad Nadir; Ibrahim, Mohammad Faisal; Younus, Nida; Farooqui, Mohammad Talha; Iftikhar, Fatiha; Siddique, Osama; Aziz, Sina
2014-02-01
To compare the expected (perceptions of their environment at the beginning of their 1st year) versus actual perceptions (perceptions at the end of 1st year) of 1st year students at Dow University of Health Sciences. The 'expected' perceptions of the students were recorded at the beginning of their 1st year (n = 411) of medical education when they entered the medical school using Dundee Ready Educational Environment Measure (DREEM). DREEM is a validated and self-administered inventory which focuses on learning, teachers, self-confidence and academic as well as social environment. The 'actual' perceptions were then recorded at the end of their first year (n = 405) of education when they had received adequate exposure of their environment. The 2 records were then compared. The total expected DREEM score was 118/200 and the total actual DREEM score was 113/200. The expected domain (Students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of atmosphere, and students' social self-perceptions) scores were 28/48, 26/44, 20/32, 28/48, and 16/28. The actual domain scores were 27/48, 23/44, 19/32, 27/48, 16/28. However both the actual and expected scoring displayed satisfactory environment for learning. Significant differences (p < 0.0001) were found in the two samples. In general the results displayed that the students perceived the environment positively but the significant difference found in the two samples, demonstrated that their expectations were not met.
Relationship between team assists and win-loss record in The National Basketball Association.
Melnick, M J
2001-04-01
Using research methodology for analysis of secondary data, statistical data for five National Basketball Association (NBA) seasons (1993-1994 to 1997-1998) were examined to test for a relationship between team assists (a behavioral measure of teamwork) and win-loss record. Rank-difference correlation indicated a significant relationship between the two variables, the coefficients ranging from .42 to .71. Team assist totals produced higher correlations with win-loss record than assist totals for the five players receiving the most playing time ("the starters"). A comparison of "assisted team points" and "unassisted team points" in relationship to win-loss record favored the former and strongly suggested that how a basketball team scores points is more important than the number of points it scores. These findings provide circumstantial support for the popular dictum in competitive team sports that "Teamwork Means Success-Work Together, Win Together."
Patient-reported allergies cause inferior outcomes after total knee arthroplasty.
Hinarejos, Pedro; Ferrer, Tulia; Leal, Joan; Torres-Claramunt, Raul; Sánchez-Soler, Juan; Monllau, Joan Carles
2016-10-01
The main objective of this study was to analyse the outcomes after total knee arthroplasty (TKA) of a group of patients with at least one self-reported allergy and a group of patients without reported allergies. We hypothesized there is a significant negative influence on clinical outcome scores after TKA in patients with self-reported allergies. Four-hundred and seventy-five patients who had undergone TKA were analysed preoperatively and 1 year after surgery. The WOMAC, KSS and SF-36 scores were obtained. The patients' Yesavage depression questionnaire score was also recorded. The scores of the 330 (69.5 %) patients without self-reported allergies were compared to the scores of the 145 (30.5 %) patients with at least one self-reported allergy in the medical record. Preoperative scores were similar in both groups. The WOMAC post-operative scores (23.6 vs 20.4; p = 0.037) and the KSS-Knee score (91.1 vs 87.6; p = 0.027) were worse in the group of patients with self-reported allergies than in the group without allergies. The scores from the Yesavage depression questionnaire and in the SF-36 were similar in both groups. Patients with at least one self-reported allergy have worse post-operative outcomes in terms of the WOMAC and KSS-Knee scores after TKA than patients without allergies. These poor outcomes do not seem to be related to depression. Therefore, more research is needed to explain them. Reported allergies could be considered a prognostic factor and used when counselling TKA patients. I.
Gomei, Sayaka; Hitosugi, Masahito; Ikegami, Keiichi; Tokudome, Shogo
2013-10-01
The objective of this study was to clarify the relationship between injury severity in bicyclists involved in traffic accidents and patient outcome or type of vehicle involved in order to propose effective measures to prevent fatal bicycle injuries. Hospital records were reviewed for all patients from 2007 to 2010 who had been involved in a traffic accident while riding a bicycle and were subsequently transferred to the Shock Trauma Center of Dokkyo Medical University Koshigaya Hospital. Patient outcomes and type of vehicle that caused the injury were examined. The mechanism of injury, Abbreviated Injury Scale (AIS) score, and Injury Severity Score (ISS) of the patient were determined. A total of 115 patients' records were reviewed. The mean patient age was 47.1 ± 27.4 years. The average ISS was 23.9, with an average maximum AIS (MAIS) score of 3.7. The ISS, MAIS score, head AIS score, and chest AIS score were well correlated with patient outcome. The head AIS score was significantly higher in patients who had died (mean of 4.4); however, the ISS, MAIS score, and head AIS score did not differ significantly according to the type of vehicle involved in the accident. The mean head AIS scores were as high as 2.4 or more for accidents involving any type of vehicle. This study provides useful information for forensic pathologists who suspect head injuries in bicyclists involved in traffic accidents. To effectively reduce bicyclist fatalities from traffic accidents, helmet use should be required for all bicyclists.
The e-CRABEL score: an updated method for auditing medical records.
Myuran, Tharsika; Turner, Oliver; Ben Doostdar, Bijan; Lovett, Bryony
2017-01-01
In 2001 the CRABEL score was devised in order to obtain a numerical score of the standard of medical note keeping. With the advent of electronic discharge letters, many components of the CRABEL score are now redundant as computers automatically include some documentation. The CRABEL score was modified to form the e-CRABEL score. "Patient details on discharge letter" and "Admission and discharge dates on discharge letter" were replaced with "Summary of investigations on discharge letter" and "Documentation of VTE prophylaxis on the drug chart". The new e-CRABEL score has been used as a monthly audit tool in a busy surgical unit to monitor long-term standards of medical note keeping, with interventions of presenting in the departmental audit meeting, and giving a teaching session to a group of junior doctors at two points. Following discussion with stakeholders: junior doctors, consultants, and the audit department; it was decided that the e-CRABEL tool was sufficiently compact to be completed on a monthly basis. Critique and interventions included using photographic examples, case note selection and clarification of the e-CRABEL criteria in a teaching session. Tools used for audit need to be updated in order to accurately represent what they measure, hence the modification of the CRABEL score to make the new e-CRABEL score. Preliminary acquisition and presentation of data using the e-CRABEL score has shown promise in improving the quality of medical record keeping. The tool is sufficiently compact as to conduct on a monthly basis, maintaining standards to a high level and also provides data on VTE documentation.
Atorvastatin in the management of tinnitus with hyperlipidemias.
Hameed, Mirza Khizer; Sheikh, Zeeshan Ayub; Ahmed, Azeema; Najam, Atif
2014-12-01
To determine the role of atorvastatin in management of tinnitus in patients with hyperlipidemia. Quasi-experimental study. ENT Department, Combined Military Hospital, Rawalpindi, from July 2011 to August 2012. Ninety eight patients of tinnitus with sensorineural hearing loss having hyperlipidemia were included in the study. Their pre-therapy serum cholesterols were measured, and tinnitus scores were recorded on a 'Tinnitus handicap questionnaire'. They were administered tablet atorvastatin 40 mg once daily with low fat diet for 8 months. After 8 months of therapy, patients were purposefully divided into responsive and unresponsive group depending on serum cholesterol levels. Post therapy serum cholesterol levels and tinnitus scores were also recorded after 8 months and compared with pre-therapy records. Serum cholesterol came to within normal limits in 51 (52%) patients (responsive group), while it remained high in 47 (48%) patients (unresponsive group). Improvement in tinnitus score in the responsive group was seen in 36 (70.5%) patients and in 2 (4.2%) patients of the unresponsive group. Improvement in tinnitus scores was compared in the two groups using Fisher's exact test and were found to be statistically better in the responsive group (p < 0.001). Tinnitus, in patients having hyperlipidemia, can be successfully dealt with by treating hyperlipidemia with lipid lowering agent atorvastatin.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Schlensinger, Adam
2011-01-01
Sinusoidal jitter is produced by simply modulating a clock frequency sinusoidally with a given frequency and amplitude. But this can be expressed as phase jitter, frequency jitter, or cycle-to-cycle jitter, rms or peak, absolute units, or normalized to the base clock frequency. Jitter using other waveforms requires calculating and downloading these waveforms to an arbitrary waveform generator, and helping the user manage relationships among phase jitter crest factor, frequency jitter crest factor, and cycle-to-cycle jitter (CCJ) crest factor. Software was developed for managing these relationships, automatically configuring the generator, and saving test results documentation. Tighter management of clock jitter and jitter sensitivity is required by new codes that further extend the already high performance of space communication links, completely correcting symbol error rates higher than 10 percent, and therefore typically requiring demodulation and symbol synchronization hardware to operating at signal-to-noise ratios of less than one. To accomplish this, greater demands are also made on transmitter performance, and measurement techniques are needed to confirm performance. It was discovered early that sinusoidal jitter can be stepped on a grid such that one can connect points by constant phase jitter, constant frequency jitter, or constant cycle-cycle jitter. The tool automates adherence to a grid while also allowing adjustments off-grid. Also, the jitter can be set by the user on any dimension and the others are calculated. The calculations are all recorded, allowing the data to be rapidly plotted or re-plotted against different interpretations just by changing pointers to columns. A key advantage is taking data on a carefully controlled grid, which allowed a single data set to be post-analyzed many different ways. Another innovation was building a software tool to provide very tight coupling between the generator and the recorded data product, and the operator's worksheet. Together, these allowed the operator to sweep the jitter stimulus quickly along any of three dimensions and focus on the response of the system under test (response was jitter transfer ratio, or performance degradation to the symbol or codeword error rate). Additionally, managing multi-tone and noise waveforms automated a tedious manual process, and provided almost instantaneous decision- making control over test flow. The code was written in LabVIEW, and calls Agilent instrument drivers to write to the generator hardware.
Installation Restoration Program. Phase I. Records Search, Brooks AFB, Texas
1985-03-01
decay of the cadavers occurred. The waste was packaged in plastic bags, placed in seven 55-gallon drums and buried in a hole 7 to 8 feet deep. The drums...Receptors subscore (I x factor score subtotal/maximm score subtotal) 44 - II. WASTE CARACTERISTICS A. Select the factor score based on the estimated quantity...subtotal) 44 II. WASTE CARACTERISTICS A. Select the factor score based on the estimated quantity, the degree of hazard, and the confidence level of the
ERIC Educational Resources Information Center
Kadane, Joseph B.; And Others
This paper offers a preliminary analysis of the effects of a semi-segregated school system on the IQ's of its students. The basic data consist of IQ scores for fourth, sixth, and eighth grades and associated environmental data obtained from their school records. A statistical model is developed to analyze longitudinal data when both process error…
NASA Astrophysics Data System (ADS)
Contractor, S.; Donat, M.; Alexander, L. V.
2017-12-01
Reliable observations of precipitation are necessary to determine past changes in precipitation and validate models, allowing for reliable future projections. Existing gauge based gridded datasets of daily precipitation and satellite based observations contain artefacts and have a short length of record, making them unsuitable to analyse precipitation extremes. The largest limiting factor for the gauge based datasets is a dense and reliable station network. Currently, there are two major data archives of global in situ daily rainfall data, first is Global Historical Station Network (GHCN-Daily) hosted by National Oceanic and Atmospheric Administration (NOAA) and the other by Global Precipitation Climatology Centre (GPCC) part of the Deutsche Wetterdienst (DWD). We combine the two data archives and use automated quality control techniques to create a reliable long term network of raw station data, which we then interpolate using block kriging to create a global gridded dataset of daily precipitation going back to 1950. We compare our interpolated dataset with existing global gridded data of daily precipitation: NOAA Climate Prediction Centre (CPC) Global V1.0 and GPCC Full Data Daily Version 1.0, as well as various regional datasets. We find that our raw station density is much higher than other datasets. To avoid artefacts due to station network variability, we provide multiple versions of our dataset based on various completeness criteria, as well as provide the standard deviation, kriging error and number of stations for each grid cell and timestep to encourage responsible use of our dataset. Despite our efforts to increase the raw data density, the in situ station network remains sparse in India after the 1960s and in Africa throughout the timespan of the dataset. Our dataset would allow for more reliable global analyses of rainfall including its extremes and pave the way for better global precipitation observations with lower and more transparent uncertainties.
Earth Observations taken by the Expedition 35 Crew
2013-03-16
ISS035-E-005438 (16 March 2013) --- One of the Expedition 35 crew members on the International Space Station used a still camera with a 400 millimeter lens to record this nocturnal image of the Phoenix, Arizona area. Like many large urban areas of the central and western United States, the Phoenix metropolitan area is laid out along a regular grid of city blocks and streets. While visible during the day, this grid is most evident at night, when the pattern of street lighting is clearly visible from above – in the case of this photograph, from the low Earth orbit vantage point of the International Space Station. The urban grid form encourages growth of a city outwards along its borders, by providing optimal access to new real estate. Fueled by the adoption of widespread personal automobile use during the 20th century, the Phoenix metropolitan area today includes 25 other municipalities (many of them largely suburban and residential in character) linked by a network of surface streets and freeways. The image area includes parts of several cities in the metropolitan area including Phoenix proper (right), Glendale (center), and Peoria (left). While the major street grid is oriented north-south, the northwest-southeast oriented Grand Avenue cuts across it at image center. Grand Avenue is a major transportation corridor through the western metropolitan area; the lighting patterns of large industrial and commercial properties are visible along its length. Other brightly lit properties include large shopping centers, strip centers, and gas stations which tend to be located at the intersections of north-south and east-west trending streets. While much of the land area highlighted in this image is urbanized, there are several noticeably dark areas. The Phoenix Mountains at upper right are largely public park and recreational land. To the west (image lower left), agricultural fields provide a sharp contrast to the lit streets of neighboring residential developments. The Salt River channel appears as a dark ribbon within the urban grid at lower right.
Optimum Image Formation for Spaceborne Microwave Radiometer Products.
Long, David G; Brodzik, Mary J
2016-05-01
This paper considers some of the issues of radiometer brightness image formation and reconstruction for use in the NASA-sponsored Calibrated Passive Microwave Daily Equal-Area Scalable Earth Grid 2.0 Brightness Temperature Earth System Data Record project, which generates a multisensor multidecadal time series of high-resolution radiometer products designed to support climate studies. Two primary reconstruction algorithms are considered: the Backus-Gilbert approach and the radiometer form of the scatterometer image reconstruction (SIR) algorithm. These are compared with the conventional drop-in-the-bucket (DIB) gridded image formation approach. Tradeoff study results for the various algorithm options are presented to select optimum values for the grid resolution, the number of SIR iterations, and the BG gamma parameter. We find that although both approaches are effective in improving the spatial resolution of the surface brightness temperature estimates compared to DIB, SIR requires significantly less computation. The sensitivity of the reconstruction to the accuracy of the measurement spatial response function (MRF) is explored. The partial reconstruction of the methods can tolerate errors in the description of the sensor measurement response function, which simplifies the processing of historic sensor data for which the MRF is not known as well as modern sensors. Simulation tradeoff results are confirmed using actual data.
Improving general practitioner clinical records with a quality assurance minimal intervention.
Del Mar, C B; Lowe, J B; Adkins, P; Arnold, E; Baade, P
1998-01-01
BACKGROUND: Although good medical records have been associated with good care, there is considerable room for their improvement in general practice. AIM: To improve the quality of general practice medical records at minimal cost. METHOD: A total of 150 randomly sampled general practitioners (GPs) in suburban Brisbane, Australia, were randomized in a controlled trial to receive or not receive an intervention. The intervention consisted of 6 to 12 one-hour monthly meetings when the pairs of GPs assessed samples of each other's medical records using a 12-item instrument. This was developed previously by a process of consensus of general practice teachers. Mean scores of 10 medical records selected at random from before the intervention started and one year later were compared. RESULTS: After the intervention, the increase in the total score (for which the maximum possible was 18) for the intervention GPs (from a baseline of 11.5 to 12.3) was not significantly greater than for the controls (from 11.4 to 11.7). Legibility and being able to determine the doctor's assessment of the consultation were significantly improved. The post-intervention increase of 1.06 (9.3%) of the total scores of the 47% of intervention GPs who complied with the intervention was significantly greater than that for the controls. CONCLUSION: The quality assurance activity improved some components of the quality of GPs' clinical records. However, the improvement was small, and the search for activities for Australian GPs that demonstrate an improvement in the quality of their practice must continue. Images p1311-a PMID:9747547
Christofidis, Melany J; Hill, Andrew; Horswill, Mark S; Watson, Marcus O
2016-01-01
To systematically evaluate the impact of several design features on chart-users' detection of patient deterioration on observation charts with early-warning scoring-systems. Research has shown that observation chart design affects the speed and accuracy with which abnormal observations are detected. However, little is known about the contribution of individual design features to these effects. A 2 × 2 × 2 × 2 mixed factorial design, with data-recording format (drawn dots vs. written numbers), scoring-system integration (integrated colour-based system vs. non-integrated tabular system) and scoring-row placement (grouped vs. separate) varied within-participants and scores (present vs. absent) varied between-participants by random assignment. 205 novice chart-users, tested between March 2011-March 2014, completed 64 trials where they saw real patient data presented on an observation chart. Each participant saw eight cases (four containing abnormal observations) on each of eight designs (which represented a factorial combination of the within-participants variables). On each trial, they assessed whether any of the observations were physiologically abnormal, or whether all observations were normal. Response times and error rates were recorded for each design. Participants responded faster (scores present and absent) and made fewer errors (scores absent) using drawn-dot (vs. written-number) observations and an integrated colour-based (vs. non-integrated tabular) scoring-system. Participants responded faster using grouped (vs. separate) scoring-rows when scores were absent, but separate scoring-rows when scores were present. Our findings suggest that several individual design features can affect novice chart-users' ability to detect patient deterioration. More broadly, the study further demonstrates the need to evaluate chart designs empirically. © 2015 John Wiley & Sons Ltd.
Concurrent Tumor Segmentation and Registration with Uncertainty-based Sparse non-Uniform Graphs
Parisot, Sarah; Wells, William; Chemouny, Stéphane; Duffau, Hugues; Paragios, Nikos
2014-01-01
In this paper, we present a graph-based concurrent brain tumor segmentation and atlas to diseased patient registration framework. Both segmentation and registration problems are modeled using a unified pairwise discrete Markov Random Field model on a sparse grid superimposed to the image domain. Segmentation is addressed based on pattern classification techniques, while registration is performed by maximizing the similarity between volumes and is modular with respect to the matching criterion. The two problems are coupled by relaxing the registration term in the tumor area, corresponding to areas of high classification score and high dissimilarity between volumes. In order to overcome the main shortcomings of discrete approaches regarding appropriate sampling of the solution space as well as important memory requirements, content driven samplings of the discrete displacement set and the sparse grid are considered, based on the local segmentation and registration uncertainties recovered by the min marginal energies. State of the art results on a substantial low-grade glioma database demonstrate the potential of our method, while our proposed approach shows maintained performance and strongly reduced complexity of the model. PMID:24717540
Wilde, M C; Boake, C; Sherer, M
2000-01-01
Final broken configuration errors on the Wechsler Adult Intelligence Scale-Revised (WAIS-R; Wechsler, 1981) Block Design subtest were examined in 50 moderate and severe nonpenetrating traumatically brain injured adults. Patients were divided into left (n = 15) and right hemisphere (n = 19) groups based on a history of unilateral craniotomy for treatment of an intracranial lesion and were compared to a group with diffuse or negative brain CT scan findings and no history of neurosurgery (n = 16). The percentage of final broken configuration errors was related to injury severity, Benton Visual Form Discrimination Test (VFD; Benton, Hamsher, Varney, & Spreen, 1983) total score and the number of VFD rotation and peripheral errors. The percentage of final broken configuration errors was higher in the patients with right craniotomies than in the left or no craniotomy groups, which did not differ. Broken configuration errors did not occur more frequently on designs without an embedded grid pattern. Right craniotomy patients did not show a greater percentage of broken configuration errors on nongrid designs as compared to grid designs.
Evans, Heather L; O'Shea, Dylan J; Morris, Amy E; Keys, Kari A; Wright, Andrew S; Schaad, Douglas C; Ilgen, Jonathan S
2016-02-01
This pilot study assessed the feasibility of using first person (1P) video recording with Google Glass (GG) to assess procedural skills, as compared with traditional third person (3P) video. We hypothesized that raters reviewing 1P videos would visualize more procedural steps with greater inter-rater reliability than 3P rating vantages. Seven subjects performed simulated internal jugular catheter insertions. Procedures were recorded by both Google Glass and an observer's head-mounted camera. Videos were assessed by 3 expert raters using a task-specific checklist (CL) and both an additive- and summative-global rating scale (GRS). Mean scores were compared by t-tests. Inter-rater reliabilities were calculated using intraclass correlation coefficients. The 1P vantage was associated with a significantly higher mean CL score than the 3P vantage (7.9 vs 6.9, P = .02). Mean GRS scores were not significantly different. Mean inter-rater reliabilities for the CL, additive-GRS, and summative-GRS were similar between vantages. 1P vantage recordings may improve visualization of tasks for behaviorally anchored instruments (eg, CLs), whereas maintaining similar global ratings and inter-rater reliability when compared with conventional 3P vantage recordings. Copyright © 2016 Elsevier Inc. All rights reserved.
A technique for global monitoring of net solar irradiance at the ocean surface. II - Validation
NASA Technical Reports Server (NTRS)
Chertock, Beth; Frouin, Robert; Gautier, Catherine
1992-01-01
The generation and validation of the first satellite-based long-term record of surface solar irradiance over the global oceans are addressed. The record is generated using Nimbus-7 earth radiation budget (ERB) wide-field-of-view plentary-albedo data as input to a numerical algorithm designed and implemented based on radiative transfer theory. The mean monthly values of net surface solar irradiance are computed on a 9-deg latitude-longitude spatial grid for November 1978-October 1985. The new data set is validated in comparisons with short-term, regional, high-resolution, satellite-based records. The ERB-based values of net surface solar irradiance are compared with corresponding values based on radiance measurements taken by the Visible-Infrared Spin Scan Radiometer aboard GOES series satellites. Errors in the new data set are estimated to lie between 10 and 20 W/sq m on monthly time scales.
High-resolution behavioral mapping of electric fishes in Amazonian habitats.
Madhav, Manu S; Jayakumar, Ravikrishnan P; Demir, Alican; Stamper, Sarah A; Fortune, Eric S; Cowan, Noah J
2018-04-11
The study of animal behavior has been revolutionized by sophisticated methodologies that identify and track individuals in video recordings. Video recording of behavior, however, is challenging for many species and habitats including fishes that live in turbid water. Here we present a methodology for identifying and localizing weakly electric fishes on the centimeter scale with subsecond temporal resolution based solely on the electric signals generated by each individual. These signals are recorded with a grid of electrodes and analyzed using a two-part algorithm that identifies the signals from each individual fish and then estimates the position and orientation of each fish using Bayesian inference. Interestingly, because this system involves eavesdropping on electrocommunication signals, it permits monitoring of complex social and physical interactions in the wild. This approach has potential for large-scale non-invasive monitoring of aquatic habitats in the Amazon basin and other tropical freshwater systems.
Northern Galápagos Corals Reveal Twentieth Century Warming in the Eastern Tropical Pacific
NASA Astrophysics Data System (ADS)
Jimenez, Gloria; Cole, Julia E.; Thompson, Diane M.; Tudhope, Alexander W.
2018-02-01
Models and observations disagree regarding sea surface temperature (SST) trends in the eastern tropical Pacific. We present a new Sr/Ca-SST record that spans 1940-2010 from two Wolf Island corals (northern Galápagos). Trend analysis of the Wolf record shows significant warming on multiple timescales, which is also present in several other records and gridded instrumental products. Together, these data sets suggest that most of the eastern tropical Pacific has warmed over the twentieth century. In contrast, recent decades have been characterized by warming during boreal spring and summer (especially north of the equator), and subtropical cooling during boreal fall and winter (especially south of the equator). These SST trends are consistent with the effects of radiative forcing, mitigated by cooling due to wind forcing during boreal winter, as well as intensified upwelling and a strengthened Equatorial Undercurrent.
NASA Technical Reports Server (NTRS)
Hall, Dorothy K.; Box, Jason E.; Koenig, Lora S.; DiGirolamo, Nicolo E.; Comiso, Josefino C.; Shuman, Christopher A.
2011-01-01
Surface temperatures on the Greenland Ice Sheet have been studied on the ground, using automatic weather station (AWS) data from the Greenland-Climate Network (GC-Net), and from analysis of satellite sensor data. Using Advanced Very High Frequency Radiometer (AVHRR) weekly surface temperature maps, warming of the surface of the Greenland Ice Sheet has been documented since 1981. We extended and refined this record using higher-resolution Moderate-Resolution Imaging Spectroradiometer (MODIS) data from March 2000 to the present. We developed a daily and monthly climate-data record (CDR) of the "clear-sky" surface temperature of the Greenland Ice Sheet using an ice-surface temperature (1ST) algorithm developed for use with MODIS data. Validation of this CDR is ongoing. MODIS Terra swath data are projected onto a polar stereographic grid at 6.25-km resolution to develop binary, gridded daily and mean-monthly 1ST maps. Each monthly map also has a color-coded image map that is available to download. Also included with the monthly maps is an accompanying map showing number of days in the month that were used to calculate the mean-monthly 1ST. This is important because no 1ST decision is made by the algorithm for cells that are considered cloudy by the internal cloud mask, so a sufficient number of days must be available to produce a mean 1ST for each grid cell. Validation of the CDR consists of several facets: 1) comparisons between ISTs and in-situ measurements; 2) comparisons between ISTs and AWS data; and 3) comparisons of ISTs with surface temperatures derived from other satellite instruments such as the Thermal Emission and Reflection Radiometer (ASTER) and Enhanced Thematic Mapper Plus (ETM+). Previous work shows that Terra MODIS ISTs are about 3 C lower than in-situ temperatures measured at Summit Camp, during the winter of 2008-09 under clear skies. In this work we begin to compare surface temperatures derived from AWS data with ISTs from the MODIS CDR. The Greenland Ice Sheet 1ST CDR will be useful for monitoring surface-temperature trends and can be used as input or for validation of climate models. The CDR can be extended into the future using MODIS Terra, Aqua and NPOESS Preparatory Project Visible Infrared Imager Radiometer Suite (VII RS) data.
Prevalence of orthorexia nervosa among ashtanga yoga practitioners: a pilot study.
Herranz Valera, Jesus; Acuña Ruiz, Patricia; Romero Valdespino, Borja; Visioli, Francesco
2014-12-01
Orthorexia nervosa (ON, i.e., fixation on righteous eating) is a poorly defined disordered eating behavior that results from a pathological obsession with food, its purported nutritional value, composition, origin, etc. We investigated the prevalence of ON in a local ashtanga yoga community, by using a validated questionnaire (ORTO-15) that sets a threshold of ON diagnosis at ≤40. Among the 136 respondents, the mean ORTO-15 score (which was normally distributed) was 35.27 ± 3.69, i.e., 86 % of respondents had an ORTO-15 score lower than 40 and no significant association with age or BMI was recorded. When we analyzed the differential distribution of orthorexia in our cohort, we recorded an association of ORTO-15 score and vegetarianism, i.e., the ORTO-15 score was lower among vegetarians. The results of this pilot study should suggest ashtanga yoga teachers to avoid excessive reference to a healthy diet, which is natural component of yoga practice.
Validation of a new ENT emergencies course for first-on-call doctors.
Swords, C; Smith, M E; Wasson, J D; Qayyum, A; Tysome, J R
2017-02-01
First-on-call ENT cover is often provided by junior doctors with limited ENT experience; yet, they may have to manage life-threatening emergencies. An intensive 1-day simulation course was developed to teach required skills to junior doctors. A prospective, single-blinded design was used. Thirty-seven participants rated their confidence before the course, immediately following the course and after a two-month interval. Blinded assessors scored participant performance in two video-recorded simulated scenarios before and after the course. Participant self-rated confidence was increased in the end-of-course survey (score of 27.5 vs 53.0; p < 0.0001), and this was maintained two to four months after the course (score of 50.5; p < 0.0001). Patient assessment and management in video-recorded emergency scenarios was significantly improved following course completion (score of 9.75 vs 18.75; p = 0.0093). This course represents an effective method of teaching ENT emergency management to junior doctors. ENT induction programmes benefit from the incorporation of a simulation component.
Shah, Mehul A; Agrawal, Rupesh; Teoh, Ryan; Shah, Shreya M; Patel, Kashyap; Gupta, Satyam; Gosai, Siddharth
2017-05-01
To introduce and validate the pediatric ocular trauma score (POTS) - a mathematical model to predict visual outcome trauma in children with traumatic cataract METHODS: In this retrospective cohort study, medical records of consecutive children with traumatic cataracts aged 18 and below were retrieved and analysed. Data collected included age, gender, visual acuity, anterior segment and posterior segment findings, nature of surgery, treatment for amblyopia, follow-up, and final outcome was recorded on a precoded data information sheet. POTS was derived based on the ocular trauma score (OTS), adjusting for age of patient and location of the injury. Visual outcome was predicted using the OTS and the POTS and using receiver operating characteristic (ROC) curves. POTS predicted outcomes were more accurate compared to that of OTS (p = 0.014). POTS is a more sensitive and specific score with more accurate predicted outcomes compared to OTS, and is a viable tool to predict visual outcomes of pediatric ocular trauma with traumatic cataract.
Psychometric properties of the Italian version of the Cognitive Reserve Scale (I-CRS).
Altieri, Manuela; Siciliano, Mattia; Pappacena, Simona; Roldán-Tapia, María Dolores; Trojano, Luigi; Santangelo, Gabriella
2018-05-04
The original definition of cognitive reserve (CR) refers to the individual differences in cognitive performance after a brain damage or pathology. Several proxies were proposed to evaluate CR (education, occupational attainment, premorbid IQ, leisure activities). Recently, some scales were developed to measure CR taking into account several cognitively stimulating activities. The aim of this study is to adapt the Cognitive Reserve Scale (I-CRS) for the Italian population and to explore its psychometric properties. I-CRS was administered to 547 healthy participants, ranging from 18 to 89 years old, along with neuropsychological and behavioral scales to evaluate cognitive functioning, depressive symptoms, and apathy. Cronbach's α, corrected item-total correlations, and the inter-item correlation matrix were calculated to evaluate the psychometric properties of the scale. Linear regression analysis was performed to build a correction grid of the I-CRS according to demographic variables. Correlational analyses were performed to explore the relationships between I-CRS and neuropsychological and behavioral scales. We found that age, sex, and education influenced the I-CRS score. Young adults and adults obtained higher I-CRS scores than elderly adults; women and participants with high educational attainment scored higher on I-CRS than men and participants with low education. I-CRS score correlated poorly with cognitive and depression scale scores, but moderately with apathy scale scores. I-CRS showed good psychometric properties and seemed to be a useful tool to assess CR in every adult life stage. Moreover, our findings suggest that apathy rather than depressive symptoms may interfere with the building of CR across the lifespan.
Testing & Validating: 3D Seismic Travel Time Tomography (Detailed Shallow Subsurface Imaging)
NASA Astrophysics Data System (ADS)
Marti, David; Marzan, Ignacio; Alvarez-Marron, Joaquina; Carbonell, Ramon
2016-04-01
A detailed full 3 dimensional P wave seismic velocity model was constrained by a high-resolution seismic tomography experiment. A regular and dense grid of shots and receivers was use to image a 500x500x200 m volume of the shallow subsurface. 10 GEODE's resulting in a 240 channels recording system and a 250 kg weight drop were used for the acquisition. The recording geometry consisted in 10x20m geophone grid spacing, and a 20x20 m stagered source spacing. A total of 1200 receivers and 676 source points. The study area is located within the Iberian Meseta, in Villar de Cañas (Cuenca, Spain). The lithological/geological target consisted in a Neogen sedimentary sequence formed from bottom to top by a transition from gyspum to silstones. The main objectives consisted in resolving the underground structure: contacts/discontinuities; constrain the 3D geometry of the lithology (possible cavities, faults/fractures). These targets were achieved by mapping the 3D distribution of the physical properties (P-wave velocity). The regularly space dense acquisition grid forced to acquire the survey in different stages and with a variety of weather conditions. Therefore, a careful quality control was required. More than a half million first arrivals were inverted to provide a 3D Vp velocity model that reached depths of 120 m in the areas with the highest ray coverage. An extended borehole campaign, that included borehole geophysical measurements in some wells provided unique tight constraints on the lithology an a validation scheme for the tomographic results. The final image reveals a laterally variable structure consisting of four different lithological units. In this methodological validation test travel-time tomography features a high capacity of imaging in detail the lithological contrasts for complex structures located at very shallow depths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mothiram, Ursula; Brennan, Patrick C; Robinson, John
2013-12-15
Following X-ray exposure, radiographers receive immediate feedback on detector exposure in the form of the exposure index (EI). To identify whether radiographers are meeting manufacturer-recommended EI (MREI) ranges for routine chest, abdomen and pelvis X-ray examinations under a variety of conditions and to examine factors affecting the EI. Data on 5000 adult X-ray examinations including the following variables were collected: examination parameters, EI values, patient gender, date of birth, date and time of examination, grid usage and the presence of implant or prosthesis. Descriptive statistics were used to summarize each data set and the Mann–Whitney U test was used tomore » determine significant differences, with P < 0.05 indicating significance for all tests. Most examinations demonstrated EI values that were outside the MREI ranges, with significantly higher median EI values recorded for female patient radiographs than those for male patients for all manufacturers, indicating higher detector exposures for all units except for Philips digital radiography (DR), where increased EI values indicate lower exposure (P = 0.01). Median EI values for out of hours radiography were also significantly higher compared with normal working hours for all technologies (P ≤ 0.02). Significantly higher median EI values were demonstrated for Philips DR chest X-rays without as compared to those with the employment of a grid (P = 0.03), while significantly lower median EI values were recorded for Carestream Health computed radiography (CR) chest X-rays when an implant or prosthesis was present (P = 0.02). Non-adherence to MREIs has been demonstrated with EI value discrepancies being dependent on patient gender, time/day of exposure, grid usage and the presence of an implant or prosthesis. Retrospective evaluation of EI databases is a valuable tool to assess the need of quality improvement in routine DR.« less
Vogel, Camille; Kopp, Dorothée; Méhault, Sonia
2017-01-15
On January 1st, 2016, the French mixed Nephrops and hake fishery of the Grande Vasière, an area located in the Bay of Biscay, fell under the discard ban implemented as part of the new European Common Fisheries Policy. The fleet records historically high levels of discard despite numerous gear selectivity studies. Together with high discards survival, new technological solutions to minimize catches of undersized individuals could justify local exemptions from the discard ban. Our study focuses on the effects of two selective devices, a square mesh cylinder (SMC) and a grid, on the escapement of undersized individuals and discard reduction. Relative catch probability of the modified gear compared with the traditional gear was modelled using the catch comparison method. Potential losses from the commercial fraction of the catch were taken into account to assess their influence on the economic viability of fishing with the modified gears. The two devices had similar effects on undersized Nephrops escapement and on discard reduction, with median values of 26.5% and 23.6% for the SMC and of 30.4% and 21.4% for the grid, respectively. Only the grid was efficient for undersized hake, recording median values of escapement and discard reduction equal to 25.0% and 20.6%, respectively. Some loss from the commercial fraction of the catch was to be expected with both devices, which could be compensated for in the long term by the contribution of undersized individuals to the stock biomass. Our results support the use of selective gears technology as part of an integrated framework including control and management measures to mitigate the effect of the discard ban both for fishers and for the ecosystem. Further work is needed to quantify the effect of additional escapement from the gear on stock dynamics. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
MacAlister, E.; Skalbeck, J.; Stewart, E.
2016-12-01
Since the late 1800's, geologic studies have been completed in Wisconsin in pursuit of understanding the basement topography and locating economically viable mineral resources. The doubly plunging Baraboo Syncline located in Columbia and Sauk Counties provides a classic record of Precambrian deformation. A similar buried structure is thought to exist in adjacent Dodge County based on a prominent aeromagnetic anomaly. For this study, 3-D modeling of gravity and aeromagnetic survey data was used to approximate the structure of the Precambrian basement topography beneath Dodge County, Wisconsin. The aim of the research was to determine a suitable basement topography grid using potential field data and then use this grid as the base for groundwater flow models. Geosoft Oasis Montaj GM-SYS 3D modeling software was used to build grids of subsurface layers and the model was constrained by well records of basement rock elevations located throughout the county. The study demonstrated that there is a complex network of crystalline basement structures that have been folded through tectonic activity during the Precambrian. A thick layer of iron rich sedimentary material was deposited on top of the basement rocks, causing a distinct magnetic signature that outlined the basement structure in the magnetic survey. Preliminary results reveal an iron layer with a density of 3.7 g/cm3 and magnetic susceptibility of 8000 x 10-6 cgs that is approximately 500 feet thick and ranges between elevations of -300 meters below and 400 meters above sea level. The 3-D model depths are consistent with depths from recent core drilling operations performed by the Wisconsin Geological and Natural History Survey. Knowing the depth to and structure of basement rock throughout Dodge County and Wisconsin plays an important role in understanding the geologic history of the region. Also, better resolution of the basement topography can enhance the accuracy of future groundwater flow models.
Sussman, Jeremy B; Wiitala, Wyndy L; Zawistowski, Matthew; Hofer, Timothy P; Bentley, Douglas; Hayward, Rodney A
2017-09-01
Accurately estimating cardiovascular risk is fundamental to good decision-making in cardiovascular disease (CVD) prevention, but risk scores developed in one population often perform poorly in dissimilar populations. We sought to examine whether a large integrated health system can use their electronic health data to better predict individual patients' risk of developing CVD. We created a cohort using all patients ages 45-80 who used Department of Veterans Affairs (VA) ambulatory care services in 2006 with no history of CVD, heart failure, or loop diuretics. Our outcome variable was new-onset CVD in 2007-2011. We then developed a series of recalibrated scores, including a fully refit "VA Risk Score-CVD (VARS-CVD)." We tested the different scores using standard measures of prediction quality. For the 1,512,092 patients in the study, the Atherosclerotic cardiovascular disease risk score had similar discrimination as the VARS-CVD (c-statistic of 0.66 in men and 0.73 in women), but the Atherosclerotic cardiovascular disease model had poor calibration, predicting 63% more events than observed. Calibration was excellent in the fully recalibrated VARS-CVD tool, but simpler techniques tested proved less reliable. We found that local electronic health record data can be used to estimate CVD better than an established risk score based on research populations. Recalibration improved estimates dramatically, and the type of recalibration was important. Such tools can also easily be integrated into health system's electronic health record and can be more readily updated.
[Scoring systems in intensive care medicine : principles, models, application and limits].
Fleig, V; Brenck, F; Wolff, M; Weigand, M A
2011-10-01
Scoring systems are used in all diagnostic areas of medicine. Several parameters are evaluated and rated with points according to their value in order to simplify a complex clinical situation with a score. The application ranges from the classification of disease severity through determining the number of staff for the intensive care unit (ICU) to the evaluation of new therapies under study conditions. Since the introduction of scoring systems in the 1980's a variety of different score models has been developed. The scoring systems that are employed in intensive care and are discussed in this article can be categorized into prognostic scores, expenses scores and disease-specific scores. Since the introduction of compulsory recording of two scoring systems for accounting in the German diagnosis-related groups (DRG) system, these tools have gained more importance for all intensive care physicians. Problems remain in the valid calculation of scores and interpretation of the results.
ERIC Educational Resources Information Center
Fahle, Erin M.; Reardon, Sean F.
2017-01-01
This paper provides the first population-based evidence on how much standardized test scores vary among public school districts within each state and how segregation explains that variation. Using roughly 300 million standardized test score records in math and ELA for grades 3 through 8 from every U.S. public school district during the 2008-09 to…
Measuring human remains in the field: Grid technique, total station, or MicroScribe?
Sládek, Vladimír; Galeta, Patrik; Sosna, Daniel
2012-09-10
Although three-dimensional (3D) coordinates for human intra-skeletal landmarks are among the most important data that anthropologists have to record in the field, little is known about the reliability of various measuring techniques. We compared the reliability of three techniques used for 3D measurement of human remain in the field: grid technique (GT), total station (TS), and MicroScribe (MS). We measured 365 field osteometric points on 12 skeletal sequences excavated at the Late Medieval/Early Modern churchyard in Všeruby, Czech Republic. We compared intra-observer, inter-observer, and inter-technique variation using mean difference (MD), mean absolute difference (MAD), standard deviation of difference (SDD), and limits of agreement (LA). All three measuring techniques can be used when accepted error ranges can be measured in centimeters. When a range of accepted error measurable in millimeters is needed, MS offers the best solution. TS can achieve the same reliability as does MS, but only when the laser beam is accurately pointed into the center of the prism. When the prism is not accurately oriented, TS produces unreliable data. TS is more sensitive to initialization than is MS. GT measures human skeleton with acceptable reliability for general purposes but insufficiently when highly accurate skeletal data are needed. We observed high inter-technique variation, indicating that just one technique should be used when spatial data from one individual are recorded. Subadults are measured with slightly lower error than are adults. The effect of maximum excavated skeletal length has little practical significance in field recording. When MS is not available, we offer practical suggestions that can help to increase reliability when measuring human skeleton in the field. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Márquez, G.; Pinto, A.; Alamo, L.; Baumann, B.; Ye, F.; Winkler, H.; Taylor, K.; Padrón, R.
2014-01-01
Summary Myosin interacting-heads (MIH) motifs are visualized in 3D-reconstructions of thick filaments from striated muscle. These reconstructions are calculated by averaging methods using images from electron micrographs of grids prepared using numerous filament preparations. Here we propose an alternative method to calculate the 3D-reconstruction of a single thick filament using only a tilt series images recorded by electron tomography. Relaxed thick filaments, prepared from tarantula leg muscle homogenates, were negatively stained. Single-axis tilt series of single isolated thick filaments were obtained with the electron microscope at a low electron dose, and recorded on a CCD camera by electron tomography. An IHRSR 3D-recontruction was calculated from the tilt series images of a single thick filament. The reconstruction was enhanced by including in the search stage dual tilt image segments while only single tilt along the filament axis is usually used, as well as applying a band pass filter just before the back projection. The reconstruction from a single filament has a 40 Å resolution and clearly shows the presence of MIH motifs. In contrast, the electron tomogram 3D-reconstruction of the same thick filament –calculated without any image averaging and/or imposition of helical symmetry- only reveals MIH motifs infrequently. This is –to our knowledge- the first application of the IHRSR method to calculate a 3D reconstruction from tilt series images. This single filament IHRSR reconstruction method (SF-IHRSR) should provide a new tool to assess structural differences between well-ordered thick (or thin) filaments in a grid by recording separately their electron tomograms. PMID:24727133
Márquez, G; Pinto, A; Alamo, L; Baumann, B; Ye, F; Winkler, H; Taylor, K; Padrón, R
2014-05-01
Myosin interacting-heads (MIH) motifs are visualized in 3D-reconstructions of thick filaments from striated muscle. These reconstructions are calculated by averaging methods using images from electron micrographs of grids prepared using numerous filament preparations. Here we propose an alternative method to calculate the 3D-reconstruction of a single thick filament using only a tilt series images recorded by electron tomography. Relaxed thick filaments, prepared from tarantula leg muscle homogenates, were negatively stained. Single-axis tilt series of single isolated thick filaments were obtained with the electron microscope at a low electron dose, and recorded on a CCD camera by electron tomography. An IHRSR 3D-recontruction was calculated from the tilt series images of a single thick filament. The reconstruction was enhanced by including in the search stage dual tilt image segments while only single tilt along the filament axis is usually used, as well as applying a band pass filter just before the back projection. The reconstruction from a single filament has a 40 Å resolution and clearly shows the presence of MIH motifs. In contrast, the electron tomogram 3D-reconstruction of the same thick filament - calculated without any image averaging and/or imposition of helical symmetry - only reveals MIH motifs infrequently. This is - to our knowledge - the first application of the IHRSR method to calculate a 3D reconstruction from tilt series images. This single filament IHRSR reconstruction method (SF-IHRSR) should provide a new tool to assess structural differences between well-ordered thick (or thin) filaments in a grid by recording separately their electron tomograms. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gailler, Audrey; Hébert, Hélène; Loevenbruck, Anne
2013-04-01
Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response on the scale of an individual harbor. In fact, when facing the problem of the interaction of the tsunami wavefield with a shoreline, any numerical simulation must be performed over an increasingly fine grid, which in turn mandates a reduced time step, and the use of a fully non-linear code. Such calculations become then prohibitively time-consuming, which is clearly unacceptable in the framework of real-time warning. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami wave heights in high seas, and tsunami warning maps at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these deep wave heights simulations. The method involves an empirical correction relation derived from Green's law, expressing conservation of wave energy flux to extend the gridded wave field into the harbor with respect to the nearby deep-water grid node. The main limitation of this method is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, a set of synthetic mareograms is calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids characterized by a coarse resolution over deep water regions and an increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). This synthetic dataset is then used to approximate the empirical parameters of the correction equation. Results of inundation estimates in several french Mediterranean harbors obtained with the fast "Green's law - derived" method are presented and compared with values given by time-consuming nested grids simulations.
Cochlear implantation for single-sided deafness and tinnitus suppression.
Holder, Jourdan T; O'Connell, Brendan; Hedley-Williams, Andrea; Wanna, George
To quantify the potential effectiveness of cochlear implantation for tinnitus suppression in patients with single-sided deafness using the Tinnitus Handicap Inventory. The study included 12 patients with unilateral tinnitus who were undergoing cochlear implantation for single-sided deafness. The Tinnitus Handicap Inventory was administered at the patient's cochlear implant candidacy evaluation appointment prior to implantation and every cochlear implant follow-up appointment, except activation, following implantation. Patient demographics and speech recognition scores were also retrospectively recorded using the electronic medical record. A significant reduction was found when comparing Tinnitus Handicap Inventory score preoperatively (61.2±27.5) to the Tinnitus Handicap Inventory score after three months of cochlear implant use (24.6±28.2, p=0.004) and the Tinnitus Handicap Inventory score beyond 6months of CI use (13.3±18.9, p=0.008). Further, 45% of patients reported total tinnitus suppression. Mean CNC word recognition score improved from 2.9% (SD 9.4) pre-operatively to 40.8% (SD 31.7) by 6months post-activation, which was significantly improved from pre-operative scores (p=0.008). The present data is in agreement with previously published studies that have shown an improvement in tinnitus following cochlear implantation for the large majority of patients with single-sided deafness. Copyright © 2017 Elsevier Inc. All rights reserved.
Trevisanuto, Daniele; Bertuola, Federica; Lanzoni, Paolo; Cavallin, Francesco; Matediana, Eduardo; Manzungu, Olivier Wingi; Gomez, Ermelinda; Da Dalt, Liviana; Putoto, Giovanni
2015-01-01
We assessed the effect of an adapted neonatal resuscitation program (NRP) course on healthcare providers' performances in a low-resource setting through the use of video recording. A video recorder, mounted to the radiant warmers in the delivery rooms at Beira Central Hospital, Mozambique, was used to record all resuscitations. One-hundred resuscitations (50 before and 50 after participation in an adapted NRP course) were collected and assessed based on a previously published score. All 100 neonates received initial steps; from these, 77 and 32 needed bag-mask ventilation (BMV) and chest compressions (CC), respectively. There was a significant improvement in resuscitation scores in all levels of resuscitation from before to after the course: for "initial steps", the score increased from 33% (IQR 28-39) to 44% (IQR 39-56), p<0.0001; for BMV, from 20% (20-40) to 40% (40-60), p = 0.001; and for CC, from 0% (0-10) to 20% (0-50), p = 0.01. Times of resuscitative interventions after the course were improved in comparison to those obtained before the course, but remained non-compliant with the recommended algorithm. Although resuscitations remained below the recommended standards in terms of quality and time of execution, clinical practice of healthcare providers improved after participation in an adapted NRP course. Video recording was well-accepted by the staff, useful for objective assessment of performance during resuscitation, and can be used as an educational tool in a low-resource setting.
Heart sounds analysis using probability assessment.
Plesinger, F; Viscor, I; Halamek, J; Jurco, J; Jurak, P
2017-07-31
This paper describes a method for automated discrimination of heart sounds recordings according to the Physionet Challenge 2016. The goal was to decide if the recording refers to normal or abnormal heart sounds or if it is not possible to decide (i.e. 'unsure' recordings). Heart sounds S1 and S2 are detected using amplitude envelopes in the band 15-90 Hz. The averaged shape of the S1/S2 pair is computed from amplitude envelopes in five different bands (15-90 Hz; 55-150 Hz; 100-250 Hz; 200-450 Hz; 400-800 Hz). A total of 53 features are extracted from the data. The largest group of features is extracted from the statistical properties of the averaged shapes; other features are extracted from the symmetry of averaged shapes, and the last group of features is independent of S1 and S2 detection. Generated features are processed using logical rules and probability assessment, a prototype of a new machine-learning method. The method was trained using 3155 records and tested on 1277 hidden records. It resulted in a training score of 0.903 (sensitivity 0.869, specificity 0.937) and a testing score of 0.841 (sensitivity 0.770, specificity 0.913). The revised method led to a test score of 0.853 in the follow-up phase of the challenge. The presented solution achieved 7th place out of 48 competing entries in the Physionet Challenge 2016 (official phase). In addition, the PROBAfind software for probability assessment was introduced.
Distributing and storing data efficiently by means of special datasets in the ATLAS collaboration
NASA Astrophysics Data System (ADS)
Köneke, Karsten; ATLAS Collaboration
2011-12-01
With the start of the LHC physics program, the ATLAS experiment started to record vast amounts of data. This data has to be distributed and stored on the world-wide computing grid in a smart way in order to enable an effective and efficient analysis by physicists. This article describes how the ATLAS collaboration chose to create specialized reduced datasets in order to efficiently use computing resources and facilitate physics analyses.
iElectrodes: A Comprehensive Open-Source Toolbox for Depth and Subdural Grid Electrode Localization.
Blenkmann, Alejandro O; Phillips, Holly N; Princich, Juan P; Rowe, James B; Bekinschtein, Tristan A; Muravchik, Carlos H; Kochen, Silvia
2017-01-01
The localization of intracranial electrodes is a fundamental step in the analysis of invasive electroencephalography (EEG) recordings in research and clinical practice. The conclusions reached from the analysis of these recordings rely on the accuracy of electrode localization in relationship to brain anatomy. However, currently available techniques for localizing electrodes from magnetic resonance (MR) and/or computerized tomography (CT) images are time consuming and/or limited to particular electrode types or shapes. Here we present iElectrodes, an open-source toolbox that provides robust and accurate semi-automatic localization of both subdural grids and depth electrodes. Using pre- and post-implantation images, the method takes 2-3 min to localize the coordinates in each electrode array and automatically number the electrodes. The proposed pre-processing pipeline allows one to work in a normalized space and to automatically obtain anatomical labels of the localized electrodes without neuroimaging experts. We validated the method with data from 22 patients implanted with a total of 1,242 electrodes. We show that localization distances were within 0.56 mm of those achieved by experienced manual evaluators. iElectrodes provided additional advantages in terms of robustness (even with severe perioperative cerebral distortions), speed (less than half the operator time compared to expert manual localization), simplicity, utility across multiple electrode types (surface and depth electrodes) and all brain regions.
A, Golkari; A, Sabokseir; D, Blane; A, Sheiham; RG, Watt
2017-01-01
Statement of Problem: Early childhood is a crucial period of life as it affects one’s future health. However, precise data on adverse events during this period is usually hard to access or collect, especially in developing countries. Objectives: This paper first reviews the existing methods for retrospective data collection in health and social sciences, and then introduces a new method/tool for obtaining more accurate general and oral health related information from early childhood retrospectively. Materials and Methods: The Early Childhood Events Life-Grid (ECEL) was developed to collect information on the type and time of health-related adverse events during the early years of life, by questioning the parents. The validity of ECEL and the accuracy of information obtained by this method were assessed in a pilot study and in a main study of 30 parents of 8 to 11 year old children from Shiraz (Iran). Responses obtained from parents using the final ECEL were compared with the recorded health insurance documents. Results: There was an almost perfect agreement between the health insurance and ECEL data sets (Kappa value=0.95 and p < 0.001). Interviewees remembered the important events more accurately (100% exact timing match in case of hospitalization). Conclusions: The Early Childhood Events Life-Grid method proved to be highly accurate when compared with recorded medical documents. PMID:28959773
iElectrodes: A Comprehensive Open-Source Toolbox for Depth and Subdural Grid Electrode Localization
Blenkmann, Alejandro O.; Phillips, Holly N.; Princich, Juan P.; Rowe, James B.; Bekinschtein, Tristan A.; Muravchik, Carlos H.; Kochen, Silvia
2017-01-01
The localization of intracranial electrodes is a fundamental step in the analysis of invasive electroencephalography (EEG) recordings in research and clinical practice. The conclusions reached from the analysis of these recordings rely on the accuracy of electrode localization in relationship to brain anatomy. However, currently available techniques for localizing electrodes from magnetic resonance (MR) and/or computerized tomography (CT) images are time consuming and/or limited to particular electrode types or shapes. Here we present iElectrodes, an open-source toolbox that provides robust and accurate semi-automatic localization of both subdural grids and depth electrodes. Using pre- and post-implantation images, the method takes 2–3 min to localize the coordinates in each electrode array and automatically number the electrodes. The proposed pre-processing pipeline allows one to work in a normalized space and to automatically obtain anatomical labels of the localized electrodes without neuroimaging experts. We validated the method with data from 22 patients implanted with a total of 1,242 electrodes. We show that localization distances were within 0.56 mm of those achieved by experienced manual evaluators. iElectrodes provided additional advantages in terms of robustness (even with severe perioperative cerebral distortions), speed (less than half the operator time compared to expert manual localization), simplicity, utility across multiple electrode types (surface and depth electrodes) and all brain regions. PMID:28303098
Katsahian, Sandrine; Simond Moreau, Erica; Leprovost, Damien; Lardon, Jeremy; Bousquet, Cedric; Kerdelhué, Gaétan; Abdellaoui, Redhouane; Texier, Nathalie; Burgun, Anita; Boussadi, Abdelali; Faviez, Carole
2015-01-01
Suspected adverse drug reactions (ADR) reported by patients through social media can be a complementary tool to already existing ADRs signal detection processes. However, several studies have shown that the quality of medical information published online varies drastically whatever the health topic addressed. The aim of this study is to use an existing rating tool on a set of social network web sites in order to assess the capabilities of these tools to guide experts for selecting the most adapted social network web site to mine ADRs. First, we reviewed and rated 132 Internet forums and social networks according to three major criteria: the number of visits, the notoriety of the forum and the number of messages posted in relation with health and drug therapy. Second, the pharmacist reviewed the topic-oriented message boards with a small number of drug names to ensure that they were not off topic. Six experts have been chosen to assess the selected internet forums using a French scoring tool: Net scoring. Three different scores and the agreement between experts according to each set of scores using weighted kappa pooled using mean have been computed. Three internet forums were chosen at the end of the selection step. Some criteria get high score (scores 3-4) no matter the website evaluated like accessibility (45-46) or design (34-36), at the opposite some criteria always have bad scores like quantitative (40-42) and ethical aspect (43-44), hyperlinks actualization (30-33). Kappa were positives but very small which corresponds to a weak agreement between experts. The personal opinion of the expert seems to have a major impact, undermining the relevance of the criterion. Our future work is to collect results given by this evaluation grid and proposes a new scoring tool for Internet social networks assessment.
Evans, Lauren Jayne; Beck, Alison; Burdett, Mark
2017-09-01
This study explores whether improvements, as measured by the CORE-OM/10, as a result of psychological therapy were related to length of treatment in weeks, number of treatment sessions, or treatment intensity, as well as any effect of diagnostic group. Pre- and post-therapy CORE-OM/10 scores were extracted from the clinical records of all secondary care adult psychological therapy team patients who undertook psychological therapy between 2010 and 2013 in one mental health trust. Of the 4,877 patients identified, 925 had complete records. Length of therapy was divided by the number of sessions to create 'treatment intensity' (sessions per week). Nonparametric analyses were used, initial score was controlled for, and diagnostic group was explored. No relationship was found between change in score and the number of sessions, therapy length, or treatment intensity; however, change in score was positively correlated with first-session score. Patients with higher initial scores had longer therapies; however, treatment intensity was similar for patients with lower pre-therapy distress. There were differences in treatment length (weeks) between diagnostic groups. Demographic differences were found between patients with and without complete records, prompting caution in terms of generalizability. These findings are consistent with the responsive regulation model (Barkham et al., 1996) which proposes that patients vary in their response to treatment, resulting in no associations between session numbers or treatment intensity and therapeutic gain with aggregated scores. Patients with higher CORE scores at the outset of psychological therapy had longer not more intensive therapy. There was variation in treatment intensity between diagnostic clusters. Number of sessions, length of therapy (in weeks), and treatment intensity (the number of sessions per week between the first and last therapy sessions) were not related to therapeutic gains. These results fit with a responsive regulation model of therapy duration, suggesting an individualized approach to therapy cessation as opposed to therapy session limits as the number of sessions a patient experienced was not generally associated with outcome. We found that clients with a diagnosis of a behavioural syndrome (F50-59) had less 'intensive' therapy; they experienced the same number of sessions over a longer time frame. Despite this, there were no associations between diagnosis category and change in score. © 2017 The British Psychological Society.
Soukup, Benjamin; Mashhadi, Syed A; Bulstrode, Neil W
2012-03-01
This study aims to assess the health-related quality-of-life benefit following auricular reconstruction using autologous costal cartilage in children. In addition, key aspects of the surgical reconstruction are assessed. After auricular reconstruction, patients completed two questionnaires. The first was a postinterventional health-related quality-of-life assessment tool, the Glasgow Benefit Inventory. A score of 0 signifies no change in health-related quality-of-life, +100 indicates maximal improvement, and -100 indicates maximal negative impact. The second questionnaire assessed surgical outcomes in auricular reconstruction across three areas: facial integration, aesthetic auricular units, and costal reconstruction. These were recorded on a five-point ordinal scale and are presented as mean scores of a total of 5. The mean total Glasgow Benefit Inventory score was 48.1; significant improvements were seen in all three Glasgow Benefit Inventory subscales (p < 0.0001). A mean integration score of 3.8 and a mean aesthetic auricular unit reconstruction score of 3.4 were recorded. Skin color matching (4.3) of the ear was most successfully reconstructed and auricular cartilage reconstruction scored lowest (3.5). Of the aesthetic units, the helix scored highest (3.6) and the tragus/antitragus scored lowest (3.3). Donor-site reconstruction scored 3.9. Correlation analysis revealed that higher reconstruction scores are associated with a greater health-related quality-of-life gain (r = 0.5). Ninety-six percent of patients would recommend the procedure to a friend. Auricular reconstruction with autologous cartilage results in significant improvements in health-related quality-of-life. In addition, better surgical outcomes lead to a greater improvement in health-related quality-of-life. Comparatively poorer reconstructed areas of the ear were identified so that surgical techniques may be improved. Therapeutic, IV.
Herasevich, Vitaly
2017-01-01
Background The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. Objective The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. Methods First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. Results The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Conclusions Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians’ needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as “apps.” A user-centered design process and usability evaluation should be considered during creation of these tools. PMID:28526675
Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly
2017-05-18
The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.
A cohort study of low Apgar scores and cognitive outcomes
Odd, D E; Rasmussen, F; Gunnell, D; Lewis, G; Whitelaw, A
2008-01-01
Objective: To investigate the association of brief (0–5 minutes) and prolonged (>5 minutes) low Apgar scores (<7) in non-encephalopathic infants with educational achievement at age 15–16 and intelligence quotients (IQs) at age 18. Design: Population-based record-linkage cohort study of 176 524 male infants born throughout Sweden between 1973 and 1976. Patients and methods: Data from the Medical Birth Register were linked to Population and Housing Censuses, conscription medical records (IQ), and school registers (summary school grade). Infants were classified according to the time for their Apgar score to reach 7 or above. Premature infants and those with encephalopathy were excluded. Results: Infants with brief (OR = 1.14 (1.03–1.27)) or prolonged (OR = 1.35 (1.07–1.69)) low Apgar scores were more likely to have a low IQ score. There was an increased risk of a low IQ score (p = 0.003) the longer it took the infant to achieve a normal Apgar score. There was no association between brief (OR = 0.96 (0.87–1.06)) or prolonged (OR = 1.01 (0.81–1.26)) low Apgar scores and a low summary school grade at age 15–16, or evidence for a trend in the risk of a low school grade (p = 0.61). The estimated proportion with an IQ score below 81 due to transiently low Apgar scores was only 0.7%. Conclusions: Infants in poor condition at birth have increased risk of poor functioning in cognitive tests in later life. This supports the idea of a “continuum of reproductive casualty”, although the small individual effect suggests that these mild degrees of fetal compromise are not of clinical importance. PMID:17916594