Sample records for csv working group

  1. Molecular, serological and biological variation among chickpea chlorotic stunt virus isolates from five countries of North Africa and West Asia.

    PubMed

    Abraham, A D; Menzel, W; Varrelmann, M; Vetten, H Josef

    2009-01-01

    Chickpea chlorotic stunt virus (CpCSV), a proposed new member of the genus Polerovirus (family Luteoviridae), has been reported only from Ethiopia. In attempts to determine the geographical distribution and variability of CpCSV, a pair of degenerate primers derived from conserved domains of the luteovirus coat protein (CP) gene was used for RT-PCR analysis of various legume samples originating from five countries and containing unidentified luteoviruses. Sequencing of the amplicons provided evidence for the occurrence of CpCSV also in Egypt, Morocco, Sudan, and Syria. Phylogenetic analysis of the CP nucleotide sequences of 18 samples from the five countries revealed the existence of two geographic groups of CpCSV isolates differing in CP sequences by 8-10%. Group I included isolates from Ethiopia and Sudan, while group II comprised those from Egypt, Morocco and Syria. For distinguishing these two groups, a simple RFLP test using HindIII and/or PvuII for cleavage of CP-gene-derived PCR products was developed. In ELISA and immunoelectron microscopy, however, isolates from these two groups could not be distinguished with rabbit antisera raised against a group-I isolate from Ethiopia (CpCSV-Eth) and a group-II isolate from Syria (CpCSV-Sy). Since none of the ten monoclonal antibodies (MAbs) that had been produced earlier against CpCSV-Eth reacted with group-II isolates, further MAbs were produced. Of the seven MAbs raised against CpCSV-Sy, two reacted only with CpCSV-Sy and two others with both CpCSV-Sy and -Eth. This indicated that there are group I- and II-specific and common (species-specific) epitopes on the CpCSV CP and that the corresponding MAbs are suitable for specific detection and discrimination of CpCSV isolates. Moreover, CpCSV-Sy (group II) caused more severe stunting and yellowing in faba bean than CpCSV-Eth (group I). In conclusion, our data indicate the existence of a geographically associated variation in the molecular, serological and presumably biological properties of CpCSV.

  2. GeoCSV: tabular text formatting for geoscience data

    NASA Astrophysics Data System (ADS)

    Stults, M.; Arko, R. A.; Davis, E.; Ertz, D. J.; Turner, M.; Trabant, C. M.; Valentine, D. W., Jr.; Ahern, T. K.; Carbotte, S. M.; Gurnis, M.; Meertens, C.; Ramamurthy, M. K.; Zaslavsky, I.; McWhirter, J.

    2015-12-01

    The GeoCSV design was developed within the GeoWS project as a way to provide a baseline of compatibility between tabular text data sets from various sub-domains in geoscience. Funded through NSF's EarthCube initiative, the GeoWS project aims to develop common web service interfaces for data access across hydrology, geodesy, seismology, marine geophysics, atmospheric science and other areas. The GeoCSV format is an essential part of delivering data via simple web services for discovery and utilization by both humans and machines. As most geoscience disciplines have developed and use data formats specific for their needs, tabular text data can play a key role as a lowest common denominator useful for exchanging and integrating data across sub-domains. The design starts with a core definition compatible with best practices described by the W3C - CSV on the Web Working Group (CSVW). Compatibility with CSVW is intended to ensure the broadest usability of data expressed as GeoCSV. An optional, simple, but limited metadata description mechanism was added to allow inclusion of important metadata with comma separated data, while staying with the definition of a "dialect" by CSVW. The format is designed both for creating new datasets and to annotate data sets already in a tabular text format such that they are compliant with GeoCSV.

  3. Phase I Study of Oral Vinorelbine in Combination with Erlotinib in Advanced Non-Small Cell Lung Cancer (NSCLC) Using Two Different Schedules

    PubMed Central

    Sutiman, Natalia; Zhang, Zhenxian; Tan, Eng Huat; Ang, Mei Kim; Tan, Shao-Weng Daniel; Toh, Chee Keong; Ng, Quan Sing; Chowbay, Balram; Lim, Wan-Teck

    2016-01-01

    Purpose This study aimed to evaluate the safety, tolerability and pharmacokinetics of the combination of oral vinorelbine with erlotinib using the conventional (CSV) and metronomic (MSV) dosing schedules in patients with advanced non-small cell lung cancer (NSCLC). Methods This was an open-label, multiple dose-escalation phase I study. An alternating 3+3 phase I design was employed to allow each schedule to enroll three patients sequentially at each dose level. Thirty patients with Stage IIIB/IV NSCLC were treated with escalating doses of oral vinorelbine starting at 40 mg/m2 on day 1 and 8 in the CSV group (N = 16) and at 100 mg/week in the MSV group (N = 14). Erlotinib was administered orally daily. Results The maximum tolerated dose was vinorelbine 80 mg/m2 with erlotinib 100 mg in the CSV group and vinorelbine 120 mg/week with erlotinib 100 mg in the MSV group. Grade 3/4 toxicities included neutropenia (N = 2; 13%) and hyponatremia (N = 1; 6%) in the CSV group, and neutropenia (N = 5; 36%) in the MSV group. Objective response was achieved in 38% and 29% in the CSV and MSV groups respectively. Vinorelbine co-administration did not significantly affect the pharmacokinetics of erlotinib and OSI-420 after initial dose. However, at steady-state, significantly higher Cmax, higher Cmin and lower CL/F of erlotinib were observed with increasing dose levels of vinorelbine in the CSV group. Significantly higher steady-state Cmin, Cavg and AUCss of erlotinib were observed with increasing dose levels of vinorelbine in the MSV group. Conclusions Combination of oral vinorelbine with erlotinib is feasible and tolerable in both the CSV and MSV groups. Trial Registration ClinicalTrials.gov NCT00702182 PMID:27135612

  4. The American College of Surgeons Children's Surgery Verification and Quality Improvement Program: implications for anesthesiologists.

    PubMed

    Houck, Constance S; Deshpande, Jayant K; Flick, Randall P

    2017-06-01

    The Task Force for Children's Surgical Care, an ad-hoc multidisciplinary group of invited leaders in pediatric perioperative medicine, was assembled in May 2012 to consider approaches to optimize delivery of children's surgical care in today's competitive national healthcare environment. Over the subsequent 3 years, with support from the American College of Surgeons (ACS) and Children's Hospital Association (CHA), the group established principles regarding perioperative resource standards, quality improvement and safety processes, data collection, and verification that were used to develop an ACS-sponsored Children's Surgery Verification and Quality Improvement Program (ACS CSV). The voluntary ACS CSV was officially launched in January 2017 and more than 125 pediatric surgical programs have expressed interest in verification. ACS CSV-verified programs have specific requirements for pediatric anesthesia leadership, resources, and the availability of pediatric anesthesiologists or anesthesiologists with pediatric expertise to care for infants and young children. The present review outlines the history of the ACS CSV, key elements of the program, and the standards specific to pediatric anesthesiology. As with the pediatric trauma programs initiated more than 40 years ago, this program has the potential to significantly improve surgical care for infants and children in the United States and Canada.

  5. One year follow-up of contrast sensitivity following conventional laser in situ keratomileusis and laser epithelial keratomileusis.

    PubMed

    Townley, Deirdre; Kirwan, Caitriona; O'Keefe, Michael

    2012-02-01

    To determine the effect of conventional laser in situ keratomileusis (LASIK) and laser epithelial keratomileusis (LASEK) for myopia on contrast sensitivity (CS) using the Pelli-Robson and Vector Vision CSV-1000E CS tests. A prospective, comparative study was conducted on 36 eyes of 36 patients with myopia undergoing LASIK (18 eyes) and LASEK (18 eyes). Surgery was performed using the Technolas 217z laser (Bausch & Lomb). CS was recorded preoperatively and at 3, 6 and 12 months postoperatively. No statistically significant difference was found in LogMAR uncorrected visual acuity post-LASIK (-0.02 ± 0.16) and LASEK (-0.04 ± 0.14). Using the Pelli-Robson, CS was significantly lower in the LASIK group 3 and 6 months postoperatively. No significant postoperative reduction in CS was observed in either treatment group. Using the CSV-1000E test, CS was significantly reduced post-LASIK at 3 (p = 0.05) and 6 (p = 0.05) cycles/degree under photopic conditions. No significant postoperative change occurred in the LASEK group under photopic or scotopic conditions. There was no significant difference in postoperative CS between the LASIK and LASEK groups at 3, 6, 12 or 18 cycles/degree using the CSV-1000E test. One year postoperatively, there was no difference in CS between both treatment groups using the Pelli-Robson and CSV-1000E tests. CS was reduced postoperatively in the LASIK group at the lower spatial frequencies under photopic conditions. No postoperative change was detected in CS following LASIK or LASEK using the Pelli-Robson test. © 2010 The Authors. Journal compilation © 2010 Acta Ophthalmol.

  6. Activation of an Aquareovirus, Chum Salmon Reovirus (CSV), by the Ciliates Tetrahymena thermophila and T. canadensis.

    PubMed

    Pinheiro, Marcel D O; Bols, Niels C

    2018-03-05

    For the first time, ciliates have been found to activate rather than inactivate a virus, chum salmon reovirus (CSV). Activation was seen as an increase in viral titre upon incubation of CSV at 22 °C with Tetrahymena canadenesis and two strains of T. thermophila: wild type (B1975) and a temperature conditional mutant for phagocytosis (NP1). The titre increase was not likely due to replication because CSV had no visible effects on the ciliates and no vertebrate virus has ever been shown unequivocally to replicate in ciliates. When incubated with B1975 and NP1 at 30 °C, CSV was activated only by B1975. Therefore, activation required CSV internalization because at 30 °C only B1975 exhibited phagocytosis. CSV replicated in fish cells at 18 to 26 °C but not at 30 °C. Collectively, these observations point to CSV activation being distinct from replication. Activation is attributed to the CSV capsid being modified in the ciliate phagosomal-lysosomal system and released in a more infectious form. When allowed to swim in CSV-infected fish cell cultures, collected, washed, and transferred to uninfected cultures, T. canadensis caused a CSV infection. Overall the results suggest that ciliates could have roles in the environmental dissemination of some fish viral diseases. © 2018 The Author(s) Journal of Eukaryotic Microbiology © 2018 International Society of Protistologists.

  7. Discovery of cell surface vimentin targeting mAb for direct disruption of GBM tumor initiating cells.

    PubMed

    Noh, Hyangsoon; Yan, Jun; Hong, Sungguan; Kong, Ling-Yuan; Gabrusiewicz, Konrad; Xia, Xueqing; Heimberger, Amy B; Li, Shulin

    2016-11-01

    Intracellular vimentin overexpression has been associated with epithelial-mesenchymal transition, metastasis, invasion, and proliferation, but cell surface vimentin (CSV) is less understood. Furthermore, it remains unknown whether CSV can serve as a therapeutic target in CSV-expressing tumor cells. We found that CSV was present on glioblastoma multiforme (GBM) cancer stem cells and that CSV expression was associated with spheroid formation in those cells. A newly developed monoclonal antibody against CSV, 86C, specifically and significantly induced apoptosis and inhibited spheroid formation in GBM cells in vitro. The addition of 86C to GBM cells in vitro also led to rapid internalization of vimentin and decreased GBM cell viability. These findings were associated with an increase in caspase-3 activity, indicating activation of apoptosis. Finally, treatment with 86C inhibited GBM progression in vivo. In conclusion, CSV-expressing GBM cells have properties of tumor initiating cells, and targeting CSV with the monoclonal antibody 86C is a promising approach in the treatment of GBM.

  8. Wadeable Streams Assessment Data

    EPA Pesticide Factsheets

    The Wadeable Streams Assessment (WSA) is a first-ever statistically-valid survey of the biological condition of small streams throughout the U.S. The U.S. Environmental Protection Agency (EPA) worked with the states to conduct the assessment in 2004-2005. Data for each parameter sampled in the Wadeable Streams Assessment (WSA) are available for downloading in a series of files as comma separated values (*.csv). Each *.csv data file has a companion text file (*.txt) that lists a dataset label and individual descriptions for each variable. Users should view the *.txt files first to help guide their understanding and use of the data.

  9. EPA FRS Facilities State Single File CSV Download

    EPA Pesticide Factsheets

    This page provides state comma separated value (CSV) files containing key information of all facilities and sites within the Facility Registry System (FRS). Each state zip file contains a single CSV file of key facility-level information.

  10. A Preliminary Investigation of Reversing RML: From an RDF dataset to its Column-Based data source

    PubMed Central

    Gougousis, Alexandros

    2015-01-01

    Abstract Background A large percentage of scientific data with tabular structure are published on the Web of Data as interlinked RDF datasets. When we come to the issue of long-term preservation of such RDF-based digital objects, it is important to provide full support for reusing them in the future. In particular, it should include means for both players who have no familiarity with RDF data model and, at the same time, who by working only with the native format of the data still provide sufficient information. To achieve this, we need mechanisms to bring the data back to their original format and structure. New information In this paper, we investigate how to perform the reverse process for column-based data sources. In particular, we devise an algorithm, RML2CSV, and exemplify its implementation in transforming an RDF dataset into its CSV tabular structure, through the use of the same RML mapping document that was used to generate the set of RDF triples. Through a set of content-based criteria, we attempt a comparative evaluation to measure the similarity between the rebuilt CSV and the original one. The results are promising and show that, under certain assumptions, RML2CSV reconstructs the same data with the same structure, offering more advanced digital preservation services. PMID:26312054

  11. Molecular Characterization of Watermelon Chlorotic Stunt Virus (WmCSV) from Palestine

    PubMed Central

    Ali-Shtayeh, Mohammed S.; Jamous, Rana M.; Mallah, Omar B.; Abu-Zeitoun, Salam Y.

    2014-01-01

    The incidence of watermelon chlorotic stunt disease and molecular characterization of the Palestinian isolate of Watermelon chlorotic stunt virus (WmCSV-[PAL]) are described in this study. Symptomatic leaf samples obtained from watermelon Citrullus lanatus (Thunb.), and cucumber (Cucumis sativus L.) plants were tested for WmCSV-[PAL] infection by polymerase chain reaction (PCR) and Rolling Circle Amplification (RCA). Disease incidence ranged between 25%–98% in watermelon fields in the studied area, 77% of leaf samples collected from Jenin were found to be mixed infected with WmCSV-[PAL] and SLCV. The full-length DNA-A and DNA-B genomes of WmCSV-[PAL] were amplified and sequenced, and the sequences were deposited in the GenBank. Sequence analysis of virus genomes showed that DNA-A and DNA-B had 97.6%–99.42% and 93.16%–98.26% nucleotide identity with other virus isolates in the region, respectively. Sequence analysis also revealed that the Palestinian isolate of WmCSV shared the highest nucleotide identity with an isolate from Israel suggesting that the virus was introduced to Palestine from Israel. PMID:24956181

  12. Differential Responses to a Visual Self-Motion Signal in Human Medial Cortical Regions Revealed by Wide-View Stimulation

    PubMed Central

    Wada, Atsushi; Sakano, Yuichi; Ando, Hiroshi

    2016-01-01

    Vision is important for estimating self-motion, which is thought to involve optic-flow processing. Here, we investigated the fMRI response profiles in visual area V6, the precuneus motion area (PcM), and the cingulate sulcus visual area (CSv)—three medial brain regions recently shown to be sensitive to optic-flow. We used wide-view stereoscopic stimulation to induce robust self-motion processing. Stimuli included static, randomly moving, and coherently moving dots (simulating forward self-motion). We varied the stimulus size and the presence of stereoscopic information. A combination of univariate and multi-voxel pattern analyses (MVPA) revealed that fMRI responses in the three regions differed from each other. The univariate analysis identified optic-flow selectivity and an effect of stimulus size in V6, PcM, and CSv, among which only CSv showed a significantly lower response to random motion stimuli compared with static conditions. Furthermore, MVPA revealed an optic-flow specific multi-voxel pattern in the PcM and CSv, where the discrimination of coherent motion from both random motion and static conditions showed above-chance prediction accuracy, but that of random motion from static conditions did not. Additionally, while area V6 successfully classified different stimulus sizes regardless of motion pattern, this classification was only partial in PcM and was absent in CSv. This may reflect the known retinotopic representation in V6 and the absence of such clear visuospatial representation in CSv. We also found significant correlations between the strength of subjective self-motion and univariate activation in all examined regions except for primary visual cortex (V1). This neuro-perceptual correlation was significantly higher for V6, PcM, and CSv when compared with V1, and higher for CSv when compared with the visual motion area hMT+. Our convergent results suggest the significant involvement of CSv in self-motion processing, which may give rise to its percept. PMID:26973588

  13. VizieR Online Data Catalog: Parenago Catalog of Stars in Orion Nebula (Parenago 1954)

    NASA Astrophysics Data System (ADS)

    Parenago, P. P.

    1997-10-01

    The present catalogue is a machine-readable version of the catalogue of stars in the area of the Orion nebula, published by P.P. Parenago (1954). The sky area between 5h 24m and 5h 36m in right ascension (1900.0) and between -4 and -7 degrees in declination (1900.0), containing the Orion nebula, has been investigated in that work. Ten of variable stars in original Parenago (1954) catalogue had CSV numbers (Kukarkin et al., 1951) but since that time all of them were confirmed as variables and included in GCVS (Kholopov et al., 1985a&b, 1987). We superseded CSV-numbers by GCVS-names in the machine-readable version for the following stars: ------------------------------------------------ Number in CSV-number GCVS-name the catalogue ------------------------------------------------ 1605 606 V372 ORI 1613 607 V373 ORI 1635 608 V374 ORI 1713 609 V375 ORI 1748 610 V387 ORI 1762 100569 V376 ORI 1974 617 V377 ORI 2183 625 V388 ORI 2393 630 V380 ORI 2478 634 V381 ORI ------------------------------------------------ (1 data file).

  14. Childhood Sexual Violence in Indonesia: A Systematic Review.

    PubMed

    Rumble, Lauren; Febrianto, Ryan Fajar; Larasati, Melania Niken; Hamilton, Carolyn; Mathews, Ben; Dunne, Michael P

    2018-01-01

    There has been relatively little research into the prevalence of childhood sexual violence (CSV) as well as the risk and protective factors for CSV in low- and middle-income countries including Indonesia. Systematic searches conducted in English and Bahasa Indonesia in this review identified 594 records published between 2006 and 2016 in peer-reviewed journals and other literature including 299 Indonesian records. Fifteen studies, including nine prevalence studies, met the quality appraisal criteria developed for this review. The review found that CSV research is scarce: Only one study included nationally representative prevalence estimates. Varying definitions for CSV, survey methods, and sample characteristics limited the generalizability of the data. The available evidence points to significant risk of sexual violence affecting both girls and boys across many geographical and institutional settings. Married adolescent girls are vulnerable to sexual violence by partners in their homes. Children in schools are vulnerable to CSV by peers and adults. Victims seldom disclose incidents and rarely seek support. In addition, early childhood experiences of trauma were strongly associated with later perpetration of sexual violence and revictimization. Limited information is available about protective factors. This review synthesizes evidence about what is currently known about CSV in Indonesia and identifies the strengths and weaknesses of the existing research. A more robust evidence base regarding CSV is required to better inform policy and justify investment into prevention programs.

  15. Voltammetric determination of arsenic in high iron and manganese groundwaters.

    PubMed

    Gibbon-Walsh, Kristoff; Salaün, Pascal; Uroic, M Kalle; Feldmann, Joerg; McArthur, John M; van den Berg, Constant M G

    2011-09-15

    Determination of the speciation of arsenic in groundwaters, using cathodic stripping voltammetry (CSV), is severely hampered by high levels of iron and manganese. Experiments showed that the interference is eliminated by addition of EDTA, making it possible to determine the arsenic speciation on-site by CSV. This work presents the CSV method to determine As(III) in high-iron or -manganese groundwaters in the field with only minor sample treatment. The method was field-tested in West-Bengal (India) on a series of groundwater samples. Total arsenic was subsequently determined after acidification to pH 1 by anodic stripping voltammetry (ASV). Comparative measurements by ICP-MS as reference method for total As, and by HPLC for its speciation, were used to corroborate the field data in stored samples. Most of the arsenic (78±0.02%) was found to occur as inorganic As(III) in the freshly collected waters, in accordance with previous studies. The data shows that the modified on-site CSV method for As(III) is a good measure of water contamination with As. The EDTA was also found to be effective in stabilising the arsenic speciation for longterm sample storage at room temperature. Without sample preservation, in water exposed to air and sunlight, the As(III) was found to become oxidised to As(V), and Fe(II) oxidised to Fe(III), removing the As(V) by adsorption on precipitating Fe(III)-hydroxides within a few hours. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Elements of a next generation time-series ASCII data file format for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Webster, C. J.

    2015-12-01

    Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format should provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format should use an existing time standard. The header should be easily human readable as well as machine parsable. The metadata format should be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format will increase the productivity of software engineers and scientists because fewer translators and checkers would be required. Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format would provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format would use existing time standard. The header would be easily human readable as well as machine parsable. The metadata format would be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format would increase the productivity of software engineers and scientists because fewer translators would be required.

  17. Assessing the Adaptability to Irregular Rest-Work Rhythms in Military Personnel

    DTIC Science & Technology

    2000-03-01

    Aeronautica Militare Italiana, CSV Reparto Medicina Aerospaziale Pratica di Mare, 00040 Pomezia (Roma) Italy 2 Dipartimento di Psicologia - Universitd...characteristics to adapt to increased work errors and impaired social work at unusual hours. and family relationship. In addition, there is an increased mortality...DUTY: a brief questionnaire on critical evaluation. Journal of Personality the sleep-wake cycle and vigilance filled in and Social Psychology. 58: 844

  18. Generating Ship-to-Shore Bulk Fuel Delivery Schedules for the Marine Expeditionary Unit

    DTIC Science & Technology

    2017-06-01

    Amphibious Ready Group . . . . . . . . . . . . . . . . . . . . . 9 2.2 Amphibious Connectors . . . . . . . . . . . . . . . . . . . . . 11 2.3 Fuel Containers...ARG Amphibious Ready Group BLT Battalion Landing Team COMPHIBRON Commander, Amphibious Squadron CSV Comma Separated Values LCAC Landing Craft Air...in the world. The MEU and the Amphibious Ready Group (ARG) create a highly capable amphibious force able to strike and conduct operations from the sea

  19. The crystal structures of potassium and cesium trivanadates

    USGS Publications Warehouse

    Evans, H.T.; Block, S.

    1966-01-01

    Potassium and cesium trivanadates are monoclinic and isomorphous, space group P21/m, with the following dimensions (Z = 2): KV3O8, a = 7.640 A, b = 8.380 A, c = 4.979 A, ??= 96?? 57???; CsV3O8, a = 8.176 A, b = 8.519 A, c = 4.988 A, ?? = 95?? 32???. The crystal structure of KV3O8 has been determined from hk0, 0kl, and h0l Weissenberg data with an R factor of 0.15. The structure of CsV3O8 has been refined with 1273 hkl Weissenberg data to an R factor of 0.089. The structures consist of corrugated sheets based on a linkage of distorted VO6, octahedra. Two of the vanadium atoms lie in double, square-pyramid groups V2O8, which are linked through opposite basal corners into chains along the b axis. The chains are joined laterally along the c axis into sheets by the third vanadium atom in VO groups, also forming part of a square-pyramid coordination. Various aspects of these structures are compared with other known oxovanadate structures.

  20. Primary mental health care for survivors of collective sexual violence in Rwanda.

    PubMed

    Zraly, Maggie; Rubin-Smith, Julia; Betancourt, Theresa

    2011-01-01

    This paper draws attention to the obligation and opportunity to respond to the mental health impacts of collective sexual violence (CSV) among genocide-rape survivors in post-genocide Rwanda. Qualitative data gathered from CSV survivors who were members of Rwandan women's genocide survivor associations are presented to illustrate how they strive to overcome adversity while seeking access to quality mental health care and using informal community mental health services. The results reveal that a system of high quality, holistic health and mental health care is yet needed to meet Rwandan CSV survivors' complex and serious health and mental health needs. Given that a rural health system, modelled on community-based, comprehensive HIV/AIDS care and treatment, is currently being implemented in Rwanda, we recommend enhancements to this model that would contribute to meeting the mental health care needs of CSV survivors while benefiting the health and mental health system as a whole within Rwanda.

  1. Induction of antiviral genes, Mx and vig-1, by dsRNA and Chum salmon reovirus in rainbow trout monocyte/macrophage and fibroblast cell lines.

    PubMed

    DeWitte-Orr, Stephanie J; Leong, Jo-Ann C; Bols, Niels C

    2007-09-01

    The expression of potential antiviral genes, Mx1, Mx2, Mx3 and vig-1, was studied in two rainbow trout cell lines: monocyte/macrophage RTS11 and fibroblast-like RTG-2. Transcripts were monitored by RT-PCR; Mx protein by Western blotting. In unstimulated cultures Mx1 and vig-1 transcripts were seen occasionally in RTS11 but rarely in RTG-2. A low level of Mx protein was seen in unstimulated RTS11 but not in RTG-2. In both cell lines, Mx and vig-1 transcripts were induced by a dsRNA, poly inosinic: poly cytidylic acid (poly IC), and by Chum salmon reovirus (CSV). Medium conditioned by cells previously exposed to poly IC or CSV and assumed to contain interferon (IFN) induced the antiviral genes in RTS11. However, RTG-2 responded only to medium conditioned by RTG-2 exposed previously to CSV. In both cell lines, poly IC and CSV induced Mx transcripts in the presence of cycloheximide, suggesting a direct induction mechanism, independent of IFN, was also possible. For CSV, ribavirin blocked induction in RTS11 but not in RTG-2, suggesting viral RNA synthesis was required for induction only in RTS11. In both RTS11 and RTG-2 cultures, Mx protein showed enhanced accumulation by 24h after exposure to poly IC and CSV, but subsequently Mx protein levels declined back to control levels in RTS11 but not in RTG-2. These results suggest that Mx can be regulated differently in macrophages and fibroblasts.

  2. Role Discovery in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    RolX takes the features from Re-FeX or any other feature matrix as input and outputs role assignments (clusters). The output of RolX is a csv file containing the node-role memberships and a csv file containing the role-feature definitions.

  3. Complete Nucleotide Sequence of Watermelon Chlorotic Stunt Virus Originating from Oman

    PubMed Central

    Khan, Akhtar J.; Akhtar, Sohail; Briddon, Rob W.; Ammara, Um; Al-Matrooshi, Abdulrahman M.; Mansoor, Shahid

    2012-01-01

    Watermelon chlorotic stunt virus (WmCSV) is a bipartite begomovirus (genus Begomovirus, family Geminiviridae) that causes economic losses to cucurbits, particularly watermelon, across the Middle East and North Africa. Recently squash (Cucurbita moschata) grown in an experimental field in Oman was found to display symptoms such as leaf curling, yellowing and stunting, typical of a begomovirus infection. Sequence analysis of the virus isolated from squash showed 97.6–99.9% nucleotide sequence identity to previously described WmCSV isolates for the DNA A component and 93–98% identity for the DNA B component. Agrobacterium-mediated inoculation to Nicotiana benthamiana resulted in the development of symptoms fifteen days post inoculation. This is the first bipartite begomovirus identified in Oman. Overall the Oman isolate showed the highest levels of sequence identity to a WmCSV isolate originating from Iran, which was confirmed by phylogenetic analysis. This suggests that WmCSV present in Oman has been introduced from Iran. The significance of this finding is discussed. PMID:22852046

  4. Complete nucleotide sequence of watermelon chlorotic stunt virus originating from Oman.

    PubMed

    Khan, Akhtar J; Akhtar, Sohail; Briddon, Rob W; Ammara, Um; Al-Matrooshi, Abdulrahman M; Mansoor, Shahid

    2012-07-01

    Watermelon chlorotic stunt virus (WmCSV) is a bipartite begomovirus (genus Begomovirus, family Geminiviridae) that causes economic losses to cucurbits, particularly watermelon, across the Middle East and North Africa. Recently squash (Cucurbita moschata) grown in an experimental field in Oman was found to display symptoms such as leaf curling, yellowing and stunting, typical of a begomovirus infection. Sequence analysis of the virus isolated from squash showed 97.6-99.9% nucleotide sequence identity to previously described WmCSV isolates for the DNA A component and 93-98% identity for the DNA B component. Agrobacterium-mediated inoculation to Nicotiana benthamiana resulted in the development of symptoms fifteen days post inoculation. This is the first bipartite begomovirus identified in Oman. Overall the Oman isolate showed the highest levels of sequence identity to a WmCSV isolate originating from Iran, which was confirmed by phylogenetic analysis. This suggests that WmCSV present in Oman has been introduced from Iran. The significance of this finding is discussed.

  5. Carbon Source-Dependent Inducible Metabolism of Veratryl Alcohol and Ferulic Acid in Pseudomonas putida CSV86

    PubMed Central

    Mohan, Karishma

    2017-01-01

    ABSTRACT Pseudomonas putida CSV86 degrades lignin-derived metabolic intermediates, viz., veratryl alcohol, ferulic acid, vanillin, and vanillic acid, as the sole sources of carbon and energy. Strain CSV86 also degraded lignin sulfonate. Cell respiration, enzyme activity, biotransformation, and high-pressure liquid chromatography (HPLC) analyses suggest that veratryl alcohol and ferulic acid are metabolized to vanillic acid by two distinct carbon source-dependent inducible pathways. Vanillic acid was further metabolized to protocatechuic acid and entered the central carbon pathway via the β-ketoadipate route after ortho ring cleavage. Genes encoding putative enzymes involved in the degradation were found to be present at fer, ver, and van loci. The transcriptional analysis suggests a carbon source-dependent cotranscription of these loci, substantiating the metabolic studies. Biochemical and quantitative real-time (qRT)-PCR studies revealed the presence of two distinct O-demethylases, viz., VerAB and VanAB, involved in the oxidative demethylation of veratric acid and vanillic acid, respectively. This report describes the various steps involved in metabolizing lignin-derived aromatic compounds at the biochemical level and identifies the genes involved in degrading veratric acid and the arrangement of phenylpropanoid metabolic genes as three distinct inducible transcription units/operons. This study provides insight into the bacterial degradation of lignin-derived aromatics and the potential of P. putida CSV86 as a suitable candidate for producing valuable products. IMPORTANCE Pseudomonas putida CSV86 metabolizes lignin and its metabolic intermediates as a carbon source. Strain CSV86 displays a unique property of preferential utilization of aromatics, including for phenylpropanoids over glucose. This report unravels veratryl alcohol metabolism and genes encoding veratric acid O-demethylase, hitherto unknown in pseudomonads, thereby providing new insight into the metabolic pathway and gene pool for lignin degradation in bacteria. The biochemical and genetic characterization of phenylpropanoid metabolism makes it a prospective system for its application in producing valuable products, such as vanillin and vanillic acid, from lignocellulose. This study supports the immense potential of P. putida CSV86 as a suitable candidate for bioremediation and biorefinery. PMID:28188206

  6. Recursive Feature Extraction in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  7. A Prototype Web-based system for GOES-R Space Weather Data

    NASA Astrophysics Data System (ADS)

    Sundaravel, A.; Wilkinson, D. C.

    2010-12-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.

  8. The Last Word on TLE (Briefing Charts)

    DTIC Science & Technology

    2015-05-12

    hist, show, plot, figure import pandas as pd from os import chdir 16 File processing • filename="TPSrun.csv" arr = pd.read_csv(filename) r... pandas theta = atan2(model.beta.x, 1.0) tArrX=[ ] ; tArrY=[ ] # define arrays for i in range(len(arr.y)): # how to put stuff into an array! append

  9. Matter-induced charge-symmetry-violating NN potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biswas, Subhrajyoti; Roy, Pradip; Dutt-Mazumder, Abhee K.

    2010-01-15

    We construct a density-dependent, Class III, charge-symmetry-violating (CSV) potential due to mixing of the {rho}-{omega} meson with off-shell corrections. Here, in addition to the usual vacuum contribution, the matter-induced mixing of {rho}-{omega} is also included. It is observed that the contribution of the density-dependent CSV potential is comparable to that of the vacuum contribution.

  10. Storm Prediction Center Today's Storm Reports

    Science.gov Websites

    )(?) Time Location County State Lat Lon Comments 2056 7 N BUFORD ALBANY WY 4121 10530 TORNADO SPOTTED NORTH 4236 10503 ON THE GROUND AT THIS TIME. (CYS) 2215 15 N CHEYENNE LARAMIE WY 4136 10479 TORNADO MOVING TOWARDS I-25 BETWEEN MM 25 AND 35. (CYS) Hail Reports (CSV) (Raw Hail CSV)(?) Time Size Location County

  11. Genotypic variability enhances the reproducibility of an ecological study.

    PubMed

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  12. {rho}-{omega} mixing and spin dependent charge-symmetry violating potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biswas, Subhrajyoti; Roy, Pradip; Dutt-Mazumder, Abhee K.

    2008-10-15

    We construct the charge symmetry violating (CSV) nucleon-nucleon potential induced by the {rho}{sup 0}-{omega} mixing due to the neutron-proton mass difference driven by the NN loop. Analytical expression for the two-body CSV potential is presented containing both the central and noncentral NN interaction. We show that the {rho}NN tensor interaction can significantly enhance the charge symmetry violating NN interaction even if the momentum dependent off-shell {rho}{sup 0}-{omega} mixing amplitude is considered. It is also shown that the inclusion of form factors removes the divergence arising out of the contact interaction. Consequently, we see that the precise size of the computedmore » scattering length difference depends on how the short-range aspects of the CSV potential are treated.« less

  13. Facilitating the analysis of the multifocal electroretinogram using the free software environment R.

    PubMed

    Bergholz, Richard; Rossel, Mirjam; Dutescu, Ralf M; Vöge, Klaas P; Salchow, Daniel J

    2018-01-01

    The large amount of data rendered by the multifocal electroretinogram (mfERG) can be analyzed and visualized in various ways. The evaluation and comparison of more than one examination is time-consuming and prone to create errors. Using the free software environment R we developed a solution to average the data of multiple examinations and to allow a comparison of different patient groups. Data of single mfERG recordings as exported in .csv format from a RETIport 21 system (version 7/03, Roland Consult) or manually compiled .csv files are the basis for the calculations. The R software extracts response densities and implicit times of N1 and P1 for the sum response, each ring eccentricity, and each single hexagon. Averages can be calculated for as many subjects as needed. The mentioned parameters can then be compared to another group of patients or healthy subjects. Application of the software is illustrated by comparing 11 patients with chloroquine maculopathy to a control group of 7 healthy subjects. The software scripts display response density and implicit time 3D plots of each examination as well as of the group averages. Differences of the group averages are presented as 3D and grayscale 2D plots. Both groups are compared using the t-test with Bonferroni correction. The group comparison is furthermore illustrated by the average waveforms and by boxplots of each eccentricity. This software solution on the basis of the programming language R facilitates the clinical and scientific use of the mfERG and aids in interpretation and analysis.

  14. Analysis of Forensic Super Timelines

    DTIC Science & Technology

    2012-06-14

    Components of Incident Response (Mandia, Prosise & Pepe, 2003). Detection of an incident can be complex. It can occur through the use of an intrusion ...ECHO =================================== REM - Convert DirList.txt to CSV File, DirList.CSV REM ...Directory Processing REM - NOTE: Must use !_dir! instead of %_dir% since it’s in the executing line of a loop FOR /F "tokens=1,2,3,4,5*" %%G IN

  15. Transcriptional Modulation of Transport- and Metabolism-Associated Gene Clusters Leading to Utilization of Benzoate in Preference to Glucose in Pseudomonas putida CSV86

    PubMed Central

    Choudhary, Alpa; Modak, Arnab; Apte, Shree K.

    2017-01-01

    ABSTRACT The effective elimination of xenobiotic pollutants from the environment can be achieved by efficient degradation by microorganisms even in the presence of sugars or organic acids. Soil isolate Pseudomonas putida CSV86 displays a unique ability to utilize aromatic compounds prior to glucose. The draft genome and transcription analyses revealed that glucose uptake and benzoate transport and metabolism genes are clustered at the glc and ben loci, respectively, as two distinct operons. When grown on glucose plus benzoate, CSV86 displayed significantly higher expression of the ben locus in the first log phase and of the glc locus in the second log phase. Kinetics of substrate uptake and metabolism matched the transcription profiles. The inability of succinate to suppress benzoate transport and metabolism resulted in coutilization of succinate and benzoate. When challenged with succinate or benzoate, glucose-grown cells showed rapid reduction in glc locus transcription, glucose transport, and metabolic activity, with succinate being more effective at the functional level. Benzoate and succinate failed to interact with or inhibit the activities of glucose transport components or metabolic enzymes. The data suggest that succinate and benzoate suppress glucose transport and metabolism at the transcription level, enabling P. putida CSV86 to preferentially metabolize benzoate. This strain thus has the potential to be an ideal host to engineer diverse metabolic pathways for efficient bioremediation. IMPORTANCE Pseudomonas strains play an important role in carbon cycling in the environment and display a hierarchy in carbon utilization: organic acids first, followed by glucose, and aromatic substrates last. This limits their exploitation for bioremediation. This study demonstrates the substrate-dependent modulation of ben and glc operons in Pseudomonas putida CSV86, wherein benzoate suppresses glucose transport and metabolism at the transcription level, leading to preferential utilization of benzoate over glucose. Interestingly, succinate and benzoate are cometabolized. These properties are unique to this strain compared to other pseudomonads and open up avenues to unravel novel regulatory processes. Strain CSV86 can serve as an ideal host to engineer and facilitate efficient removal of recalcitrant pollutants even in the presence of simpler carbon sources. PMID:28733285

  16. Effects of natural antimicrobials with modified atmosphere packaging on the growth kinetics of Listeria monocytogenes in ravioli at various temperatures

    PubMed Central

    Ro, Eun Young; Kim, Geun Su; Kwon, Do Young; Park, Young Min; Cho, Sang Woo; Lee, Sang Yun; Yeo, Ik Hyun

    2017-01-01

    Abstract The objective of this study was to investigate the antimicrobial effects of cultured sugar/vinegar (CSV) blend and nisin to control the risk of Listeria monocytogenes in ready to cook (RTC) ravioli. Ravioli dough was prepared with 0.1, 0.3, 0.5, 1% CSV blend and 0.1, 0.2, and 0.3% nisin. Inoculated spinach or artichoke raviolis with 2.0 ± 0.5 log cfu/g of L. monocytogenes were packed aerobically or using modified atmosphere packaging (MAP), and then stored at 4, 10, 17, and 24 °C for 60 days. Growth kinetic parameters of the observed data fit well to the Baranyi equation. Ravioli with spinach filling materials yielded a higher risk than that with artichoke. L. monocytogenes was able to survive in ravioli with artichoke, but did not grow. The addition of 1% CSV blend or 0.3% nisin in spinach ravioli with the combination of MAP effectively controlled the growth of L. monocytogenes at the temperature below 10 °C. The organoleptic quality of spinach ravioli was not also affected by the application of 1% CSV blend. Therefore, the CSV blend can be recommended to improve the microbial safety and quality of natural RTC ravioli at retail market. Practical applications The risk of ravioli was affected by the filling materials of ravioli at retail market. Addition of 1% cultured sugar/vinegar blend in dough substantially contributes to the extension of shelf‐life of MAP spinach raviolis. classification and regression tree analysis results indicate that refrigeration temperature is the main control factor to affect lag time and growth rate, while packaging method is critical for maximum population density. PMID:29456276

  17. pi-eta mixing and charge symmetry violating NN potential in matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biswas, Subhrajyoti; Roy, Pradip; Dutt-Mazumder, Abhee K.

    2010-06-15

    We construct density-dependent class III charge symmetry violating (CSV) potential caused by the mixing of pi-eta mesons with off-shell corrections. The density dependence enters through the nonvanishing pi-eta mixing driven by both the neutron-proton mass difference and their asymmetric density distribution. The contribution of density-dependent mixing to the CSV potential is found to be appreciably larger than that of the vacuum part.

  18. Transparency, Accountability, and Engagement: A Recipe for Building Trust in Policing

    DTIC Science & Technology

    2017-06-01

    Toward Community-orientated Policing: Potential, Basic Requirements, and Threshold Questions,” Crime and Delinquency 33 (1987): 6–30. 49 More, Current...States,” in Sourcebook of Criminal Justice Statistics Online, accessed June 4, 2017, http://www.albany.edu/sourcebook/csv/ t2332011.csv. 89 Gary...to-date crime statistics , and empowered them to think creatively to develop individualized plans to address crime trends and conditions. His focus

  19. UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis

    DTIC Science & Technology

    2013-06-01

    CRN Common Random Numbers CSV Comma Separated Values DoE Design of Experiment GLM Generalized Linear Model HVT High Value Target JAR Java ARchive JMF... Java Media Framework JRE Java runtime environment Mason Multi-Agent Simulator Of Networks MOE Measure Of Effectiveness MOP Measures Of Performance...with every set several times, and to write a CSV file with the results. Rather than scripting the agent behavior deterministically, the agents should

  20. Determination of genotoxic effects of methidathion alkaline hydrolysis in human lymphocytes using the micronucleus assay and square-wave voltammetry.

    PubMed

    Stivaktakis, Polychronis D; Giannakopoulos, Evangelos; Vlastos, Dimitris; Matthopoulos, Demetrios P

    2017-02-01

    The interaction of pesticides with environmental factors, such as pH, may result in alterations of their physicochemical properties and should be taken into consideration in regard to their classification. This study investigates the genotoxicity of methidathion and its alkaline hydrolysis by-products in cultured human lymphocytes, using the square-wave voltammetry (square wave-adsorptive cathodic stripping voltammetry (SW-AdCSV) technique) and the cytokinesis block micronucleus assay (CBMN assay). According to the SW-AdCSV data the alkaline hydrolysis of methidathion results in two new molecules, one non-electro-active and a second electro-active which is more genotoxic than methidathion itself in cultured human lymphocytes, inducing higher micronuclei frequencies. The present study confirms the SW-AdCSV technique as a voltammetric method which can successfully simulates the electrodynamics of the cellular membrane. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Mynodbcsv: lightweight zero-config database solution for handling very large CSV files.

    PubMed

    Adaszewski, Stanisław

    2014-01-01

    Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach--data stay mostly in the CSV files; "zero configuration"--no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.

  2. Mynodbcsv: Lightweight Zero-Config Database Solution for Handling Very Large CSV Files

    PubMed Central

    Adaszewski, Stanisław

    2014-01-01

    Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: “no copy” approach – data stay mostly in the CSV files; “zero configuration” – no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results. PMID:25068261

  3. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  4. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  5. Effect due to charge symmetry violation on the Paschos-Wolfenstein relation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding Yong; Ma Boqiang; CCAST

    2006-03-01

    The modification of the Paschos-Wolfenstein relation is investigated when the charge symmetry violations of valence and sea quark distributions in the nucleon are taken into account. We also study qualitatively the impact of charge symmetry violation (CSV) effect on the extraction of sin{sup 2}{theta}{sub w} from deep-inelastic neutrino- and antineutrino-nuclei scattering within the light-cone meson-baryon fluctuation model. We find that the effect of CSV is too small to give a sizable contribution to the NuTeV result with various choices of mass difference inputs, which is consistence with the prediction that the strange-antistrange asymmetry can account for largely the NuTeV deviationmore » in this model. It is noticeable that the effect of CSV might contribute to the NuTeV deviation when the larger difference between the internal momentum scales, {alpha}{sub p} of the proton and {alpha}{sub n} of the neutron, is considered.« less

  6. Interventions in the commercial sex industry during the rise in syphilis rates among men who have sex with men (MSM).

    PubMed

    Taylor, Melanie; Montoya, Jorge A; Cantrell, Russell; Mitchell, Samuel J; Williams, Mark; Jordahl, Lori; Freeman, Millicent; Brown, James; Broussard, Dawn; Roland, Eric

    2005-10-01

    Describe sexually transmitted disease/human immunodeficiency virus prevention interventions targeting men who have sex with men (MSM) in commercial sex venues (CSV). Compilation of descriptive and evaluation data from the CDC 8-city MSM Syphilis Response on interventions conducted in bathhouses/sex clubs, circuit parties, the Internet, male sex workers, and the adult film industry. Interventions in the commercial sex industry (CSI) often involved multiple collaborative efforts between public health departments (PHD), community-based organizations (CBO), and CSV owners and managers. Education and condoms were provided at multiple venues, including circuit parties, bathhouses, and sex clubs. CBO staff reported one-on-one street and CSV outreach to engage MSM at risk. Evaluation data demonstrate that MSM exposed to media campaigns were more aware of syphilis and more likely to have been tested for syphilis than MSM who did not see the campaigns. PHD and CBO are using multiple means of reaching MSM in the CSI. Evaluations are needed to determine which of these efforts decreases syphilis transmission.

  7. A new version of Visual tool for estimating the fractal dimension of images

    NASA Astrophysics Data System (ADS)

    Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.

    2010-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.

  8. Effects of feeding strategy and age on live animal performance, carcass characteristics, and economics of short-term feeding programs for culled beef cows.

    PubMed

    Sawyer, J E; Mathis, C P; Davis, B

    2004-12-01

    To evaluate production and economic effects of feeding management strategy and age on intensively managed culled beef cows, a study was conducted using 125 cows of British breeding blocked by age (Young = 3 and 4 yr olds; LowMid = 5 and 6 yr olds; HighMid = 7 and 8 yr olds; and Aged = 9 yr and older) and assigned to one of three steam-flaked corn based feeding strategies. Treatments were as follows: Conservative (CSV), 30% roughage throughout; Standard (STD), decrease roughage from 30 to 10% over 20 d; and Aggressive (AGR), decrease roughage from 30 to 10% over 10 d. There were four pens per treatment in a randomized complete block design. Cows were fed for a total of 54 d, and BW was measured on d 0, 14, 28, 42, and 54. Half the cows from each pen were randomly selected and slaughtered at a commercial abattoir, and carcass data were collected. Average daily gain, daily DMI, and G:F during each weigh period and across the entire feeding period were calculated. Over the 54-d feeding period, strategies that employed more energy-dense diets numerically increased ADG (1.28, 1.63, and 1.55 +/- 0.14 kg/d for CSV, STD, and AGR; P = 0.26) and decreased DMI (11.91, 10.74, and 10.89 +/- 0.27 kg/d for CSV, STD, and AGR; P = 0.05), such that G:F was lower for CSV than for STD or AGR (0.105, 0.150, and 0.141 +/- 0.010; P = 0.05). Carcass weight was least for the CSV strategy (298 kg) and greatest for STD (328 kg); AGR resulted in intermediate carcass weight (317 +/- 6 kg; P = 0.04). Total cost of gain was over 30% greater for CSV strategy than for STD or AGR strategies (P < 0.01). In many cases, block effects (age) had a greater effect on responses than treatments. Average daily gain, DMI, and G:F decreased linearly with age (P < 0.01). Hot carcass weight, dressing percent, and fat thickness decreased linearly with age (P < 0.03); yield grade decreased and carcass maturity attributes increased linearly with age (P < 0.02). Performance and intake differences resulted in linear increases in total cost of gain (P < 0.01) and breakeven price (P = 0.03) with increasing age. These data indicate advantages to more aggressive feeding management strategies for culled beef cows, although maximal intake may be achieved with higher-roughage diets. Despite management effects, an increase in market price above purchase price may be required for intensive feeding of culled beef cows to be a profitable enterprise.

  9. The measurement of peripheral blood volume reactions to tilt test by the electrical impedance technique after exercise in athletes

    NASA Astrophysics Data System (ADS)

    Melnikov, A. A.; Popov, S. G.; Nikolaev, D. V.; Vikulov, A. D.

    2013-04-01

    We have investigated the distribution of peripheral blood volumes in different regions of the body in response to the tilt-test in endurance trained athletes after aerobic exercise. Distribution of peripheral blood volumes (ml/beat) simultaneously in six regions of the body (two legs, two hands, abdomen, neck and ECG) was assessed in response to the tilt-test using the impedance method (the impedance change rate (dZ/dT). Before and after exercise session cardiac stroke (CSV) and blood volumes in legs, arms and neck were higher in athletes both in lying and standing positions. Before exercise the increase of heart rate and the decrease of a neck blood volume in response to tilting was lower (p <0.05) but the decrease of leg blood volumes was higher (p<0.001) in athletes. The reactions in arms and abdomen blood volumes were similar. Also, the neck blood volumes as percentage of CSV (%/CSV) did not change in the control but increased in athletes (p <0.05) in response to the tilt test. After (10 min recovery) the aerobic bicycle exercise (mean HR = 156±8 beat/min, duration 30 min) blood volumes in neck and arms in response to the tilting were reduced equally, but abdomen (p<0.05) and leg blood volumes (p <0.001) were lowered more significantly in athletes. The neck blood flow (%/CSV) did not change in athletes but decreased in control (p<0.01), which was offset by higher tachycardia in response to tilt-test in controls after exercise. The data demonstrate greater orthostatic tolerance in athletes both before and after exercise during fatigue which is due to effective distribution of blood flows aimed at maintaining cerebral blood flow.

  10. FORMATOMATIC: a program for converting diploid allelic data between common formats for population genetic analysis.

    PubMed

    Manoukis, Nicholas C

    2007-07-01

    There has been a great increase in both the number of population genetic analysis programs and the size of data sets being studied with them. Since the file formats required by the most popular and useful programs are variable, automated reformatting or conversion between them is desirable. formatomatic is an easy to use program that can read allelic data files in genepop, raw (csv) or convert formats and create data files in nine formats: raw (csv), arlequin, genepop, immanc/bayesass +, migrate, newhybrids, msvar, baps and structure. Use of formatomatic should greatly reduce time spent reformatting data sets and avoid unnecessary errors.

  11. Nucleotide sequence of a chickpea chlorotic stunt virus relative that infects pea and faba bean in China.

    PubMed

    Zhou, Cui-Ji; Xiang, Hai-Ying; Zhuo, Tao; Li, Da-Wei; Yu, Jia-Lin; Han, Cheng-Gui

    2012-07-01

    We determined the genome sequence of a new polerovirus that infects field pea and faba bean in China. Its entire nucleotide sequence (6021 nt) was most closely related (83.3% identity) to that of an Ethiopian isolate of chickpea chlorotic stunt virus (CpCSV-Eth). With the exception of the coat protein (encoded by ORF3), amino acid sequence identities of all gene products of this virus to those of CpCSV-Eth and other poleroviruses were <90%. This suggests that it is a new member of the genus Polerovirus, and the name pea mild chlorosis virus is proposed.

  12. File format for normalizing radiological concentration exposure rate and dose rate data for the effects of radioactive decay and weathering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Terrence D.

    2017-04-01

    This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less

  13. Web-Based Customizable Viewer for Mars Network Overflight Opportunities

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Wallick, Michael N.; Allard, Daniel A.

    2012-01-01

    This software displays a full summary of information regarding the overflight opportunities between any set of lander and orbiter pairs that the user has access to view. The information display can be customized, allowing the user to choose which fields to view/hide and filter. The software works from a Web browser on any modern operating system. A full summary of information pertaining to an overflight is available, including the proposed, tentative, requested, planned, and implemented. This gives the user a chance to quickly check for inconsistencies and fix any problems. Overflights from multiple lander/ orbiter pairs can be compared instantly, and information can be filtered through the query and shown/hidden, giving the user a customizable view of the data. The information can be exported to a CSV (comma separated value) or XML (extensible markup language) file. The software only grants access to users who are authorized to view the information. This application is an addition to the MaROS Web suite. Prior to this addition, information pertaining to overflight opportunities would have a limited amount of data (displayed graphically) and could only be shown in strict temporal ordering. This new display shows more information, allows direct comparisons between overflights, and allows the data to be manipulated in ways that it was unable to be done in the past. The current software solution is to use CSV files to view the overflight opportunities.

  14. Photoelectron emission from LiF surfaces by ultrashort electromagnetic pulses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acuna, M. A.; Gravielle, M. S.; Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires

    2011-03-15

    Energy- and angle-resolved electron emission spectra produced by incidence of ultrashort electromagnetic pulses on a LiF(001) surface are studied by employing a distorted-wave method named the crystal surface-Volkov (CSV) approximation. The theory makes use of the Volkov phase to describe the action of the external electric field on the emitted electron, while the electron-surface interaction is represented within the tight-binding model. The CSV approach is applied to investigate the effects introduced by the crystal lattice when the electric field is oriented parallel to the surface plane. These effects are essentially governed by the vector potential of the external field, whilemore » the influence of the crystal orientation was found to be negligible.« less

  15. Metal–organic complexation in the marine environment

    PubMed Central

    Luther, George W; Rozan, Timothy F; Witter, Amy; Lewis, Brent

    2001-01-01

    We discuss the voltammetric methods that are used to assess metal–organic complexation in seawater. These consist of titration methods using anodic stripping voltammetry (ASV) and cathodic stripping voltammetry competitive ligand experiments (CSV-CLE). These approaches and a kinetic approach using CSV-CLE give similar information on the amount of excess ligand to metal in a sample and the conditional metal ligand stability constant for the excess ligand bound to the metal. CSV-CLE data using different ligands to measure Fe(III) organic complexes are similar. All these methods give conditional stability constants for which the side reaction coefficient for the metal can be corrected but not that for the ligand. Another approach, pseudovoltammetry, provides information on the actual metal–ligand complex(es) in a sample by doing ASV experiments where the deposition potential is varied more negatively in order to destroy the metal–ligand complex. This latter approach gives concentration information on each actual ligand bound to the metal as well as the thermodynamic stability constant of each complex in solution when compared to known metal–ligand complexes. In this case the side reaction coefficients for the metal and ligand are corrected. Thus, this method may not give identical information to the titration methods because the excess ligand in the sample may not be identical to some of the actual ligands binding the metal in the sample. PMID:16759421

  16. morph

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodall, John; Iannacone, Mike; Athalye, Anish

    2013-08-01

    Morph is a framework and domain-specific language (DSL) that helps parse and transform structured documents. It currently supports several file formats including XML, JSON, and CSV, and custom formats are usable as well.

  17. Exploration Gap Assessment (FY13 Update)

    DOE Data Explorer

    Dan Getman

    2013-09-30

    This submission contains an update to the previous Exploration Gap Assessment funded in 2012, which identify high potential hydrothermal areas where critical data are needed (gap analysis on exploration data). The uploaded data are contained in two data files for each data category: A shape (SHP) file containing the grid, and a data file (CSV) containing the individual layers that intersected with the grid. This CSV can be joined with the map to retrieve a list of datasets that are available at any given site. A grid of the contiguous U.S. was created with 88,000 10-km by 10-km grid cells, and each cell was populated with the status of data availability corresponding to five data types: 1. well data 2. geologic maps 3. fault maps 4. geochemistry data 5. geophysical data

  18. BP Spill Sampling and Monitoring Data

    EPA Pesticide Factsheets

    This dataset analyzes waste from the the British Petroleum Deepwater Horizon Rig Explosion Emergency Response, providing opportunity to query data sets by metadata criteria and find resulting raw datasets in CSV format.The data query tool allows users to download EPA's air, water and sediment sampling and monitoring data that has been collected in response to the BP oil spill. All sampling and monitoring data that has been collected to date is available for download as raw structured data.The query tools enables CSV file creation to be refined based on the following search criteria: date range (between April 28, 2010 and 9/29/2010); location by zip, city, or county; media (solid waste, weathered oil, air, surface water, liquid waste, tar, sediment, water); substance categories (based on media selection) and substances (based on substance category selection).

  19. Atlantic salmon endothelial cells from the heart were more susceptible than fibroblasts from the bulbus arteriosus to four RNA viruses but protected from two viruses by dsRNA pretreatment.

    PubMed

    Pham, Phuc H; Tong, Winnie W L; Misk, Ehab; Jones, Ginny; Lumsden, John S; Bols, Niels C

    2017-11-01

    Heart diseases caused by viruses are major causes of Atlantic salmon aquaculture loss. Two Atlantic salmon cardiovascular cell lines, an endothelial cell line (ASHe) from the heart and a fibroblast cell line (BAASf) from the bulbus arteriosus, were evaluated for their response to four fish viruses, CSV, IPNV, VHSV IVa and VHSV IVb, and the innate immune agonist, double-stranded RNA mimic poly IC. All four viruses caused cytopathic effects in ASHe and BAASf. However, ASHe was more susceptible to all four viruses than BAASf. When comparing between the viruses, ASHe cells were found to be moderately susceptible to CSV and VHSV IVb, but highly susceptible to IPNV and VHSV IVa induced cell death. All four viruses were capable of propagating in the ASHe cell line, leading to increases in virus titre over time. In BAASf, CSV and IPNV produced more than one log increase in titre from initial infection, but VHSV IVb and IVa did not. When looking at the antiviral response of both cell lines, Mx proteins were induced in ASHe and BAASf by poly IC. All four viruses induced Mx proteins in BAASf, while only CSV and VHSV IVb induced Mx proteins in ASHe. IPNV and VHSV IVa suppressed Mx proteins expression in ASHe. Pretreatment of ASHe with poly IC to allow for Mx proteins accumulation protected the culture from subsequent infections with IPNV and VHSV IVa, resulting in delayed cell death, reduced virus titres and reduced viral proteins expression. These data suggest that endothelial cells potentially can serve as points of infections for viruses in the heart and that two of the four viruses, IPNV and VHSV IVa, have mechanisms to avoid or downregulate antiviral responses in ASHe cells. Furthermore, the high susceptibility of the ASHe cell line to IPNV and VHSV IVa can make it a useful tool for studying antiviral compounds against these viruses and for general detection of fish viruses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Resistivity variations related to the large March 9, 1998 eruption at La Fournaise volcano inferred by continuous MT monitoring

    NASA Astrophysics Data System (ADS)

    Wawrzyniak, Pierre; Zlotnicki, Jacques; Sailhac, Pascal; Marquis, Guy

    2017-11-01

    The 2645 m-high La Fournaise volcano, located in the Southwest of Réunion Island (Indian Ocean), is a shield basaltic volcano where effusive eruptions generally occur along long fissures starting from the summit, alongside major fractures that characterize the eruptions' dynamism and effusivity. Between 1992 and 1998, the volcano underwent a quiet period during which few earthquakes were recorded. Minor seismic activity returned after 1997 and picked up in March 1998 during the 35 h preceding the March 9 eruption. From 1996, two autonomous stations (CSV and BAV) were installed on the volcano. CSV was located inside the Enclos Fouqué caldera while BAV was positioned 8.2 km NW of the volcano summit. Horizontal components of the electric and magnetic fields were sampled every 20 s. Continuous time-series were available from 1996 to 1999 at CSV, and from 1997 to March 1998 at BAV. Data have been processed using both single-station and remote-reference processing. Both results show apparent resistivity variations synchronous to the eruption. Time-lapse impedance estimates are computed on overlapping time windows of about two days at both stations. The only major decrease of the observed impedance coincides with the March 1998 eruption. At CSV, the resistivity started to drop about five days before the eruption, reached several local minima until April, and then slowly increased as the volcanic crisis reduced in activity. After the end of the crisis in September 1998, the apparent resistivity recovered its pre-crisis value. The time-lapse results also show variability in directionality: sharp and elongated phase tensor ellipse residuals appear during the eruption with a N105° orientation, suggesting the emergence of an almost NS-striking dyke. A 1D background model built from MT soundings performed during the quiet period (1996 to February 1998) on which a 3D NS-striking dyke was added shows a good agreement with phase tensor residuals and spatial distribution of the resistivity variations observed during the eruption.

  1. [Human bioaging acceleration as Chernobyl radiation consequence].

    PubMed

    Neĭfakh, E A; Liuman, G K

    2013-01-01

    To monitor human bioaging as a health integral index by blood plasma markers as a molar ratio for biochemically coupled monomers of intracellular lipofuscin, an intracellular polymeric aging pigment with free-radical crossed shifts, has been developed. Lipofuscin includes cell debris with catabolites of lipoperoxic cascade and lipid antioxidants. The latter were detected in the plasma samples of normal adults and children, as well as in Chernobyl clean-up workers (24-62 years old by 1990) with external total gamma-doses of 0.9-145 cSv for 4.2 years. Dynamics for bioaging markers as the molar ratio of blood levels of lipoperoxic catabolites to their antioxidants reflected normal physiologic peculiarities for the studied age periods: oxygen stress for newborns, adaptation during childhood, stability for the middle age and an increased lipoperoxidation (mainly for aging men) due to the age weakening of the antioxidant control. The ratio for the fractions of ma- lone dialdehyde (MDA), a lipoperoxic final catabolite, showed the increase of its binding by plasma proteins in proportions to calendar ages for the norm, as it is the case for lipofuscin; The graph of the age normal molar ratio of protein-bound MDA to the free one was pre-set for calibrations into the developed computer Program to calculate Relative Aging Velocities (Wrel) by bioage increments during the period of human exposure to radiation from the CAPS damage. Wrel were increasing logarithmically to the obtained doses if the total radiation exceeded 4 cSv and exceeded their normal velocities at 50 cSv 10 times or more. Slowing down of Wrel in relation to the calendar age increment was found if the sum doses were lower than 4 cSv. Levels of the studied plasma metabolites as their bioage Moles/Moles markers relative to their norms are dynamically stationary in contrast to the lipofuscin intracellular irreversible accumulation. Earlier it was shown that the decreased vitamin E and A levels with the increased lipoperoxic metabolite blood levels that indicate health consequences for the irradiated CAPS personell with related cytogenetic deviations, as well as for the adult population and children from radio-polluted regions, were restored to norms or corrected by adequate peroral therapy with bioantioxidants.

  2. Performance of Leak Compensation in All-Age ICU Ventilators During Volume-Targeted Neonatal Ventilation: A Lung Model Study.

    PubMed

    Itagaki, Taiga; Bennett, Desmond J; Chenelle, Christopher T; Fisher, Daniel F; Kacmarek, Robert M

    2017-01-01

    Volume-targeted ventilation is increasingly used in low birthweight infants because of the potential for reducing volutrauma and avoiding hypocapnea. However, it is not known what level of air leak is acceptable during neonatal volume-targeted ventilation when leak compensation is activated concurrently. Four ICU ventilators (Servo-i, PB980, V500, and Avea) were compared in available invasive volume-targeted ventilation modes (pressure control continuous spontaneous ventilation [PC-CSV] and pressure control continuous mandatory ventilation [PC-CMV]). The Servo-i and PB980 were tested with (+) and without (-) their proximal flow sensor. The V500 and Avea were tested with their proximal flow sensor as indicated by their manufacturers. An ASL 5000 lung model was used to simulate 4 neonatal scenarios (body weight 0.5, 1, 2, and 4 kg). The ASL 5000 was ventilated via an endotracheal tube with 3 different leaks. Two minutes of data were collected after each change in leak level, and the asynchrony index was calculated. Tidal volume (V T ) before and after the change in leak was assessed. The differences in delivered V T between before and after the change in leak were within ±5% in all scenarios with the PB980 (-/+) and V500. With the Servo-i (-/+), baseline V T was ≥10% greater than set V T during PC-CSV, and delivered V T markedly changed with leak. The Avea demonstrated persistent high V T in all leak scenarios. Across all ventilators, the median asynchrony index was 1% (interquartile range 0-27%) in PC-CSV and 1.8% (0-45%) in PC-CMV. The median asynchrony index was significantly higher in the Servo-i (-/+) than in the PB980 (-/+) and V500 in 1 and 2 kg scenarios during PC-CSV and PC-CMV. The PB980 and V500 were the only ventilators to acclimate to all leak scenarios and achieve targeted V T . Further clinical investigation is needed to validate the use of leak compensation during neonatal volume-targeted ventilation. Copyright © 2017 by Daedalus Enterprises.

  3. BP Spill Sampling and Monitoring Data April-September 2010 - Data Download Tool

    EPA Pesticide Factsheets

    This dataset analyzes waste from the the British Petroleum Deepwater Horizon Rig Explosion Emergency Response, providing opportunity to query data sets by metadata criteria and find resulting raw datasets in CSV format.The data query tool allows users to download air, water and sediment sampling and monitoring data that has been collected in response to the BP oil spill. All sampling and monitoring data that has been collected to date is available for download as raw structured data.The query tools enables CSV file creation to be refined based on the following search criteria: date range (between April 28, 2010 and 9/29/2010); location by zip, city, or county; media (solid waste, weathered oil, air, surface water, liquid waste, tar, sediment, water); substance categories (based on media selection) and substances (based on substance category selection).

  4. Characterization and optimization of cell seeding in scaffolds by factorial design: quality by design approach for skeletal tissue engineering.

    PubMed

    Chen, Yantian; Bloemen, Veerle; Impens, Saartje; Moesen, Maarten; Luyten, Frank P; Schrooten, Jan

    2011-12-01

    Cell seeding into scaffolds plays a crucial role in the development of efficient bone tissue engineering constructs. Hence, it becomes imperative to identify the key factors that quantitatively predict reproducible and efficient seeding protocols. In this study, the optimization of a cell seeding process was investigated using design of experiments (DOE) statistical methods. Five seeding factors (cell type, scaffold type, seeding volume, seeding density, and seeding time) were selected and investigated by means of two response parameters, critically related to the cell seeding process: cell seeding efficiency (CSE) and cell-specific viability (CSV). In addition, cell spatial distribution (CSD) was analyzed by Live/Dead staining assays. Analysis identified a number of statistically significant main factor effects and interactions. Among the five seeding factors, only seeding volume and seeding time significantly affected CSE and CSV. Also, cell and scaffold type were involved in the interactions with other seeding factors. Within the investigated ranges, optimal conditions in terms of CSV and CSD were obtained when seeding cells in a regular scaffold with an excess of medium. The results of this case study contribute to a better understanding and definition of optimal process parameters for cell seeding. A DOE strategy can identify and optimize critical process variables to reduce the variability and assists in determining which variables should be carefully controlled during good manufacturing practice production to enable a clinically relevant implant.

  5. Lyme Disease Data

    MedlinePlus

    ... County-level Lyme disease data from 2000-2016 Microsoft Excel file [Excel CSV – 209KB] ––Right–click the link ... PDF file Microsoft PowerPoint file Microsoft Word file Microsoft Excel file Audio/Video file Apple Quicktime file RealPlayer ...

  6. Demonstration of New OLAF Capabilities and Technologies

    NASA Astrophysics Data System (ADS)

    Kingston, C.; Palmer, E.; Stone, J.; Neese, C.; Mueller, B.

    2017-06-01

    Upgrades to the On-Line Archiving Facility (OLAF) PDS tool are leading to improved usability and additional functionality by integration of JavaScript web app frameworks. Also included is the capability to upload tabular data as CSV files.

  7. Fallon, Nevada FORGE Analogue Outcrop Samples

    DOE Data Explorer

    Blankenship, Doug; Bauer, Steve J.; Barrow, P.; Robbins, A.; Hileman, M.

    2018-03-12

    Compilation of results for mechanical and fluid flow properties of analogue outcrop samples - experimental data for compressional and shear wave velocities, tensile strengths, and compressive strengths. Outcrop location and sample orientation data are documented in a separate csv file.

  8. FRS EZ Query

    EPA Pesticide Factsheets

    This page is the starting point for EZ Query. This page describes how to select key data elements from EPA's Facility Information Database and Geospatial Reference Database to build a tabular report or a Comma Separated Value (CSV) files for downloading.

  9. VizieR Online Data Catalog: Horizontal temperature at Venus upper atmosphere (Peralta+, 2016)

    NASA Astrophysics Data System (ADS)

    Peralta, J.; Lopez-Valverde, M. A.; Gilli, G.; Piccialli, A.

    2015-11-01

    The dayside atmospheric temperatures in the UMLT of Venus (displayed in Figure 7A of this article) are listed as a CSV data file. These values consist of averages in bins of 5° in latitude and 0.25-hours in local time from dayside temperatures covering five years of data (from 2006/05/14 to 2011/06/05). These temperatures were inferred from the CO2 NLTE nadir spectra measured by the instrument VIRTIS-H onboard Venus Express (see article for full description of the procedure), and are representative of the atmospheric region between 10-2 to 10-5mb. Along with the temperatures, we also provide the corresponding error and the number of temperatures averaged in each bin. The format of the CSV file reasonably agrees with the expected format of the data files to be provided in the future version of the Venus International Reference Atmosphere (VIRA). (1 data file).

  10. Definition and maintenance of a telemetry database dictionary

    NASA Technical Reports Server (NTRS)

    Knopf, William P. (Inventor)

    2007-01-01

    A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.

  11. Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.

  12. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  13. In Situ Air Temperature and Humidity Measurements Over Diverse Land Covers in Greenbelt, Maryland, November 2013-November 2015

    NASA Technical Reports Server (NTRS)

    Carroll, Mark L.; Brown, Molly E.; Wooten, Margaret R.; Donham, Joel E.; Hubbard, Alfred B.; Ridenhour, William B.

    2016-01-01

    As our climate changes through time there is an ever-increasing need to quantify how and where it is changing so that mitigation strategies can be implemented. Urban areas have a disproportionate amount of warming due, in part, to the conductive properties of concrete and asphalt surfaces, surface albedo, heat capacity, lack of water, etc. that make up an urban environment. The NASA Climate Adaptation Science Investigation working group at Goddard Space Flight Center in Greenbelt, MD, conducted a study to collect temperature and humidity data at 15 min intervals from 12 sites at the center. These sites represent the major surface types at the center: asphalt, building roof, grass field, forest, and rain garden. The data show a strong distinction in the thermal properties of these surfaces at the center and the difference between the average values for the center compared to a local meteorological station. The data have been submitted to Oak Ridge National Laboratory Distributed Active Archive Center (ORNL-DAAC) for archival in comma separated value (csv) file format (Carroll et al.,2016) and can be found by following this link: http:daac.ornl.govcgi-bindsviewer.pl?ds_id1319.

  14. Differential responses in dorsal visual cortex to motion and disparity depth cues

    PubMed Central

    Arnoldussen, David M.; Goossens, Jeroen; van den Berg, Albert V.

    2013-01-01

    We investigated how interactions between monocular motion parallax and binocular cues to depth vary in human motion areas for wide-field visual motion stimuli (110 × 100°). We used fMRI with an extensive 2 × 3 × 2 factorial blocked design in which we combined two types of self-motion (translational motion and translational + rotational motion), with three categories of motion inflicted by the degree of noise (self-motion, distorted self-motion, and multiple object-motion), and two different view modes of the flow patterns (stereo and synoptic viewing). Interactions between disparity and motion category revealed distinct contributions to self- and object-motion processing in 3D. For cortical areas V6 and CSv, but not the anterior part of MT+ with bilateral visual responsiveness (MT+/b), we found a disparity-dependent effect of rotational flow and noise: When self-motion perception was degraded by adding rotational flow and moderate levels of noise, the BOLD responses were reduced compared with translational self-motion alone, but this reduction was cancelled by adding stereo information which also rescued the subject's self-motion percept. At high noise levels, when the self-motion percept gave way to a swarm of moving objects, the BOLD signal strongly increased compared to self-motion in areas MT+/b and V6, but only for stereo in the latter. BOLD response did not increase for either view mode in CSv. These different response patterns indicate different contributions of areas V6, MT+/b, and CSv to the processing of self-motion perception and the processing of multiple independent motions. PMID:24339808

  15. Raw Pressure Data from Observation Wells at Brady's Hot Springs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Lim

    This .csv files contain the raw water pressure data from three observation wells during pumping tests performed in the Spring of 2016. Included is a "read me" file explaining the details of where and how the data were collected.

  16. BRAD BRDY and BRD1 GPS Station RINEX Files 09-17-2015

    DOE Data Explorer

    Corne Kreemer

    2015-09-17

    CSV files with links to RINEX data for stations BRAD and BRDY for all days after those reported previous (i.e., since 21-JAN-2015) Links to websites that show the position time-series of both stations.

  17. COPPER COMPLEXATION BY NATURAL ORGANIC MATTER IN CONTAMINATED AND UNCOMTAINATED GROUND WATER

    EPA Science Inventory

    Ground-water samples were collected from an uncontaminated and a contaminated site. Copper complexation was characterized by ion-selective electrode (ISE), fluorescence quenching (FQ), and cathodic stripping voltammetric (CSV) titrations. All of the samples were titrated at their...

  18. Optimal error analysis of the intraseasonal convection due to uncertainties of the sea surface temperature in a coupled model

    NASA Astrophysics Data System (ADS)

    Li, Xiaojing; Tang, Youmin; Yao, Zhixiong

    2017-04-01

    The predictability of the convection related to the Madden-Julian Oscillation (MJO) is studied using a coupled model CESM (Community Earth System Model) and the climatically relevant singular vector (CSV) approach. The CSV approach is an ensemble-based strategy to calculate the optimal initial error on climate scale. In this study, we focus on the optimal initial error of the sea surface temperature in Indian Ocean, where is the location of the MJO onset. Six MJO events are chosen from the 10 years model simulation output. The results show that the large values of the SVs are mainly located in the bay of Bengal and the south central IO (around (25°S, 90°E)), which is a meridional dipole-like pattern. The fast error growth of the CSVs have important impacts on the prediction of the convection related to the MJO. The initial perturbations with the SV pattern result in the deep convection damping more quickly in the east Pacific Ocean. Moreover, the sensitivity studies of the CSVs show that different initial fields do not affect the CSVs obviously, while the perturbation domain is a more responsive factor to the CSVs. The rapid growth of the CSVs is found to be related to the west bay of Bengal, where the wind stress starts to be perturbed due to the CSV initial error. These results contribute to the establishment of an ensemble prediction system, as well as the optimal observation network. In addition, the analysis of the error growth can provide us some enlightment about the relationship between SST and the intraseasonal convection related to the MJO.

  19. Study of Copper and Purine-Copper Complexes on Modified Carbon Electrodes by Cyclic and Elimination Voltammetry

    PubMed Central

    Trnkova, Libuse; Zerzankova, Lenka; Dycka, Filip; Mikelova, Radka; Jelen, Frantisek

    2008-01-01

    Using a paraffin impregnated graphite electrode (PIGE) and mercury-modified pyrolytic graphite electrode with basal orientation (Hg-PGEb) copper(II) and Cu(II)-DNA purine base solutions have been studied by cyclic (CV) and linear sweep voltammetry (LSV) in connection with elimination voltammetry with linear scan (EVLS). In chloride and bromide solutions (pH 6), the redox process of Cu(II) proceeded on PIGE with two cathodic and two anodic potentially separated signals. According to the elimination function E4, the first cathodic peak corresponds to the reduction Cu(II) + e- → Cu(I) with the possibility of fast disproportionation 2Cu(I) → Cu(II)+ Cu(0). The E4 of the second cathodic peak signalized an electrode process controlled by a surface reaction. The electrode system of Cu(II) on Hg-PGEb in borate buffer (pH 9.2) was characterized by one cathodic and one anodic peak. Anodic stripping voltammetry (ASV) on PIGE and cathodic stripping voltammetry (CSV) on Hg-PGEb were carried out at potentials where the reduction of copper ions took place and Cu(I)-purine complexes were formed. By using ASV and CSV in combination with EVLS, the sensitivity of Cu(I)-purine complex detection was enhanced relative to either ASV or CSV alone, resulting in higher peak currents of more than one order of magnitude. The statistical treatment of CE data was used to determine the reproducibility of measurements. Our results show that EVLS in connection with the stripping procedure is useful for both qualitative and quantitative microanalysis of purine derivatives and can also reveal details of studied electrode processes. PMID:27879715

  20. NGDS Data Archiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2013-08-01

    This is a Node.js command line utility for scraping XML metadata from CSW and WFS, downloading linkage data from CSW and WFS, pinging hosts and returning status codes, pinging data linkages and returning status codes, writing ping status to CSV files, and uploading data to Amazon S3.

  1. Brady's Geothermal Field - Analysis of Pressure Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, David

    *This submission provides corrections to GDR Submissions 844 and 845* Poroelastic Tomography (PoroTomo) by Adjoint Inverse Modeling of Data from Hydrology. The 3 *csv files containing pressure data are the corrected versions of the pressure dataset found in Submission 844. The dataset has been corrected in the sense that the atmospheric pressure has been subtracted from the total pressure measured in the well. Also, the transducers used at wells 56A-1 and SP-2 are sensitive to surface temperature fluctuations. These temperature effects have been removed from the corrected datasets. The 4th *csv file contains corrected version of the pumping data foundmore » in Submission 845. The data has been corrected in the sense that the data from several wells that were used during the PoroTomo deployment pumping tests that were not included in the original dataset has been added. In addition, several other minor changes have been made to the pumping records due to flow rate instrument calibration issues that were discovered.« less

  2. Estimates of Annual Fossil-Fuel CO2 Emitted for Each State in the U.S.A. and the District of Columbia for Each Year from 1960 through 2001

    DOE Data Explorer

    Blasing, T. J. [Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory (ORNL), Oak Ridge, Tennessee (USA); Marland, Gregg [Carbon Dioxide Information Analysis Center (CDIAC), Oak Ridge National Laboratory (ORNL), Oak Ridge, Tennessee (USA); Broniak, Christine [Oregon State Univ., Corvallis, OR (United States)

    2004-01-01

    Consumption data for coal, petroleum, and natural gas are multiplied by their respective thermal conversion factors, which are in units of heat energy per unit of fuel consumed (i.e., per cubic foot, barrel, or ton), to calculate the amount of heat energy derived from fuel combustion. The thermal conversion factors are given in Appendix A of each issue of Monthly Energy Review, published by the Energy Information Administration (EIA) of the U.S. Department of Energy (DOE). Results are expressed in terms of heat energy obtained from each fuel type. These energy values were obtained from the State Energy Data Report (EIA, 2003a), ( http://www.eia.doe.gov/emeu/states/sep_use/total/csv/use_csv.html), and served as our basic input. The energy data are also available in hard copy from the Energy Information Administration, U.S. Department of Energy, as the State Energy Data Report (EIA, 2003a,b).

  3. Principles determining the structure of high-pressure forms of metals: The structures of cesium(IV) and cesium(V)

    PubMed Central

    Pauling, Linus

    1989-01-01

    Consideration of the relation between bond length and bond number and the average atomic volume for different ways of packing atoms leads to the conclusion that the average ligancy of atoms in a metal should increase when a phase change occurs on increasing the pressure. Minimum volume for each value of the ligancy results from triangular coordination polyhedra (with triangular faces), such as the icosahedron and the Friauf polyhedron. Electron transfer may permit atoms of an element to assume different ligancies. Application of these principles to Cs(IV) and Cs(V), which were previously assigned structures with ligancy 8 and 6, respectively, has led to the assignment to Cs(IV) of a primitive cubic unit cell with a = 16.11 Å and with about 122 atoms in the cube and to Cs(V) of a primitive cubic unit cell resembling that of Mg32(Al,Zn)49, with a = 16.97 Å and with 162 atoms in the cube. PMID:16578839

  4. Converting CSV Files to RKSML Files

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Liebersbach, Robert

    2009-01-01

    A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.

  5. Brady Well Coordinates and Observation Sensor Depths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Lim

    Contains metadata associated with the wells used in the 2016 Spring Campaign led partially by UW - Madison, LBNL, and LLNL scientists. Included with the well coordinates are the depths to the pressure sensors used in observation and pumping wells. Read me files are included for each .csv file.

  6. Clinical Skills Verification, Formative Feedback, and Psychiatry Residency Trainees

    ERIC Educational Resources Information Center

    Dalack, Gregory W.; Jibson, Michael D.

    2012-01-01

    Objective: The authors describe the implementation of Clinical Skills Verification (CSV) in their program as an in-training assessment intended primarily to provide formative feedback to trainees, strengthen the supervisory experience, identify the need for remediation of interviewing skills, and secondarily to demonstrating resident competence…

  7. Rooftop Energy Potential of Low Income Communities in America REPLICA

    DOE Data Explorer

    Mooney, Meghan (ORCID:0000000309406958); Sigrin, Ben

    1970-01-01

    The Rooftop Energy Potential of Low Income Communities in America REPLICA data set provides estimates of residential rooftop solar technical potential at the tract-level with emphasis on estimates for Low and Moderate Income LMI populations. In addition to technical potential REPLICA is comprised of 10 additional datasets at the tract-level to provide socio-demographic and market context. The model year vintage of REPLICA is 2015. The LMI solar potential estimates are made at the tract level grouped by Area Median Income AMI income tenure and building type. These estimates are based off of LiDAR data of 128 metropolitan areas statistical modeling and ACS 2011-2015 demographic data. The remaining datasets are supplemental datasets that can be used in conjunction with the technical potential data for general LMI solar analysis planning and policy making. The core dataset is a wide-format CSV file seeds_ii_replica.csv that can be tagged to a tract geometry using the GEOID or GISJOIN fields. In addition users can download geographic shapefiles for the main or supplemental datasets. This dataset was generated as part of the larger NREL-led SEEDSII Solar Energy Evolution and Diffusion Studies project and specifically for the NREL technical report titled Rooftop Solar Technical Potential for Low-to-Moderate Income Households in the United States by Sigrin and Mooney 2018. This dataset is intended to give researchers planners advocates and policy-makers access to credible data to analyze low-income solar issues and potentially perform cost-benefit analysis for program design. To explore the data in an interactive web mapping environment use the NREL SolarForAll app.

  8. In situ air temperature and humidity measurements over diverse land covers in Greenbelt, Maryland, November 2013-November 2015

    NASA Astrophysics Data System (ADS)

    Carroll, Mark L.; Brown, Molly E.; Wooten, Margaret R.; Donham, Joel E.; Hubbard, Alfred B.; Ridenhour, William B.

    2016-09-01

    As our climate changes through time there is an ever-increasing need to quantify how and where it is changing so that mitigation strategies can be implemented. Urban areas have a disproportionate amount of warming due, in part, to the conductive properties of concrete and asphalt surfaces, surface albedo, heat capacity, lack of water, etc. that make up an urban environment. The NASA Climate Adaptation Science Investigation working group at Goddard Space Flight Center in Greenbelt, MD, conducted a study to collect temperature and humidity data at 15 min intervals from 12 sites at the center. These sites represent the major surface types at the center: asphalt, building roof, grass field, forest, and rain garden. The data show a strong distinction in the thermal properties of these surfaces at the center and the difference between the average values for the center compared to a local meteorological station. The data have been submitted to Oak Ridge National Laboratory Distributed Active Archive Center (ORNL-DAAC) for archival in comma separated value (csv) file format (Carroll et al., 2016) and can be found by following this link: http://daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1319.

  9. Fallon, Nevada FORGE Geodetic Data

    DOE Data Explorer

    Blankenship, Doug; Eneva, Mariana; Hammond, William

    2018-02-01

    Fallon FORGE InSAR and geodetic GPS deformation data. InSAR shapefiles are packaged together as .MPK (ArcMap map package, compatible with other GIS platforms), and as .CSV comma-delimited plaintext. GPS data and additional metadata are linked to the Nevada Geodetic Laboratory database at the Univ. of Nevada, Reno (UNR).

  10. EarthServer2 : The Marine Data Service - Web based and Programmatic Access to Ocean Colour Open Data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2017-04-01

    The ESA Ocean Colour - Climate Change Initiative (ESA OC-CCI) has produced a long-term high quality global dataset with associated per-pixel uncertainty data. This dataset has now grown to several hundred terabytes (uncompressed) and is freely available to download. However, the sheer size of the dataset can act as a barrier to many users; large network bandwidth, local storage and processing requirements can prevent researchers without the backing of a large organisation from taking advantage of this raw data. The EC H2020 project, EarthServer2, aims to create a federated data service providing access to more than 1 petabyte of earth science data. Within this federation the Marine Data Service already provides an innovative on-line tool-kit for filtering, analysing and visualising OC-CCI data. Data are made available, filtered and processed at source through a standards-based interface, the Open Geospatial Consortium Web Coverage Service and Web Coverage Processing Service. This work was initiated in the EC FP7 EarthServer project where it was found that the unfamiliarity and complexity of these interfaces itself created a barrier to wider uptake. The continuation project, EarthServer2, addresses these issues by providing higher level tools for working with these data. We will present some examples of these tools. Many researchers wish to extract time series data from discrete points of interest. We will present a web based interface, based on NASA/ESA WebWorldWind, for selecting points of interest and plotting time series from a chosen dataset. In addition, a CSV file of locations and times, such as a ship's track, can be uploaded and these points extracted and returned in a CSV file allowing researchers to work with the extract locally, such as a spreadsheet. We will also present a set of Python and JavaScript APIs that have been created to complement and extend the web based GUI. These APIs allow the selection of single points and areas for extraction. The extracted data is returned as structured data (for instance a Python array) which can then be passed directly to local processing code. We will highlight how the libraries can be used by the community and integrated into existing systems, for instance by the use of Jupyter notebooks to share Python code examples which can then be used by other researchers as a basis for their own work.

  11. Brady's Geothermal Field - DTS Raw Data

    DOE Data Explorer

    Thomas Coleman

    2016-03-26

    The submitted data correspond to the complete raw temperature datasets captured by the distributed temperature sensing (DTS) horizontal and vertical arrays during the PoroTomo Experiment. Files in each submitted resource include: .xml (level 0): Data that includes Stokes, Anti-Stokes, and Temperature data .csv (level 1): Data that includes temperature PT100: Reference probe data

  12. Clinical Skills Verification in General Psychiatry: Recommendations of the ABPN Task Force on Rater Training

    ERIC Educational Resources Information Center

    Jibson, Michael D.; Broquet, Karen E.; Anzia, Joan Meyer; Beresin, Eugene V.; Hunt, Jeffrey I.; Kaye, David; Rao, Nyapati Raghu; Rostain, Anthony Leon; Sexson, Sandra B.; Summers, Richard F.

    2012-01-01

    Objective: The American Board of Psychiatry and Neurology (ABPN) announced in 2007 that general psychiatry training programs must conduct Clinical Skills Verification (CSV), consisting of observed clinical interviews and case presentations during residency, as one requirement to establish graduates' eligibility to sit for the written certification…

  13. BLAST for Behind-the-Meter Applications Lite Tool | Transportation Research

    Science.gov Websites

    provided by NREL's PV Watts calculator. A generic utility rate structure framework makes it possible to the BLAST documentation for proper CSV formatting. Rate structure values Define demand charges and energy costs to best represent your utility rate structure of interest. Demand charges and energy costs

  14. Experimental Study of SBS Suppression via White Noise Phase Modulation

    DTIC Science & Technology

    2014-02-10

    fiber optical parametric amplifiers,” Opt. Communications 283, 2607-2610 (2010). [8] Coles, J. B., Kuo, B. P.-P., Alie , N., Moro, S., Bres, C.-S...V., Farley, K., Leveille, R., Galipeau, J., Majid , I., Christensen, S., Samson, B., Tankala, K. “kW level narrow linewidth Yb fiber amplifiers for

  15. Brady's Geothermal Field - Metadata for InSAR Holdings

    DOE Data Explorer

    Ali, Tabrez

    2016-07-29

    List of synthetic aperture radar (SAR) images acquired by TerraSAR-X and TanDEM-X satellite missions and archived at UNAVCO's WINSAR facility. See file "Bradys TSX Holdings.csv" for individual links. NOTE: The user must create an account in order to access the data (See "Instructions for Creating an Account" below).

  16. Processing of Egomotion-Consistent Optic Flow in the Rhesus Macaque Cortex

    PubMed Central

    Cottereau, Benoit R.; Smith, Andrew T.; Rima, Samy; Fize, Denis; Héjja-Brichard, Yseult; Renaud, Luc; Lejards, Camille; Vayssière, Nathalie; Trotter, Yves; Durand, Jean-Baptiste

    2017-01-01

    Abstract The cortical network that processes visual cues to self-motion was characterized with functional magnetic resonance imaging in 3 awake behaving macaques. The experimental protocol was similar to previous human studies in which the responses to a single large optic flow patch were contrasted with responses to an array of 9 similar flow patches. This distinguishes cortical regions where neurons respond to flow in their receptive fields regardless of surrounding motion from those that are sensitive to whether the overall image arises from self-motion. In all 3 animals, significant selectivity for egomotion-consistent flow was found in several areas previously associated with optic flow processing, and notably dorsal middle superior temporal area, ventral intra-parietal area, and VPS. It was also seen in areas 7a (Opt), STPm, FEFsem, FEFsac and in a region of the cingulate sulcus that may be homologous with human area CSv. Selectivity for egomotion-compatible flow was never total but was particularly strong in VPS and putative macaque CSv. Direct comparison of results with the equivalent human studies reveals several commonalities but also some differences. PMID:28108489

  17. Crystal structure and crystal chemistry of melanovanadite, a natural vanadium bronze.

    USGS Publications Warehouse

    Konnert, J.A.; Evans, H.T.

    1987-01-01

    The crystal structure of melanovanadite from Minas Ragra, Peru, has been determined in space group P1. The triclinic unit cell (non-standard) has a 6.360(2), b 18.090(9), c 6.276(2) A, alpha 110.18(4)o, beta 101.62(3)o, gamma 82.86(4)o. A subcell with b' = b/2 was found by crystal-structure analysis to contain CaV4O10.5H2O. The subcell has a layer structure in which the vanadate sheet consists of corner-shared tetrahedral VO4 and double square-pyramidal V2O8 groups, similar to that previously found in synthetic CsV2O5. Refinement of the full structure (R = 0.056) showed that the Ca atom, which half-occupies a general position in the subcell, is 90% ordered at one of these sites in the whole unit cell. Bond length-bond strength estimates indicate that the tetrahedra contain V5+, and the square pyramids, V4+.-J.A.Z.

  18. Using Mauna Loa Atmospheric CO2 Data in Large General Education Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.; Kapp, J. L.

    2007-12-01

    We have been using the Mauna Loa atmospheric CO2 dataset (http://scrippsco2.ucsd.edu/data/in_situ_co2/monthly_mlo.csv) in a large (up to 300) General Education Geoscience course, primarily in small breakout groups (30 students). The exercise is designed to address quantitative literacy including percentages, slopes and linear trends, issues of data completeness and bias, quality of extrapolations, as well as implications for climate change. We are significantly revising the course, which serves 600 students a semester, with help from a curriculum grant. A major goal is to improve student learning by incorporating inquiry based activities in the large lecture setting. Lectures now incorporate several activities throughout a given class period, in which students are asked to use critical thinking skills such as interpreting patterns in data and graphs, analyzing a scientific hypothesis for its coherence with the scientific method, and answering higher order synthesis questions in both verbal and written form. This differs from our past format where class periods were dominated by lecture, with a single short activity done individually about every other lecture. To test the effectiveness of the new course format we will give students the same atmospheric CO2 exercise in the lecture setting that they were given previously in breakout groups. Students will work in small groups in lecture after receiving a short introduction to the exercise by the instructor. They will plot CO2 concentrations, make extrapolations, and interpret patterns in the data. We will compare scores on the exercise with previous semesters. We expect that students will do better having had more experience with interpreting scientific data and practicing higher order thinking skills. We also expect working in small groups will foster better learning through peer teaching and discussion. We will incorporate responses from students who took part in the exercises from current and previous semesters. We administer a greenhouse effect concept inventory both before and after the CO2 exercise and other in-class greenhouse gas activities, and will present those results as well.

  19. Farm 2 Fly | National Agricultural Library

    Science.gov Websites

    identify gaps where further research, development or investment may be needed to facilitate readiness of a FSRL checklist and report template for performing a feedstock evaluation, as well as examples of a materials and residues, and a template for the report, as well as examples. Farm2fly Program csv xlsx 1 2 3

  20. Brady Geothermal Field InSAR Raw Data

    DOE Data Explorer

    Ali, Tabrez

    2015-03-31

    List of TerraSAR-X/TanDEM-X images acquired between 2015-01-01 and 2015-03-31, and archived at https://winsar.unavco.org. See file "BHS InSAR Data with URLs.csv" for individual links. NOTE: The user must create an account in order to access the data (See "Instructions for Creating an Account" below).

  1. Open Geoscience Database

    NASA Astrophysics Data System (ADS)

    Bashev, A.

    2012-04-01

    Currently there is an enormous amount of various geoscience databases. Unfortunately the only users of the majority of the databases are their elaborators. There are several reasons for that: incompaitability, specificity of tasks and objects and so on. However the main obstacles for wide usage of geoscience databases are complexity for elaborators and complication for users. The complexity of architecture leads to high costs that block the public access. The complication prevents users from understanding when and how to use the database. Only databases, associated with GoogleMaps don't have these drawbacks, but they could be hardly named "geoscience" Nevertheless, open and simple geoscience database is necessary at least for educational purposes (see our abstract for ESSI20/EOS12). We developed a database and web interface to work with them and now it is accessible at maps.sch192.ru. In this database a result is a value of a parameter (no matter which) in a station with a certain position, associated with metadata: the date when the result was obtained; the type of a station (lake, soil etc); the contributor that sent the result. Each contributor has its own profile, that allows to estimate the reliability of the data. The results can be represented on GoogleMaps space image as a point in a certain position, coloured according to the value of the parameter. There are default colour scales and each registered user can create the own scale. The results can be also extracted in *.csv file. For both types of representation one could select the data by date, object type, parameter type, area and contributor. The data are uploaded in *.csv format: Name of the station; Lattitude(dd.dddddd); Longitude(ddd.dddddd); Station type; Parameter type; Parameter value; Date(yyyy-mm-dd). The contributor is recognised while entering. This is the minimal set of features that is required to connect a value of a parameter with a position and see the results. All the complicated data treatment could be conducted in other programs after extraction the filtered data into *.csv file. It makes the database understandable for non-experts. The database employs open data format (*.csv) and wide spread tools: PHP as the program language, MySQL as database management system, JavaScript for interaction with GoogleMaps and JQueryUI for create user interface. The database is multilingual: there are association tables, which connect with elements of the database. In total the development required about 150 hours. The database still has several problems. The main problem is the reliability of the data. Actually it needs an expert system for estimation the reliability, but the elaboration of such a system would take more resources than the database itself. The second problem is the problem of stream selection - how to select the stations that are connected with each other (for example, belong to one water stream) and indicate their sequence. Currently the interface is English and Russian. However it can be easily translated to your language. But some problems we decided. For example problem "the problem of the same station" (sometimes the distance between stations is smaller, than the error of position): when you adding new station to the database our application automatically find station near this place. Also we decided problem of object and parameter type (how to regard "EC" and "electrical conductivity" as the same parameter). This problem has been solved using "associative tables". If you would like to see the interface on your language, just contact us. We should send you the list of terms and phrases for translation on your language. The main advantage of the database is that it is totally open: everybody can see, extract the data from the database and use them for non-commercial purposes with no charge. Registered users can contribute to the database without getting paid. We hope, that it will be widely used first of all for education purposes, but professional scientists could use it also.

  2. Merging clinical chemistry biomarker data with a COPD database - building a clinical infrastructure for proteomic studies.

    PubMed

    Eriksson, Jonatan; Andersson, Simone; Appelqvist, Roger; Wieslander, Elisabet; Truedsson, Mikael; Bugge, May; Malm, Johan; Dahlbäck, Magnus; Andersson, Bo; Fehniger, Thomas E; Marko-Varga, György

    2016-01-01

    Data from biological samples and medical evaluations plays an essential part in clinical decision making. This data is equally important in clinical studies and it is critical to have an infrastructure that ensures that its quality is preserved throughout its entire lifetime. We are running a 5-year longitudinal clinical study, KOL-Örestad, with the objective to identify new COPD (Chronic Obstructive Pulmonary Disease) biomarkers in blood. In the study, clinical data and blood samples are collected from both private and public health-care institutions and stored at our research center in databases and biobanks, respectively. The blood is analyzed by Mass Spectrometry and the results from this analysis then linked to the clinical data. We built an infrastructure that allows us to efficiently collect and analyze the data. We chose to use REDCap as the EDC (Electronic Data Capture) tool for the study due to its short setup-time, ease of use, and flexibility. REDCap allows users to easily design data collection modules based on existing templates. In addition, it provides two functions that allow users to import batches of data; through a web API (Application Programming Interface) as well as by uploading CSV-files (Comma Separated Values). We created a software, DART (Data Rapid Translation), that translates our biomarker data into a format that fits REDCap's CSV-templates. In addition, DART is configurable to work with many other data formats as well. We use DART to import our clinical chemistry data to the REDCap database. We have shown that a powerful and internationally adopted EDC tool such as REDCap can be extended so that it can be used efficiently in proteomic studies. In our study, we accomplish this by using DART to translate our clinical chemistry data to a format that fits the templates of REDCap.

  3. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less

  4. Genomic mutation consequence calculator.

    PubMed

    Major, John E

    2007-11-15

    The genomic mutation consequence calculator (GMCC) is a tool that will reliably and quickly calculate the consequence of arbitrary genomic mutations. GMCC also reports supporting annotations for the specified genomic region. The particular strength of the GMCC is it works in genomic space, not simply in spliced transcript space as some similar tools do. Within gene features, GMCC can report on the effects on splice site, UTR and coding regions in all isoforms affected by the mutation. A considerable number of genomic annotations are also reported, including: genomic conservation score, known SNPs, COSMIC mutations, disease associations and others. The manual interface also offers link outs to various external databases and resources. In batch mode, GMCC returns a csv file which can easily be parsed by the end user. GMCC is intended to support the many tumor resequencing efforts, but can be useful to any study investigating genomic mutations.

  5. SynTrack: DNA Assembly Workflow Management (SynTrack) v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MENG, XIANWEI; SIMIRENKO, LISA

    2016-12-01

    SynTrack is a dynamic, workflow-driven data management system that tracks the DNA build process: Management of the hierarchical relationships of the DNA fragments; Monitoring of process tasks for the assembly of multiple DNA fragments into final constructs; Creations of vendor order forms with selectable building blocks. Organizing plate layouts barcodes for vendor/pcr/fusion/chewback/bioassay/glycerol/master plate maps (default/condensed); Creating or updating Pre-Assembly/Assembly process workflows with selected building blocks; Generating Echo pooling instructions based on plate maps; Tracking of building block orders, received and final assembled for delivering; Bulk updating of colony or PCR amplification information, fusion PCR and chewback results; Updating with QA/QCmore » outcome with .csv & .xlsx template files; Re-work assembly workflow enabled before and after sequencing validation; and Tracking of plate/well data changes and status updates and reporting of master plate status with QC outcomes.« less

  6. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  7. A Virtual Environment for Resilient Infrastructure Modeling and Design

    DTIC Science & Technology

    2015-09-01

    Security CI Critical Infrastructure CID Center for Infrastructure Defense CSV Comma Separated Value DAD Defender-Attacker-Defender DHS Department...responses to disruptive events (e.g., cascading failure behavior) in a context- rich , controlled environment for exercises, education, and training...The general attacker-defender (AD) and defender-attacker-defender ( DAD ) models for CI are defined in Brown et al. (2006). These models help

  8. [Computerized system validation of clinical researches].

    PubMed

    Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel

    2015-11-01

    Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.

  9. Speciation studies of nickel and chromium in wastewater from an electroplating plant.

    PubMed

    Kiptoo, Jackson K; Ngila, J Catherine; Sawula, Gerald M

    2004-09-08

    A speciation scheme involving the use of flame atomic absorption spectrometry (FAAS) and differential pulse adsorptive cathodic stripping voltammetry (DPAdCSV) techniques was applied to studies of nickel and chromium in wastewater from a nickel-chrome electroplating plant. Dimethylglyoxime (DMG) and diethylenetriaminepentaacetic acid (DTPA) were employed as complexing agents for adsorptive voltammetric determination of Ni and Cr, respectively. Cr(III) and Cr(VI) were determined by exploiting differences in their reactivity towards DTPA at HMDE. Total dissolved metal content was in the range 2906-3141 and 30.7-31.2mgl(-1) for Ni and Cr, respectively. A higher percentage of the metal was present as labile species (mean value of 67.9% for Ni and 79.8% for Cr) suggesting that strongly binding ligands are not ubiquitous in the sample. About 77.8% of Cr was found to exist in the higher oxidization state, Cr(IV). Results on effect of dilution on lability of the metal forms in the sample using DPAdCSV showed slight peak shifts to a more negative (cathodic) value by -0.036V for Ni and -0.180V for Cr with a dilution factor of 100, while peak intensity (cathodic current) remained fairly constant.

  10. Surface Meteorology at Teller Site Stations, Seward Peninsula, Alaska, Ongoing from 2016

    DOE Data Explorer

    Bob Busey; Bob Bolton; Cathy Wilson; Lily Cohen

    2017-12-05

    Meteorological data are currently being collected at two locations at the Teller Site, Seward Peninsula. Teller Creek Station near TL_BSV (TELLER BOTTOM METEOROLOGICAL STATION) Station is located in the lower watershed in a tussock / willow transition zone and co-located with continuous snow depth measurements and subsurface measurements. Teller Creek Station near TL_IS_5 (TELLER TOP METEOROLOGICAL STATION) Station is located in the upper watershed and co-located with continuous snow depth measurements and subsurface measurements. Two types of data products are provided for these stations: First, meteorological and site characterization data grouped by sensor/measurement type (e.g., radiation or soil pit temperature and moisture). These are *.csv files. Second, a Data Visualization tool is provided for quick visualization of measurements over time at a station. Download the *_Visualizer.zip file, extract, and click on the 'index.html' file. Data values are the same in both products.

  11. [Determine and Implement Updates to Be Made to MODEAR (Mission Operations Data Enterprise Architecture Repository)

    NASA Technical Reports Server (NTRS)

    Fanourakis, Sofia

    2015-01-01

    My main project was to determine and implement updates to be made to MODEAR (Mission Operations Data Enterprise Architecture Repository) process definitions to be used for CST-100 (Crew Space Transportation-100) related missions. Emphasis was placed on the scheduling aspect of the processes. In addition, I was to complete other tasks as given. Some of the additional tasks were: to create pass-through command look-up tables for the flight controllers, finish one of the MDT (Mission Operations Directorate Display Tool) displays, gather data on what is included in the CST-100 public data, develop a VBA (Visual Basic for Applications) script to create a csv (Comma-Separated Values) file with specific information from spreadsheets containing command data, create a command script for the November MCC-ASIL (Mission Control Center-Avionics System Integration Laboratory) testing, and take notes for one of the TCVB (Terminal Configured Vehicle B-737) meetings. In order to make progress in my main project I scheduled meetings with the appropriate subject matter experts, prepared material for the meetings, and assisted in the discussions in order to understand the process or processes at hand. After such discussions I made updates to various MODEAR processes and process graphics. These meetings have resulted in significant updates to the processes that were discussed. In addition, the discussions have helped the departments responsible for these processes better understand the work ahead and provided material to help document how their products are created. I completed my other tasks utilizing resources available to me and, when necessary, consulting with the subject matter experts. Outputs resulting from my other tasks were: two completed and one partially completed pass through command look-up tables for the fight controllers, significant updates to one of the MDT displays, a spreadsheet containing data on what is included in the CST-100 public data, a tool to create a csv file with specific information from spreadsheets containing command data, a command script for the November MCC-ASIL testing which resulted in a successful test day identifying several potential issues, and notes from one of the TCVB meetings that was used to keep the teams up to date on what was discussed and decided. I have learned a great deal working at NASA these last four months. I was able to meet and work with amazing individuals, further develop my technical knowledge, expand my knowledge base regarding human spaceflight, and contribute to the CST-100 missions. My work at NASA has strengthened my desire to continue my education in order to make further contributions to the field, and has given me the opportunity to see the advantages of a career at NASA.

  12. HIPPO Unit Commitment Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-01-17

    Developed for the Midcontinent Independent System Operator, Inc. (MISO), HIPPO-Unit Commitment Version 1 is for solving security constrained unit commitment problem. The model was developed to solve MISO's cases. This version of codes includes I/O module to read in MISO's csv files, modules to create a state-based mixed integer programming formulation for solving MIP, and modules to test basic procedures to solve MIP via HPC.

  13. 78 FR 28732 - Revisions to Electric Quarterly Report Filing Process; Availability of Draft XML Schema

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... posting CSV file samples. Order No. 770 revised the process for filing EQRs. Pursuant to Order No. 770, one of the new processes for filing allows EQRs to be filed using an XML file. The XML schema that is needed to file EQRs in this manner is now posted on the Commission's Web site at http://www.ferc.gov/docs...

  14. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  15. A Data Warehouse to Support Condition Based Maintenance (CBM)

    DTIC Science & Technology

    2005-05-01

    Application ( VBA ) code sequence to import the original MAST-generated CSV and then create a single output table in DBASE IV format. The DBASE IV format...database architecture (Oracle, Sybase, MS- SQL , etc). This design includes table definitions, comments, specification of table attributes, primary and foreign...built queries and applications. Needs the application developers to construct data views. No SQL programming experience. b. Power Database User - knows

  16. Network Science Research Laboratory (NSRL) Telemetry Warehouse

    DTIC Science & Technology

    2016-06-01

    Functionality and architecture of the NSRL Telemetry Warehouse are also described as well as the web interface, data structure, security aspects, and...Experiment Controller 6 4.5 Telemetry Sensors 7 4.6 Custom Data Processing Nodes 7 5. Web Interface 8 6. Data Structure 8 6.1 Measurements 8...telemetry in comma-separated value (CSV) format from the web interface or via custom applications developed by researchers using the client application

  17. EPA FRS Facilities Combined File CSV Download for the Marshall Islands

    EPA Pesticide Factsheets

    The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.

  18. EPA FRS Facilities Single File CSV Download for the Marshall Islands

    EPA Pesticide Factsheets

    The Facility Registry System (FRS) identifies facilities, sites, or places subject to environmental regulation or of environmental interest to EPA programs or delegated states. Using vigorous verification and data management procedures, FRS integrates facility data from program national systems, state master facility records, tribal partners, and other federal agencies and provides the Agency with a centrally managed, single source of comprehensive and authoritative information on facilities.

  19. Supporting Marine Corps Enhanced Company Operations: A Quantitative Analysis

    DTIC Science & Technology

    2010-06-01

    by decomposition into simple independent parts. o Agents interact with each other in non-linear ways, and “ adapt ” to their local environment . (p...Center Co Company CoLT Company Landing Team CAS Complex Adaptive Systems CSV Comma-separated Value DO Distributed Operations DODIC Department...SUMMARY The modern irregular warfare environment has dramatically impacted the battle space assignments and mission scope of tactical units that now

  20. VStar: Variable star data visualization and analysis tool

    NASA Astrophysics Data System (ADS)

    VStar Team

    2014-07-01

    VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.

  1. EPA Enforcement and Compliance History Online

    EPA Pesticide Factsheets

    The Environmental Protection Agency's Enforcement and Compliance History Online (ECHO) website provides customizable and downloadable information about environmental inspections, violations, and enforcement actions for EPA-regulated facilities related to the Clean Air Act, Clean Water Act, Resource Conservation and Recovery Act, and Safe Drinking Water Act. These data are updated weekly as part of the ECHO data refresh, and ECHO offers many user-friendly options to explore data, including:? Facility Search: ECHO information is searchable by varied criteria, including location, facility type, and compliance status. Search results are customizable and downloadable.? Comparative Maps and State Dashboards: These tools offer aggregated information about facility compliance status, regulatory agency compliance monitoring, and enforcement activity at the national and state level.? Bulk Data Downloads: One of ECHO??s most popular features is the ability to work offline by downloading large data sets. Users can take advantage of the ECHO Exporter, which provides summary information about each facility in comma-separated values (csv) file format, or download data sets by program as zip files.

  2. MODIS Interactive Subsetting Tool (MIST)

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Duerr, R.; Haran, T.; Khalsa, S. S.; Miller, D.

    2008-12-01

    In response to requests from the user community, NSIDC has teamed with the Oak Ridge National Laboratory Distributive Active Archive Center (ORNL DAAC) and the Moderate Resolution Data Center (MrDC) to provide time series subsets of satellite data covering stations in the Greenland Climate Network (GC-NET) and the International Arctic Systems for Observing the Atmosphere (IASOA) network. To serve these data NSIDC created the MODIS Interactive Subsetting Tool (MIST). MIST works with 7 km by 7 km subset time series of certain Version 5 (V005) MODIS products over GC-Net and IASOA stations. User- selected data are delivered in a text Comma Separated Value (CSV) file format. MIST also provides online analysis capabilities that include generating time series and scatter plots. Currently, MIST is a Beta prototype and NSIDC intends that user requests will drive future development of the tool. The intent of this poster is to introduce MIST to the MODIS data user audience and illustrate some of the online analysis capabilities.

  3. Iron Speciation in the Subtropical Waters East of New Zealand using Multi Detection Window CLE-AdCSV Titrations.

    NASA Astrophysics Data System (ADS)

    Chandrasekhar, Anoop; Sander, Sylvia; Milnes, Angie; Boyd, Philip

    2015-04-01

    Iron plays a significant role in the ocean productivity as a micro nutrient that facilitates the growth of marine phytoplankton and microbes. The bioavailability of iron in the ocean depends on it speciation. Iron is bio available in its dissolved form and about 99.9% of dissolved iron in seawater is organically complexed with natural ligands. The competitive ligand equilibration - adsorptive cathodic stripping voltammetry (CLE-AdCSV) is the widely used technique to examine Fe speciation. The method has its own limitations. The analytical window employed in this technique has a distinct impact on Fe speciation results (Buck, Moffett et al. 2012). Recently, (Pizeta, Sander et al. in preparation) have shown that the accuracy of complexometric titrations improve if multiple analytical windows (MAW) are solved as a united dataset. Several programs are now available that enable this approach with the KMS (Kineteql.xls , Hudson 2014), which is based on an Excel application based on speciation calculation (Hudson, Rue et al. 2003, Sander, Hunter et al. 2011), being one of them. In the present work, the unified MAW data analysis method is applied to determine iron speciation by CLE-AdCSV with salicyl aldoxime (SA) (Abualhaija and van den Berg 2014) in real seawater samples from the Spring bloom FeCycle III voyage, which took place in an anticyclonic eddy in subtropical waters east of New Zealand in spring 2012. Two different analytical windows (5 and 15µM SA) were applied to samples from depth profiles taken during this cruise. The data obtained was analysed using the program KMS (Kineteql.xls). Most samples only returned one Fe-binding ligands class. Higher ligand concentrations were observed in the upper water column and the stability constants were above 22 (e.g. 22.25 ± 0.21 for station 63). Our results will be discussed in the context of microbial community distribution as well as other biogeochemical parameters. Abualhaija, M. M. and C. M. G. van den Berg (2014). "Chemical speciation of iron in seawater using catalytic cathodic stripping voltammetry with ligand competition against salicylaldoxime." Marine Chemistry 164(0): 60-74. Buck, K. N., J. Moffett, K. A. Barbeau, R. M. Bundy, Y. Kondo and J. Wu (2012). "The organic complexation of iron and copper: an intercomparison of competitive ligand exchange-adsorptive cathodic stripping voltammetry (CLE-ACSV) techniques " Limnology and Oceanography: Methods 10: 496-515. Hudson, R. J. M., E. L. Rue and K. W. Bruland (2003). "Modeling Complexometric Titrations of Natural Water Samples." Environ. Sci. Tech. 37: 1553-1562. Pizeta, I., S. G. Sander, O. Baars, K. Buck, R. Bundy, G. Carrasco, P. Croot, C. Garnier, L. Gerringa, M. Gledhill, K. Hirose, D. R. Hudson, Y. Kondo-Jacquot, L. Laglera, D. Omanovic, M. Rijkenberg, B. Twining and M. Wells (in preparation). "Intercomparison of estimating metal binding ligand parameters from simulated titration data using different fitting approaches." for Limnology and Oceanography: Methods. Sander, S. G., K. A. Hunter, H. Harms and M. Wells (2011). "Numerical approach to speciation and estimation of parameters used in modeling trace metal bioavailability." Environmental Science and Technology 45(15): 6388-6395.

  4. Storage and Database Management for Big Data

    DTIC Science & Technology

    2015-07-27

    and value ), each cell is actually a seven tuple where the column is broken into three parts, and there is an additional field for a timestamp as seen...questions require a careful understanding of the technology field in addition to the types of problems that are being solved. This chapter aims to address...formats such as comma separated values (CSV), JavaScript Object Notation (JSON) [21], or other proprietary sensor formats. Most often, this raw data

  5. Automap User’s Guide 2013

    DTIC Science & Technology

    2013-06-03

    dairyFarm.txt Ted runs a dairy farm. He milks the cows , runs the office, and cleans the barn. 136 dairyFarmDeleteList.txt There are some...applying the delete list, the text appears in the display like this: Ted runs dairy farm. He milks cows , runs office, cleans ...in the file. dairyFarmMeta.csv Ted,agent runs,task dairy,resource farm,location He,agent milks ,task cows ,resource office,location cleans

  6. Counting missing values in a metabolite-intensity data set for measuring the analytical performance of a metabolomics platform.

    PubMed

    Huan, Tao; Li, Liang

    2015-01-20

    Metabolomics requires quantitative comparison of individual metabolites present in an entire sample set. Unfortunately, missing intensity values in one or more samples are very common. Because missing values can have a profound influence on metabolomic results, the extent of missing values found in a metabolomic data set should be treated as an important parameter for measuring the analytical performance of a technique. In this work, we report a study on the scope of missing values and a robust method of filling the missing values in a chemical isotope labeling (CIL) LC-MS metabolomics platform. Unlike conventional LC-MS, CIL LC-MS quantifies the concentration differences of individual metabolites in two comparative samples based on the mass spectral peak intensity ratio of a peak pair from a mixture of differentially labeled samples. We show that this peak-pair feature can be explored as a unique means of extracting metabolite intensity information from raw mass spectra. In our approach, a peak-pair peaking algorithm, IsoMS, is initially used to process the LC-MS data set to generate a CSV file or table that contains metabolite ID and peak ratio information (i.e., metabolite-intensity table). A zero-fill program, freely available from MyCompoundID.org , is developed to automatically find a missing value in the CSV file and go back to the raw LC-MS data to find the peak pair and, then, calculate the intensity ratio and enter the ratio value into the table. Most of the missing values are found to be low abundance peak pairs. We demonstrate the performance of this method in analyzing an experimental and technical replicate data set of human urine metabolome. Furthermore, we propose a standardized approach of counting missing values in a replicate data set as a way of gauging the extent of missing values in a metabolomics platform. Finally, we illustrate that applying the zero-fill program, in conjunction with dansylation CIL LC-MS, can lead to a marked improvement in finding significant metabolites that differentiate bladder cancer patients and their controls in a metabolomics study of 109 subjects.

  7. Surface Meteorology at Kougarok Site Station, Seward Peninsula, Alaska, Ongoing from 2017

    DOE Data Explorer

    Bob Busey; Bob Bolton; Cathy Wilson; Lily Cohen

    2017-12-04

    Meteorological data are currently being collected at one location at the top of the Kougarok hill, Seward Peninsula. This December 18, 2017 release includes data for: Teller Creek Station near TL_BSV (TELLER BOTTOM METEOROLOGICAL STATION) Station is located in the lower watershed in a tussock / willow transition zone and co-located with continuous snow depth measurements and subsurface measurements. Teller Creek Station near TL_IS_5 (TELLER TOP METEOROLOGICAL STATION) Station is located in the upper watershed and co-located with continuous snow depth measurements and subsurface measurements. Two types of data products are provided for these stations: First, meteorological and site characterization data grouped by sensor/measurement type (e.g., radiation or soil pit temperature and moisture). These are *.csv files. Second, a Data Visualization tool is provided for quick visualization of measurements over time at a station. Download the *_Visualizer.zip file, extract, and click on the 'index.html' file. Data values are the same in both products.

  8. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the

  9. Figure11

    EPA Pesticide Factsheets

    R script: ensemble_rrf_sigma_vs_mean_play.RData: ensemble_mean_sigma_rrf_allgrids_epismax_new_13runs.csvPlot: boxplot_ensemble_rrf_sigma_vs_mean_nowater_new_13runs_epimax.pdfThis dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  10. The development of method for continuous improvement of master file of the nursing practice terminology.

    PubMed

    Tsuru, Satoko; Okamine, Eiko; Takada, Aya; Watanabe, Chitose; Uchiyama, Makiko; Dannoue, Hideo; Aoyagi, Hisae; Endo, Akira

    2009-01-01

    Nursing Action Master and Nursing Observation Master were released from 2002 to 2008. Two kinds of format, an Excel format and a CSV format file are prepared for maintaining them. Followings were decided as a basic rule of the maintenance: newly addition, revision, deletion, the numbering of the management and a rule of the coding. The master was developed based on it. We do quality assurance for the masters using these rules.

  11. PLEXOS Input Data Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  12. FY2012 Annual Report for Director Operational Test & Evaluation (DOT&E)

    DTIC Science & Technology

    2012-01-01

    The JLTV FoV consists of two vehicle categories: the JLTV Combat Tactical Vehicle (CTV), designed to seat four passengers ; and the JLTV Combat...Support Vehicle (CSV), designed to seat two passengers . • The JLTV CTV has a 3,500-pound payload and three mission package configurations: - Close...For example, a previous ground combat vehicle had KPPs that only required it seat nine passengers , be transportable by a C-130, and have a

  13. Variability in home mechanical ventilation prescription.

    PubMed

    Escarrabill, Joan; Tebé, Cristian; Espallargues, Mireia; Torrente, Elena; Tresserras, Ricard; Argimón, J

    2015-10-01

    Few studies have analyzed the prevalence and accessibility of home mechanical ventilation (HMV). The aim of this study was to characterize the prevalence of HMV and variability in prescriptions from administrative data. Prescribing rates of HMV in the 37 healthcare sectors of the Catalan Health Service were compared from billing data from 2008 to 2011. Crude accumulated activity rates (per 100,000 population) were calculated using systematic component of variation (SCV) and empirical Bayes (EB) methods. Standardized activity ratios (SAR) were described using a map of healthcare sectors. A crude rate of 23 HMV prescriptions per 100,000 population was observed. Rates increase with age and have increased by 39%. Statistics measuring variation not due to chance show a high variation in women (CSV=0.20 and EB=0.30) and in men (CSV=0.21 and EB=0.40), and were constant over time. In a multilevel Poisson model, hospitals with a chest unit were associated with a greater number of cases (beta=0.68, P<.0001). High variability in prescribing HMV can be explained, in part, by the attitude of professionals towards treatment and accessibility to specialist centers with a chest unit. Analysis of administrative data and variability mapping help identify unexplained variations and, in the absence of systematic records, are a feasible way of tracking treatment. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  14. Study on evaluation of starch, dietary fiber and mineral composition of cookies developed from 12 sorghum cultivars.

    PubMed

    Rao, B Dayakar; Kulkarni, Dhanashri B; C, Kavitha

    2018-01-01

    The study aimed to identify best cultivars suitable for sorghum cookies accordingly nutrient and mineral compositions were evaluated. Protein and fat content of cookies were ranged from 5.89±0.04 to 8.27±0.21% and 21.03±0.01 to 23.08±0.03% respectively. The starch content of cookie ranged between 47.06±0.01 and 42.15±0.03% and dietary fiber was reported highest in CSH14 (9.27±0.01%). The highest Mg (56.24±0.03mg/100g) P (255.54±0.03mg/100g), and K (124.26±0.02mg/100g) content were found in C43 cultivar. CSV18R was reported highest iron content (1.23±0.01mg/100g). The sensory scores for overall acceptability of cookies were highest in CSH23, CSH13R and CSV18R cultivars which are rich in dietary fiber and minerals. Normally the hybrids are high yielders and the grain price/qt is 20% lower than varieties. It is implied the raw material costs of two identified cultivars (CSH23 & CSH13R) would help the industry to reduce overall cost of production and offer a better profit margins over the varieties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Phylogenetic analyses indicate little variation among reticuloendotheliosis viruses infecting avian species, including the endangered Attwater's prairie chicken.

    PubMed

    Bohls, Ryan L; Linares, Jose A; Gross, Shannon L; Ferro, Pam J; Silvy, Nova J; Collisson, Ellen W

    2006-08-01

    Reticuloendotheliosis virus infection, which typically causes systemic lymphomas and high mortality in the endangered Attwater's prairie chicken, has been described as a major obstacle in repopulation efforts of captive breeding facilities in Texas. Although antigenic relationships among reticuloendotheliosis virus (REV) strains have been previously determined, phylogenetic relationships have not been reported. The pol and env of REV proviral DNA from prairie chickens (PC-R92 and PC-2404), from poxvirus lesions in domestic chickens, the prototype poultry derived REV-A and chick syncytial virus (CSV), and duck derived spleen necrosis virus (SNV) were PCR amplified and sequenced. The 5032bp, that included the pol and most of env genes, of the PC-R92 and REV-A were 98% identical, and nucleotide sequence identities of smaller regions within the pol and env from REV strains examined ranged from 95 to 99% and 93 to 99%, respectively. The putative amino acid sequences were 97-99% identical in the polymerase and 90-98% in the envelope. Phylogenetic analyses of the nucleotide and amino acid sequences indicated the closest relationship among the recent fowl pox-associated chicken isolates, the prairie chicken isolates and the prototype CSV while only the SNV appeared to be distinctly divergent. While the origin of the naturally occurring viruses is not known, the avian poxvirus may be a critical component of transmission of these ubiquitous oncogenic viruses.

  16. Theoretical analysis on non-uniformity of water distribution and influence of construction parameters on settling efficiency.

    PubMed

    Huang, Ting-Lin; Li, Yu-Xian; Zhang, Hui

    2008-01-01

    During the reconstruction of horizontal flow tanks into inclined settling tanks in Chinese water plants, uniformity of water distribution has not been solved theoretically. Based on the concepts of hydraulics, a model of inclined tanks, including the ratio (L/B) of tank length (L) to width (B), diameter of inclined tubes (d) and height of the water distribution area (h1) and so on, was established to simulate and analyze the effects of these parameters on Non-Uniformity of Water Distribution (NUWD). The influences of NUWD on settling efficiency were also analyzed based on Yao's formula. Simulated results show that the ratio (L/B) has the greatest impact on NUWD, and the settling efficiency decreases with it. Under the conditions of q=10 or 20 m/h and L/B>or=5 or 3, the total forces imposed on down-sliding flocs tend to be zero, which reduces the separating efficiency. Moreover, critical settling velocity (CSV) of the first inclined tube will decrease with the increase of h1, and the optimal range of h1 will be 1.2-1.6 m. The difference of CSV between the first tube and the average value of the tank u0 (shown as Delta(uF0-u0)) will increase with d and surface load (q). Copyright IWA Publishing 2008.

  17. Interactive Visualization of National Airspace Data in 4D (IV4D)

    DTIC Science & Technology

    2010-08-01

    Research Laboratory) JView graphics engine. All of the software, IV4D/Viewer/JView, is written in Java and is platform independent, meaning that it...both parts. 11 3.3.1.1 Airspace Volumes Once appropriate CSV or ACES XML airspace boundary files are selected from a standard Java File Chooser...persistence mechanism, Hibernate , was replaced with JDBC specific code and, over time, quite a bit of JDBC support code was added to the Viewer and to

  18. Catalog Descriptions Using VOTable Files

    NASA Astrophysics Data System (ADS)

    Thompson, R.; Levay, K.; Kimball, T.; White, R.

    2008-08-01

    Additional information is frequently required to describe database table contents and make it understandable to users. For this reason, the Multimission Archive at Space Telescope (MAST) creates Òdescription filesÓ for each table/catalog. After trying various XML and CSV formats, we finally chose VOTable. These files are easy to update via an HTML form, easily read using an XML parser such as (in our case) the PHP5 SimpleXML extension, and have found multiple uses in our data access/retrieval process.

  19. Auxiliary Library Explorer (ALEX) Development

    DTIC Science & Technology

    2016-02-01

    non-empty cells. This is a laborious manual task and could probably have been avoided by using Java code to read the data directly from Excel. In fact...it might be even easier to leave the data as a comma separated variables (CSV) file and read the data in with Java , although this could create other...This is first implemented using the MakeFullDatabaseapp Java project, which performs an SQL query on the DSpace data to return a list of items for which

  20. Anbindung des SISIS-SunRise-Bibliothekssystems an das zentrale Identitätsmanagement

    NASA Astrophysics Data System (ADS)

    Ebner, Ralf; Pretz, Edwin

    Wir berichten über Konzepte und Implementierungen zur Datenprovisionierung aus den Personenverwaltungssystemen der Technischen Universität München (TUM) über das zentrale Metadirectory am Leibniz-Rechenzentrum (LRZ) in das SISIS-SunRise-Bibliothekssystem der Universitätsbibliothek der TUM (TUB). Es werden drei Implementierungsvarianten diskutiert, angefangen von der Generierung und Übertragung einfacher CSV-Dateien über ein OpenLDAP-basiertes Konzept als Backend für die SISIS-Datenbank bis zur endgültigen Implementierung mit dem OCLC IDM Connector.

  1. Analyzed Boise Data for Oscillatory Hydraulic Tomography

    DOE Data Explorer

    Lim, David

    2015-07-01

    Data here has been "pre-processed" and "analyzed" from the raw data submitted to the GDR previously (raw data files found at http://gdr.openei.org/submissions/479. doi:10.15121/1176944 after 30 September 2017). First, we submit .mat files which are the "pre-processed" data (must have MATLAB software to use). Secondly, the csv files contain submitted data in its final analyzed form before being used for inversion. Specifically, we have fourier coefficients obtained from Fast Fourier Transform Algorithms.

  2. BOREAS TE-21 SSA Site Characteristics Data

    NASA Technical Reports Server (NTRS)

    Knox, Robert; Hall, Forrest G. (Editor); Papagno, Andrea (Editor)

    2000-01-01

    The Boreal Ecosystem-Atmospheric Study (BOREAS) TE-20 (Terrestrial Ecology) team collected several data sets for use in developing and testing models of forest ecosystem dynamics. This data set contains measurements of site characteristics conducted in the Southern Study Area (SSA) from 18 Jul 1994 to 30 Jul 1994. The data are stored in CSV files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  3. Data are from Mars, Tools are from Venus

    NASA Technical Reports Server (NTRS)

    Lee, H. Joe

    2017-01-01

    Although during the data production phase, the data producers will usually ensure the products to be easily used by the specific power users the products serve. However, most data products are also posted for general public to use. It is not straightforward for data producers to anticipate what tools that these general end-data users are likely to use. In this talk, we will try to help fill in the gap by going over various tools related to Earth Science and how they work with the existing NASA HDF (Hierarchical Data Format) data products and the reasons why some products cannot be visualized or analyzed by existing tools. One goal is for to give insights for data producers on how to make their data product more interoperable. On the other hand, we also provide some hints for end users on how to make tools work with existing HDF data products. (tool category list: check the comments) HDF-EOS tools: HDFView HDF-EOS Plugin, HEG, h4tonccf, hdf-eos2 dumper, NCL, MATLAB, IDL, etc.net; CDF-Java tools: Panoply, IDV, toosUI, NcML, etc.net; CDF-C tools: ArcGIS Desktop, GrADS, NCL, NCO, etc.; GDAL tools: ArcGIS Desktop, QGIS, Google Earth, etc.; CSV tools: ArcGIS Online, MS Excel, Tableau, etc.

  4. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  5. A Customizable Importer for the Clinical Data Warehouses PaDaWaN and I2B2.

    PubMed

    Fette, Georg; Kaspar, Mathias; Dietrich, Georg; Ertl, Maximilian; Krebs, Jonathan; Stoerk, Stefan; Puppe, Frank

    2017-01-01

    In recent years, clinical data warehouses (CDW) storing routine patient data have become more and more popular to support scientific work in the medical domain. Although CDW systems provide interfaces to import new data, these interfaces have to be used by processing tools that are often not included in the systems themselves. In order to establish an extraction-transformation-load (ETL) workflow, already existing components have to be taken or new components have to be developed to perform the load part of the ETL. We present a customizable importer for the two CDW systems PaDaWaN and I2B2, which is able to import the most common import formats (plain text, CSV and XML files). In order to be run, the importer only needs a configuration file with the user credentials for the target CDW and a list of XML import configuration files, which determine how already exported data is indented to be imported. The importer is provided as a Java program, which has no further software requirements.

  6. Automatic and efficient methods applied to the binarization of a subway map

    NASA Astrophysics Data System (ADS)

    Durand, Philippe; Ghorbanzadeh, Dariush; Jaupi, Luan

    2015-12-01

    The purpose of this paper is the study of efficient methods for image binarization. The objective of the work is the metro maps binarization. the goal is to binarize, avoiding noise to disturb the reading of subway stations. Different methods have been tested. By this way, a method given by Otsu gives particularly interesting results. The difficulty of the binarization is the choice of this threshold in order to reconstruct. Image sticky as possible to reality. Vectorization is a step subsequent to that of the binarization. It is to retrieve the coordinates points containing information and to store them in the two matrices X and Y. Subsequently, these matrices can be exported to a file format 'CSV' (Comma Separated Value) enabling us to deal with them in a variety of software including Excel. The algorithm uses quite a time calculation in Matlab because it is composed of two "for" loops nested. But the "for" loops are poorly supported by Matlab, especially in each other. This therefore penalizes the computation time, but seems the only method to do this.

  7. Mapping the Gaps: Building a pipeline for contributing and accessing crowdsourced bathymetry data

    NASA Astrophysics Data System (ADS)

    Rosenberg, A. M.; Jencks, J. H.; Robertson, E.; Reed, A.

    2017-12-01

    Both the Moon and Mars have been more comprehensively mapped than the Earth's oceans. Notably, less than 15% of the world's deep ocean and 50% of the world's coastal waters (<200m) have been measured directly. A knowledge of the depth and shape of the seafloor underpins the safe, sustainable, cost effective execution of almost every human activity that takes place at sea, yet most of the seafloor remains virtually unmapped, unobserved, and unexplored. Since 2014, the International Hydrographic Organization (IHO) has encouraged innovative supplementary data-gathering and data-maximizing initiatives to increase knowledge of the bathymetry of the seas, oceans and coastal waters including crowdsourced bathymetry (CSB). CSB can be used to identify areas where nautical charts are inadequate or applied to charts when the source and uncertainties of the data are well understood. The key to successful CSB efforts is volunteer observers who operate vessels-of-opportunity in places where charts are poor or where the seafloor is dynamic and hydrographic assets are not easily available. NOAA chairs the IHO CSB Working Group and hosts the IHO Data Centre for Digital Bathymetry (IHO DCDB) at NOAA's National Centers for Environmental Information (NCEI). NCEI has been working to enhance the infrastructure and interface of the DCDB to provide archiving, discovery, display and retrieval of CSB contributed from mariners around the world. NCEI, in partnership with NOAA's Office of Coast Survey and Rose Point Navigation Systems, established a citizen science pilot program in 2015 to harvest CSB from Electronic Navigation Systems. Today, data providers can submit xyz, csv, or geoJSON for automated ingest, while other formats can be accommodated with minimal system code changes. Like most marine geophysical datasets at NCEI, users can discover, filter, and request CSB data via a map viewer (https://maps.ngdc.noaa.gov/viewers/csb/). Now that the CSB pipeline has been established, NCEI has begun to plan future work that includes expanding the current infrastructure to account for increasing data volumes and implementing a point storage technology that would allow results to be dynamically generated and displayed through heat maps, while continuing to increase the number of data contributors to the IHO CSB initiative.

  8. Data display and analysis with μView

    NASA Astrophysics Data System (ADS)

    Tucakov, Ivan; Cosman, Jacob; Brewer, Jess H.

    2006-03-01

    The μView utility is a new Java applet version of the old db program, extended to include direct access to MUD data files, from which it can construct a variety of spectrum types, including complex and RRF-transformed spectra. By using graphics features built into all modern Web browsers, it provides full graphical display capabilities consistently across all platforms. It has the full command-line functionality of db as well as a more intuitive graphical user interface and extensive documentation, and can read and write db, csv and XML format files.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  10. ARC Cell Science Validation (CS-V) Payload Overview

    NASA Technical Reports Server (NTRS)

    Gilkerson, Nikita

    2017-01-01

    Automated cell biology system for laboratory and International Space Station (ISS) National Laboratory research. Enhanced cell culture platform that provides undisturbed culture maintenance, including feedback temperature control, medical grade gas supply, perfusion nutrient delivery and removal of waste, and automated experiment manipulations. Programmable manipulations include: media feeds change out, injections, fraction collections, fixation, flow rate, and temperature modification within a one-piece sterile barrier flow path. Cassette provides 3 levels of containment and allows Crew access to the bioculture chamber and flow path assembly for experiment initiation, refurbishment, or sample retrieval and preservation.

  11. SedMob: A mobile application for creating sedimentary logs in the field

    NASA Astrophysics Data System (ADS)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  12. Original data preprocessor for Femap/Nastran

    NASA Astrophysics Data System (ADS)

    Oanta, Emil M.; Panait, Cornel; Raicu, Alexandra

    2016-12-01

    Automatic data processing and visualization in the finite elements analysis of the structural problems is a long run concern in mechanical engineering. The paper presents the `common database' concept according to which the same information may be accessed from an analytical model, as well as from a numerical one. In this way, input data expressed as comma-separated-value (CSV) files are loaded into the Femap/Nastran environment using original API codes, being automatically generated: the geometry of the model, the loads and the constraints. The original API computer codes are general, being possible to generate the input data of any model. In the next stages, the user may create the discretization of the model, set the boundary conditions and perform a given analysis. If additional accuracy is needed, the analyst may delete the previous discretizations and using the same information automatically loaded, other discretizations and analyses may be done. Moreover, if new more accurate information regarding the loads or constraints is acquired, they may be modelled and then implemented in the data generating program which creates the `common database'. This means that new more accurate models may be easily generated. Other facility consists of the opportunity to control the CSV input files, several loading scenarios being possible to be generated in Femap/Nastran. In this way, using original intelligent API instruments the analyst is focused to accurately model the phenomena and on creative aspects, the repetitive and time-consuming activities being performed by the original computer-based instruments. Using this data processing technique we apply to the best Asimov's principle `minimum change required / maximum desired response'.

  13. A novel microfluidic valve controlledby induced charge electro-osmotic flow

    NASA Astrophysics Data System (ADS)

    Wang, Chengfa; Song, Yongxin; Pan, Xinxiang; Li, Dongqing

    2016-07-01

    In this paper, a novel microfluidic valve by utilizing induced charge electro-osmotic flow (ICEOF) is proposed and analyzed. The key part of the microfluidic valve is a Y-shaped microchannel. A small metal plate is placed at each corner of the junction of the Y-shaped microchannel. When a DC electrical field is applied through the channels, electro-osmotic flows occur in the channels, and two vortices will be formed near each of the metal plates due to the ICEOF. The two vortices behave like virtual ‘blocking columns’ to restrain and direct the flow in the Y-channel. In this paper, effects of the length of the metal plates, the applied voltages, the width of the microchannel, the zeta potential of the non-metal microchannel wall, and the orientation of the branch channels on the flow switching between two outlet channels are numerically investigated. The results show that the flow switching between the two outlet channels can be flexibly achieved by adjusting the applied DC voltages. The critical switching voltage (CSV), under which one outlet channel is closed, decreases with the increase in the metal plate length and the orientation angle of the outlet channels. The CSV, however, increases with the increase in the inlet voltage, the width of the microchannel, and the absolute value of the zeta potential of the non-metal microchannel wall. Compared with other types of micro-valves, the proposed micro-valve is simple in structure without any moving parts. Only a DC power source is needed for its actuation, thus it can operate automatically by controlling the applied voltages.

  14. NASA Thesaurus Data File

    NASA Technical Reports Server (NTRS)

    2012-01-01

    The NASA Thesaurus contains the authorized NASA subject terms used to index and retrieve materials in the NASA Aeronautics and Space Database (NA&SD) and NASA Technical Reports Server (NTRS). The scope of this controlled vocabulary includes not only aerospace engineering, but all supporting areas of engineering and physics, the natural space sciences (astronomy, astrophysics, planetary science), Earth sciences, and the biological sciences. The NASA Thesaurus Data File contains all valid terms and hierarchical relationships, USE references, and related terms in machine-readable form. The Data File is available in the following formats: RDF/SKOS, RDF/OWL, ZThes-1.0, and CSV/TXT.

  15. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  16. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques.

    PubMed

    Gandy, Lisa M; Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85-100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases.

  17. Synthesizer: Expediting synthesis studies from context-free data with information retrieval techniques

    PubMed Central

    Gumm, Jordan; Fertig, Benjamin; Thessen, Anne; Kennish, Michael J.; Chavan, Sameer; Marchionni, Luigi; Xia, Xiaoxin; Shankrit, Shambhavi; Fertig, Elana J.

    2017-01-01

    Scientists have unprecedented access to a wide variety of high-quality datasets. These datasets, which are often independently curated, commonly use unstructured spreadsheets to store their data. Standardized annotations are essential to perform synthesis studies across investigators, but are often not used in practice. Therefore, accurately combining records in spreadsheets from differing studies requires tedious and error-prone human curation. These efforts result in a significant time and cost barrier to synthesis research. We propose an information retrieval inspired algorithm, Synthesize, that merges unstructured data automatically based on both column labels and values. Application of the Synthesize algorithm to cancer and ecological datasets had high accuracy (on the order of 85–100%). We further implement Synthesize in an open source web application, Synthesizer (https://github.com/lisagandy/synthesizer). The software accepts input as spreadsheets in comma separated value (CSV) format, visualizes the merged data, and outputs the results as a new spreadsheet. Synthesizer includes an easy to use graphical user interface, which enables the user to finish combining data and obtain perfect accuracy. Future work will allow detection of units to automatically merge continuous data and application of the algorithm to other data formats, including databases. PMID:28437440

  18. WaterML, an Information Standard for the Exchange of in-situ hydrological observations

    NASA Astrophysics Data System (ADS)

    Valentine, D.; Taylor, P.; Zaslavsky, I.

    2012-04-01

    The WaterML 2.0 Standards Working Group (SWG), working within the Open Geospatial Consortium (OGC) and in cooperation with the joint OGC-World Meteorological Organization (WMO) Hydrology Domain Working Group (HDWG), has developed an open standard for the exchange of water observation data; WaterML 2.0. The focus of the standard is time-series data, commonly generated from in-situ style monitoring. This is high value data for hydrological applications such as flood forecasting, environmental reporting and supporting hydrological infrastructure (e.g. dams, supply systems), which is commonly exchanged, but a lack of standards inhibits efficient reuse and automation. The process of developing WaterML required doing a harmonization analysis of existing standards to identify overlapping concepts and come to agreement on a harmonized definition. Generally the formats captured similar requirements, all with subtle differences, such as how time-series point metadata was handled. The in-progress standard WaterML 2.0 incorporates the semantics of the hydrologic information: location, procedure, and observations, and is implemented as an application schema of the Geography Markup Language version 3.2.1, making use of the OGC Observations & Measurements standards. WaterML2.0 is designed as an extensible schema to allow encoding of data to be used in a variety of exchange scenarios. Example areas of usage are: exchange of data for operational hydrological monitoring programs; supporting operation of infrastructure (e.g. dams, supply systems); cross-border exchange of observational data; release of data for public dissemination; enhancing disaster management through data exchange; and exchange in support of national reporting The first phase of WaterML2.0 focused on structural definitions allowing for the transfer of time-series, with less work on harmonization of vocabulary items such as quality codes. Vocabularies from various organizations tend to be specific and take time to come to agreement on. This will be continued in future work for the HDWG, along with extending the information model to cover additional types of hydrologic information: rating and gauging information, and water quality. Rating curves, gaugings and river cross sections are commonly exchanged in addition to standard time-series data to allow information relating to conversions such as river level to discharge. Members of the HDWG plan to initiate this work in early 2012. Water quality data is varied in the way it is processed and in the number of phenomena it measures. It will require specific components of extension to the WaterML2.0 model, most likely making use of the specimen types within O&M and extensive use of controlled vocabularies. Other future work involves different target encodings for the WaterML2.0 conceptual model, such as JSON, netCDF, CSV etc. are optimized for particular needs, such as efficiency in size of the encoding and parsing of structure, but may not be capable of representing the full extent of the WaterML2.0 information model. Certain encodings are best matched for particular needs; the community has begun investigation into when and how best to implement these.

  19. A New Paradigm to Analyze Data Completeness of Patient Data.

    PubMed

    Nasir, Ayan; Gurupur, Varadraj; Liu, Xinliang

    2016-08-03

    There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data.

  20. Spreadsheet Toolkit for Ulysses Hi-Scale Measurements of Interplanetary Ions and Electrons

    NASA Astrophysics Data System (ADS)

    Reza, J. Z.; Lanzerotti, L. J.; Denker, C.; Patterson, D.; Amstrong, T. P.

    2004-05-01

    Throughout the entire Ulysses out-of-the-ecliptic solar polar mission, the Heliosphere Instrument for Spectra, Composition, and Anisotropy at Low Energies (HI-SCALE) has collected measurements of interplanetary ions and electrons. Time-series of electron and ion fluxes obtained since 1990 have been carefully calibrated and will be stored in a data management system, which will be publicly accessible via the WWW. The goal of the Virtual Solar Observatory (VSO) is to provide data uniformly and efficiently to a diverse user community. However, data dissemination can only be a first step, which has to be followed by a suite of data analysis tools that are tailored towards a diverse user community in science, technology, and education. The widespread use and familiarity of spreadsheets, which are available at low cost or open source for many operating systems, make them an interesting tool to investigate for the analysis of HI-SCALE data. The data are written in comma separated variable (CSV) format, which is commonly used in spreadsheet programs. CSV files can simply be linked as external data to spreadsheet templates, which in turn can be used to generate tables and figures of basic statistical properties and frequency distributions, temporal evolution of electron and ion spectra, comparisons of various energy channels, automatic detection of solar events, solar cycle variations, and space weather. Exploring spreadsheet-assisted data analysis in the context of information technology research, data base information search and retrieval, and data visualization potentially impacts other VSO components, where diverse user communities are targeted. Finally, this presentation is the result of an undergraduate research project, which will allow us to evaluate the performance of user-based spreadsheet analysis "benchmarked" at the undergraduate skill level.

  1. A New Paradigm to Analyze Data Completeness of Patient Data

    PubMed Central

    Nasir, Ayan; Liu, Xinliang

    2016-01-01

    Summary Background There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Objectives Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. Methods The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. Results The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. Conclusion DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data. PMID:27484918

  2. The cortical spatiotemporal correlate of otolith stimulation: Vestibular evoked potentials by body translations.

    PubMed

    Ertl, M; Moser, M; Boegle, R; Conrad, J; Zu Eulenburg, P; Dieterich, M

    2017-07-15

    The vestibular organ senses linear and rotational acceleration of the head during active and passive motion. These signals are necessary for bipedal locomotion, navigation, the coordination of eye and head movements in 3D space. The temporal dynamics of vestibular processing in cortical structures have hardly been studied in humans, let alone with natural stimulation. The aim was to investigate the cortical vestibular network related to natural otolith stimulation using a hexapod motion platform. We conducted two experiments, 1. to estimate the sources of the vestibular evoked potentials (VestEPs) by means of distributed source localization (n=49), and 2. to reveal modulations of the VestEPs through the underlying acceleration intensity (n=24). For both experiments subjects were accelerated along the main axis (left/right, up/down, fore/aft) while the EEG was recorded. We were able to identify five VestEPs (P1, N1, P2, N2, P3) with latencies between 38 and 461 ms as well as an evoked beta-band response peaking with a latency of 68 ms in all subjects and for all acceleration directions. Source localization gave the cingulate sulcus visual (CSv) area and the opercular-insular region as the main origin of the evoked potentials. No lateralization effects due to handedness could be observed. In the second experiment, area CSv was shown to be integral in the processing of acceleration intensities as sensed by the otolith organs, hinting at its potential role in ego-motion detection. These robust VestEPs could be used to investigate the mechanisms of inter-regional interaction in the natural context of vestibular processing and multisensory integration. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Arms, S. C.

    2015-12-01

    Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.

  4. PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-04

    PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields;more » generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less

  5. A high-speed drug interaction search system for ease of use in the clinical environment.

    PubMed

    Takada, Masahiro; Inada, Hiroshi; Nakazawa, Kazuo; Tani, Shoko; Iwata, Michiaki; Sugimoto, Yoshihisa; Nagata, Satoru

    2012-12-01

    With the advancement of pharmaceutical development, drug interactions have become increasingly complex. As a result, a computer-based drug interaction search system is required to organize the whole of drug interaction data. To overcome problems faced with the existing systems, we developed a drug interaction search system using a hash table, which offers higher processing speeds and easier maintenance operations compared with relational databases (RDB). In order to compare the performance of our system and MySQL RDB in terms of search speed, drug interaction searches were repeated for all 45 possible combinations of two out of a group of 10 drugs for two cases: 5,604 and 56,040 drug interaction data. As the principal result, our system was able to process the search approximately 19 times faster than the system using the MySQL RDB. Our system also has several other merits such as that drug interaction data can be created in comma-separated value (CSV) format, thereby facilitating data maintenance. Although our system uses the well-known method of a hash table, it is expected to resolve problems common to existing systems and to be an effective system that enables the safe management of drugs.

  6. Analytical methods for determination of free metal ion concentration, labile species fraction and metal complexation capacity of environmental waters: a review.

    PubMed

    Pesavento, Maria; Alberti, Giancarla; Biesuz, Raffaela

    2009-01-12

    Different experimental approaches have been suggested in the last few decades to determine metal species in complex matrices of unknown composition as environmental waters. The methods are mainly focused on the determination of single species or groups of species. The more recent developments in trace elements speciation are reviewed focusing on methods for labile and free metal determination. Electrochemical procedures with low detection limit as anodic stripping voltammetry (ASV) and the competing ligand exchange with adsorption cathodic stripping voltammetry (CLE-AdCSV) have been widely employed in metal distribution studies in natural waters. Other electrochemical methods such as stripping chronopotentiometry and AGNES seem to be promising to evaluate the free metal concentration at the low levels of environmental samples. Separation techniques based on ion exchange (IE) and complexing resins (CR), and micro separation methods as the Donnan membrane technique (DMT), diffusive gradients in thin-film gels (DGT) and the permeation liquid membrane (PLM), are among the non-electrochemical methods largely used in this field and reviewed in the text. Under appropriate conditions such techniques make possible the evaluation of free metal ion concentration.

  7. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.

  8. Determination of the spinel group end-members based on electron microprobe analyses

    NASA Astrophysics Data System (ADS)

    Ferracutti, Gabriela R.; Gargiulo, M. Florencia; Ganuza, M. Luján; Bjerg, Ernesto A.; Castro, Silvia M.

    2015-04-01

    The spinel group minerals have been the focus of many studies, not only because of their economic interest, but also due to the fact that they are very useful as petrogenetic indicators. The application End-Members Generator (EMG) allows to establish, based on electron microprobe analyses (EMPA), the 19 end-members of the spinel group: MgAl2O4 (Spinel sensu stricto, s.s.), FeAl2O4 (Hercynite), MnAl2O4 (Galaxite), ZnAl2O4 (Gahnite), MgFe2O4 (Magnesioferrite), Fe3O4 (Magnetite), MnFe2O4 (Jacobsite), ZnFe2O4 (Franklinite), NiFe2O4 (Trevorite), MgCr2O4 (Magnesiochromite), FeCr2O4 (Chromite), MnCr2O4 (Manganochromite), ZnCr2O4 (Zincochromite), NiCr2O4 (Nichromite), MgV2O4 (Magnesiocoulsonite), FeV2O4 (Coulsonite), MnV2O4 (Vuorelainenite), Mg2TiO4 (Qandilite) and Fe2TiO4 (Ulvöspinel). EMG is an application that does not require an installation process and was created with the purpose of performing calculations to obtain: cation proportions (per formula unit, p.f.u.), end-members of the spinel group, redistribution proportions for the corresponding end-members in the Magnetite prism or Ulvöspinel prism and a data validation section to check the results. EMG accepts .csv data files and the results obtained can be used to represent a given dataset with the SpinelViz program or any other 2D and/or 3D graph plotting software.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, Benjamin A.

    We report on the use and design of a portable, extensible performance data collection tool motivated by modeling needs of the high performance computing systems co-design com- munity. The lightweight performance data collectors with Eiger support is intended to be a tailorable tool, not a shrink-wrapped library product, as pro ling needs vary widely. A single code markup scheme is reported which, based on compilation ags, can send perfor- mance data from parallel applications to CSV les, to an Eiger mysql database, or (in a non-database environment) to at les for later merging and loading on a host with mysqlmore » available. The tool supports C, C++, and Fortran applications.« less

  10. Working with HITRAN Database Using Hapi: HITRAN Application Programming Interface

    NASA Astrophysics Data System (ADS)

    Kochanov, Roman V.; Hill, Christian; Wcislo, Piotr; Gordon, Iouli E.; Rothman, Laurence S.; Wilzewski, Jonas

    2015-06-01

    A HITRAN Application Programing Interface (HAPI) has been developed to allow users on their local machines much more flexibility and power. HAPI is a programming interface for the main data-searching capabilities of the new "HITRANonline" web service (http://www.hitran.org). It provides the possibility to query spectroscopic data from the HITRAN database in a flexible manner using either functions or query language. Some of the prominent current features of HAPI are: a) Downloading line-by-line data from the HITRANonline site to a local machine b) Filtering and processing the data in SQL-like fashion c) Conventional Python structures (lists, tuples, and dictionaries) for representing spectroscopic data d) Possibility to use a large set of third-party Python libraries to work with the data e) Python implementation of the HT lineshape which can be reduced to a number of conventional line profiles f) Python implementation of total internal partition sums (TIPS-2011) for spectra simulations g) High-resolution spectra calculation accounting for pressure, temperature and optical path length h) Providing instrumental functions to simulate experimental spectra i) Possibility to extend HAPI's functionality by custom line profiles, partitions sums and instrumental functions Currently the API is a module written in Python and uses Numpy library providing fast array operations. The API is designed to deal with data in multiple formats such as ASCII, CSV, HDF5 and XSAMS. This work has been supported by NASA Aura Science Team Grant NNX14AI55G and NASA Planetary Atmospheres Grant NNX13AI59G. L.S. Rothman et al. JQSRT, Volume 130, 2013, Pages 4-50 N.H. Ngo et al. JQSRT, Volume 129, November 2013, Pages 89-100 A. L. Laraia at al. Icarus, Volume 215, Issue 1, September 2011, Pages 391-400

  11. The mzqLibrary – An open source Java library supporting the HUPO‐PSI quantitative proteomics standard

    PubMed Central

    Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M.; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R.

    2015-01-01

    The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML‐based format, capable of representing data about two‐dimensional features from LC‐MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java‐based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)‐level quantification values from peptide‐level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC‐MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in‐built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq‐lib/. PMID:26037908

  12. The mzqLibrary--An open source Java library supporting the HUPO-PSI quantitative proteomics standard.

    PubMed

    Qi, Da; Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R

    2015-09-01

    The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML-based format, capable of representing data about two-dimensional features from LC-MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java-based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)-level quantification values from peptide-level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC-MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in-built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq-lib/. © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Automated Estimation of the Orbital Parameters of Jupiter's Moons

    NASA Astrophysics Data System (ADS)

    Western, Emma; Ruch, Gerald T.

    2016-01-01

    Every semester the Physics Department at the University of St. Thomas has the Physics 104 class complete a Jupiter lab. This involves taking around twenty images of Jupiter and its moons with the telescope at the University of St. Thomas Observatory over the course of a few nights. The students then take each image and find the distance from each moon to Jupiter and plot the distances versus the elapsed time for the corresponding image. Students use the plot to fit four sinusoidal curves of the moons of Jupiter. I created a script that automates this process for the professor. It takes the list of images and creates a region file used by the students to measure the distance from the moons to Jupiter, a png image that is the graph of all the data points and the fitted curves of the four moons, and a csv file that contains the list of images, the date and time each image was taken, the elapsed time since the first image, and the distances to Jupiter for Io, Europa, Ganymede, and Callisto. This is important because it lets the professor spend more time working with the students and answering questions as opposed to spending time fitting the curves of the moons on the graph, which can be time consuming.

  14. Teletesting at IABG - Technical Features and Security Issues

    NASA Astrophysics Data System (ADS)

    Goerner, E.

    2004-08-01

    In the space simulation department at IABG data handling systems are used to collect, evaluate and present all data gathered from different test chambers during thermal vacuum tests. In the year 2000 a redesign of the existing data handling systems gave us the opportunity to add some features like ethernet- based client / server systems and internet protocol TCP / IP. The results were state of the art internet-ready data handling systems. Based on this we started mid 2002 with a new project called teletesting to give our customers remote access to test data. For the realisation TCO (Total Cost of Ownership), QoS (Quality of Service), data confidentiality, restrictive access to test data and a plain and simple user interface with standard components, i.e. normal PC hardware and software, were mandatory. As a result of this project, our customers have now online access to their test data in CSV/EXCEL format, in display mode either in numerical or graphical form and through DynaWorks. ISDN teletesting is already used by our customers, internet teletesting is in test mode but some parts have already been approved and used. Although an extension to teleoperation is implemented in the control systems (WIN CC) of our test chambers, it is not yet in use.

  15. The Particle Physics Playground website: tutorials and activities using real experimental data

    NASA Astrophysics Data System (ADS)

    Bellis, Matthew; CMS Collaboration

    2016-03-01

    The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.

  16. lcps: Light curve pre-selection

    NASA Astrophysics Data System (ADS)

    Schlecker, Martin

    2018-05-01

    lcps searches for transit-like features (i.e., dips) in photometric data. Its main purpose is to restrict large sets of light curves to a number of files that show interesting behavior, such as drops in flux. While lcps is adaptable to any format of time series, its I/O module is designed specifically for photometry of the Kepler spacecraft. It extracts the pre-conditioned PDCSAP data from light curves files created by the standard Kepler pipeline. It can also handle csv-formatted ascii files. lcps uses a sliding window technique to compare a section of flux time series with its surroundings. A dip is detected if the flux within the window is lower than a threshold fraction of the surrounding fluxes.

  17. Two-dimensional thermography image retrieval from zig-zag scanned data with TZ-SCAN

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Yamasaki, Ryohei; Arai, Kohei

    2008-10-01

    TZ-SCAN is a simple and low cost thermal imaging device which consists of a single point radiation thermometer on a tripod with a pan-tilt rotator, a DC motor controller board with a USB interface, and a laptop computer for rotator control, data acquisition, and data processing. TZ-SCAN acquires a series of zig-zag scanned data and stores the data as CSV file. A 2-D thermal distribution image can be retrieved by using the second quefrency peak calculated from TZ-SCAN data. An experiment is conducted to confirm the validity of the thermal retrieval algorithm. The experimental result shows efficient accuracy for 2-D thermal distribution image retrieval.

  18. Effects of milrinone on left ventricular cardiac function during cooling in an intact animal model.

    PubMed

    Tveita, Torkjel; Sieck, Gary C

    2012-08-01

    Due to adverse effects of β-receptor agonists reported when applied during hypothermia, left ventricular (LV) cardiac effects of milrinone, a PDE3 inhibitor which mode of action is deprived the sarcolemmal β-receptor-G protein-PKA system, was tested during cooling to 15 °C. Sprague Dawley rats were instrumented to measure left ventricular (LV) pressure-volume changes using a Millar pressure-volume conductance catheter. Core temperature was reduced from 37 to 15 °C (60 min) using internal and external heat exchangers. Milrinone, or saline placebo, was given as continuous i.v. infusions for 30 min at 37 °C and during cooling. In normothermic controls continuous milrinone infusion for 90 min elevated cardiac output (CO) and stroke volume (SV) significantly. Significant differences in cardiac functional variables between the milrinone group and the saline control group during cooling to 15 °C were found: Compared to saline treated animals throughout cooling from 33 to 15 °CSV was significantly elevated in milrinone animals, the index of LV isovolumic relaxation, Tau, was significantly better preserved, and both HR and CO were significantly higher from 33 to 24 °C. Likewise, during cooling between 33 and 28 °C also LVdP/dt(max) was significantly higher in the milrinone group. Milrinone preserved LV systolic and diastolic function at a significantly higher level than in saline controls during cooling to 15 °C. In essential contrast to our previous results when using β-receptor agonists during hypothermia, the present experiment demonstrates the positive inotropic effects of milrinone on LV cardiac function during cooling to 15 °C. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. WE-D-204-06: An Open Source ImageJ CatPhan Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, G

    2015-06-15

    Purpose: The CatPhan is a popular QA device for assessing CT image quality. There are a number of software options which perform analysis of the CatPhan. However, there is often little ability for the user to adjust the analysis if it isn’t running properly, and these are all expensive options. An open source tool is an effective solution. Methods: To use the software, the user imports the CT as an image sequence in ImageJ. The user then scrolls to the slice with the lateral dots. The user then runs the plugin. If tolerance constraints are not already created, the usermore » is prompted to enter them or to use generic tolerances. Upon completion of the analysis, the plugin calls pdfLaTex to compile the pdf report. There is a csv version of the report as well. A log of the results from all CatPhan scans is kept as a csv file. The user can use this to baseline the machine. Results: The tool is capable of detecting the orientation of the phantom. If the CatPhan was scanned backwards, one can simply flip the stack of images horizontally and proceed with the analysis. The analysis includes Sensitometry (estimating the effective beam energy), HU values and linearity, Low Contrast Visibility (using LDPE & Polystyrene), Contrast Scale, Geometric Accuracy, Slice Thickness Accuracy, Spatial resolution (giving the MTF using the line pairs as well as the point spread function), CNR, Low Contrast Detectability (including the raw data), Uniformity (including the Cupping Effect). Conclusion: This is a robust tool that analyzes more components of the CatPhan than other software options (with the exception of ImageOwl). It produces an elegant pdf and keeps a log of analyses for long-term tracking of the system. Because it is open source, users are able to customize any component of it.« less

  20. Gap Assessment (FY 13 Update)

    DOE Data Explorer

    Getman, Dan

    2013-09-30

    To help guide its future data collection efforts, The DOE GTO funded a data gap analysis in FY2012 to identify high potential hydrothermal areas where critical data are needed. This analysis was updated in FY2013 and the resulting datasets are represented by this metadata. The original process was published in FY 2012 and is available here: https://pangea.stanford.edu/ERE/db/GeoConf/papers/SGW/2013/Esposito.pdf Though there are many types of data that can be used for hydrothermal exploration, five types of exploration data were targeted for this analysis. These data types were selected for their regional reconnaissance potential, and include many of the primary exploration techniques currently used by the geothermal industry. The data types include: 1. well data 2. geologic maps 3. fault maps 4. geochemistry data 5. geophysical data To determine data coverage, metadata for exploration data (including data type, data status, and coverage information) were collected and catalogued from nodes on the National Geothermal Data System (NGDS). It is the intention of this analysis that the data be updated from this source in a semi-automated fashion as new datasets are added to the NGDS nodes. In addition to this upload, an online tool was developed to allow all geothermal data providers to access this assessment and to directly add metadata themselves and view the results of the analysis via maps of data coverage in Geothermal Prospector (http://maps.nrel.gov/gt_prospector). A grid of the contiguous U.S. was created with 88,000 10-km by 10-km grid cells, and each cell was populated with the status of data availability corresponding to the five data types. Using these five data coverage maps and the USGS Resource Potential Map, sites were identified for future data collection efforts. These sites signify both that the USGS has indicated high favorability of occurrence of geothermal resources and that data gaps exist. The uploaded data are contained in two data files for each data category. The first file contains the grid and is in the SHP file format (shape file.) Each populated grid cell represents a 10k area within which data is known to exist. The second file is a CSV (comma separated value) file that contains all of the individual layers that intersected with the grid. This CSV can be joined with the map to retrieve a list of datasets that are available at any given site. The attributes in the CSV include: 1. grid_id : The id of the grid cell that the data intersects with 2. title: This represents the name of the WFS service that intersected with this grid cell 3. abstract: This represents the description of the WFS service that intersected with this grid cell 4. gap_type: This represents the category of data availability that these data fall within. As the current processing is pulling data from NGDS, this category universally represents data that are available in the NGDS and are ready for acquisition for analytic purposes. 5. proprietary_type: Whether the data are considered proprietary 6. service_type: The type of service 7. base_url: The service URL

  1. Soil moisture datasets at five sites in the central Sierra Nevada and northern Coast Ranges, California

    USGS Publications Warehouse

    Stern, Michelle A.; Anderson, Frank A.; Flint, Lorraine E.; Flint, Alan L.

    2018-05-03

    In situ soil moisture datasets are important inputs used to calibrate and validate watershed, regional, or statewide modeled and satellite-based soil moisture estimates. The soil moisture dataset presented in this report includes hourly time series of the following: soil temperature, volumetric water content, water potential, and total soil water content. Data were collected by the U.S. Geological Survey at five locations in California: three sites in the central Sierra Nevada and two sites in the northern Coast Ranges. This report provides a description of each of the study areas, procedures and equipment used, processing steps, and time series data from each site in the form of comma-separated values (.csv) tables.

  2. Figure 3

    EPA Pesticide Factsheets

    The Figure.tar.gz contains a directory for each WRF ensemble run. In these directories are *.csv files for each meteorology variable examined. These are comma delimited text files that contain statistics for each observation site. Also provided is an R script that reads these files (user would need to change directory pointers) and computes the variability of error and bias of the ensemble at each site and plots these for reproduction of figure 3.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  3. Sensor Fish Communicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Sensor Fish collects information that can be used to evaluate conditions encountered by juvenile salmonids and other fish as they pass through hydroelectric dams on their way to the ocean. Sensor Fish are deployed in turbines, spillways, and sluiceways and measure changes in pressure, angular rate of change, and linear acceleration during passage. The software is need to make Sensor Fish fully functional and easy to use. Sensor Fish Communicator (SFC) links to Sensor Fish, allowing users to control data collection settings and download data. It may also be used to convert native raw data (.raw2) files into Commamore » Separated Variable (.csv) files and plot the results. The multiple capabilities of the SFC allow hardware communication, data conversion, and data plotting with one application.« less

  4. Qcorp: an annotated classification corpus of Chinese health questions.

    PubMed

    Guo, Haihong; Na, Xu; Li, Jiao

    2018-03-22

    Health question-answering (QA) systems have become a typical application scenario of Artificial Intelligent (AI). An annotated question corpus is prerequisite for training machines to understand health information needs of users. Thus, we aimed to develop an annotated classification corpus of Chinese health questions (Qcorp) and make it openly accessible. We developed a two-layered classification schema and corresponding annotation rules on basis of our previous work. Using the schema, we annotated 5000 questions that were randomly selected from 5 Chinese health websites within 6 broad sections. 8 annotators participated in the annotation task, and the inter-annotator agreement was evaluated to ensure the corpus quality. Furthermore, the distribution and relationship of the annotated tags were measured by descriptive statistics and social network map. The questions were annotated using 7101 tags that covers 29 topic categories in the two-layered schema. In our released corpus, the distribution of questions on the top-layered categories was treatment of 64.22%, diagnosis of 37.14%, epidemiology of 14.96%, healthy lifestyle of 10.38%, and health provider choice of 4.54% respectively. Both the annotated health questions and annotation schema were openly accessible on the Qcorp website. Users can download the annotated Chinese questions in CSV, XML, and HTML format. We developed a Chinese health question corpus including 5000 manually annotated questions. It is openly accessible and would contribute to the intelligent health QA system development.

  5. DataUp: Helping manage and archive data within the researcher's workflow

    NASA Astrophysics Data System (ADS)

    Strasser, C.

    2012-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.

  6. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  7. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  8. Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools

    NASA Astrophysics Data System (ADS)

    Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.

    2017-12-01

    For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.

  9. JSATS Detector Field Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Eric Y.; Flory, Adam E.; Lamarche, Brian L.

    2014-06-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Detector is a software and hardware system that captures JSATS Acoustic Micro Transmitter (AMT) signals. The system uses hydrophones to capture acoustic signals in the water. This analog signal is then amplified and processed by the Analog to Digital Converter (ADC) and Digital Signal Processor (DSP) board in the computer. This board digitizes and processes the acoustic signal to determine if a possible JSATS tag is present. With this detection, the data will be saved to the computer for further analysis. This document details the features and functionality of the JSATS Detector software.more » The document covers how to install the software, setup and run the detector software. The document will also go over the raw binary waveform file format and CSV files containing RMS values« less

  10. A Standard for Sharing and Accessing Time Series Data: The Heliophysics Application Programmers Interface (HAPI) Specification

    NASA Astrophysics Data System (ADS)

    Vandegriff, J. D.; King, T. A.; Weigel, R. S.; Faden, J.; Roberts, D. A.; Harris, B. T.; Lal, N.; Boardsen, S. A.; Candey, R. M.; Lindholm, D. M.

    2017-12-01

    We present the Heliophysics Application Programmers Interface (HAPI), a new interface specification that both large and small data centers can use to expose time series data holdings in a standard way. HAPI was inspired by the similarity of existing services at many Heliophysics data centers, and these data centers have collaborated to define a single interface that captures best practices and represents what everyone considers the essential, lowest common denominator for basic data access. This low level access can serve as infrastructure to support greatly enhanced interoperability among analysis tools, with the goal being simplified analysis and comparison of data from any instrument, model, mission or data center. The three main services a HAPI server must perform are 1. list a catalog of datasets (one unique ID per dataset), 2. describe the content of one dataset (JSON metadata), and 3. retrieve numerical content for one dataset (stream the actual data). HAPI defines both the format of the query to the server, and the response from the server. The metadata is lightweight, focusing on use rather than discovery, and the data format is a streaming one, with Comma Separated Values (CSV) being required and binary or JSON streaming being optional. The HAPI specification is available at GitHub, where projects are also underway to develop reference implementation servers that data providers can adapt and use at their own sites. Also in the works are data analysis clients in multiple languages (IDL, Python, Matlab, and Java). Institutions which have agreed to adopt HAPI include Goddard (CDAWeb for data and CCMC for models), LASP at the University of Colorado Boulder, the Particles and Plasma Interactions node of the Planetary Data System (PPI/PDS) at UCLA, the Plasma Wave Group at the University of Iowa, the Space Sector at the Johns Hopkins Applied Physics Lab (APL), and the tsds.org site maintained at George Mason University. Over the next year, the adoption of a uniform way to access time series data is expected to significantly enhance interoperability within the Heliophysics data environment. https://github.com/hapi-server/data-specification

  11. VizieR Online Data Catalog: OCARS catalog second version (Malkin, 2016)

    NASA Astrophysics Data System (ADS)

    Malkin, Z. M.

    2016-11-01

    Unlike the first version, supported in 2007-2015, the second version of the OCARS catalog includes three files: ocars.txt is the main file containing the source coordinates, source types, redshifts, and approximate magnitudes, together with commentary; this file corresponds to the first version of the OCARS catalog; ocars_m.txt contains photometric data in the 13 uUBgV rRiIzJHK bands; ocars_n.txt contains a table of corresponding source names in various catalogs; currently, only cross-identifications with IVS programs4 and the LQAC catalog [9] are included; The list of objects included in the OCARS catalog is formed from various astrometric and geodeticVLBI programs and catalogs in the following order: - sources in the ICRF2 [2]; - other sources observed in the framework of IVS programs; - sources from the NASA Goddard VLBI group catalog5 ; - sources from the RFC catalog,6 which is the most complete astrometric catalog of radio sources, is updated each quarter, and contributed more than half the OCARS objects; the latest version of OCARS used the RFC-2016a catalog based on observations obtained in 1980-2015 as part of IVS and other radio astrometric programs [19-31]; - sources from the literature. Optical Characteristics of Astrometric Radio Sources (OCARS) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Last revised: 27-NOV-2016 Latest update: - removed 30+ RFC sources not identified in NED and optics - removed rather long detailed statistics table, which seems to be not interested for most of users; it is always available on request - a few additions and amendments E-mail alerts about updates are available on request. URL of this file is http://www.gao.spb.ru/english/as/ac_vlbi/ocars.txt Supplement files: Optical and IR magnitudes: http://www.gao.spb.ru/english/as/acvlbi/ocarsm.txt Cross-identification table: http://www.gao.spb.ru/english/as/acvlbi/ocarsn.txt OCARS catalog in CSV format: http://www.gao.spb.ru/english/as/ac_vlbi/ocars.csv Please send comments and requests to Zinovy Malkin, malkin(at)gao.spb.ru ---------------------------------------------------------------------------- The Second Version of the OCARS Catalog of Optical Characteristics of Astrometric Radio Sources. Astronomy Reports, 2016, Vol. 60, No. 11, pp. 996-1005. DOI: 10.1134/S1063772916110032 If you use OCARS catalog in your work, please cite this publication. ---------------------------------------------------------------------- Brief OCARS statistics (detailed statistics is available on request) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Total number of sources 11375 +30...+90 3320 ( 29.2%) -30...+30 6233 ( 54.8%) -90...-30 1822 ( 16.0%) Number of sources with known type 6284 ( 55.2%) AGN 4704 ( 74.9%) quasars 3027 ( 64.3%) BL Lac 992 ( 21.1%) Seyfert 384 ( 8.2%) blazars 89 ( 1.9%) radio galaxies 1580 ( 25.1%) Number of sources with redshift info 5845 ( 51.4%) +30...+90 1803 ( 30.8%) -30...+30 3329 ( 57.0%) -90...-30 713 ( 12.2%) unreliable 868 ( 14.9%) Number of sources with known magnitude 8133 ( 71.5%) +30...+90 2380 ( 29.3%) -30...+30 4481 ( 55.1%) -90...-30 1272 ( 15.6%) Number of sources with both z and magnitude info 5803 ( 51.0%) (3 data files).

  12. TCGA2BED: extracting, extending, integrating, and querying The Cancer Genome Atlas.

    PubMed

    Cumbo, Fabio; Fiscon, Giulia; Ceri, Stefano; Masseroli, Marco; Weitschek, Emanuel

    2017-01-03

    Data extraction and integration methods are becoming essential to effectively access and take advantage of the huge amounts of heterogeneous genomics and clinical data increasingly available. In this work, we focus on The Cancer Genome Atlas, a comprehensive archive of tumoral data containing the results of high-throughout experiments, mainly Next Generation Sequencing, for more than 30 cancer types. We propose TCGA2BED a software tool to search and retrieve TCGA data, and convert them in the structured BED format for their seamless use and integration. Additionally, it supports the conversion in CSV, GTF, JSON, and XML standard formats. Furthermore, TCGA2BED extends TCGA data with information extracted from other genomic databases (i.e., NCBI Entrez Gene, HGNC, UCSC, and miRBase). We also provide and maintain an automatically updated data repository with publicly available Copy Number Variation, DNA-methylation, DNA-seq, miRNA-seq, and RNA-seq (V1,V2) experimental data of TCGA converted into the BED format, and their associated clinical and biospecimen meta data in attribute-value text format. The availability of the valuable TCGA data in BED format reduces the time spent in taking advantage of them: it is possible to efficiently and effectively deal with huge amounts of cancer genomic data integratively, and to search, retrieve and extend them with additional information. The BED format facilitates the investigators allowing several knowledge discovery analyses on all tumor types in TCGA with the final aim of understanding pathological mechanisms and aiding cancer treatments.

  13. The use of an automated interactive voice response system to manage medication identification calls to a poison center.

    PubMed

    Krenzelok, Edward P; Mrvos, Rita

    2009-05-01

    In 2007, medication identification requests (MIRs) accounted for 26.2% of all calls to U.S. poison centers. MIRs are documented with minimal information, but they still require an inordinate amount of work by specialists in poison information (SPI). An analysis was undertaken to identify options to reduce the impact of MIRs on both human and financial resources. All MIRs (2003-2007) to a certified regional poison information center were analyzed to determine call patterns and staffing. The data were used to justify an efficient and cost-effective solution. MIRs represented 42.3% of the 2007 call volume. Optimal staffing would require hiring an additional four full-time equivalent SPI. An interactive voice response (IVR) system was developed to respond to the MIRs. The IVR was used to develop the Medication Identification System that allowed the diversion of up to 50% of the MIRs, enhancing surge capacity and allowing specialists to address the more emergent poison exposure calls. This technology is an entirely voice-activated response call management system that collects zip code, age, gender and drug data and stores all responses as .csv files for reporting purposes. The query bank includes the 200 most common MIRs, and the system features text-to-voice synthesis that allows easy modification of the drug identification menu. Callers always have the option of engaging a SPI at any time during the IVR call flow. The IVR is an efficient and effective alternative that creates better staff utilization.

  14. A GRASS GIS module to obtain an estimation of glacier behavior under climate change: A pilot study on Italian glacier

    NASA Astrophysics Data System (ADS)

    Strigaro, Daniele; Moretti, Massimiliano; Mattavelli, Matteo; Frigerio, Ivan; Amicis, Mattia De; Maggi, Valter

    2016-09-01

    The aim of this work is to integrate the Minimal Glacier Model in a Geographic Information System Python module in order to obtain spatial simulations of glacier retreat and to assess the future scenarios with a spatial representation. The Minimal Glacier Models are a simple yet effective way of estimating glacier response to climate fluctuations. This module can be useful for the scientific and glaciological community in order to evaluate glacier behavior, driven by climate forcing. The module, called r.glacio.model, is developed in a GRASS GIS (GRASS Development Team, 2016) environment using Python programming language combined with different libraries as GDAL, OGR, CSV, math, etc. The module is applied and validated on the Rutor glacier, a glacier in the south-western region of the Italian Alps. This glacier is very large in size and features rather regular and lively dynamics. The simulation is calibrated by reconstructing the 3-dimensional dynamics flow line and analyzing the difference between the simulated flow line length variations and the observed glacier fronts coming from ortophotos and DEMs. These simulations are driven by the past mass balance record. Afterwards, the future assessment is estimated by using climatic drivers provided by a set of General Circulation Models participating in the Climate Model Inter-comparison Project 5 effort. The approach devised in r.glacio.model can be applied to most alpine glaciers to obtain a first-order spatial representation of glacier behavior under climate change.

  15. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Computer image analysis in obtaining characteristics of images: greenhouse tomatoes in the process of generating learning sets of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.

    2014-04-01

    The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.

  17. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  18. Atomic bomb health benefits.

    PubMed

    Luckey, T D

    2008-01-01

    Media reports of deaths and devastation produced by atomic bombs convinced people around the world that all ionizing radiation is harmful. This concentrated attention on fear of miniscule doses of radiation. Soon the linear no threshold (LNT) paradigm was converted into laws. Scientifically valid information about the health benefits from low dose irradiation was ignored. Here are studies which show increased health in Japanese survivors of atomic bombs. Parameters include decreased mutation, leukemia and solid tissue cancer mortality rates, and increased average lifespan. Each study exhibits a threshold that repudiates the LNT dogma. The average threshold for acute exposures to atomic bombs is about 100 cSv. Conclusions from these studies of atomic bomb survivors are: One burst of low dose irradiation elicits a lifetime of improved health.Improved health from low dose irradiation negates the LNT paradigm.Effective triage should include radiation hormesis for survivor treatment.

  19. Identification and on-line monitoring of reduced sulphur species (RSS) by voltammetry in oxic waters.

    PubMed

    Superville, Pierre-Jean; Pižeta, Ivanka; Omanović, Dario; Billon, Gabriel

    2013-08-15

    Based on automatic on-line measurements on the Deûle River that showed daily variation of a peak around -0.56V (vs Ag|AgCl 3M), identification of Reduced Sulphur Species (RSS) in oxic waters was performed applying cathodic stripping voltammetry (CSV) with the hanging mercury drop electrode (HMDE). Pseudopolarographic studies accompanied with increasing concentrations of copper revealed the presence of elemental sulphur S(0), thioacetamide (TA) and reduced glutathione (GSH) as the main sulphur compounds in the Deûle River. In order to resolve these three species, a simple procedure was developed and integrated in an automatic on-line monitoring system. During one week monitoring with hourly measurements, GSH and S(0) exhibited daily cycles whereas no consequential pattern was observed for TA. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.

    PubMed

    Easlon, Hsien Ming; Bloom, Arnold J

    2014-07-01

    Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.

  1. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    PubMed

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  2. Getting Your GIS Data into Google Earth: Data Conversion Tools and Tips

    NASA Astrophysics Data System (ADS)

    Nurik, R.; Marks, M.

    2009-12-01

    Google Earth is a powerful platform for displaying your data. You can easily visualize content using the Keyhole Markup Language (KML). But what if you don't have your data in KML format? GIS data comes in a wide variety of formats, including .shp files, CSV, and many others. What can you do? This session will walk you through some of the tools for converting data to KML format. We will explore a variety of tools, including: Google Earth Pro, GDAL/OGR, KML2KML, etc. This session will be paced so that you can follow along on your laptop if you wish. Should you want to follow along, bring a laptop, and install the trial versions of Google Earth Pro and KML2KML. It is also recommended that you download GDAL from gdal.org and install it on your system.

  3. Relating Time-Dependent Acceleration and Height Using an Elevator

    NASA Astrophysics Data System (ADS)

    Kinser, Jason M.

    2015-04-01

    A simple experiment in relating a time-dependent linear acceleration function to height is explored through the use of a smartphone and an elevator. Given acceleration as a function of time1, a(t), the velocity function and position functions are determined through integration as in v (t ) =∫ a (t ) d t (1) and x (t ) =∫ v (t ) dt. Mobile devices such as smartphones or tablets have accelerometers that capture slowly evolving acceleration with respect to time and can deliver those measurements as a CSV file. A recent example measured the oscillations of the elevator as it starts its motion.2 In the application presented here the mobile device is used to estimate the height of the elevator ride. By estimating the functional form of the acceleration of an elevator ride, it is possible to estimate the height of the ride through Eqs. (1) and (2).

  4. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  5. Frapbot: An open-source application for FRAP data.

    PubMed

    Kohze, Robin; Dieteren, Cindy E J; Koopman, Werner J H; Brock, Roland; Schmidt, Samuel

    2017-08-01

    We introduce Frapbot, a free-of-charge open source software web application written in R, which provides manual and automated analyses of fluorescence recovery after photobleaching (FRAP) datasets. For automated operation, starting from data tables containing columns of time-dependent intensity values for various regions of interests within the images, a pattern recognition algorithm recognizes the relevant columns and identifies the presence or absence of prebleach values and the time point of photobleaching. Raw data, residuals, normalization, and boxplots indicating the distribution of half times of recovery (t 1/2 ) of all uploaded files are visualized instantly in a batch-wise manner using a variety of user-definable fitting options. The fitted results are provided as .zip file, which contains .csv formatted output tables. Alternatively, the user can manually control any of the options described earlier. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  6. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  7. Atomic Bomb Health Benefits

    PubMed Central

    Luckey, T. D.

    2008-01-01

    Media reports of deaths and devastation produced by atomic bombs convinced people around the world that all ionizing radiation is harmful. This concentrated attention on fear of miniscule doses of radiation. Soon the linear no threshold (LNT) paradigm was converted into laws. Scientifically valid information about the health benefits from low dose irradiation was ignored. Here are studies which show increased health in Japanese survivors of atomic bombs. Parameters include decreased mutation, leukemia and solid tissue cancer mortality rates, and increased average lifespan. Each study exhibits a threshold that repudiates the LNT dogma. The average threshold for acute exposures to atomic bombs is about 100 cSv. Conclusions from these studies of atomic bomb survivors are: One burst of low dose irradiation elicits a lifetime of improved health.Improved health from low dose irradiation negates the LNT paradigm.Effective triage should include radiation hormesis for survivor treatment. PMID:19088902

  8. Computer vision system for egg volume prediction using backpropagation neural network

    NASA Astrophysics Data System (ADS)

    Siswantoro, J.; Hilman, M. Y.; Widiasri, M.

    2017-11-01

    Volume is one of considered aspects in egg sorting process. A rapid and accurate volume measurement method is needed to develop an egg sorting system. Computer vision system (CVS) provides a promising solution for volume measurement problem. Artificial neural network (ANN) has been used to predict the volume of egg in several CVSs. However, volume prediction from ANN could have less accuracy due to inappropriate input features or inappropriate ANN structure. This paper proposes a CVS for predicting the volume of egg using ANN. The CVS acquired an image of egg from top view and then processed the image to extract its 1D and 2 D size features. The features were used as input for ANN in predicting the volume of egg. The experiment results show that the proposed CSV can predict the volume of egg with a good accuracy and less computation time.

  9. Accessing National Water Model Output for Research and Application: An R package

    NASA Astrophysics Data System (ADS)

    Johnson, M.; Coll, J.

    2017-12-01

    With the National Water Model becoming operational in August of 2016, the need for a open source way to translate a huge amount of data into actionable intelligence and innovative research is apparent. The first step in doing this is to provide a package for accessing, managing, and writing data in a way that is both interpretable, portable, and useful to the end user in both the R environment, and other applications. This can be as simple as subsetting the outputs and writing to a CSV, but can also include converting discharge output to more meaningful statistics and measurements, and methods to visualize data in ways that are meaningful to a wider audience. The NWM R package presented here aims to serve this need through a suite of functions fit for researchers, first responders, and average citizens. A vignette of how this package can be applied to real-time flood mapping will be demonstrated.

  10. USEEIO Satellite Tables

    EPA Pesticide Factsheets

    These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized by the emission or resources type, as described in the manuscript. The main files (without SI) show the final satellite tables in the 'Exchanges' sheet which have emissions or resource use per USD for 2013. The other sheets in these files provide meta data for the create of the tables, including general information, sources, etc. The 'export' sheet is used for saving the satellite table for csv export. The data dictionary describes the fields in this sheet. The supporting files provide all the details data transformation and organization for the development of the satellite tables.This dataset is associated with the following publication:Yang, Y., W. Ingwersen, T. Hawkins, and D. Meyer. USEEIO: a New and Transparent United States Environmentally Extended Input-Output Model. JOURNAL OF CLEANER PRODUCTION. Elsevier Science Ltd, New York, NY, USA,

  11. Open Core Data approaches to exposing facility data to support FAIR principles

    NASA Astrophysics Data System (ADS)

    Fils, D.; Lehnert, K.; Noren, A. J.

    2017-12-01

    The Open Core Data (OCD) award from NSF is focused on exposing scientific drilling data from the JOIDES Resolution Science Operator (JRSO) and Continental Scientific Drilling Coordination Office (CSDCO) following guidance from the Force 11 FAIR principles and the W3C "best practices" recommendations and notes. The goal of this implementation is to provide the identification, access, citation and provenance of these data to support the research community. OCD employs Linked Open Data (LOD) patterns and HTML5 microdata publishing via JSON-LD using various vocabularies. These vocabularies include schema.org, GeoLink and other relevant community vocabularies. Attention is paid to enabling hypermedia navigation between resources to aid in fast and efficient harvesting of the metadata directly from the LOD approach using web architecture patterns. Further, the vocabularies are employed to address the need of both DOI assignment and creation of data citation entries following ESIP data citation recommendations. The use of LOD, community vocabularies and persistent identifiers has enabled linking between hosted and remote data resources. In addition to the semantic metadata and LOD pattern, OCD is implementing approaches to data packaging to facilitate data use. OCD is currently using the CSV for the Web approach but is moving to implement frictionless data packages. This data package model provide access to a large suite of tools, libraries and workbenches to support data utilization, validation and visualization. Further, a basic reference implementation of the W3C PROV-AQ pingback pattern is under testing. This work is done in coordination with the RDA Provenance Patterns WG and follows patterns already employed by Geoscience Australia. This development is also done in coordination with ESIP provenance work. As needed, more traditional Application Program Interfaces (APIs) are exposed following best practices in RESTful services. All these capabilities are implemented in Open Core Data in the lightest possible manner to address the desired functions while being as easy to maintain as possible. The approaches, lessons learned and takeaways from this work at Open Core Data to date will be presented.

  12. Building a SuAVE browse interface to R2R's Linked Data

    NASA Astrophysics Data System (ADS)

    Clark, D.; Stocks, K. I.; Arko, R. A.; Zaslavsky, I.; Whitenack, T.

    2017-12-01

    The Rolling Deck to Repository program (R2R) is creating and evaluating a new browse portal based on the SuAVE platform and the R2R linked data graph. R2R manages the underway sensor data collected by the fleet of US academic research vessels, and provides a discovery and access point to those data at its website, www.rvdata.us. R2R has a database-driven search interface, but seeks a more capable and extensible browse interface that could be built off of the substantial R2R linked data resources. R2R's Linked Data graph organizes its data holdings around key concepts (e.g. cruise, vessel, device type, operator, award, organization, publication), anchored by persistent identifiers where feasible. The "Survey Analysis via Visual Exploration" or SuAVE platform (suave.sdsc.edu) is a system for online publication, sharing, and analysis of images and metadata. It has been implemented as an interface to diverse data collections, but has not been driven off of linked data in the past. SuAVE supports several features of interest to R2R, including faceted searching, collaborative annotations, efficient subsetting, Google maps-like navigation over an image gallery, and several types of data analysis. Our initial SuAVE-based implementation was through a CSV export from the R2R PostGIS-enabled PostgreSQL database. This served to demonstrate the utility of SuAVE but was static and required reloading as R2R data holdings grew. We are now working to implement a SPARQL-based ("RDF Query Language") service that directly leverages the R2R Linked Data graph and offers the ability to subset and/or customize output.We will show examples of SuAVE faceted searches on R2R linked data concepts, and discuss our experience to date with this work in progress.

  13. Contrast sensitivity measured by two different test methods in healthy, young adults with normal visual acuity.

    PubMed

    Koefoed, Vilhelm F; Baste, Valborg; Roumes, Corinne; Høvding, Gunnar

    2015-03-01

    This study reports contrast sensitivity (CS) reference values obtained by two different test methods in a strictly selected population of healthy, young adults with normal uncorrected visual acuity. Based on these results, the index of contrast sensitivity (ICS) is calculated, aiming to establish ICS reference values for this population and to evaluate the possible usefulness of ICS as a tool to compare the degree of agreement between different CS test methods. Military recruits with best eye uncorrected visual acuity 0.00 LogMAR or better, normal colour vision and age 18-25 years were included in a study to record contrast sensitivity using Optec 6500 (FACT) at spatial frequencies of 1.5, 3, 6, 12 and 18 cpd in photopic and mesopic light and CSV-1000E at spatial frequencies of 3, 6, 12 and 18 cpd in photopic light. Index of contrast sensitivity was calculated based on data from the three tests, and the Bland-Altman technique was used to analyse the agreement between ICS obtained by the different test methods. A total of 180 recruits were included. Contrast sensitivity frequency data for all tests were highly skewed with a marked ceiling effect for the photopic tests. The median ICS for Optec 6500 at 85 cd/m2 was -0.15 (95% percentile 0.45), compared with -0.00 (95% percentile 1.62) for Optec at 3 cd/m2 and 0.30 (95% percentile 1.20) FOR CSV-1000E. The mean difference between ICSFACT 85 and ICSCSV was -0.43 (95% CI -0.56 to -0.30, p<0.00) with limits of agreement (LoA) within -2.10 and 1.22. The regression line on the difference of average was near to zero (R2=0.03). The results provide reference CS and ICS values in a young, adult population with normal visual acuity. The agreement between the photopic tests indicated that they may be used interchangeably. There was little agreement between the mesopic and photopic tests. The mesopic test seemed best suited to differentiate between candidates and may therefore possibly be useful for medical selection purposes. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  14. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps, stratigraphic columns and cross-sections be produced digitally continues our integration on the use of digital technologies throughout the curriculum. Initial evaluation suggests that students using the Apps more quickly progress towards synthesis and interpretation of the data as well as a deeper understanding of complex 4D field relationships.

  15. Introduction

    NASA Astrophysics Data System (ADS)

    de Graauw, T.

    2010-01-01

    First of all, I would like to wish all of you an happy New Year, which I sincerely hope will bring you success, happiness and interesting new opportunities. For us in ALMA, the end of 2009 and the beginning of 2010 have been very exciting and this is once more a special moment in the development of our observatory. After transporting our third antenna to the high altitude Chajnantor plateau, at 5000 meters above sea level, our team successfully combined the outputs of these antennas using "phase closure", a standard method in interferometry. This achievement marks one more milestone along the way to the beginning of Commissioning and Science Verification, CSV, which, once completed, will mark the beginning of Early Science for ALMA. There was an official announcement about this milestone at the AAS meeting early January and we also wanted to share this good news with you through this newsletter, which contains the content of the announcement. In another area, this newsletter contains the progress on site and a presentation of the Atacama Compact Array (ACA). This is the second part of a two part series on antennas, a continuation of the article in the last newsletter. The ACA plays a crucial part in the imaging of extended sources with ALMA. Without the ACA, the ability to produce accurate images would be very restricted. Finally, as you know, we like to show the human face of this great endeavour we are building and this time, we decided to highlight the Department of Technical Services, another fundamental piece working actively to make ALMA the most powerful radio observatory ever built.

  16. An Open Software Platform for Sharing Water Resource Models, Code and Data

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  17. Rapid Screening Method for Detecting Ethinyl Estradiol in Natural Water Employing Voltammetry

    PubMed Central

    2016-01-01

    17α-Ethinyl estradiol (EE2), which is used worldwide in the treatment of some cancers and as a contraceptive, is often found in aquatic systems and is considered a pharmaceutically active compound (PhACs) in the environment. Current methods for the determination of this compound, such as chromatography, are expensive and lengthy and require large amounts of toxic organic solvents. In this work, a voltammetric procedure is developed and validated as a screening tool for detecting EE2 in water samples without prior extraction, clean-up, or derivatization steps. Application of the method we elaborate here to EE2 analysis is unprecedented. EE2 detection was carried out using differential pulse adsorptive cathodic stripping voltammetry (DP AdCSV) with a hanging mercury drop electrode (HMDE) in pH 7.0 Britton-Robinson buffer. The electrochemical process of EE2 reduction was investigated by cyclic voltammetry at different scan rates. Electroreduction of the hormone on a mercury electrode exhibited a peak at −1.16 ± 0.02 V versus Ag/AgCl. The experimental parameters were as follows: −0.7 V accumulation potential, 150 s accumulation time, and 60 mV s−1 scan rate. The limit of detection was 0.49 μg L−1 for a preconcentration time of 150 s. Relative standard deviations were less than 13%. The method was applied to the detection of EE2 in water samples with recoveries ranging from 93.7 to 102.5%. PMID:27738548

  18. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  19. PAnalyzer: a software tool for protein inference in shotgun proteomics.

    PubMed

    Prieto, Gorka; Aloria, Kerman; Osinalde, Nerea; Fullaondo, Asier; Arizmendi, Jesus M; Matthiesen, Rune

    2012-11-05

    Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool.

  20. PAnalyzer: A software tool for protein inference in shotgun proteomics

    PubMed Central

    2012-01-01

    Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool. PMID:23126499

  1. IsoMS: automated processing of LC-MS data generated by a chemical isotope labeling metabolomics platform.

    PubMed

    Zhou, Ruokun; Tseng, Chiao-Li; Huan, Tao; Li, Liang

    2014-05-20

    A chemical isotope labeling or isotope coded derivatization (ICD) metabolomics platform uses a chemical derivatization method to introduce a mass tag to all of the metabolites having a common functional group (e.g., amine), followed by LC-MS analysis of the labeled metabolites. To apply this platform to metabolomics studies involving quantitative analysis of different groups of samples, automated data processing is required. Herein, we report a data processing method based on the use of a mass spectral feature unique to the chemical labeling approach, i.e., any differential-isotope-labeled metabolites are detected as peak pairs with a fixed mass difference in a mass spectrum. A software tool, IsoMS, has been developed to process the raw data generated from one or multiple LC-MS runs by peak picking, peak pairing, peak-pair filtering, and peak-pair intensity ratio calculation. The same peak pairs detected from multiple samples are then aligned to produce a CSV file that contains the metabolite information and peak ratios relative to a control (e.g., a pooled sample). This file can be readily exported for further data and statistical analysis, which is illustrated in an example of comparing the metabolomes of human urine samples collected before and after drinking coffee. To demonstrate that this method is reliable for data processing, five (13)C2-/(12)C2-dansyl labeled metabolite standards were analyzed by LC-MS. IsoMS was able to detect these metabolites correctly. In addition, in the analysis of a (13)C2-/(12)C2-dansyl labeled human urine, IsoMS detected 2044 peak pairs, and manual inspection of these peak pairs found 90 false peak pairs, representing a false positive rate of 4.4%. IsoMS for Windows running R is freely available for noncommercial use from www.mycompoundid.org/IsoMS.

  2. Radiobiological foundation of crew radiation risk for mars mission

    NASA Astrophysics Data System (ADS)

    Shafirkin, A.

    The results of a comprehensive clinico-physiological study of 250 dogs after 22 hours per day chronic exposure to gamma -radiation throughout their life are presented. The exposure duration was 3 and 6 years. The dose rate varied between 25 and 150 cSv/year to simulate galactic cosmic ray dose of crew members during mars mission. Several groups of the dogs received an additional acute dose of 10 and 50 cSv during a day three times per year to simulate stochastic irradiation caused by solar cosmic rays. Data on the status of regulatory systems of organism, exchange processes dynamics, organism reaction on additional functional loads are also presented. Organism reaction and dynamics of kinetic relations are considered in detail for most radiosensitive and regenerating tissue systems of the organism, namely, bloodforming system and spermatogenic epithelium. The results on life span reduction of the dogs and dog race characteristics after the radiation exposure are discussed. Based on the results obtained in this study and in model experiments realized with big amount of small laboratory animals that were exposed to a wide dose range, using other published data, mathematical models were developed, e. g. a model of radiation damage forming as dependent on time with taking into account recovery processes, and a model of radiation mortality rate of mammals. Based on these models and analysis of radiation environment behind various shielding on the route to Mars, crew radiation risk was calculated for space missions of various durations. Total radiation risk values for cosmonaut lifetime after the missions were also estimated together with expected life span reduction.

  3. Radiobiological foundation of crew radiation risk for Mars mission

    NASA Astrophysics Data System (ADS)

    Aleksandr, Shafirkin; Grigoriev, Yurj

    The results of a comprehensive clinico-physiological study of 250 dogs after 22 hours per day chronic exposure to gamma-radiation throughout their life are presented. The exposure duration was 3 and 6 years. The dose rate varied between 25 and 150 cSv/year to simulate galactic cosmic ray dose of crew members during mars mission. Several groups of the dogs received an additional acute dose of 10 and 50 cSv during a day three times per year to simulate stochastic irradiation caused by solar cosmic rays. Data on the status of regulatory systems of organism, exchange processes dynamics, organism reaction on additional functional loads are also presented. Organism reaction and dynamics of kinetic relations are considered in detail for most radiosensitive and regenerating tissue systems of the organism, namely, bloodforming system and spermatogenic epithelium. The results on life span reduction of the dogs and dog race characteristics after the radiation exposure are discussed. Based on the results obtained in this study and in model experiments realized with big amount of small laboratory animals that were exposed to a wide dose range, using other published data, mathematical models were developed, e. g. a model of radiation damage forming as dependent on time with taking into account recovery processes, and a model of radiation mortality rate of mammals. Based on these models and analysis of radiation environment behind various shielding on the route to Mars, crew radiation risk was calculated for space missions of various durations. Total radiation risk values for cosmonaut lifetime after the missions were also estimated together with expected life span reduction.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Q

    Purpose: According to clinical and research requirement, we develop a function of automatic reading dose of interest from dose volume histogram(DVH), to replace the traditional method with a mouse one by one point, and it's also verified. Methods: The DVH automatic reading function will be developed in an in-house developed radiotherapy information management system(RTIMS), which is based on Apache+PHP+MySQL. A DVH ASCII file is exported from Varian Eclipse V8.6, which includes the following contents: 1. basic information of patient; 2. dose information of plan; 3. dose information of structures, including basic information and dose volume data of target volume andmore » organ at risk. And the default exported dose volume data also includes relative doses by 1% step and corresponding absolute doses and cumulative relative volumes, and the volumes are 4 decimal fraction. Clinically, we often need read the doses of some integer percent volumes, such as D50 and D30. So it couldn't be directly obtained from the above data, but we can use linear interpolation bye the near volumes and doses: Dx=D2−(V2−Vx)*(D2−D1)/(V2−V1), and program a function to search, read and calculate the corresponding data. And the doses of all preseted volume of interest of all structures can be automatically read one by one patient, and saved as a CSV file. To verify it, we select 24 IMRT plans for prostate cancer, and doses of interest are PTV D98/D95/D5/D2, bladder D30/D50, and rectum D25/D50. Two groups of data, using the automatic reading method(ARM) and pointed dose method(PDM), are analyzed with SPSS 16. The absolute difference=D-ARM-D-PDM, relative difference=absolute difference*100%/prescription dose(7600cGy). Results: The differences are as following: PTV D98/D95/D5/D2: −0.04%/− 0.04%/0.13%/0.19%, bladder D30/D50: −0.02%/0.01%, and rectum D25/D50: 0.03%/0.01%. Conclusion: Using this function, the error is very small, and can be neglected. It could greatly improve the efficiency of clinical work. Project supported by the National Natural Science Foundation of China (Grant No.81101694)« less

  5. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  6. Sexual Venue Choice and Sexual Risk-Taking Among Substance-Using Men Who have Sex with Men

    PubMed Central

    Fletcher, Jesse B.; Reback, Cathy J.

    2016-01-01

    Commercial sex venues (CSVs) and public sex environments (PSEs) offer men who have sex with men (MSM) sexual privacy and anonymity. Sociodemographic characteristics (e.g., race/ethnicity, sexual identity, age, HIV status) are correlated with individuals’ choice of sexual venue, potentially suggesting environmental associations with both sociodemographics and sexual risk. From March 2005 through March 2012, 1298 substance-using MSM provided information on their most recent sexual encounter; iterative logit models estimated associations between sociodemographics and sexual venue, and/ or whether sexual venue was associated with sexual risk-taking while controlling for sociodemographics. More than a third of participants’ most recent sexual encounters took place in either a PSE (23.0%) or a CSV (11.3%); anonymous, HIV-serodiscordant, and/or sex while on methamphetamine and/or marijuana was significantly more likely to occur in CSVs/PSEs than in a private location, even when controlling for sociodemographics. Findings demonstrate that socioenvironmental factors were associated with sexual risk-taking among high-risk, urban MSM. PMID:27905014

  7. Sexual Venue Choice and Sexual Risk-Taking Among Substance-Using Men Who have Sex with Men.

    PubMed

    Rusow, Joshua A; Fletcher, Jesse B; Reback, Cathy J

    2017-04-01

    Commercial sex venues (CSVs) and public sex environments (PSEs) offer men who have sex with men (MSM) sexual privacy and anonymity. Sociodemographic characteristics (e.g., race/ethnicity, sexual identity, age, HIV status) are correlated with individuals' choice of sexual venue, potentially suggesting environmental associations with both sociodemographics and sexual risk. From March 2005 through March 2012, 1298 substance-using MSM provided information on their most recent sexual encounter; iterative logit models estimated associations between sociodemographics and sexual venue, and/or whether sexual venue was associated with sexual risk-taking while controlling for sociodemographics. More than a third of participants' most recent sexual encounters took place in either a PSE (23.0%) or a CSV (11.3%); anonymous, HIV-serodiscordant, and/or sex while on methamphetamine and/or marijuana was significantly more likely to occur in CSVs/PSEs than in a private location, even when controlling for sociodemographics. Findings demonstrate that socioenvironmental factors were associated with sexual risk-taking among high-risk, urban MSM.

  8. TOUGH3 v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PAU, GEORGE; JUNG, YOOJIN; FINSTERLE, STEFAN

    2016-09-14

    TOUGH3 V1.0 capabilities to simulate multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media, with applications geosciences and reservoir engineering and other application areas. TOUGH3 V1.0 supports a number of different combinations of fluids and components (updated equation-of-state (EOS) modules from previous versions of TOUGH, including EOS1, EOS2, EOS3, EOS4, EOS5, EOS7, EOS7R, EOS7C, EOS7CA, EOS8, EOS9, EWASG, TMVOC, ECO2N, and ECO2M). This upgrade includes (a) expanded list of updated equation-of-state (EOS) modules, (b) new hysteresis models, (c) new implementation of parallel and solver functionalities, (d) new linear solver options based on PETSc libraries, (e) new automatic buildmore » system that automatically downloads and builds third-party libraries and TOUGH3, (f) new printout in CSV format, (g) dynamic memory allocation, (h) various user features, and (i) bug fixes.« less

  9. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  10. The Brainomics/Localizer database.

    PubMed

    Papadopoulos Orfanos, Dimitri; Michel, Vincent; Schwartz, Yannick; Pinel, Philippe; Moreno, Antonio; Le Bihan, Denis; Frouin, Vincent

    2017-01-01

    The Brainomics/Localizer database exposes part of the data collected by the in-house Localizer project, which planned to acquire four types of data from volunteer research subjects: anatomical MRI scans, functional MRI data, behavioral and demographic data, and DNA sampling. Over the years, this local project has been collecting such data from hundreds of subjects. We had selected 94 of these subjects for their complete datasets, including all four types of data, as the basis for a prior publication; the Brainomics/Localizer database publishes the data associated with these 94 subjects. Since regulatory rules prevent us from making genetic data available for download, the database serves only anatomical MRI scans, functional MRI data, behavioral and demographic data. To publish this set of heterogeneous data, we use dedicated software based on the open-source CubicWeb semantic web framework. Through genericity in the data model and flexibility in the display of data (web pages, CSV, JSON, XML), CubicWeb helps us expose these complex datasets in original and efficient ways. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Corexit 9500 inactivates two enveloped viruses of aquatic animals but enhances the infectivity of a nonenveloped fish virus.

    PubMed

    Pham, P H; Huang, Y J; Chen, C; Bols, N C

    2014-02-01

    The effects of Corexit 9500, a dispersant used to clean up oil spills, on invertebrates, lower vertebrates, birds, and human health have been examined, but there is a significant lack of study of the effect of this dispersant on aquatic viruses. In this study, the effects of Corexit 9500 on four aquatic viruses of differing structural composition were examined. Corexit 9500 reduced the titer of the enveloped viral hemorrhagic septicemia virus (VHSV) at all concentrations (10% to 0.001%) examined. The titer of frog virus 3 (FV3), a virus with both enveloped and nonenveloped virions, was reduced only at the high Corexit 9500 concentrations (10% to 0.1%). Corexit 9500 was unable to reduce the titer of nonenveloped infectious pancreatic necrosis virus (IPNV) but enhanced the titer of chum salmon reovirus (CSV) by 2 to 4 logs. With the ability to inactivate enveloped viruses and possibly enhance some nonenveloped viruses, Corexit 9500 has the potential to alter the aquatic virosphere.

  12. Nestly--a framework for running software with nested parameter choices and aggregating results.

    PubMed

    McCoy, Connor O; Gallagher, Aaron; Hoffman, Noah G; Matsen, Frederick A

    2013-02-01

    The execution of a software application or pipeline using various combinations of parameters and inputs is a common task in bioinformatics. In the absence of a specialized tool to organize, streamline and formalize this process, scientists must write frequently complex scripts to perform these tasks. We present nestly, a Python package to facilitate running tools with nested combinations of parameters and inputs. nestly provides three components. First, a module to build nested directory structures corresponding to choices of parameters. Second, the nestrun script to run a given command using each set of parameter choices. Third, the nestagg script to aggregate results of the individual runs into a CSV file, as well as support for more complex aggregation. We also include a module for easily specifying nested dependencies for the SCons build tool, enabling incremental builds. Source, documentation and tutorial examples are available at http://github.com/fhcrc/nestly. nestly can be installed from the Python Package Index via pip; it is open source (MIT license).

  13. HDBStat!: a platform-independent software suite for statistical analysis of high dimensional biology data.

    PubMed

    Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B

    2005-04-06

    Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  14. BAMS2 Workspace: a comprehensive and versatile neuroinformatic platform for collating and processing neuroanatomical connections

    PubMed Central

    Bota, Mihail; Talpalaru, Ştefan; Hintiryan, Houri; Dong, Hong-Wei; Swanson, Larry W.

    2014-01-01

    We present in this paper a novel neuroinformatic platform, the BAMS2 Workspace (http://brancusi1.usc.edu), designed for storing and processing information about gray matter region axonal connections. This de novo constructed module allows registered users to directly collate their data by using a simple and versatile visual interface. It also allows construction and analysis of sets of connections associated with gray matter region nomenclatures from any designated species. The Workspace includes a set of tools allowing the display of data in matrix and networks formats, and the uploading of processed information in visual, PDF, CSV, and Excel formats. Finally, the Workspace can be accessed anonymously by third party systems to create individualized connectivity networks. All features of the BAMS2 Workspace are described in detail, and are demonstrated with connectivity reports collated in BAMS and associated with the rat sensory-motor cortex, medial frontal cortex, and amygdalar regions. PMID:24668342

  15. Global Federation of Data Services in Seismology: Extending the Concept to Interdisciplinary Science

    NASA Astrophysics Data System (ADS)

    Ahern, Tim; Trabant, Chad; Stults, Mike; VanFossen, Mick

    2016-04-01

    The International Federation of Digital Seismograph Networks (FDSN) sets international standards, formats, and access protocols for global seismology. Recently the availability of an FDSN standard for web services has enabled the development of a federated model of data access. With a growing number of internationally distributed data centers supporting compatible web services the task of federation is now fully realizable. The utility of this approach is already starting to bear fruit in seismology. This presentation will highlight the advances the seismological community has made in the past year towards federated access to seismological data including waveforms, earthquake event catalogs, and metadata describing seismic stations. It will include a discussion of an IRIS Federator as well as an emerging effort to develop an FDSN Federator that will allow seamless access to seismological information across multiple FDSN data centers. As part of the NSF EarthCube initiative as well as the US-European data coordination project (COOPEUS), IRIS and several partners, collectively called GeoWS, have been extending the concept of standard web services to other domains. Our primary partners include Lamont Doherty Earth Observatory (marine geophysics), Caltech (tectonic plate reconstructions), SDSC (hydrology), UNAVCO (geodesy), and Unidata (atmospheric sciences). Additionally, IRIS is working with partners at NOAA's National Centers for Environmental Information (NCEI) , NEON, UTEP, WOVOdat, INTERMAGNET, Global Geodynamics Program, and the Ocean Observatory Initiative (OOI) to develop web services for those domains. The ultimate goal is to allow discovery, access, and utilization of cross-domain data sources. One of the significant outcomes of this effort is the development of a simple text and metadata representation for tabular data called GeoCSV, that allows straightforward interpretation of information from multiple domains by non-domain experts.

  16. Future Flows Hydrology: an ensemble of daily river flow and monthly groundwater levels for use for climate change impact assessment across Great Britain

    NASA Astrophysics Data System (ADS)

    Prudhomme, C.; Haxton, T.; Crooks, S.; Jackson, C.; Barkwith, A.; Williamson, J.; Kelvin, J.; Mackay, J.; Wang, L.; Young, A.; Watts, G.

    2012-12-01

    The dataset Future Flows Hydrology was developed as part of the project "Future Flows and Groundwater Levels" to provide a consistent set of transient daily river flow and monthly groundwater levels projections across England, Wales and Scotland to enable the investigation of the role of climate variability on river flow and groundwater levels nationally and how this may change in the future. Future Flows Hydrology is derived from Future Flows Climate, a national ensemble projection derived from the Hadley Centre's ensemble projection HadRM3-PPE to provide a consistent set of climate change projections for the whole of Great Britain at both space and time resolutions appropriate for hydrological applications. Three hydrological models and one groundwater level model were used to derive Future Flows Hydrology, with 30 river sites simulated by two hydrological models to enable assessment of hydrological modelling uncertainty in studying the impact of climate change on the hydrology. Future Flows Hydrology contains an 11-member ensemble of transient projections from January 1951 to December 2098, each associated with a single realisation from a different variant of HadRM3 and a single hydrological model. Daily river flows are provided for 281 river catchments and monthly groundwater levels at 24 boreholes as .csv files containing all 11 ensemble members. When separate simulations are done with two hydrological models, two separate .csv files are provided. Because of potential biases in the climate-hydrology modelling chain, catchment fact sheets are associated with each ensemble. These contain information on the uncertainty associated with the hydrological modelling when driven using observed climate and Future Flows Climate for a period representative of the reference time slice 1961-1990 as described by key hydrological statistics. Graphs of projected changes for selected hydrological indicators are also provided for the 2050s time slice. Limitations associated with the dataset are provided, along with practical recommendation of use. Future Flows Hydrology is freely available for non-commercial use under certain licensing conditions. For each study site, catchment averages of daily precipitation and monthly potential evapotranspiration, used to drive the hydrological models, are made available, so that hydrological modelling uncertainty under climate change conditions can be explored further. doi:10.5285/f3723162-4fed-4d9d-92c6-dd17412fa37b.

  17. Future Flows Hydrology: an ensemble of daily river flow and monthly groundwater levels for use for climate change impact assessment across Great Britain

    NASA Astrophysics Data System (ADS)

    Prudhomme, C.; Haxton, T.; Crooks, S.; Jackson, C.; Barkwith, A.; Williamson, J.; Kelvin, J.; Mackay, J.; Wang, L.; Young, A.; Watts, G.

    2013-03-01

    The dataset Future Flows Hydrology was developed as part of the project "Future Flows and Groundwater Levels'' to provide a consistent set of transient daily river flow and monthly groundwater level projections across England, Wales and Scotland to enable the investigation of the role of climate variability on river flow and groundwater levels nationally and how this may change in the future. Future Flows Hydrology is derived from Future Flows Climate, a national ensemble projection derived from the Hadley Centre's ensemble projection HadRM3-PPE to provide a consistent set of climate change projections for the whole of Great Britain at both space and time resolutions appropriate for hydrological applications. Three hydrological models and one groundwater level model were used to derive Future Flows Hydrology, with 30 river sites simulated by two hydrological models to enable assessment of hydrological modelling uncertainty in studying the impact of climate change on the hydrology. Future Flows Hydrology contains an 11-member ensemble of transient projections from January 1951 to December 2098, each associated with a single realisation from a different variant of HadRM3 and a single hydrological model. Daily river flows are provided for 281 river catchments and monthly groundwater levels at 24 boreholes as .csv files containing all 11 ensemble members. When separate simulations are done with two hydrological models, two separate .csv files are provided. Because of potential biases in the climate-hydrology modelling chain, catchment fact sheets are associated with each ensemble. These contain information on the uncertainty associated with the hydrological modelling when driven using observed climate and Future Flows Climate for a period representative of the reference time slice 1961-1990 as described by key hydrological statistics. Graphs of projected changes for selected hydrological indicators are also provided for the 2050s time slice. Limitations associated with the dataset are provided, along with practical recommendation of use. Future Flows Hydrology is freely available for non-commercial use under certain licensing conditions. For each study site, catchment averages of daily precipitation and monthly potential evapotranspiration, used to drive the hydrological models, are made available, so that hydrological modelling uncertainty under climate change conditions can be explored further. doi:10.5285/f3723162-4fed-4d9d-92c6-dd17412fa37b

  18. The Ocean Observatories Initiative: Data Acquisition Functions and Its Built-In Automated Python Modules

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Vardaro, M.; Crowley, M. F.; Glenn, S. M.; Schofield, O.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Fram, J. P.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of oceanographic sensors. The Endurance Array in the Pacific Ocean consists of two separate lines off the coasts of Oregon and Washington. The Oregon line consists of 7 moorings, two cabled benthic experiment packages and 6 underwater gliders. The Washington line comprises 6 moorings and 6 gliders. Each mooring is outfitted with a variety of instrument packages. The raw data from these instruments are sent to shore via satellite communication and in some cases, via fiber optic cable. Raw data is then sent to the cyberinfrastructure (CI) group at Rutgers where it is aggregated, parsed into thousands of different data streams, and integrated into a software package called uFrame. The OOI CI delivers the data to the general public via a web interface that outputs data into commonly used scientific data file formats such as JSON, netCDF, and CSV. The Rutgers data management team has developed a series of command-line Python tools that streamline data acquisition in order to facilitate the QA/QC review process. The first step in the process is querying the uFrame database for a list of all available platforms. From this list, a user can choose a specific platform and automatically download all available datasets from the specified platform. The downloaded dataset is plotted using a generalized Python netcdf plotting routine that utilizes a data visualization toolbox called matplotlib. This routine loads each netCDF file separately and outputs plots by each available parameter. These Python tools have been uploaded to a Github repository that is openly available to help facilitate OOI data access and visualization.

  19. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale.

    NASA Astrophysics Data System (ADS)

    Magnoni, L.; Suthakar, U.; Cordeiro, C.; Georgiou, M.; Andreeva, J.; Khan, A.; Smith, D. R.

    2015-12-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures.

  20. The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.

    2016-02-01

    The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.

  1. NCCA 2010 Water

    EPA Pesticide Factsheets

    Data from the National Aquatic Resource Surveys: The following data are available for download as comma separated values (.csv) files. Sort the table using the pull down menus or headers to more easily locate the data. Right click on the file name and select Save Link As to save the file to your computer. Make sure to also download the companion metadata file (.txt) for the list of field labels. See the survey technical document for more information on the data analyses.This dataset is associated with the following publications:Yurista , P., J. Kelly , and J. Scharold. Great Lakes nearshore-offshore: Distinct water quality regions. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 42: 375-385, (2016).Kelly , J., P. Yurista , M. Starry, J. Scharold , W. Bartsch , and A. Cotter. The first US National Coastal Condition Assessment survey in the Great Lakes: Development of the GIS frame and exploration of spatial variation in nearshore water quality results. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 41: 1060-1074, (2015).

  2. CBrowse: a SAM/BAM-based contig browser for transcriptome assembly visualization and analysis.

    PubMed

    Li, Pei; Ji, Guoli; Dong, Min; Schmidt, Emily; Lenox, Douglas; Chen, Liangliang; Liu, Qi; Liu, Lin; Zhang, Jie; Liang, Chun

    2012-09-15

    To address the impending need for exploring rapidly increased transcriptomics data generated for non-model organisms, we developed CBrowse, an AJAX-based web browser for visualizing and analyzing transcriptome assemblies and contigs. Designed in a standard three-tier architecture with a data pre-processing pipeline, CBrowse is essentially a Rich Internet Application that offers many seamlessly integrated web interfaces and allows users to navigate, sort, filter, search and visualize data smoothly. The pre-processing pipeline takes the contig sequence file in FASTA format and its relevant SAM/BAM file as the input; detects putative polymorphisms, simple sequence repeats and sequencing errors in contigs and generates image, JSON and database-compatible CSV text files that are directly utilized by different web interfaces. CBowse is a generic visualization and analysis tool that facilitates close examination of assembly quality, genetic polymorphisms, sequence repeats and/or sequencing errors in transcriptome sequencing projects. CBrowse is distributed under the GNU General Public License, available at http://bioinfolab.muohio.edu/CBrowse/ liangc@muohio.edu or liangc.mu@gmail.com; glji@xmu.edu.cn Supplementary data are available at Bioinformatics online.

  3. bioWeb3D: an online webGL 3D data visualisation tool.

    PubMed

    Pettit, Jean-Baptiste; Marioni, John C

    2013-06-07

    Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.

  4. Observations from the GOES Space Environment Monitor and Solar X-ray Imager are now available in a whole new way!

    NASA Astrophysics Data System (ADS)

    Wilkinson, D. C.

    2012-12-01

    NOAA's Geosynchronous Operational Environmental Satellites (GOES) have been observing the environment in near-earth-space for over 37 years. Those data are down-linked and processed by the Space Weather Prediction Center (SWPC) and form the cornerstone of their alert and forecast services. At the close of each UT day these data are ingested by the National Geophysical Data Center (NGDC) where they are merged into the national archive and made available to the user community in a uniform manner. In 2012 NGDC unveiled a RESTful web service for accessing these data. What does this mean? Users can now build a web-like URL using simple predefined constructs that allows their browser or custom software to directly access the relational archives and bundle the requested data into a variety of popular formats. The user can select precisely the data they need and the results are delivered immediately. NGDC understands that many users are perfectly happy retrieving data via pre-generated files and will continue to provide internally documented NetCDF and CSV files far into the future.

  5. Chapter 21: Programmatic Interfaces - STILTS

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, M. J.

    STILTS is the Starlink Tables Infrastructure Library Tool Set developed by Mark Taylor of the former Starlink Project. STILTS is a command-line tool (see the NVOSS_HOME/bin/stilts command) providing access to the same functionality driving the TOPCAT application and can be run using either the STILTS-specific jar file, or the more general TOPCAT jar file (both are available in the NVOSS_HOME/java/lib directory and are included in the default software environment classpath). The heart of both STILTS and TOPCAT is the STIL Java library. STIL is designed to efficiently handle the input, output and processing of very large tabular datasets and the STILTS task interface makes it an ideal tool for the scripting environment. Multiple formats are supported (including FITS Binary Tables, VOTable, CSV, SQL databases and ASCII, amongst others) and while some tools will generically handle all supported formats, others are specific to the VOTable format. Converting a VOTable to a more script-friendly format is the first thing most users will encounter, but there are many other useful tools as well.

  6. Data indicating temperature response of Ti-6Al-4V thin-walled structure during its additive manufacture via Laser Engineered Net Shaping.

    PubMed

    Marshall, Garrett J; Thompson, Scott M; Shamsaei, Nima

    2016-06-01

    An OPTOMEC Laser Engineered Net Shaping (LENS(™)) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti-6Al-4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials.

  7. Observations from the GOES Space Environment Monitor and Solar X-ray Imager are now available in a whole new way!

    NASA Astrophysics Data System (ADS)

    Wilkinson, D. C.

    2013-12-01

    NOAA's Geosynchronous Operational Environmental Satellites (GOES) have been observing the environment in near-earth-space for over 37 years. Those data are down-linked and processed by the Space Weather Prediction Center (SWPC) and form the cornerstone of their alert and forecast services. At the close of each UT day these data are ingested by the National Geophysical Data Center (NGDC) where they are merged into the national archive and made available to the user community in a uniform manner. In 2012 NGDC unveiled a RESTful web service for accessing these data. What does this mean? Users can now build a web-like URL using simple predefined constructs that allows their browser or custom software to directly access the relational archives and bundle the requested data into a variety of popular formats. The user can select precisely the data they need and the results are delivered immediately. NGDC understands that many users are perfectly happy retrieving data via pre-generated files and will continue to provide internally documented NetCDF and CSV files far into the future.

  8. Data indicating temperature response of Ti–6Al–4V thin-walled structure during its additive manufacture via Laser Engineered Net Shaping

    PubMed Central

    Marshall, Garrett J.; Thompson, Scott M.; Shamsaei, Nima

    2016-01-01

    An OPTOMEC Laser Engineered Net Shaping (LENS™) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti–6Al–4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials. PMID:27054180

  9. Kelly et al. (2016): Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter

    EPA Pesticide Factsheets

    In this study, modeled gas- and aerosol phase ammonia, nitric acid, and hydrogen chloride are compared to measurements taken during a field campaign conducted in northern Colorado in February and March 2011. We compare the modeled and observed gas-particle partitioning, and assess potential reasons for discrepancies between the model and measurements. This data set contains scripts and data used for each figure in the associated manuscript. Figures are generated using the R project statistical programming language. Data files are in either comma-separated value (CSV) format or netCDF, a standard self-describing binary data format commonly used in the earth and atmospheric sciences. This dataset is associated with the following publication:Kelly , J., K. Baker , C. Nolte, S. Napelenok , W.C. Keene, and A.A.P. Pszenny. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter. ATMOSPHERIC ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 131: 67-77, (2016).

  10. Spatial Aspects of Multi-Sensor Data Fusion: Aerosol Optical Thickness

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Zubko, V.; Gopalan, A.

    2007-01-01

    The Goddard Earth Sciences Data and Information Services Center (GES DISC) investigated the applicability and limitations of combining multi-sensor data through data fusion, to increase the usefulness of the multitude of NASA remote sensing data sets, and as part of a larger effort to integrate this capability in the GES-DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni). This initial study focused on merging daily mean Aerosol Optical Thickness (AOT), as measured by the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites, to increase spatial coverage and produce complete fields to facilitate comparison with models and station data. The fusion algorithm used the maximum likelihood technique to merge the pixel values where available. The algorithm was applied to two regional AOT subsets (with mostly regular and irregular gaps, respectively) and a set of AOT fields that differed only in the size and location of artificially created gaps. The Cumulative Semivariogram (CSV) was found to be sensitive to the spatial distribution of gap areas and, thus, useful for assessing the sensitivity of the fused data to spatial gaps.

  11. AAVSO Target Tool: A Web-Based Service for Tracking Variable Star Observations (Abstract)

    NASA Astrophysics Data System (ADS)

    Burger, D.; Stassun, K. G.; Barnes, C.; Kafka, S.; Beck, S.; Li, K.

    2018-06-01

    (Abstract only) The AAVSO Target Tool is a web-based interface for bringing stars in need of observation to the attention of AAVSOís network of amateur and professional astronomers. The site currently tracks over 700 targets of interest, collecting data from them on a regular basis from AAVSOís servers and sorting them based on priority. While the target tool does not require a login, users can obtain visibility times for each target by signing up and entering a telescope location. Other key features of the site include filtering by AAVSO observing section, sorting by different variable types, formatting the data for printing, and exporting the data to a CSV file. The AAVSO Target Tool builds upon seven years of experience developing web applications for astronomical data analysis, most notably on Filtergraph (Burger, D., et al. 2013, Astronomical Data Analysis Software and Systems XXII, Astronomical Society of the Pacific, San Francisco, 399), and is built using the web2py web framework based on the python programming language. The target tool is available at http://filtergraph.com/aavso.

  12. The sedimentological characteristics and geochronology of the marshes of Dauphin Island, Alabama

    USGS Publications Warehouse

    Ellis, Alisha M.; Smith, Christopher G.; Marot, Marci E.

    2018-03-22

    In August 2015, scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center collected 11 push cores from the marshes of Dauphin Island and Little Dauphin Island, Alabama. Sample site environments included high marshes, low salt marshes, and salt flats, and varied in distance from the shoreline. The sampling efforts were part of a larger study to assess the feasibility and sustainability of proposed restoration efforts for Dauphin Island, Alabama, and to identify trends in shoreline erosion and accretion. The data presented in this publication can provide a basis for assessing organic and inorganic sediment accumulation rates and temporal changes in accumulation rates over multiple decades at multiple locations across the island. This study was funded by the National Fish and Wildlife Foundation, via the Gulf Environmental Benefit Fund. This report serves as an archive for the sedimentological and geochemical data derived from the marsh cores. Downloadable data are available and include Microsoft Excel spreadsheets (.xlsx), comma-separated values (.csv) text files, JPEG files, and formal Federal Geographic Data Committee metadata in a U.S. Geological Survey data release.

  13. PuffinPlot: A versatile, user-friendly program for paleomagnetic analysis

    NASA Astrophysics Data System (ADS)

    Lurcock, P. C.; Wilson, G. S.

    2012-06-01

    PuffinPlot is a user-friendly desktop application for analysis of paleomagnetic data, offering a unique combination of features. It runs on several operating systems, including Windows, Mac OS X, and Linux; supports both discrete and long core data; and facilitates analysis of very weakly magnetic samples. As well as interactive graphical operation, PuffinPlot offers batch analysis for large volumes of data, and a Python scripting interface for programmatic control of its features. Available data displays include demagnetization/intensity, Zijderveld, equal-area (for sample, site, and suite level demagnetization data, and for magnetic susceptibility anisotropy data), a demagnetization data table, and a natural remanent magnetization intensity histogram. Analysis types include principal component analysis, Fisherian statistics, and great-circle path intersections. The results of calculations can be exported as CSV (comma-separated value) files; graphs can be printed, and can also be saved as publication-quality vector files in SVG or PDF format. PuffinPlot is free, and the program, user manual, and fully documented source code may be downloaded from http://code.google.com/p/puffinplot/.

  14. Web Site on Marine Connecivity Around Australia

    NASA Astrophysics Data System (ADS)

    Condie, Scott

    2005-06-01

    The Commonwealth Scientific and Industrial Research Organisation (CSIRO), with support from the Western Australian Government, has developed an online tool for marine scientists and managers to investigate the largescale patterns of spatial connectivity around Australia that are associated with ocean current transport (,Figure 1). This tool, referred to as the Australian Connectivity Interface, or Aus-ConnIe, is expected to find applications in areas such as tracer dispersion studies (see example by Ridgway and Condie [2004](, larval dispersion and recruitment, and the development of scenarios and preliminary risk assessments for contaminant dispersion in the marine environment. After selecting a region of interest, users can investigate where material carried into that region comes from, or where material originating in that region goes to, over a range of timescales (weeks to months). These connectivity statistics are based on large numbers of particle trajctories (one million at any given time) estimated from satellite altimeter data, coastal tide-gauge data, and winds from meteorological models. Users can save the results in a variety of formats (CSV, Excel, or XML) and, as an option, may save their sessions by first registering.

  15. GRIIDC: A Data Repository for Gulf of Mexico Science

    NASA Astrophysics Data System (ADS)

    Ellis, S.; Gibeaut, J. C.

    2017-12-01

    The Gulf of Mexico Research Initiative Information & Data Cooperative (GRIIDC) system is a data management solution appropriate for any researcher sharing Gulf of Mexico and oil spill science data. Our mission is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico ecosystem. GRIIDC developed an open-source software solution to manage data from the Gulf of Mexico Research Initiative (GoMRI). The GoMRI program has over 2500 researchers from diverse fields of study with a variety of attitudes, experiences, and capacities for data sharing. The success of this solution is apparent through new partnerships to share data generated by RESTORE Act Centers of Excellence Programs, the National Academies of Science, and others. The GRIIDC data management system integrates dataset management planning, metadata creation, persistent identification, and data discoverability into an easy-to-use web application. No specialized software or program installations are required to support dataset submission or discovery. Furthermore, no data transformations are needed to submit data to GRIIDC; common file formats such as Excel, csv, and text are all acceptable for submissions. To ensure data are properly documented using the GRIIDC implementation of the ISO 19115-2 metadata standard, researchers submit detailed descriptive information through a series of interactive forms and no knowledge of metadata or xml formats are required. Once a dataset is documented and submitted the GRIIDC team performs a review of the dataset package. This review ensures that files can be opened and contain data, and that data are completely and accurately described. This review does not include performing quality assurance or control of data points, as GRIIDC expects scientists to perform these steps during the course of their work. Once approved, data are made public and searchable through the GRIIDC data discovery portal and the DataONE network.

  16. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  17. Complexation-Based Detection of Nickel(II) at a Graphene-Chelate Probe in the Presence of Cobalt and Zinc by Adsorptive Stripping Voltammetry

    PubMed Central

    Pokpas, Keagan; Jahed, Nazeem; Baker, Priscilla G.

    2017-01-01

    The adsorptive stripping voltammetric detection of nickel and cobalt in water samples at metal film electrodes has been extensively studied. In this work, a novel, environmentally friendly, metal-free electrochemical probe was constructed for the ultra-trace determination of Ni2+ in water samples by Adsorptive Cathodic Stripping Voltammetry (AdCSV). The electrochemical platform is based on the adsorptive accumulation of Ni2+ ions directly onto a glassy carbon electrode (GCE) modified with dimethylglyoxime (DMG) as chelating agent and a Nafion-graphene (NGr) nanocomposite to enhance electrode sensitivity. The nafion-graphene dimethylglyoxime modified glassy carbon electrode (NGr-DMG-GCE) shows superior detection capabilities as a result of the improved surface-area-to-volume ratio and enhanced electron transfer kinetics following the incorporation of single layer graphene, while limiting the toxic effects of the sensor by removal of the more common mercury, bismuth and lead films. Furthermore, for the first time the NGr-DMG-GCE, in the presence of common interfering metal ions of Co2+ and Zn2+ demonstrates good selectivity and preferential binding towards the detection of Ni2+ in water samples. Structural and morphological characterisation of the synthesised single layer graphene sheets was conducted by Raman spectrometry, HRTEM and HRSEM analysis. The instrumental parameters associated with the electrochemical response, including accumulation potential and accumulation time were investigated and optimised in addition to the influence of DMG and graphene concentrations. The NGr-DMG-GCE demonstrated well resolved, reproducible peaks, with RSD (%) below 5% and a detection limit of 1.5 µg L−1 for Ni2+ reduction at an accumulation time of 120 s. The prepared electrochemical sensor exhibited good detection and quantitation towards Ni2+ detection in tap water samples, well below 0.1 mg L−1 set by the WHO and EPA standards. This is comparable to the South African drinking water guidelines of 0.15 mg L−1. PMID:28757588

  18. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  19. Performance Analysis of Automatic Dependent Surveillance-Broadcast (ADS-B) and Breakdown of Anomalies

    NASA Astrophysics Data System (ADS)

    Tabassum, Asma

    This thesis work analyzes the performance of Automatic Dependent Surveillance-Broadcast (ADS-B) data received from Grand Forks International Airport, detects anomalies in the data and quantifies the associated potential risk. This work also assesses severity associated anomalous data in Detect and Avoid (DAA) for Unmanned Aircraft System (UAS). The received data were raw and archived in GDL-90 format. A python module is developed to parse the raw data into readable data in a .csv file. The anomaly detection algorithm is based on Federal Aviation Administration's (FAA) ADS-B performance assessment report. An extensive study is carried out on two main types of anomalies, namely dropouts and altitude deviations. A dropout is considered when the update rate exceeds three seconds. Dropouts are of different durations and have a different level of risk depending on how much time ADS-B is unavailable as the surveillance system. Altitude deviation refers to the deviation between barometric and geometric altitude. Deviation ranges from 25 feet to 600 feet have been observed. As of now, barometric altitude has been used for separation and surveillance while geometric altitude can be used in cases where barometric altitude is not available. Many UAS might not have both sensors installed on board due to size and weight constrains. There might be a chance of misinterpretation of vertical separation specially while flying in National Airspace (NAS) if the ownship UAS and intruder manned aircraft use two different altitude sources for separation standard. The characteristics and agreement between two different altitudes is investigated with a regression based approach. Multiple risk matrices are established based on the severity of the DAA well clear. ADS-B is called the Backbone of FAA Next Generation Air Transportation System, NextGen. NextGen is the series of inter-linked programs, systems, and policies that implement advanced technologies and capabilities. ADS-B utilizes the Satellite based Global Positioning System (GPS) technology to provide the pilot and the Air Traffic Control (ATC) with more information which enables an efficient navigation of aircraft in increasingly congested airspace. FAA mandated all aircraft, both manned and unmanned, be equipped with ADS-B out by the year 2020 to fly within most controlled airspace. As a fundamental component of NextGen it is crucial to understand the behavior and potential risk with ADS-B Systems.

  20. A cloud based brokering framework to support hydrology at global scale

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Pecora, S.; Bordini, F.; Nativi, S.

    2016-12-01

    This work presents the hydrology broker designed and deployed in the context of a collaboration between the Regional Agency for Environmental Protection in the Italian region Emilia-Romagna (ARPA-ER) and CNR-IIA (National Research Council of Italy). The hydrology brokering platform eases the task of discovering and accessing hydrological observation data, usually acquired and made available by national agencies by means of a set of heterogeneous services (e.g. CUAHSI HIS servers, OGC services, FTP servers) and formats (e.g. WaterML, O&M, ...). The hydrology broker makes all the already published data available according to one or more of the desired and well known discovery protocols, access protocols, and formats . As a result, the user is able to search and access the available hydrological data through his preferred client (e.g. CUAHSI HydroDesktop, 52North SWE client). It is also easy to build a hydrological web portal on top of the broker, using the user friendly js API. The hydrology broker has been deployed on the Amazon cloud to ensure scalability and tested in the context of the work of the Commission for Hydrology of WMO on three different scenarios: the La Plata river basin, the Sava river basin and the Arctic-HYCOS project. In each scenario the hydrology broker discovered and accessed heterogeneous data formats (e.g. Waterml 1.0/2.0, proprietary CSV documents) from the heterogeneous services (e.g. CUAHSI HIS servers, FTP service and agency proprietary services) managed by several national agencies and international commissions. The hydrology broker made possible to present all the available data uniformly through the user desired service type and format (e.g. an HIS server publishing Waterml 2.0), producing a great improvement in both system interoperability and data exchange. Interoperability tests were also successfully conducted with WMO Information System (WIS) nodes, making possible for a specific Global Information Center System (GISC) to gather the available hydrological records as ISO 19115:2007 metadata documents through the OAI-PMH interface exposed by the broker. The framework flexibility makes it also easy to add other sources, as well as additional published interfaces, in order to cope with the future standard requirements needed by the hydrological community.

  1. OOSTethys - Open Source Software for the Global Earth Observing Systems of Systems

    NASA Astrophysics Data System (ADS)

    Bridger, E.; Bermudez, L. E.; Maskey, M.; Rueda, C.; Babin, B. L.; Blair, R.

    2009-12-01

    An open source software project is much more than just picking the right license, hosting modular code and providing effective documentation. Success in advancing in an open collaborative way requires that the process match the expected code functionality to the developer's personal expertise and organizational needs as well as having an enthusiastic and responsive core lead group. We will present the lessons learned fromOOSTethys , which is a community of software developers and marine scientists who develop open source tools, in multiple languages, to integrate ocean observing systems into an Integrated Ocean Observing System (IOOS). OOSTethys' goal is to dramatically reduce the time it takes to install, adopt and update standards-compliant web services. OOSTethys has developed servers, clients and a registry. Open source PERL, PYTHON, JAVA and ASP tool kits and reference implementations are helping the marine community publish near real-time observation data in interoperable standard formats. In some cases publishing an OpenGeospatial Consortium (OGC), Sensor Observation Service (SOS) from NetCDF files or a database or even CSV text files could take only minutes depending on the skills of the developer. OOSTethys is also developing an OGC standard registry, Catalog Service for Web (CSW). This open source CSW registry was implemented to easily register and discover SOSs using ISO 19139 service metadata. A web interface layer over the CSW registry simplifies the registration process by harvesting metadata describing the observations and sensors from the “GetCapabilities” response of SOS. OPENIOOS is the web client, developed in PERL to visualize the sensors in the SOS services. While the number of OOSTethys software developers is small, currently about 10 around the world, the number of OOSTethys toolkit implementers is larger and growing and the ease of use has played a large role in spreading the use of interoperable standards compliant web services widely in the marine community.

  2. Analysis of Stage and Clinical/Prognostic Factors for Lung Cancer from SEER Registries: AJCC Staging and Collaborative Stage Data Collection System

    PubMed Central

    Chen, Vivien W.; Ruiz, Bernardo A.; Hsieh, Mei-Chin; Wu, Xiao-Cheng; Ries, Lynn; Lewis, Denise R.

    2014-01-01

    Introduction The American Joint Committee on Cancer (AJCC) 7th edition introduced major changes in the staging of lung cancer, including Tumor (T), Node (N), Metastasis (M) (TNM) system and new stage/prognostic site-specific factors (SSFs), collected under the Collaborative Stage Version 2 (CSv2) Data Collection System. The intent was to improve the stage precision which could guide treatment options and ultimately lead to better survival. This report examines stage trends, the change in stage distributions from the AJCC 6th to the 7th edition, and findings of the prognostic SSFs for 2010 lung cancer cases. Methods Data were from the November 2012 submission of 18 Surveillance, Epidemiology, and End Results (SEER) Program population-based registries. A total of 344 797 cases of lung cancer, diagnosed in 2004–2010, were analyzed. Results The percentages of small tumors and early stage lung cancer cases increased from 2004 to 2010. The AJCC 7th edition, implemented for 2010 diagnosis year, subclassified tumor size and reclassified multiple tumor nodules, pleural effusions, and involvement of tumors in the contralateral lung, resulting in a slight decrease in stage IB and stage IIIB and a small increase in stage IIA and stage IV. Overall about 80% of cases remained the same stage group in AJCC 6th and 7th editions. About 21% of lung cancer patients had separate tumor nodules in the ipsilateral (same) lung, and 23% of the surgically resected patients had visceral pleural invasion, both adverse prognostic factors. Conclusion It is feasible for high quality population-based registries such as the SEER Program to collect more refined staging and prognostic SSFs that allows better categorization of lung cancer patients with different clinical outcomes and to assess their survival. PMID:25412390

  3. Correlation between observation task performance and visual acuity, contrast sensitivity and environmental light in a simulated maritime study.

    PubMed

    Koefoed, Vilhelm F; Assmuss, Jörg; Høvding, Gunnar

    2018-03-25

    To examine the relevance of visual acuity (VA) and index of contrast sensitivity (ICS) as predictors for visual observation task performance in a maritime environment. Sixty naval cadets were recruited to a study on observation tasks in a simulated maritime environment under three different light settings. Their ICS were computed based on contrast sensitivity (CS) data recorded by Optec 6500 and CSV-1000E CS tests. The correlation between object identification distance and VA/ICS was examined by stepwise linear regression. The object detection distance was significantly correlated to the level of environmental light (p < 0.001), but not to the VA or ICS recorded in the test subjects. Female cadets had a significantly shorter target identification range than the male cadets. Neither CS nor VA were found to be significantly correlated to observation task performance. This apparent absence of proven predictive value of visual parameters for observation tasks in a maritime environment may presumably be ascribed to the normal and uniform visual capacity in all our study subjects. © 2018 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  4. bioWeb3D: an online webGL 3D data visualisation tool

    PubMed Central

    2013-01-01

    Background Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. Results An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Conclusions Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets. PMID:23758781

  5. PrimerZ: streamlined primer design for promoters, exons and human SNPs.

    PubMed

    Tsai, Ming-Fang; Lin, Yi-Jung; Cheng, Yu-Chang; Lee, Kuo-Hsi; Huang, Cheng-Chih; Chen, Yuan-Tsong; Yao, Adam

    2007-07-01

    PrimerZ (http://genepipe.ngc.sinica.edu.tw/primerz/) is a web application dedicated primarily to primer design for genes and human SNPs. PrimerZ accepts genes by gene name or Ensembl accession code, and SNPs by dbSNP rs or AFFY_Probe IDs. The promoter and exon sequence information of all gene transcripts fetched from the Ensembl database (http://www.ensembl.org) are processed before being passed on to Primer3 (http://frodo.wi.mit.edu/cgi-bin/primer3/primer3_www.cgi) for individual primer design. All results returned from Primer 3 are organized and integrated in a specially designed web page for easy browsing. Besides the web page presentation, csv text file export is also provided for enhanced user convenience. PrimerZ automates highly standard but tedious gene primer design to improve the success rate of PCR experiments. More than 2000 primers have been designed with PrimerZ at our institute since 2004 and the success rate is over 70%. The addition of several new features has made PrimerZ even more useful to the research community in facilitating primer design for promoters, exons and SNPs.

  6. Low-cost, email-based system for self blood pressure monitoring at home.

    PubMed

    Nakajima, Kazuki; Nambu, Masayuki; Kiryu, Tohru; Tamura, Toshiyo; Sasaki, Kazuo

    2006-01-01

    We have developed a low-cost monitoring system, which allows subjects to send blood pressure (BP) data obtained at home to health-care professionals by email. The system consists of a wrist BP monitor and a personal computer (PC) with an Internet connection. The wrist BP monitor includes an advanced positioning sensor to verify that the wrist is placed properly at heart level. Subjects at home can self-measure their BP every day, automatically transfer the BP data to their PC each week, and then send a comma-separated values (CSV) file to their health-care professional by email. In a feasibility study, 10 subjects used the system for a mean period of 207 days (SD 149). The mean percent achievement of measurement in the 10 subjects was 84% (SD 12). There was a seasonal variation in systolic and diastolic BP, which was inversely correlated with temperature. Eight of the 10 subjects evaluated the system favourably. The results of the present study demonstrate the feasibility of our email-based system for self-monitoring of blood pressure. Its low cost means that it may have widespread application in future home telecare studies.

  7. Detection of circulating tumor cells from cryopreserved human sarcoma peripheral blood mononuclear cells.

    PubMed

    Li, Heming; Meng, Qing H; Noh, Hyangsoon; Batth, Izhar Singh; Somaiah, Neeta; Torres, Keila E; Xia, Xueqing; Wang, Ruoyu; Li, Shulin

    2017-09-10

    Circulating tumor cells (CTCs) enter the vasculature or lymphatic system after shedding from the primary tumor. CTCs may serve as "seed" cells for tumor metastasis. The utility of CTCs in clinical applications for sarcoma is not fully investigated, partly owing to the necessity for fresh blood samples and the lack of a CTC-specific antibody. To overcome these drawbacks, we developed a technique for sarcoma CTCs capture and detection using cryopreserved peripheral blood mononuclear cells (PBMCs) and our proprietary cell-surface vimentin (CSV) antibody 84-1, which is specific to tumor cells. This technique was validated by sarcoma cell spiking assay, matched CTCs comparison between fresh and cryopreserved PBMCs, and independent tumor markers in multiple types of sarcoma patient blood samples. The reproducibility was maximized when cryopreserved PBMCs were prepared from fresh blood samples within 2 h of the blood draw. In summary, as far as we are aware, ours is the first report to capture and detect CTCs from cryopreserved PBMCs. Further validation in other types of tumor may help boost the feasibility and utility of CTC-based diagnosis in a centralized laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Sequence Polishing Library (SPL) v10.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberortner, Ernst

    The Sequence Polishing Library (SPL) is a suite of software tools in order to automate "Design for Synthesis and Assembly" workflows. Specifically: The SPL "Converter" tool converts files among the following sequence data exchange formats: CSV, FASTA, GenBank, and Synthetic Biology Open Language (SBOL); The SPL "Juggler" tool optimizes the codon usages of DNA coding sequences according to an optimization strategy, a user-specific codon usage table and genetic code. In addition, the SPL "Juggler" can translate amino acid sequences into DNA sequences.:The SPL "Polisher" verifies NA sequences against DNA synthesis constraints, such as GC content, repeating k-mers, and restriction sites.more » In case of violations, the "Polisher" reports the violations in a comprehensive manner. The "Polisher" tool can also modify the violating regions according to an optimization strategy, a user-specific codon usage table and genetic code;The SPL "Partitioner" decomposes large DNA sequences into smaller building blocks with partial overlaps that enable an efficient assembly. The "Partitioner" enables the user to configure the characteristics of the overlaps, which are mostly determined by the utilized assembly protocol, such as length, GC content, or melting temperature.« less

  9. Automatic EEG spike detection.

    PubMed

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  10. NASA Tech Briefs, June 2009

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Topics covered include: Device for Measuring Low Flow Speed in a Duct, Measuring Thermal Conductivity of a Small Insulation Sample, Alignment Jig for the Precise Measurement of THz Radiation, Autoignition Chamber for Remote Testing of Pyrotechnic Devices, Microwave Power Combiners for Signals of Arbitrary Amplitude, Synthetic Foveal Imaging Technology, Airborne Antenna System for Minimum-Cycle-Slip GPS Reception, Improved Starting Materials for Back-Illuminated Imagers, Multi-Modulator for Bandwidth-Efficient Communication, Some Improvements in Utilization of Flash Memory Devices, GPS/MEMS IMU/Microprocessor Board for Navigation, T/R Multi-Chip MMIC Modules for 150 GHz, Pneumatic Haptic Interfaces, Device Acquires and Retains Rock or Ice Samples, Cryogenic Feedthrough Test Rig, Improved Assembly for Gas Shielding During Welding or Brazing, Two-Step Plasma Process for Cleaning Indium Bonding Bumps, Tool for Crimping Flexible Circuit Leads, Yb14MnSb11 as a High-Efficiency Thermoelectric Material, Polyimide-Foam/Aerogel Composites for Thermal Insulation, Converting CSV Files to RKSML Files, Service Management Database for DSN Equipment, Chemochromic Hydrogen Leak Detectors, Compatibility of Segments of Thermoelectric Generators, Complementary Barrier Infrared Detector, JPL Greenland Moulin Exploration Probe, Ultra-Lightweight Self-Deployable Nanocomposite Structure for Habitat Applications, and Room-Temperature Ionic Liquids for Electrochemical Capacitors.

  11. The distribution and stabilisation of dissolved Fe in deep-sea hydrothermal plumes

    NASA Astrophysics Data System (ADS)

    Bennett, Sarah A.; Achterberg, Eric P.; Connelly, Douglas P.; Statham, Peter J.; Fones, Gary R.; German, Christopher R.

    2008-06-01

    We have conducted a study of hydrothermal plumes overlying the Mid-Atlantic Ridge near 5° S to investigate whether there is a significant export flux of dissolved Fe from hydrothermal venting to the oceans. Our study combined measurements of plume-height Fe concentrations from a series of 6 CTD stations together with studies of dissolved Fe speciation in a subset of those samples. At 2.5 km down plume from the nearest known vent site dissolved Fe concentrations were ˜ 20 nM. This is much higher than would be predicted from a combination of plume dilution and dissolved Fe(II) oxidation rates, but consistent with stabilisation due to the presence of organic Fe complexes and Fe colloids. Using Competitive Ligand Exchange-Cathodic Stripping Voltammetry (CLE-CSV), stabilised dissolved Fe complexes were detected within the dissolved Fe fraction on the edges of one non-buoyant hydrothermal plume with observed ligand concentrations high enough to account for stabilisation of ˜ 4% of the total Fe emitted from the 5° S vent sites. If these results were representative of all hydrothermal systems, submarine venting could provide 12-22% of the global deep-ocean dissolved Fe budget.

  12. 76 FR 4365 - Renewal of the Trinity River Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... Management Working Group AGENCY: Office of the Secretary, Interior. ACTION: Notice. SUMMARY: The Secretary of... Trinity River Adaptive Management Working Group (Working Group) for 2 years. The Working Group provides... Road, Arcata, CA 95521; 707-822-7201. SUPPLEMENTARY INFORMATION: The Working Group conducts its...

  13. 78 FR 5830 - Renewal of the Trinity River Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... Management Working Group AGENCY: Office of the Secretary, Interior. ACTION: Notice. SUMMARY: The Secretary of... Trinity River Adaptive Management Working Group (Working Group) for 2 years. The Working Group provides... Road, Arcata, CA 95521; 707-822-7201. SUPPLEMENTARY INFORMATION: The Working Group conducts its...

  14. Work-family conflict in work groups: social information processing, support, and demographic dissimilarity.

    PubMed

    Bhave, Devasheesh P; Kramer, Amit; Glomb, Theresa M

    2010-01-01

    We used social information processing theory to examine the effect of work-family conflict (WFC) at the work group level on individuals' experience of WFC. Consistent with hypotheses, results suggest that WFC at the work group level influences individual WFC over and above the shared work environment and job demands. It was also observed that work group support and demographic dissimilarity moderate this relationship. Moderator analyses suggest that work group social support buffers WFC for individuals but is also associated with a stronger effect of work group WFC on individuals' WFC. Moreover, the work group effect on individuals' WFC was shown to be stronger for individuals who were demographically dissimilar to the work group in terms of sex and number of dependents. The interpretations and implications of these findings are discussed. Copyright 2009 APA, all rights reserved.

  15. [Impact of work-related musculoskeletal disorders on work ability among workers].

    PubMed

    Zhang, Lei; Huang, Chunping; Lan, Yajia; Wang, Mianzhen; Shu, Liping; Zhang, Wenhui; Yu, Long; Yao, Shengcai; Liao, Yunhua

    2015-04-01

    To assess the impact of work-related musculoskeletal disorders (WRMDs) on work ability among workers. A total of 1686 workers in various occupations, such as administration and education, were enrolled as subjects using the random cluster sampling method. The WRMDs and work ability of all subjects were evaluated using standardized Nordic questionnaires for the analysis of musculoskeletal symptoms and the Work Ability Index (WAI) scale, respectively. Comparison of work ability and its classification between the disease group and the non-disease group was performed by paired t test, RxC table χ2 test, and the Wilcoxon rank-sum test. The relationship between work duration and work ability was analyzed by the Spearman correlation test and a multi-level model. (1). The work ability of workers in the disease group was significantly lower than that in the non-disease group (P<0.0 1). (2) There were significant differences in work ability between workers with different work durations (<10 years, 10-20 years, and ≥20 years) (F=22.124, P< 0.01). With the increase in work duration, the work ability of workers declined in both groups, and the work ability of workers in the disease group (Spearman coefficient rs=-0. 172, P<0.01) had a more significant decline than that in the non-disease group (Spearman coefficient rs=-0.104, P<0.01). WRMDs were important risk factors for the decrease in work ability among workers. (3) There were significant differences in constituent ratios and levels of work ability classification between the disease group and the non-disease group (χ2=121.097, P<0.01; Z=-10.699, P<0.01). The proportions of workers with poor and medium work ability in the disease group were significantly higher than those in the non-disease group, while the proportion of works with excellent work ability in the disease group was significantly lower than that in the non-disease group. The similar characteristics in constituent ratios and levels of work ability classification could be found between the disease group and the non- disease group in various occupations (P<0.01). WRMDs have a harmful effect on the work ability of workers, and the work ability of workers substantially declines with the increase in exposure time (work duration).

  16. Group work as an incentive for learning – students’ experiences of group work

    PubMed Central

    Hammar Chiriac, Eva

    2014-01-01

    Group work is used as a means for learning at all levels in educational systems. There is strong scientific support for the benefits of having students learning and working in groups. Nevertheless, studies about what occurs in groups during group work and which factors actually influence the students’ ability to learn is still lacking. Similarly, the question of why some group work is successful and other group work results in the opposite is still unsolved. The aim of this article is to add to the current level of knowledge and understandings regarding the essence behind successful group work in higher education. This research is focused on the students’ experiences of group work and learning in groups, which is an almost non-existing aspect of research on group work prior to the beginning of the 21st century. A primary aim is to give university students a voice in the matter by elucidating the students’ positive and negative points of view and how the students assess learning when working in groups. Furthermore, the students’ explanations of why some group work ends up being a positive experience resulting in successful learning, while in other cases, the result is the reverse, are of interest. Data were collected through a study-specific questionnaire, with multiple choice and open-ended questions. The questionnaires were distributed to students in different study programs at two universities in Sweden. The present result is based on a reanalysis and qualitative analysis formed a key part of the study. The results indicate that most of the students’ experiences involved group work that facilitated learning, especially in the area of academic knowledge. Three important prerequisites (learning, study-social function, and organization) for group work that served as an effective pedagogy and as an incentive for learning were identified and discussed. All three abstractions facilitate or hamper students’ learning, as well as impact their experiences with group work. PMID:24926282

  17. 75 FR 34476 - Glen Canyon Dam Adaptive Management Work Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... DEPARTMENT OF THE INTERIOR Bureau of Reclamation Glen Canyon Dam Adaptive Management Work Group... Management Work Group. The purpose of the Adaptive Management Work Group is to advise and to provide... of the Glen Canyon Dam Adaptive Management Work Group is in the public interest in connection with...

  18. A Task Group Practitioner's Response to Waldo and Bauman's Article on Regrouping the Categorization of Group Work.

    ERIC Educational Resources Information Center

    Keel, Linda P.

    1998-01-01

    Argues that Waldo and Bauman's Goals and Process (GAP) matrix does not include task/work groups. Claims that it is not in the best interest of group work to undo or rework the Association for Specialists in Group Work's four core groups as a model. States that the field of group work needs a commonly shared framework/categorization from which to…

  19. Has Group Work Education Lost Its Social Group Work Essence? A Content Analysis of MSW Course Syllabi in Search of Mutual Aid and Group Conflict Content

    ERIC Educational Resources Information Center

    Sweifach, Jay Stephen

    2015-01-01

    This article presents the results of a content analysis of MSW group work course syllabi in an effort to better understand the extent to which mutual aid and group conflict, two important dimensions of social group work, are included and featured as prominent elements in MSW-level group work instruction.

  20. Theoretical Issues in Clinical Social Group Work.

    ERIC Educational Resources Information Center

    Randall, Elizabeth; Wodarski, John S.

    1989-01-01

    Reviews relevant issues in clinical social group practice including group versus individual treatment, group work advantages, approach rationale, group conditions for change, worker role in group, group composition, group practice technique and method, time as group work dimension, pretherapy training, group therapy precautions, and group work…

  1. Multicultural Group Work: A Force for Developing and Healing

    ERIC Educational Resources Information Center

    Anderson, Donald

    2007-01-01

    Multicultural group work represents a powerful tool for helping and healing in the context of human diversity. This article summarizes multicultural group work, including task, psychoeducational, counseling, and psychotherapy groups, and describes a group work model for multicultural assessment, diagnosis, and treatment planning. Group work…

  2. Group Work Publication-1991.

    ERIC Educational Resources Information Center

    Zimpfer, David G.

    1992-01-01

    Lists 21 new publications in group work, of which 9 are reviewed. Those discussed include publications on group counseling and psychotherapy, structured groups, support groups, psychodrama, and social group work. (Author/NB)

  3. Catalogue of Exoplanets in Multiple-Star-Systems

    NASA Astrophysics Data System (ADS)

    Schwarz, Richard; Funk, Barbara; Bazsó, Ákos; Pilat-Lohinger, Elke

    2017-07-01

    Cataloguing the data of exoplanetary systems becomes more and more important, due to the fact that they conclude the observations and support the theoretical studies. Since 1995 there is a database which list most of the known exoplanets (The Extrasolar Planets Encyclopaedia is available at http://exoplanet.eu/ and described at Schneider et al. 2011). With the growing number of detected exoplanets in binary and multiple star systems it became more important to mark and to separate them into a new database. Therefore we started to compile a catalogue for binary and multiple star systems. Since 2013 the catalogue can be found at http://www.univie.ac.at/adg/schwarz/multiple.html (description can be found at Schwarz et al. 2016) which will be updated regularly and is linked to the Extrasolar Planets Encyclopaedia. The data of the binary catalogue can be downloaded as a file (.csv) and used for statistical purposes. Our database is divided into two parts: the data of the stars and the planets, given in a separate list. Every columns of the list can be sorted in two directions: ascending, meaning from the lowest value to the highest, or descending. In addition an introduction and help is also given in the menu bar of the catalogue including an example list.

  4. AgeFactDB--the JenAge Ageing Factor Database--towards data integration in ageing research.

    PubMed

    Hühne, Rolf; Thalheim, Torsten; Sühnel, Jürgen

    2014-01-01

    AgeFactDB (http://agefactdb.jenage.de) is a database aimed at the collection and integration of ageing phenotype data including lifespan information. Ageing factors are considered to be genes, chemical compounds or other factors such as dietary restriction, whose action results in a changed lifespan or another ageing phenotype. Any information related to the effects of ageing factors is called an observation and is presented on observation pages. To provide concise access to the complete information for a particular ageing factor, corresponding observations are also summarized on ageing factor pages. In a first step, ageing-related data were primarily taken from existing databases such as the Ageing Gene Database--GenAge, the Lifespan Observations Database and the Dietary Restriction Gene Database--GenDR. In addition, we have started to include new ageing-related information. Based on homology data taken from the HomoloGene Database, AgeFactDB also provides observation and ageing factor pages of genes that are homologous to known ageing-related genes. These homologues are considered as candidate or putative ageing-related genes. AgeFactDB offers a variety of search and browse options, and also allows the download of ageing factor or observation lists in TSV, CSV and XML formats.

  5. AgeFactDB—the JenAge Ageing Factor Database—towards data integration in ageing research

    PubMed Central

    Hühne, Rolf; Thalheim, Torsten; Sühnel, Jürgen

    2014-01-01

    AgeFactDB (http://agefactdb.jenage.de) is a database aimed at the collection and integration of ageing phenotype data including lifespan information. Ageing factors are considered to be genes, chemical compounds or other factors such as dietary restriction, whose action results in a changed lifespan or another ageing phenotype. Any information related to the effects of ageing factors is called an observation and is presented on observation pages. To provide concise access to the complete information for a particular ageing factor, corresponding observations are also summarized on ageing factor pages. In a first step, ageing-related data were primarily taken from existing databases such as the Ageing Gene Database—GenAge, the Lifespan Observations Database and the Dietary Restriction Gene Database—GenDR. In addition, we have started to include new ageing-related information. Based on homology data taken from the HomoloGene Database, AgeFactDB also provides observation and ageing factor pages of genes that are homologous to known ageing-related genes. These homologues are considered as candidate or putative ageing-related genes. AgeFactDB offers a variety of search and browse options, and also allows the download of ageing factor or observation lists in TSV, CSV and XML formats. PMID:24217911

  6. Updates to the Virtual Atomic and Molecular Data Centre

    NASA Astrophysics Data System (ADS)

    Hill, Christian; Tennyson, Jonathan; Gordon, Iouli E.; Rothman, Laurence S.; Dubernet, Marie-Lise

    2014-06-01

    The Virtual Atomic and Molecular Data Centre (VAMDC) has established a set of standards for the storage and transmission of atomic and molecular data and an SQL-based query language (VSS2) for searching online databases, known as nodes. The project has also created an online service, the VAMDC Portal, through which all of these databases may be searched and their results compared and aggregated. Since its inception four years ago, the VAMDC e-infrastructure has grown to encompass over 40 databases, including HITRAN, in more than 20 countries and engages actively with scientists in six continents. Associated with the portal are a growing suite of software tools for the transformation of data from its native, XML-based, XSAMS format, to a range of more convenient human-readable (such as HTML) and machinereadable (such as CSV) formats. The relational database for HITRAN1, created as part of the VAMDC project is a flexible and extensible data model which is able to represent a wider range of parameters than the current fixed-format text-based one. Over the next year, a new online interface to this database will be tested, released and fully documented - this web application, HITRANonline2, will fully replace the ageing and incomplete JavaHAWKS software suite.

  7. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  8. Improvements in the Protein Identifier Cross-Reference service.

    PubMed

    Wein, Samuel P; Côté, Richard G; Dumousseau, Marine; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan A

    2012-07-01

    The Protein Identifier Cross-Reference (PICR) service is a tool that allows users to map protein identifiers, protein sequences and gene identifiers across over 100 different source databases. PICR takes input through an interactive website as well as Representational State Transfer (REST) and Simple Object Access Protocol (SOAP) services. It returns the results as HTML pages, XLS and CSV files. It has been in production since 2007 and has been recently enhanced to add new functionality and increase the number of databases it covers. Protein subsequences can be Basic Local Alignment Search Tool (BLAST) against the UniProt Knowledgebase (UniProtKB) to provide an entry point to the standard PICR mapping algorithm. In addition, gene identifiers from UniProtKB and Ensembl can now be submitted as input or mapped to as output from PICR. We have also implemented a 'best-guess' mapping algorithm for UniProt. In this article, we describe the usefulness of PICR, how these changes have been implemented, and the corresponding additions to the web services. Finally, we explain that the number of source databases covered by PICR has increased from the initial 73 to the current 102. New resources include several new species-specific Ensembl databases as well as the Ensembl Genome ones. PICR can be accessed at http://www.ebi.ac.uk/Tools/picr/.

  9. Analysing News for Stock Market Prediction

    NASA Astrophysics Data System (ADS)

    Ramalingam, V. V.; Pandian, A.; Dwivedi, shivam; Bhatt, Jigar P.

    2018-04-01

    Stock market means the aggregation of all sellers and buyers of stocks representing their ownership claims on the business. To be completely absolute about the investment on these stocks, proper knowledge about them as well as their pricing, for both present and future is very essential. Large amount of data is collected and parsed to obtain this essential information regarding the fluctuations in the stock market. This data can be any news or public opinions in general. Recently, many methods have been used, especially big unstructured data methods to predict the stock market values. We introduce another method of focusing on deriving the best statistical learning model for predicting the future values. The data set used is very large unstructured data collected from an online social platform, commonly known as Quindl. The data from this platform is then linked to a csv fie and cleaned to obtain the essential information for stock market prediction. The method consists of carrying out the NLP (Natural Language Processing) of the data and then making it easier for the system to understand, finds and identifies the correlation in between this data and the stock market fluctuations. The model is implemented using Python Programming Language throughout the entire project to obtain flexibility and convenience of the system.

  10. Integration of g4tools in Geant4

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, Ivana

    2014-06-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  11. CIG-P: Circular Interaction Graph for Proteomics.

    PubMed

    Hobbs, Christopher K; Leung, Michelle; Tsang, Herbert H; Ebhardt, H Alexander

    2014-10-31

    A typical affinity purification coupled to mass spectrometry (AP-MS) experiment includes the purification of a target protein (bait) using an antibody and subsequent mass spectrometry analysis of all proteins co-purifying with the bait (aka prey proteins). Like any other systems biology approach, AP-MS experiments generate a lot of data and visualization has been challenging, especially when integrating AP-MS experiments with orthogonal datasets. We present Circular Interaction Graph for Proteomics (CIG-P), which generates circular diagrams for visually appealing final representation of AP-MS data. Through a Java based GUI, the user inputs experimental and reference data as file in csv format. The resulting circular representation can be manipulated live within the GUI before exporting the diagram as vector graphic in pdf format. The strength of CIG-P is the ability to integrate orthogonal datasets with each other, e.g. affinity purification data of kinase PRPF4B in relation to the functional components of the spliceosome. Further, various AP-MS experiments can be compared to each other. CIG-P aids to present AP-MS data to a wider audience and we envision that the tool finds other applications too, e.g. kinase - substrate relationships as a function of perturbation. CIG-P is available under: http://sourceforge.net/projects/cig-p/

  12. DOCKSCORE: a webserver for ranking protein-protein docked poses.

    PubMed

    Malhotra, Sony; Mathew, Oommen K; Sowdhamini, Ramanathan

    2015-04-24

    Proteins interact with a variety of other molecules such as nucleic acids, small molecules and other proteins inside the cell. Structure-determination of protein-protein complexes is challenging due to several reasons such as the large molecular weights of these macromolecular complexes, their dynamic nature, difficulty in purification and sample preparation. Computational docking permits an early understanding of the feasibility and mode of protein-protein interactions. However, docking algorithms propose a number of solutions and it is a challenging task to select the native or near native pose(s) from this pool. DockScore is an objective scoring scheme that can be used to rank protein-protein docked poses. It considers several interface parameters, namely, surface area, evolutionary conservation, hydrophobicity, short contacts and spatial clustering at the interface for scoring. We have implemented DockScore in form of a webserver for its use by the scientific community. DockScore webserver can be employed, subsequent to docking, to perform scoring of the docked solutions, starting from multiple poses as inputs. The results, on scores and ranks for all the poses, can be downloaded as a csv file and graphical view of the interface of best ranking poses is possible. The webserver for DockScore is made freely available for the scientific community at: http://caps.ncbs.res.in/dockscore/ .

  13. REX2000 Version 2.5: Improved DATA Handling and Enhanced User-Interface

    NASA Astrophysics Data System (ADS)

    Taguchi, Takeyoshi

    2007-02-01

    XAFS analysis can be applied to various fields such as material science, environmental study, biological science, etc. and is widely used for characterization in those fields. In the early days that XAFS technique was started to be used, scientists wrote their own code for XAFS data analysis. As XAFS technique became very popular and XAFS community grew big, a several analysis code or package had been developed and submitted for people to use. The REX2000 is one of those XAFS analysis packages, which is commercially available. Counting up from its predecessor "REX", REX2000 has been used for more than 15 years in XAFS society. From the previous modification in 2003, a major change was made in this year of 2006. For a dynamical study of advanced material, many XAFS DATA were measured (quick XAFS and in-situ XAFS) and hundreds of DATA sets need to be processed. The REX2000's DATA handling is improved to cope with those huge volume DATA at once and report the fitting result as CSV file. Well-established user-interface is enhanced so that user can customize initial values for data analysis and specify the options through graphical interface. Many small changes are made and described in this paper.

  14. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems

    PubMed Central

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-01-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940

  15. Large Drought-induced Variations in Oak Leaf Volatile Organic Compound Emissions during PINOT NOIR 2012

    EPA Pesticide Factsheets

    Leaf level oak isoprene emissions and co2/H2O exchange in the Ozarks, USABAGeron.csv is the speciated biomass displayed in Figure 1.Biomass Dry Weights.xlsx is used to convert leaf area to dry leaf biomass and is used in Figure 2.Daly Ozarks leaf ISOP.txt and MOFLUX_Isoprene Summary_refined Tcurve data.xlsx are the leaf isoprene emission rate files shown in Figure 2.Harley Aug12_Chris.xls is the leaf isoprene emission rate file shown in Figure 3.Daly Ozarks leaf.txt is the BVOC emissions file used for Figure 7 and Table 4.Drought IS.txt is the review data given in Table 2.Fig4 Aug10 2012 Harley.txt is shown in Figure 4.Fig 5 Aug14 2012 Harley.txt is shown in Figure 5.Daly Ozarks Leaf.txt is used in Fig 7.Drought IS.txt is used in Fig 8.This dataset is associated with the following publication:Geron , C., R. Daly , P. Harley, R. Rasmussen, R. Seco, A. Guenther, T. Karl, and L. Gu. Large Drought-Induced Variations in Oak Leaf Volatile Organic Compound Emissions during PINOT NOIR 2012. CHEMOSPHERE. Elsevier Science Ltd, New York, NY, USA, 146: 8-21, (2016).

  16. Substance Identification Information from EPA's Substance Registry

    EPA Pesticide Factsheets

    The Substance Registry Services (SRS) is the authoritative resource for basic information about substances of interest to the U.S. EPA and its state and tribal partners. Substances, particularly chemicals, can have many valid synonyms. For example, toluene, methyl benzene, and phenyl methane, are commonly used names for the same chemical. EPA programs collect environmental data for this chemical using each of these names, plus others. This diversity leads to problems when a user is looking for programmatic data for toluene but is unaware that the data is stored under the synonym methyl benzene. For each substance, the SRS identifies the statutes, EPA programs, as well as organization external to EPA, that track or regulate that substance and the synonym used by that statute, EPA program or external organization. Besides standardized information for each chemical, such as the Chemical Abstracts Services name and the Chemical Abstracts Number and the EPA Registry Name (the EPA standard name), the SRS also includes additional information, such as molecular weight and molecular formula. Additionally, an SRS Internal Tracking Number uniquely identifies each substance, enabling cross-walking between synonyms. EPA is providing a large .ZIP file providing the SRS data in CSV format, and a separate small metadata file in XML containing the field names and definitions.

  17. Health & Demographic Surveillance System Profile: The Birbhum population project (Birbhum HDSS).

    PubMed

    Ghosh, Saswata; Barik, Anamitra; Majumder, Saikat; Gorain, Ashoke; Mukherjee, Subrata; Mazumdar, Saibal; Chatterjee, Kajal; Bhaumik, Sunil Kumar; Bandyopadhyay, Susanta Kumar; Satpathi, BiswaRanjan; Majumder, Partha P; Chowdhury, Abhijit

    2015-02-01

    The Birbhum HDSS was established in 2008 and covers 351 villages in four administrative blocks in rural areas of Birbhum district of West Bengal, India. The project currently follows 54 585 individuals living in 12557 households. The population being followed up is economically underprivileged and socially marginalized. The HDSS, a prospective longitudinal cohort study, has been designed to study changes in population demographic, health and healthcare utilization. In addition to collecting data on vital statistics and antenatal and postnatal tracking, verbal autopsies are being performed. Moreover, periodic surveys capturing socio-demographic and economic conditions have been conducted twice. Data on nutritional status (children as well as adults), non-communicable diseases, smoking etc. have also been collected in special surveys. Currently, intervention studies on anaemia, undernutrition and common preschool childhood morbidities through behavioural changes are under way. For access to the data, a researcher needs to send a request to the Data Manager [suri.shds@gmail.com]. Data are shared in common formats like comma-separated files (csv) or Microsoft Excel (xlsx) or Microsoft Access Database (mdb).The HDSS will soon upgrade its data management system to a more integrated platform, coordinated and guided by INDEPTH data sharing policy. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. A Launch Requirements Trade Study for Active Space Radiation Shielding for Long Duration Human Missions

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Bollweg, Ken; Martin, Trent; Westover, Shayne; Battiston, Roberto; Burger, William J.; Meinke, Rainer

    2015-01-01

    A trade study for an active shielding concept based on magnetic fields in a solenoid configuration versus mass based shielding was developed. Monte Carlo simulations were used to estimate the radiation exposure for two values of the magnetic field strength and the mass of the magnetic shield configuration. For each field strength, results were reported for the magnetic region shielding (end caps ignored) and total region shielding (end caps included but no magnetic field protection) configurations. A value of 15 cSv was chosen to be the maximum exposure for an astronaut. The radiation dose estimate over the total shield region configuration cannot be used at this time without a better understanding of the material and mass present in the end cap regions through a detailed vehicle design. The magnetic shield region configuration, assuming the end cap regions contribute zero exposure, can be launched on a single Space Launch System rocket and up to a two year mission can be supported. The magnetic shield region configuration results in two versus nine launches for a comparable mass based shielding configuration. The active shielding approach is clearly more mass efficient because of the reduced number of launches than the mass based shielding for long duration missions.

  19. D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, D.; /Fermilab

    1998-06-09

    This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less

  20. Group Selection and Learning for a Lab-Based Construction Management Course

    ERIC Educational Resources Information Center

    Solanki, Pranshoo; Kothari, Nidhi

    2014-01-01

    In construction industries' projects, working in groups is a normal practice. Group work in a classroom is defined as students working collaboratively in a group so that everyone can participate on a collective task. The results from literature review indicate that group work is more effective method of learning as compared to individual work.…

  1. 75 FR 4440 - Meeting of the Working group on Environmental Cooperation Pursuant to the United States-Morocco...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... DEPARTMENT OF STATE [Public Notice 6885] Meeting of the Working group on Environmental Cooperation... meeting of the Working Group on Environmental Cooperation (``Working Group'') in Rabat, Morocco on February 9, 2010, at a venue to be announced. Meetings of the Working Group were forecast in paragraph five...

  2. 75 FR 51525 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    .... 63] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad... Committee (RSAC) Working Group Activities. SUMMARY: The FRA is updating its announcement of RSAC's Working.... SUPPLEMENTARY INFORMATION: This notice serves to update FRA's last announcement of working group activities and...

  3. Long working hours and emotional well-being in korean manufacturing industry employees.

    PubMed

    Lee, Kyoung-Hye; Kim, Jong-Eun; Kim, Young-Ki; Kang, Dong-Mug; Yun, Myeong-Ja; Park, Shin-Goo; Song, Jae-Seok; Lee, Sang-Gil

    2013-12-05

    Korea is well known for its long work hours amongst employees. Because workers of the manufacturing industry are constantly exposed to extended work hours, this study was based on how long work hours affect their emotional well-being. The analysis was done using the secondary Korean Working Condition Survey (KWCS). Long work hours were defined to be more than 48 hours, and they were subcategorized into units of 52 hours and 60 hours. Based on the WHO (five) well-being index, emotional state was subdivided into three groups - reference group, low-mood group, and possible depression group- where 28 points and 50 points were division points, and two groups were compared at a time. Association between long work hours and emotional state was analyzed using binary and multinomial logistic regression analysis. Working for extended working hours in the manufacturing industry showed a statistically significant increase (t test p < 0.001) in trend among the possible depression group when compared to the reference group and the low-mood group. When demographical characteristics, health behaviors, socioeconomic state, and work-related characteristics were fixed as controlled variables, as work hours increased the odds ratio of the possible depression group increased compared to the reference group, and especially the odds ratio was 2.73 times increased for work hours between 48-52 and 4.09 times increased for 60 hours or more and both were statistically significant. In comparing the low-mood group and possible depression group, as work hours increased the odds ratio increased to 1.73, 2.39, and 4.16 times, and all work hours from working 48-52 hours, 53-60 hours, and 60 hours or more were statistically significant. Multinomial logistic regression analysis also showed that among the reference group and possible group, the possible depression group was statistically significant as odds ratio increased to 2.94 times in working 53-60 hours, and 4.35 times in 60 hours or more. Long work hours have an adverse effect on emotional well-being. A more diversified research towards variables that affect long work hours and emotional well-being and how they interact with each other and their relationship to overall health is imperative.

  4. 75 FR 4904 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ...-7257] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad... Committee (RSAC) Working Group Activities. SUMMARY: The FRA is updating its announcement of RSAC's Working... notice serves to update FRA's last announcement of working group activities and status reports of August...

  5. 77 FR 24257 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-23

    .... 69] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad... Committee (RSAC) Working Group Activities. SUMMARY: The FRA is updating its announcement of the RSAC Working.... SUPPLEMENTARY INFORMATION: This notice serves to update FRA's last announcement of working group activities and...

  6. 78 FR 57672 - 91st Meeting: RTCA Special Committee 159, Global Positioning Systems (GPS)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-19

    ... include the following: Working Group Sessions October 7 Working Group 2C, GPS/Inertial, ARINC & A4A Rooms October 8 Working Group 2, GPS/WAAS, McIntosh-NBAA Room and Colson Board Room October 9 Working Group 2, GPS/WAAS, ARINC & A4A Rooms, Afternoon, 1:00 p.m.-5:00 p.m., Working Group 4, GPS/Precision Landing...

  7. Forward-looking infrared imaging predicts ultimate burn depth in a porcine vertical injury progression model.

    PubMed

    Miccio, Joseph; Parikh, Shruti; Marinaro, Xavier; Prasad, Atulya; McClain, Steven; Singer, Adam J; Clark, Richard A F

    2016-03-01

    Current methods of assessing burn depth are limited and are primarily based on visual assessments by burn surgeons. This technique has been shown to have only 60% accuracy and a more accurate, simple, noninvasive method is needed to determine burn wound depth. Forward-looking infrared (FLIR) thermography is both noninvasive and user-friendly with the potential to rapidly assess burn depth. The purpose of this paper is to determine if early changes in burn temperature (first 3 days) can be a predictor of burn depth as assessed by vertical scarring 28 days after injury. While under general anesthesia, 20 burns were created on the backs of two female Yorkshire swine using a 2.5cm×2.5cm×7.5cm, 150g aluminum bar, for a total of 40 burns. FLIR imaging was performed at both early (1, 2 and 3 days) and late (7, 10, 14, 17, 21, 24 and 28 days) time points. Burns were imaged from a height of 12 inches from the skin surface. FLIR ExaminIR(©) software was used to examine the infrared thermographs. One hundred temperature points from burn edge to edge across the center of the burn were collected for each burn at all time points and were exported as a comma-separated values (CSV) file. The CSV file was processed and analyzed using a MATLAB program. The temperature profiles through the center of the burns generated parabola-like curves. The lowest temperature (temperature minimum) and a line midway between the temperature minimum and ambient skin temperature at the burn edges was defined and the area of the curve calculated (the "temperature half-area"). Half-area values 2 days after burn had higher correlations with scar depth than did the minimum temperatures. However, burns that became warmer from 1 day to 2 days after injury had a lower scar depth then burns that became cooler and this trend was best predicted by temperature minima. When data were analyzed as a diagnostic test for sensitivity and specificity using >3mm scarring, i.e. a full-thickness burn, as a clinically relevant criterion standard, temperature minima at 2 days after burn was found to be the most sensitive and specific test. FLIR imaging is a fast and simple tool that has been shown to predict burn wound outcome in a porcine vertical injury progression model. Data showed that more severe burn wounds get cooler between 1 and 2 days after burn. We found four analytic methods of FLIR images that were predictive of burn progression at 1 and 2 days after burn; however, temperature minima 2 days after burn appeared to be the best predictive test for injury progression to a full-thickness burn. Although these results must be validated in clinical studies, FLIR imaging has the potential to aid clinicians in assessing burn severity and thereby assisting in burn wound management. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  8. 78 FR 56238 - Office of the Director, National Institutes of Health; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... to discuss the findings of two of its working groups: the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) working group and the HeLa Genome Data Access working group. The BRAIN working group...

  9. The PEcAn Project: Accessible Tools for On-demand Ecosystem Modeling

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Kooper, R.; LeBauer, D.; Desai, A. R.; Mantooth, J.; Dietze, M.

    2014-12-01

    Ecosystem models play a critical role in understanding the terrestrial biosphere and forecasting changes in the carbon cycle, however current forecasts have considerable uncertainty. The amount of data being collected and produced is increasing on daily basis as we enter the "big data" era, but only a fraction of this data is being used to constrain models. Until we can improve the problems of model accessibility and model-data communication, none of these resources can be used to their full potential. The Predictive Ecosystem Analyzer (PEcAn) is an ecoinformatics toolbox and a set of workflows that wrap around an ecosystem model and manage the flow of information in and out of regional-scale TBMs. Here we present new modules developed in PEcAn to manage the processing of meteorological data, one of the primary driver dependencies for ecosystem models. The module downloads, reads, extracts, and converts meteorological observations to Unidata Climate Forecast (CF) NetCDF community standard, a convention used for most climate forecast and weather models. The module also automates the conversion from NetCDF to model specific formats, including basic merging, gap-filling, and downscaling procedures. PEcAn currently supports tower-based micrometeorological observations at Ameriflux and FluxNET sites, site-level CSV-formatted data, and regional and global reanalysis products such as the North American Regional Reanalysis and CRU-NCEP. The workflow is easily extensible to additional products and processing algorithms.These meteorological workflows have been coupled with the PEcAn web interface and now allow anyone to run multiple ecosystem models for any location on the Earth by simply clicking on an intuitive Google-map based interface. This will allow users to more readily compare models to observations at those sites, leading to better calibration and validation. Current work is extending these workflows to also process field, remotely-sensed, and historical observations of vegetation composition and structure. The processing of heterogeneous met and veg data within PEcAn is made possible using the Brown Dog cyberinfrastructure tools for unstructured data.

  10. Beebook: light field mapping app

    NASA Astrophysics Data System (ADS)

    De Donatis, Mauro; Di Pietro, Gianfranco; Rinnone, Fabio

    2014-05-01

    In the last decade the mobile systems for field digital mapping were developed (see Wikipedia for "Digital geologic mapping"), also against many skeptic traditional geologists. Until now, hardware was often heavy (tablet PC) and software sometime difficult also for expert GIS users. At present, the advent of light tablet and applications makes things easier, but we are far to find a whole solution for a complex survey like the geological one where you have to manage complexities such information, hypothesis, data, interpretation. Beebook is a new app for Android devices, has been developed for fast ad easy mapping work in the field trying to try to solve this problem. The main features are: • off-line raster management, GeoTIFF ed other raster format using; • on-line map visualisation (Google Maps, OSM, WMS, WFS); • SR management and conversion using PROJ.4; • vector file mash-up (KML and SQLite format); • editing of vector data on the map (lines, points, polygons); • augmented reality using "Mixare" platform; • export of vector data in KML, CSV, SQLite (Spatialite) format; • note: GPS or manual point inserting linked to other application files (pictures, spreadsheet, etc.); • form: creation, edition and filling of customized form; • GPS: status control, tracker and positioning on map; • sharing: synchronization and sharing of data, forms, positioning and other information can be done among users. The input methods are different from digital keyboard to fingers touch, from voice recording to stylus. In particular the most efficient way of inserting information is the stylus (or pen): field geologists are familiar with annotation and sketches. Therefore we suggest the use of devices with stylus. The main point is that Beebook is the first "transparent" mobile GIS for tablet and smartphone deriving from previous experience as traditional mapping and different previous digital mapping software ideation and development (MapIT, BeeGIS, Geopaparazzi). Deriving from those experiences, we developed a tool which is easy to use and applicable not only for geology but also to every field survey.

  11. Impact of work-life imbalance on job satisfaction and quality of life among hospital nurses in Japan

    PubMed Central

    MAKABE, Sachiko; TAKAGAI, Junko; ASANUMA, Yoshihiro; OHTOMO, Kazuo; KIMURA, Yutaka

    2014-01-01

    This study investigated the status of work-life imbalance among hospital nurses in Japan and impact of work-life imbalance on job satisfaction and quality of life. A cross-sectional survey of 1,202 nurses (81% response rate) was conducted in three Japanese acute care hospitals. Participants were divided into four groups for actual work-life balance (Group A: 50/50, including other lower working proportion groups [e.g., 40/50]; Group B: 60/40; Group C: 70/30; and Group D: 80/20, including other higher working proportion groups [e.g., 90/10]). We also asked participants about desired work-life balance, and private and work-related perspectives. Satisfactions (job, private life, and work-life balance), quality of life, and stress-coping ability were also measured. All data were compared among the four groups. Most nurses sensed that they had a greater proportion of working life than private life, and had a work-life imbalance. Actual WLB did not fit compared to desired WLB. When the actual working proportion greatly exceeds the private life proportion, nurses’ health could be in danger, and they may resign due to lower job satisfaction and QOL. Simultaneous progress by both management and individual nurses is necessary to improve work-life imbalance. PMID:25475095

  12. Impact of work-life imbalance on job satisfaction and quality of life among hospital nurses in Japan.

    PubMed

    Makabe, Sachiko; Takagai, Junko; Asanuma, Yoshihiro; Ohtomo, Kazuo; Kimura, Yutaka

    2015-01-01

    This study investigated the status of work-life imbalance among hospital nurses in Japan and impact of work-life imbalance on job satisfaction and quality of life. A cross-sectional survey of 1,202 nurses (81% response rate) was conducted in three Japanese acute care hospitals. Participants were divided into four groups for actual work-life balance (Group A: 50/50, including other lower working proportion groups [e.g., 40/50]; Group B: 60/40; Group C: 70/30; and Group D: 80/20, including other higher working proportion groups [e.g., 90/10]). We also asked participants about desired work-life balance, and private and work-related perspectives. Satisfactions (job, private life, and work-life balance), quality of life, and stress-coping ability were also measured. All data were compared among the four groups. Most nurses sensed that they had a greater proportion of working life than private life, and had a work-life imbalance. Actual WLB did not fit compared to desired WLB. When the actual working proportion greatly exceeds the private life proportion, nurses' health could be in danger, and they may resign due to lower job satisfaction and QOL. Simultaneous progress by both management and individual nurses is necessary to improve work-life imbalance.

  13. Emotions in Group Work: Insights from an Appraisal-Oriented Perspective

    ERIC Educational Resources Information Center

    Zschocke, Karen; Wosnitza, Marold; Bürger, Kathrin

    2016-01-01

    Small group work is common practice in higher education. However, empirical research on students' emotions related to group work is still relatively scarce. Particularly, little is known about students' appraisals of a group task as antecedents of emotions arising in the context of group work. This paper provides a first attempt to systematically…

  14. The effect of lifestyle modification on physical fitness and work ability in different workstyles.

    PubMed

    Ohta, Masanori; Okufuji, Tatsuya; Matsushima, Yasuyuki; Ikeda, Masaharu

    2004-12-01

    It is generally considered that physical fitness is affected by daily life activities including leisure time activity and working time activity. The aim of this study is to investigate the effects of different levels of physical activity at work on physical fitness, analyze the effects of 12-week lifestyle modification outside of working hours on physical fitness, work satisfaction and subjective symptoms, and to consider the role of lifestyle modification in occupational health. Lifestyle modification, consisting of aerobic exercise and diet counseling, was conducted for 12 weeks. The data before and after the intervention from 49 male workers were obtained. Physical fitness such as exercise endurance, flexibility, agility, balance, muscular strength, muscular endurance, and muscular power was measured before and after the intervention. The subjects were asked to fill out questionnaires about their work activities, subjective complaints, and work satisfaction. Subjects were divided into active work group (n = 14) and sedentary work group (n = 35) for analysis according to their work activities. As for differences in physical fitness due to different levels of physical activity, the active work group had superior exercise endurance and balance compared to the sedentary work group. In addition, the sedentary work group tended to experience greater fatigue than the active work group. In the active work group, flexibility and muscular strength were significantly increased with lifestyle modification and, in the sedentary work group, exercise endurance, flexibility and muscular endurance were significantly improved while balance also showed a tendency to improve. In the sedentary work group, lifestyle modification resulted in reduced fatigue and stiff neck as well as an increased work satisfaction. In the active work group, no change was observed in complaints or work satisfaction, but improved physical fitness led to a reduction in subjective complaints and an increase in work satisfaction. The level of physical activity at work contributes to the physical fitness of the worker and the addition of aerobic exercise in the worker's leisure time improves physical fitness and thereby contributes to increased work ability regardless of differences in the level of physical activity at work.

  15. Group processing in an undergraduate biology course for preservice teachers: Experiences and attitudes

    NASA Astrophysics Data System (ADS)

    Schellenberger, Lauren Brownback

    Group processing is a key principle of cooperative learning in which small groups discuss their strengths and weaknesses and set group goals or norms. However, group processing has not been well-studied at the post-secondary level or from a qualitative or mixed methods perspective. This mixed methods study uses a phenomenological framework to examine the experience of group processing for students in an undergraduate biology course for preservice teachers. The effect of group processing on students' attitudes toward future group work and group processing is also examined. Additionally, this research investigated preservice teachers' plans for incorporating group processing into future lessons. Students primarily experienced group processing as a time to reflect on past performance. Also, students experienced group processing as a time to increase communication among group members and become motivated for future group assignments. Three factors directly influenced students' experiences with group processing: (1) previous experience with group work, (2) instructor interaction, and (3) gender. Survey data indicated that group processing had a slight positive effect on students' attitudes toward future group work and group processing. Participants who were interviewed felt that group processing was an important part of group work and that it had increased their group's effectiveness as well as their ability to work effectively with other people. Participants held positive views on group work prior to engaging in group processing, and group processing did not alter their atittude toward group work. Preservice teachers who were interviewed planned to use group work and a modified group processing protocol in their future classrooms. They also felt that group processing had prepared them for their future professions by modeling effective collaboration and group skills. Based on this research, a new model for group processing has been created which includes extensive instructor interaction and additional group processing sessions. This study offers a new perspective on the phenomenon of group processing and informs science educators and teacher educators on the effective implementation of this important component of small-group learning.

  16. Development and Reliability of the Basic Skill Assessment Tool for Adolescents with Autism Spectrum Disorder

    PubMed Central

    Lersilp, Suchitporn; Suchart, Sumana

    2017-01-01

    The purpose of this study was to improve upon the first version of the basic work skills assessment tool for adolescents with autism spectrum disorder (ASD) and examine interrater and intrarater reliability using Intraclass Correlation Coefficient (ICC). The modified tool includes 2 components: (1) three tasks measuring work abilities and work attitudes and (2) a form to record the number of verbal and nonverbal prompts. 26 participants were selected by purposive sampling and divided into 3 groups—group 1 (10 subjects, aged 11–13 years), group 2 (10, aged 14–16 years), and group 3 (6, aged 17–19 years). The results show that interrater reliabilities of work abilities and work attitudes were high in all groups except that the work attitude in group 1 was moderate. Intrarater reliabilities of work abilities in group 1 and group 2 were high. Group 3 was moderate. Intrarater reliabilities of work attitudes in group 1 and group 3 were high but not in group 2 in which they were moderate. Nevertheless, interrater and intrarater reliabilities in the total scores of all groups were high, which implies that this tool is applicable for adolescents aged 11–19 years with consideration of relevance for each group. PMID:28280769

  17. Small group learning: graduate health students' views of challenges and benefits.

    PubMed

    Jackson, Debra; Hickman, Louise D; Power, Tamara; Disler, Rebecca; Potgieter, Ingrid; Deek, Hiba; Davidson, Patricia M

    2014-07-19

    Abstract Background: For health care professionals, particularly nurses, the need to work productively and efficiently in small groups is a crucial skill required to meet the challenges of the contemporary health-care environment. Small group work is an educational technique that is used extensively in nurse education. The advantage of group work includes facilitation of deep, active and collaborative learning. However, small group work can be problematic and present challenges for students. Many of the challenges occur because group work necessitates the coming together of collections of individuals, each with their own personalities and sets of experiences. Aim: This study aimed to identify challenges and benefits associated with small group work and to explore options for retaining the positive aspects of group work while reducing or eliminating the aspects the students experienced as negative. Method: Online survey; thematic analysis. Results: Over all, students experienced a range of challenges that necessitated the development of problem-solving strategies. However, they were able to elucidate some enjoyable and positive aspects of group work. Implications for teaching and learning are drawn from this study. Conclusion: The ability to work effectively in small groups and teams is essential for all health care workers in the contemporary health environment. Findings of this study highlight the need for educators to explore novel and effective ways in which to engage nurses in group work.

  18. Small group learning: Graduate health students' views of challenges and benefits.

    PubMed

    Jackson, Debra; Hickman, Louise D; Power, Tamara; Disler, Rebecca; Potgieter, Ingrid; Deek, Hiba; Davidson, Patricia M

    2014-01-01

    Abstract Background: For health-care professionals, particularly nurses, the need to work productively and efficiently in small groups is a crucial skill required to meet the challenges of the contemporary health-care environment. Small group work is an educational technique that is used extensively in nurse education. The advantage of group work includes facilitation of deep, active and collaborative learning. However, small group work can be problematic and present challenges for students. Many of the challenges occur because group work necessitates the coming together of collections of individuals, each with their own personalities and sets of experiences. This study aimed to identify challenges and benefits associated with small group work and to explore options for retaining the positive aspects of group work while reducing or eliminating the aspects the students experienced as negative. Online survey; thematic analysis. Over all, students experienced a range of challenges that necessitated the development of problem-solving strategies. However, they were able to elucidate some enjoyable and positive aspects of group work. Implications for teaching and learning are drawn from this study. The ability to work effectively in small groups and teams is essential for all health-care workers in the contemporary health environment. Findings of this study highlight the need for educators to explore novel and effective ways in which to engage nurses in group work.

  19. 78 FR 11728 - Aviation Rulemaking Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Working Groups a. Airman Testing Standards and Training Working Group (ARAC) b. Flight Controls Harmonization Working Group (Transport Airplane and Engine Subcommittee [TAE]) c. Airworthiness Assurance Working Group (TAE) 3. New Tasks a. Engine Bird Ingestion Requirements--Revision of Section 33.76 b...

  20. Graduate Social Work Students' Experiences with Group Work in the Field and the Classroom

    ERIC Educational Resources Information Center

    Goodman, Harriet; Knight, Carolyn; Khudododov, Khudodod

    2014-01-01

    For decades, group work scholars have described a discrepancy between student preparation for group work practice and opportunities to work with groups in the field practicum and professional practice. Educators in related disciplines such as counseling and psychology have expressed similar concerns. This article reports findings of a study of MSW…

  1. The Impact of Instructor's Group Management Strategies on Students' Attitudes to Group Work and Generic Skill Development

    ERIC Educational Resources Information Center

    Natoli, Riccardo; Jackling, Beverley; Seelanatha, Lalith

    2014-01-01

    This paper examines the influence of two distinct group work management strategies on finance students' attitudes towards group work and their perceptions of generic skill development. Using quantitative and qualitative data, comparisons are made between students who experienced a supportive group work environment and students who experienced an…

  2. Students' Perceptions of Classroom Group Work as a Function of Group Member Selection

    ERIC Educational Resources Information Center

    Myers, Scott A.

    2012-01-01

    The purpose of this assessment was to examine whether differences exist between students who self-select their classroom work group members and students who are randomly assigned to their classroom work groups in terms of their use of organizational citizenship behaviors with their work group members; their commitment to, trust in, and relational…

  3. 78 FR 49595 - Aviation Rulemaking Advisory Committee-New Task

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... the new ARAC activity and solicits membership for the Maintenance Reliability Program Working Group... establish the Maintenance Reliability Program Working Group. The working group will serve as staff to ARAC... programs. The Maintenance Reliability Program Working Group will provide advice and recommendations on the...

  4. 75 FR 27618 - RTCA Government/Industry Air Traffic Management Advisory Committee (ATMAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ... Plenary (Welcome and Introductions); Trajectory Operations (TOps) Work Group Presentation of Document... Work Group (NGIWG) Report, Discussion, and possible Next Steps; ADS-B Work Group Presentation of Legacy... Activity Airspace Program (NSAAP) Presentation Requested by Requirements and Planning Work Group, Airspace...

  5. 77 FR 36903 - Accelerating Broadband Infrastructure Deployment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... coordination with the Chief Performance Officer (CPO). (b) The Working Group shall be composed of: (i) a... broadband infrastructure. Sec. 2. Broadband Deployment on Federal Property Working Group. (a) In order to... Property Working Group (Working Group), to be co-chaired by representatives designated by the Administrator...

  6. Group Work and the Change of Obstacles over Time: The Influence of Learning Style and Group Composition

    ERIC Educational Resources Information Center

    Soetanto, Danny; MacDonald, Matthew

    2017-01-01

    It is through working in groups that students develop cooperative learning skills and experience. However, group work activity often leads students into a difficult experience, especially for first-year students who are not familiar with group work activities at university. This study explores obstacles faced by first-year students during their…

  7. 76 FR 54487 - Charter Renewal, Glen Canyon Dam Adaptive Management Work Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... Management Work Group AGENCY: Bureau of Reclamation, Interior. ACTION: Notice of renewal. SUMMARY: Following... Interior (Secretary) is renewing the charter for the Glen Canyon Dam Adaptive Management Work Group. The purpose of the Adaptive Management Work Group is to advise and to provide recommendations to the Secretary...

  8. ACHP | News | Native Hawaiian Federal Interagency Working Group Created

    Science.gov Websites

    Search skip specific nav links Home arrow News arrow Native Hawaiian Federal Interagency Working Group Created Native Hawaiian Federal Interagency Working Group Created Improving consultations on unique issues involving Native Hawaiian organizations is the purpose of a new interagency working group established by the

  9. 78 FR 36541 - Public Interface Control Working Group (ICWG) Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-18

    ... DEPARTMENT OF DEFENSE Department of the Air Force Public Interface Control Working Group (ICWG... Systems (GPS) Directorate will be hosting a Public Interface Control Working Group (ICWG) meeting for the....mil by August 7, 2013. Public Interface Control Working Group Meeting (ICWG) Date(s) and Times: 24-25...

  10. The long-term impact of employment on mental health service use and costs for persons with severe mental illness.

    PubMed

    Bush, Philip W; Drake, Robert E; Xie, Haiyi; McHugo, Gregory J; Haslett, William R

    2009-08-01

    Stable employment promotes recovery for persons with severe mental illness by enhancing income and quality of life, but its impact on mental health costs has been unclear. This study examined service cost over ten years among participants in a co-occurring disorders study. Latent-class growth analysis of competitive employment identified trajectory groups. The authors calculated annual costs of outpatient services and institutional stays for 187 participants and examined group differences in ten-year utilization and cost. A steady-work group (N=51) included individuals whose work hours increased rapidly and then stabilized to average 5,060 hours per person over ten years. A late-work group (N=57) and a no-work group (N=79) did not differ significantly in utilization or cost outcomes, so they were combined into a minimum-work group (N=136). More education, a bipolar disorder diagnosis (versus schizophrenia or schizoaffective disorder), work in the past year, and lower scores on the expanded Brief Psychiatric Rating Scale predicted membership in the steady-work group. These variables were controlled for in the outcomes analysis. Use of outpatient services for the steady-work group declined at a significantly greater rate than it did for the minimum-work group, while institutional (hospital, jail, or prison) stays declined for both groups without a significant difference. The average cost per participant for outpatient services and institutional stays for the minimum-work group exceeded that of the steady-work group by $166,350 over ten years. Highly significant reductions in service use were associated with steady employment. Given supported employment's well-established contributions to recovery, evidence of long-term reductions in the cost of mental health services should lead policy makers and insurers to promote wider implementation.

  11. Semi-automatic Data Integration using Karma

    NASA Astrophysics Data System (ADS)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of Karma specifically for the geosciences. In particular, we show how Karma can be used intuitively to obtain the mapping model between case study data sources and a publicly available and expressive target ontology that has been designed to capture a broad set of concepts in geoscience with standardized, easily searchable names.

  12. Key Concepts of Teams in an Organisation. Information Bank Working Paper Number 2541.

    ERIC Educational Resources Information Center

    Marsh, D. T.

    Teams in an organization are more than cooperative working groups. Advantages of group work, as opposed to individual work, include producing a better end result, providing satisfaction for the individual and the organization, and assisting the organization through coordination and work allocation. Disadvantages of group work include producing a…

  13. Planning Self-Managed Work Groups. Features of Self-Managed Work Groups. Results of Using Self-Managed Work Groups. Issues and Implications in Using Self-Managed Work Groups. Status of Ohio Manufacturing Companies.

    ERIC Educational Resources Information Center

    Smylie, Patrick E.; Jacobs, Ronald L.

    A study was conducted to describe the present status of self-managed work groups in Ohio manufacturing companies. Data for the study were gathered through lengthy interviews and site visits with 45 manufacturing companies in the state, 24 employing 2,000-14,000 workers and 21 employing 300 to 1,900 workers. The results of the study are presented…

  14. 75 FR 76070 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    .... 65] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad... Committee (RSAC) Working Group Activities. SUMMARY: The FRA is updating its announcement of RSAC's Working.... [[Page 76071

  15. Qualitative Research in Group Work: Status, Synergies, and Implementation

    ERIC Educational Resources Information Center

    Rubel, Deborah; Okech, Jane E. Atieno

    2017-01-01

    The article aims to advance the use of qualitative research methods to understand group work. The first part of this article situates the use of qualitative research methods in relationship to group work research. The second part examines recent qualitative group work research using a framework informed by scoping and systematic review methods and…

  16. 76 FR 58078 - Thirteenth Meeting: RTCA Special Committee 214: Working Group 78: Standards for Air Traffic Data...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... Committee 214: Working Group 78: Standards for Air Traffic Data Communication Services AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 214: Working Group 78... public of a meeting of the RTCA Special Committee 214: Working Group 78: Standards for Air Traffic Data...

  17. 3 CFR 13486 - Executive Order 13486 of January 9, 2009. Strengthening Laboratory Biosecurity in the United States

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... agents and toxins. Sec. 2. Establishment and Operation of the Working Group. (a) There is hereby established, within the Department of Defense for administrative purposes only, the Working Group on Strengthening the Biosecurity of the United States (Working Group). (b) The Working Group shall consist...

  18. Student Collaboration in Group Work: Inclusion as Participation

    ERIC Educational Resources Information Center

    Forslund Frykedal, Karin; Hammar Chiriac, Eva

    2018-01-01

    Group work is an educational mode that promotes learning and socialisation among students. In this study, we focused on the inclusive processes when students work in small groups. The aim was to investigate and describe students' inclusive and collaborative processes in group work and how the teacher supported or impeded these transactions. Social…

  19. 77 FR 14584 - Eleventh Meeting: RTCA Special Committee 217, Joint With EUROCAE Working Group-44, Terrain and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... Committee 217, Joint With EUROCAE Working Group--44, Terrain and Airport Mapping Databases AGENCY: Federal... Special Committee 217, Joint with EUROCAE Working Group--44, Terrain and Airport Mapping Databases... Committee 217, Joint with EUROCAE Working Group--44, Terrain and Airport Mapping Databases. DATES: The...

  20. 76 FR 8353 - Positioning Systems Directorate Will Be Hosting an Interface Control Working Group (ICWG) Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-14

    ... an Interface Control Working Group (ICWG) Meeting for Document ICD-GPS-870 AGENCY: Interface Control Working Group (ICWG) meeting for document ICD-GPS-870. ACTION: Meeting Notice. SUMMARY: This notice... Working Group (ICWG) meeting for document ICD-GPS-870, Navstar Next Generation GPS Operational Control...

  1. 75 FR 1380 - National Drinking Water Advisory Council's Climate Ready Water Utilities Working Group Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... Ready Water Utilities Working Group Meeting Announcement AGENCY: Environmental Protection Agency. ACTION... meeting of the Climate Ready Water Utilities (CRWU) Working Group of the National Drinking Water Advisory Council (NDWAC). The purpose of this meeting is for the Working Group to discuss the attributes and...

  2. 75 FR 20352 - National Drinking Water Advisory Council's Climate Ready Water Utilities Working Group Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... Ready Water Utilities Working Group Meeting Announcement AGENCY: Environmental Protection Agency. ACTION...-person meeting of the Climate Ready Water Utilities (CRWU) Working Group of the National Drinking Water Advisory Council (NDWAC). The purpose of this meeting is for the Working Group to discuss key findings, the...

  3. 76 FR 62894 - Following Procedures When Going Between Rolling Equipment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... Operations Fatality Analysis (SOFA) Working Group. In October 1999, the Working Group issued a report titled ``Findings and Recommendations of the SOFA Working Group.'' The report can be found on FRA's Web site at http... recommendation reads as follows: \\1\\ More recently, in March 2011, the SOFA Working Group issued a report titled...

  4. Geologic data management at AVO: building authoritative coverage with radical availability (Invited)

    NASA Astrophysics Data System (ADS)

    Cameron, C.; Snedigar, S. F.; Nye, C. J.

    2009-12-01

    In 2002, the Alaska Volcano Observatory (AVO) began creating the Geologic Database of Information on Volcanoes in Alaska (GeoDIVA) to create a system that contains complete, flexible, timely, and accurate geologic and geographic information on Pleistocene and younger volcanoes in Alaska. This system was primarily intended to be a tool for scientific investigation, crisis response, and public information - delivered in a dynamic, digital format to both internal and external users. It is now the back-end of the AVO public website. GeoDIVA does not interface with our daily monitoring activities, however -- seismic and satellite data are handled by different database efforts. GeoDIVA also doesn’t store volcanic unrest data, although we hope WOVOdat will. GeoDIVA does include modules for the following datasets: bibliography (every subsequent piece of data in GeoDIVA is tied to a reference), basic volcano information (~137 edifices), historical eruption history information (~550 events), images (~17,000), sample information (~4400), geochemistry (~1500; population in progress), petrography (very early stages of data creation), sample storage (~14,000), and Quaternary vent information (~1200 vents). Modules in progress include GIS data, tephra data, and geochronologic data. In recent years, we have been doing maintenance work on older modules (for example, adding new references to the bibliography, and creating new queries and data fields in response to user feedback) as well as developing, designing, and populating new modules. Population can be quite time consuming, as there are no pre-compiled comprehensive existing sources for most information on Alaskan volcanoes, and we carefully reference each item. Newer modules also require more complex data arrangements than older modules. To meet the needs of a diverse group of users on widely varying computer platforms, GeoDIVA data is primarily stored in a MySQL DBMS; PostGIS/PostgreSQL are currently used to store and search spatial point data such as sample and volcano location. The spatial data storage system is evolving rapidly, and may change to a different DBMS in the future. Data upload is done via a web browser (one-record-at-a-time, tedious), or through automated .csv upload. Because we use open-source software and provide access through web browsers, AVO staff can view and update information from anywhere. In the future, we hope GeoDIVA will be a complete site for all geologic information about Alaskan volcanoes; because all data points are linked together (by references, sample IDs, volcanoes, geologists, etc.) we’ll be able to draw a box on a map and retrieve information on edifices, vents, samples, and all associated metadata, images, references, analytical data, and accompanying GIS files. As we look toward our goals, remaining challenges include: linking our data with other national and international efforts, creating easier ways for all to upload data, GIS development, and balancing the speed of new module development with the need for older module maintenance.

  5. Validating a work group climate assessment tool for improving the performance of public health organizations

    PubMed Central

    Perry, Cary; LeMay, Nancy; Rodway, Greg; Tracy, Allison; Galer, Joan

    2005-01-01

    Background This article describes the validation of an instrument to measure work group climate in public health organizations in developing countries. The instrument, the Work Group Climate Assessment Tool (WCA), was applied in Brazil, Mozambique, and Guinea to assess the intermediate outcomes of a program to develop leadership for performance improvement. Data were collected from 305 individuals in 42 work groups, who completed a self-administered questionnaire. Methods The WCA was initially validated using Cronbach's alpha reliability coefficient and exploratory factor analysis. This article presents the results of a second validation study to refine the initial analyses to account for nested data, to provide item-level psychometrics, and to establish construct validity. Analyses included eigenvalue decomposition analysis, confirmatory factor analysis, and validity and reliability analyses. Results This study confirmed the validity and reliability of the WCA across work groups with different demographic characteristics (gender, education, management level, and geographical location). The study showed that there is agreement between the theoretical construct of work climate and the items in the WCA tool across different populations. The WCA captures a single perception of climate rather than individual sub-scales of clarity, support, and challenge. Conclusion The WCA is useful for comparing the climates of different work groups, tracking the changes in climate in a single work group over time, or examining differences among individuals' perceptions of their work group climate. Application of the WCA before and after a leadership development process can help work groups hold a discussion about current climate and select a target for improvement. The WCA provides work groups with a tool to take ownership of their own group climate through a process that is simple and objective and that protects individual confidentiality. PMID:16223447

  6. 78 FR 36778 - Pesticide Program Dialogue Committee; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-19

    ... Pollinator Protection; PPDC Work Group on Integrated Pest Management; PPDC Work Group on Comparative Safety... Management Work Group, 9:30 a.m. to noon in Conference Room S-4370-80; Comparative Safety Statements Work...

  7. Transfer after Working Memory Updating Training

    PubMed Central

    Waris, Otto; Soveri, Anna; Laine, Matti

    2015-01-01

    During the past decade, working memory training has attracted much interest. However, the training outcomes have varied between studies and methodological problems have hampered the interpretation of results. The current study examined transfer after working memory updating training by employing an extensive battery of pre-post cognitive measures with a focus on near transfer. Thirty-one healthy Finnish young adults were randomized into either a working memory training group or an active control group. The working memory training group practiced with three working memory tasks, while the control group trained with three commercial computer games with a low working memory load. The participants trained thrice a week for five weeks, with one training session lasting about 45 minutes. Compared to the control group, the working memory training group showed strongest transfer to an n-back task, followed by working memory updating, which in turn was followed by active working memory capacity. Our results support the view that working memory training produces near transfer effects, and that the degree of transfer depends on the cognitive overlap between the training and transfer measures. PMID:26406319

  8. Transfer after Working Memory Updating Training.

    PubMed

    Waris, Otto; Soveri, Anna; Laine, Matti

    2015-01-01

    During the past decade, working memory training has attracted much interest. However, the training outcomes have varied between studies and methodological problems have hampered the interpretation of results. The current study examined transfer after working memory updating training by employing an extensive battery of pre-post cognitive measures with a focus on near transfer. Thirty-one healthy Finnish young adults were randomized into either a working memory training group or an active control group. The working memory training group practiced with three working memory tasks, while the control group trained with three commercial computer games with a low working memory load. The participants trained thrice a week for five weeks, with one training session lasting about 45 minutes. Compared to the control group, the working memory training group showed strongest transfer to an n-back task, followed by working memory updating, which in turn was followed by active working memory capacity. Our results support the view that working memory training produces near transfer effects, and that the degree of transfer depends on the cognitive overlap between the training and transfer measures.

  9. Health, work, social trust, and financial situation in persons with Usher syndrome type 1.

    PubMed

    Ehn, Mattias; Wahlqvist, Moa; Danermark, Berth; Dahlström, Örjan; Möller, Claes

    2018-05-28

    Research has demonstrated that persons with Usher syndrome type 1 (USH1) have significantly poorer physical and psychological health compared to a reference group. To explore the relation between work, health, social trust, and financial situation in USH1 compared to a reference group. Sixty-six persons (18-65 y) from the Swedish Usher database received a questionnaire and 47 were included, 23 working and 24 non-working. The reference group comprised 3,049 working and 198 non-working persons. The Swedish Health on Equal Terms questionnaire was used and statistical analysis with multiple logistic regression was conducted. The USH1 non-work group had a higher Odds ratio (95% CI) in poor psychological and physical health, social trust, and financial situation compared to the USH1 work group and reference groups. Age, gender, hearing, and vision impairment did not explain the differences. The relation between the USH1 work and non-work groups showed the same pattern as the reference groups, but the magnitude of problems was significantly higher. Both disability and unemployment increased the risk of poor health, social trust and financial situation in persons with USH1, but having an employment seemed to counteract the risks related to disability.

  10. Group work: Facilitating the learning of international and domestic undergraduate nursing students.

    PubMed

    Shaw, Julie; Mitchell, Creina; Del Fabbro, Letitia

    2015-01-01

    Devising innovative strategies to address internationalization is a contemporary challenge for universities. A Participatory Action Research (PAR) project was undertaken to identify issues for international nursing students and their teachers. The findings identified group work as a teaching strategy potentially useful to facilitate international student learning. The educational intervention of structured group work was planned and implemented in one subject of a Nursing degree. Groups of four to five students were formed with one or two international students per group. Structural support was provided by the teacher until the student was learning independently, the traditional view of scaffolding. The group work also encouraged students to learn from one another, a contemporary understanding of scaffolding. Evaluation of the group work teaching strategy occurred via anonymous, self-completed student surveys. The student experience data were analysed using descriptive statistical techniques, and free text comments were analysed using content analysis. Over 85% of respondents positively rated the group work experience. Overwhelmingly, students reported that class discussions and sharing nursing experiences positively influenced their learning and facilitated exchange of knowledge about nursing issues from an international perspective. This evaluation of a structured group work process supports the use of group work in engaging students in learning, adding to our understanding of purposeful scaffolding as a pathway to enhance learning for both international and domestic students. By explicitly using group work within the curriculum, educators can promote student learning, a scholarly approach to teaching and internationalization of the curriculum.

  11. Does competitive employment improve nonvocational outcomes for people with severe mental illness?

    PubMed

    Bond, G R; Resnick, S G; Drake, R E; Xie, H; McHugo, G J; Bebout, R R

    2001-06-01

    The authors examined the cumulative effects of work on symptoms, quality of life, and self-esteem for 149 unemployed clients with severe mental illness receiving vocational rehabilitation. Nonvocational measures were assessed at 6-month intervals throughout the 18-month study period, and vocational activity was tracked continuously. On the basis of their predominant work activity over the study period, participants were classified into 4 groups: competitive work, sheltered work, minimal work, and no work. The groups did not differ at baseline on any of the nonvocational measures. Using mixed effects regression analysis to examine rates of change over time, the authors found that the competitive work group showed higher rates of improvement in symptoms; in satisfaction with vocational services, leisure, and finances; and in self-esteem than did participants in a combined minimal work-no work group. The sheltered work group showed no such advantage.

  12. A Demands-Resources Model of Work Pressure in IT Student Task Groups

    ERIC Educational Resources Information Center

    Wilson, E. Vance; Sheetz, Steven D.

    2010-01-01

    This paper presents an initial test of the group task demands-resources (GTD-R) model of group task performance among IT students. We theorize that demands and resources in group work influence formation of perceived group work pressure (GWP) and that heightened levels of GWP inhibit group task performance. A prior study identified 11 factors…

  13. From the inside Out: Group Work with Women of Color

    ERIC Educational Resources Information Center

    Short, Ellen L.; Williams, Wendi S.

    2014-01-01

    This article will present two models for conducting group work with Women of Color (WOC): the SisterCircle Approach and the Group Relations Model. The authors contend that the models, when used together, combine an internal and external focus ("inside out") of group work that can assist group workers to conduct individual and group-level…

  14. 78 FR 15110 - Aviation Rulemaking Advisory Committee; Engine Bird Ingestion Requirements-New Task

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-08

    ...: During the bird-ingestion rulemaking database (BRDB) working group`s reevaluation of the current engine... engine core ingestion. If the BRDB working group`s reevaluation determines that such requirements are... Task ARAC accepted the task and will establish the Engine Harmonization Working Group (EHWG), under the...

  15. Designing and Assessing Productive Group Work in Secondary Schools

    ERIC Educational Resources Information Center

    Vaca, Javier; Lapp, Diane; Fisher, Douglas

    2011-01-01

    A history teacher examines what is successful and not successful in group work in his high school classroom and gives concrete suggestions for improving group practice. Topics discussed include preparing students for group work, supporting collaboration, inviting critical analysis, and assessing both group and individual performance. (Contains 2…

  16. Dealing with Parasites in Group Projects.

    ERIC Educational Resources Information Center

    Carter, Judy H.

    While it is generally accepted that people working in groups can accomplish more than people working individually, it is equally accepted that parasites will attempt to feed on the other group members. Group work has been called by several names--group learning, cooperative learning, collaborative learning--all of which carry slightly different…

  17. 75 FR 34530 - Analysis by the President's Working Group on Financial Markets on the Long-Term Availability and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... DEPARTMENT OF THE TREASURY Analysis by the President's Working Group on Financial Markets on the... insurance for terrorism risk. The President's Working Group on Financial Markets (established by Executive... his designee, is the Chairman of the President's Working Group on Financial Markets. As chair of the...

  18. 3 CFR 13650 - Executive Order 13650 of August 1, 2013. Improving Chemical Facility Safety and Security

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Working Group. (a) There is established a Chemical Facility Safety and Security Working Group (Working Group) co-chaired by the Secretary of Homeland Security, the Administrator of the Environmental... Secretary level or higher. In addition, the Working Group shall consist of the head of each of the following...

  19. Teachers' and Students' Negotiation Moves When Teachers Scaffold Group Work

    ERIC Educational Resources Information Center

    González, Gloriana; DeJarnette, Anna F.

    2015-01-01

    Group work has been a main activity recommended by mathematics education reform. We aim at describing the patterns of interaction between teachers and students during group work. We ask: How do teachers scaffold group work during a problem-based lesson? We use data from a problem-based lesson taught in six geometry class periods by two teachers…

  20. A Standards-Based Inventory of Foundation Competencies in Social Work with Groups

    ERIC Educational Resources Information Center

    Macgowan, Mark J.

    2012-01-01

    Objective: This article describes the development of a measure of foundation competencies in group work derived from the Standards for Social Work Practice with Groups. Developed by the Association for the Advancement of Social Work with Groups, the Standards have not been widely used. An instrument based on the Standards can help advance…

  1. A Complex Systems Investigation of Group Work Dynamics in L2 Interactive Tasks

    ERIC Educational Resources Information Center

    Poupore, Glen

    2018-01-01

    Working with Korean university-level learners of English, this study provides a detailed analytical comparison of 2 task work groups that were video-recorded, with 1 group scoring very high and the other relatively low based on the results of a Group Work Dynamic (GWD) measuring instrument. Adopting a complexity theory (CT) perspective and…

  2. Factors Affecting the Interest of Israeli Social Work Students in Working with Different Client Groups

    ERIC Educational Resources Information Center

    Krumer-Nevo, Michal; Weiss, Idit

    2006-01-01

    Employing a large-scale sample of 521 BSW students from 4 Israeli schools of social work, this research examines the factors affecting social work students' interest in working with a wide range of client groups. The results suggest that student interest in working with specific client groups is affected by factors related to desire for…

  3. Outcome in patients admitted outside regular hospital working hours: does time until regular working hours matter?

    PubMed

    Nakajima, Makoto; Inatomi, Yuichiro; Yonehara, Toshiro; Watanabe, Masaki; Ando, Yukio

    2015-01-01

    The aim of this study was to investigate whether stratifying patients according to the time period from admission to the start of regular working hours would help detect a weekend effect in acute stroke patients. Ischemic stroke patients admitted between October 2002 and March 2012 were analyzed. Working hours were defined as 9:00-17:00 on weekdays. Patients were divided into those admitted during working hours (no-wait group) and three other groups according to the time from admission to working hours: ≤24 h (short-wait group), 24-48 h (medium-wait group), and >48 h (long-wait group). The modified Rankin Scale score and mortality at three-months were compared among the groups. Of 5625 patients, 3323 (59%) were admitted outside working hours. The proportion of patients with an mRS score 0-1 at three-months showed a decreasing trend with the time period before working hours: 47% (no-wait group), 42% (short-wait group), 42% (medium-wait group), and 38% (long-wait group), respectively (P < 0·001). When the no-wait group was used as a reference, the odds ratio for modified Rankin Scale score 0-1 was 0·88 (95% confidence interval, 0·75-1·04) in the short-wait group, 0·86 (0·69-1·07) in the medium-wait group, and 0·67 (0·53-0·85) in the long-wait group after adjusting for sex, age, premorbid mRS score, previous morbidity, stroke severity, and vascular risk factors. Mortality at three-months was not different between the no-wait group and the other groups. A weekend effect might be evident if patients were stratified according to the time period from admission until working hours. © 2014 World Stroke Organization.

  4. Work-family conflicts and work performance.

    PubMed

    Roth, Lawrence; David, Emily M

    2009-08-01

    Prior research indicates that work-family conflict interferes with family far more than it interferes with work. Conservation of resources provides a possible explanation: when shifting resources from family is no longer sufficient to maintain satisfactory work performance, then workers must acquire additional resources or reduce investments in work. One source of such additional resources could be high performance peers in the work group. The performance of workers with resource-rich peers may be less adversely affected by work-family conflict. In this study, 136 employees of a wholesale distribution firm (61% women, 62% minority) working in groups of 7 to 11 in manual labor and low-level administrative jobs rated their own work-to-family conflict. Their supervisors rated workers' performance. Hierarchical regression analysis indicated that work-to-family conflict increasingly adversely affected job performance as work group performance decreased. Hence, work group performance may be an important moderator of the effects of work-family conflict.

  5. 2010 Chemical Working Group Status

    NASA Technical Reports Server (NTRS)

    Reid, Concha M.

    2010-01-01

    The Steering Group for the Interagency Advanced Power Group (IAPG) held their business meeting on November 30-December 1st in McLean, Virginia. Status reports were presented from each of the IAPG's Working Groups. These charts contain a brief summary of the IAPG Chemical Working Group's activities during 2010 and its plans for 2011.

  6. Letting the Drama into Group Work: Using Conflict Constructively in Performing Arts Group Practice

    ERIC Educational Resources Information Center

    Crossley, Tracy

    2006-01-01

    The article examines conflict avoidance in performing arts group work and issues arising in relation to teaching and learning. In group theory, conflict is addressed largely in terms of its detrimental effects on group work, and its constructive potential is often marginalized. Similarly, undergraduate students usually interpret "effective…

  7. Exploring Students' Group Work Needs in the Context of Internationalisation Using a Creative Visual Method

    ERIC Educational Resources Information Center

    Cox, Andrew; Chiles, Prue; Care, Leo

    2012-01-01

    While UK universities see group work as essential to building higher order intellectual and team skills, many international students are unfamiliar with this way of studying. Group work is also a focus of home students' concerns. Cultural differences in the interpretation of space for learning or how spatial issues affect group work processes has…

  8. Causal Relationships between Communication Confidence, Beliefs about Group Work, and Willingness to Communicate in Foreign Language Group Work

    ERIC Educational Resources Information Center

    Fushino, Kumiko

    2010-01-01

    This article reports on the causal relationships between three factors in second language (L2) group work settings: communication confidence (i.e., confidence in one's ability to communicate), beliefs about group work, and willingness to communicate (WTC). A questionnaire was administered to 729 first-year university students in Japan. A model…

  9. 77 FR 6796 - Notification of Three Public Teleconferences of a Work Group of the Chartered Science Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... Work Group of the Chartered Science Advisory Board AGENCY: Environmental Protection Agency (EPA... teleconferences of a work group of the Chartered Science Advisory Board to discuss the President's FY 2013 Budget...), 5 U.S.C., App. 2. Pursuant to FACA and EPA policy, notice is hereby given that a work group of the...

  10. Impact of dry eye on work productivity.

    PubMed

    Yamada, Masakazu; Mizuno, Yoshinobu; Shigeyasu, Chika

    2012-01-01

    The purpose of this study was to evaluate the impact of dry eye on work productivity of office workers, especially in terms of presenteeism. A total of 396 individuals aged ≥20 years (258 men and 138 women, mean age 43.4 ± 13.0 years) were recruited through an online survey. Data from 355 responders who did not have missing values were included in the analysis. They were classified into the following four groups according to the diagnostic status and subjective symptoms of dry eye: a definite dry eye group; a marginal dry eye group; a self-reported dry eye group; and a control group. The impact of dry eye on work productivity was evaluated using the Japanese version of the Work Limitations Questionnaire. The cost of work productivity loss associated with dry eye and the economic benefits of providing treatment for dry eye were also assessed. The degree of work performance loss was 5.65% in the definite dry eye group, 4.37% in the marginal dry eye group, 6.06% in the self-reported dry eye group, and 4.27% in the control group. Productivity in the self-reported dry eye group was significantly lower than that in the control group (P < 0.05). The annual cost of work productivity loss associated with dry eye was estimated to be USD 741 per person. Dry eye impairs work performance among office workers, which may lead to a substantial loss to industry. Management of symptoms of dry eye by providing treatment may contribute to improvement in work productivity.

  11. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  12. The Microbe Directory: An annotated, searchable inventory of microbes' characteristics.

    PubMed

    Shaaban, Heba; Westfall, David A; Mohammad, Rawhi; Danko, David; Bezdan, Daniela; Afshinnekoo, Ebrahim; Segata, Nicola; Mason, Christopher E

    2018-01-05

    The Microbe Directory is a collective research effort to profile and annotate more than 7,500 unique microbial species from the MetaPhlAn2 database that includes bacteria, archaea, viruses, fungi, and protozoa. By collecting and summarizing data on various microbes' characteristics, the project comprises a database that can be used downstream of large-scale metagenomic taxonomic analyses, allowing one to interpret and explore their taxonomic classifications to have a deeper understanding of the microbial ecosystem they are studying. Such characteristics include, but are not limited to: optimal pH, optimal temperature, Gram stain, biofilm-formation, spore-formation, antimicrobial resistance, and COGEM class risk rating. The database has been manually curated by trained student-researchers from Weill Cornell Medicine and CUNY-Hunter College, and its analysis remains an ongoing effort with open-source capabilities so others can contribute. Available in SQL, JSON, and CSV (i.e. Excel) formats, the Microbe Directory can be queried for the aforementioned parameters by a microorganism's taxonomy. In addition to the raw database, The Microbe Directory has an online counterpart ( https://microbe.directory/) that provides a user-friendly interface for storage, retrieval, and analysis into which other microbial database projects could be incorporated. The Microbe Directory was primarily designed to serve as a resource for researchers conducting metagenomic analyses, but its online web interface should also prove useful to any individual who wishes to learn more about any particular microbe.

  13. ProtPhylo: identification of protein-phenotype and protein-protein functional associations via phylogenetic profiling.

    PubMed

    Cheng, Yiming; Perocchi, Fabiana

    2015-07-01

    ProtPhylo is a web-based tool to identify proteins that are functionally linked to either a phenotype or a protein of interest based on co-evolution. ProtPhylo infers functional associations by comparing protein phylogenetic profiles (co-occurrence patterns of orthology relationships) for more than 9.7 million non-redundant protein sequences from all three domains of life. Users can query any of 2048 fully sequenced organisms, including 1678 bacteria, 255 eukaryotes and 115 archaea. In addition, they can tailor ProtPhylo to a particular kind of biological question by choosing among four main orthology inference methods based either on pair-wise sequence comparisons (One-way Best Hits and Best Reciprocal Hits) or clustering of orthologous proteins across multiple species (OrthoMCL and eggNOG). Next, ProtPhylo ranks phylogenetic neighbors of query proteins or phenotypic properties using the Hamming distance as a measure of similarity between pairs of phylogenetic profiles. Candidate hits can be easily and flexibly prioritized by complementary clues on subcellular localization, known protein-protein interactions, membrane spanning regions and protein domains. The resulting protein list can be quickly exported into a csv text file for further analyses. ProtPhylo is freely available at http://www.protphylo.org. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. d-Omix: a mixer of generic protein domain analysis tools.

    PubMed

    Wichadakul, Duangdao; Numnark, Somrak; Ingsriswang, Supawadee

    2009-07-01

    Domain combination provides important clues to the roles of protein domains in protein function, interaction and evolution. We have developed a web server d-Omix (a Mixer of Protein Domain Analysis Tools) aiming as a unified platform to analyze, compare and visualize protein data sets in various aspects of protein domain combinations. With InterProScan files for protein sets of interest provided by users, the server incorporates four services for domain analyses. First, it constructs protein phylogenetic tree based on a distance matrix calculated from protein domain architectures (DAs), allowing the comparison with a sequence-based tree. Second, it calculates and visualizes the versatility, abundance and co-presence of protein domains via a domain graph. Third, it compares the similarity of proteins based on DA alignment. Fourth, it builds a putative protein network derived from domain-domain interactions from DOMINE. Users may select a variety of input data files and flexibly choose domain search tools (e.g. hmmpfam, superfamily) for a specific analysis. Results from the d-Omix could be interactively explored and exported into various formats such as SVG, JPG, BMP and CSV. Users with only protein sequences could prepare an InterProScan file using a service provided by the server as well. The d-Omix web server is freely available at http://www.biotec.or.th/isl/Domix.

  15. Batch Conversion of 1-D FITS Spectra to Common Graphical Display Files

    NASA Astrophysics Data System (ADS)

    MacConnell, Darrell J.; Patterson, A. P.; Wing, R. F.; Costa, E.; Jedrzejewski, R. I.

    2008-09-01

    Authors DJM, RFW, and EC have accumulated about 1000 spectra of cool stars from CTIO, ESO, and LCO over the interval 1985 to 1994 and processed them with the standard IRAF tasks into FITS files of normalized intensity vs. wavelength. With the growth of the Web as a means of exchanging and preserving scientific information, we desired to put the spectra into a Web-readable format. We have searched without success sites such as the Goddard FITS Image Viewer page, http://fits.gsfc.nasa.gov/fits_viewer.html, for a program to convert a large number of 1-d stellar spectra from FITS format into common formats such as PDF, PS, or PNG. Author APP has written a Python script to do this using the PyFITS module and plotting routines from Pylab. The program determines the wavelength calibration using header keywords and creates PNG plots with a legend read from a CSV file that may contain the star name, position, spectral type, etc. It could readily be adapted to perform almost any kind of simple batch processing of astronomical data. The program may be obtained from the first author (jack@stsci.edu). Support for DJM from the research program for CSC astronomers at STScI is gratefully acknowledged. The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy Inc. under NASA contract NAS 5-26555.

  16. NRF2-ome: an integrated web resource to discover protein interaction and regulatory networks of NRF2.

    PubMed

    Türei, Dénes; Papp, Diána; Fazekas, Dávid; Földvári-Nagy, László; Módos, Dezső; Lenti, Katalin; Csermely, Péter; Korcsmáros, Tamás

    2013-01-01

    NRF2 is the master transcriptional regulator of oxidative and xenobiotic stress responses. NRF2 has important roles in carcinogenesis, inflammation, and neurodegenerative diseases. We developed an online resource, NRF2-ome, to provide an integrated and systems-level database for NRF2. The database contains manually curated and predicted interactions of NRF2 as well as data from external interaction databases. We integrated NRF2 interactome with NRF2 target genes, NRF2 regulating TFs, and miRNAs. We connected NRF2-ome to signaling pathways to allow mapping upstream NRF2 regulatory components that could directly or indirectly influence NRF2 activity totaling 35,967 protein-protein and signaling interactions. The user-friendly website allows researchers without computational background to search, browse, and download the database. The database can be downloaded in SQL, CSV, BioPAX, SBML, PSI-MI, and in a Cytoscape CYS file formats. We illustrated the applicability of the website by suggesting a posttranscriptional negative feedback of NRF2 by MAFG protein and raised the possibility of a connection between NRF2 and the JAK/STAT pathway through STAT1 and STAT3. NRF2-ome can also be used as an evaluation tool to help researchers and drug developers to understand the hidden regulatory mechanisms in the complex network of NRF2.

  17. Major- and Trace-Element Concentrations in Rock Samples Collected in 2006 from the Taylor Mountains 1:250,000-scale Quadrangle, Alaska

    USGS Publications Warehouse

    Klimasauskas, Edward P.; Miller, Marti L.; Bradley, Dwight C.

    2007-01-01

    Introduction The Kuskokwim mineral belt of Bundtzen and Miller (1997) forms an important metallogenic region in southwestern Alaska that has yielded more than 3.22 million ounces of gold and 400,000 ounces of silver. Precious-metal and related deposits in this region associated with Late Cretaceous to early Tertiary igneous complexes extend into the Taylor Mountains 1:250,000-scale quadrangle. The U.S. Geological Survey is in the process of conducting a mineral resource assessment of this region. This report presents analytical data collected during the third year of this multiyear study. A total of 138 rock geochemistry samples collected during the 2006 field season were analyzed using the ICP-AES/MS42, ICP-AES10, fire assay, and cold vapor atomic absorption methods described in more detail below. Analytical values are provided in percent (% or pct: 1 gram per 100 grams), parts per million (ppm: 1 gram per 1,000,000 grams), or parts per billion (ppb: 1 gram per 1,000,000,000 grams) as indicated in the column heading of the data table. Data are provided for download in Excel (*.xls), comma delimited (*.csv), dBase 4 (*.dbf) and as a point coverage in ArcInfo interchange (*.e00) formats available at http://pubs.usgs.gov/of/2007/1386/.

  18. Cytogenetic effects of space radiation in lymphocytes of MIR-18 crews

    NASA Technical Reports Server (NTRS)

    Yang, T. C.; George, K.; Johnson, A. S.; Tavakoli, A.; Durante, M.; Fedorenko, B. S.

    1997-01-01

    For assessing health risk, the measurement of physical dose received during a space mission, as well as the LETs, energies and charges of particles is important. It is also important to obtain quantitative information regarding the effectiveness of space radiation in causing damage to critical biological targets, e.g., chromosomes, since at present the estimated uncertainty of biological effects of space radiation is more than a factor of two. Such large uncertainty makes accurate health risk assessment very difficult. For this very reason, a study on cytogenetic effects of space radiation in human lymphocytes was proposed and done for MIR-18 mission. This study used FISH technique to score chromosomal translocations and C-banding method to determine dicentrics. Growth kinetics of cells and SCE were examined to ensure that chromosomal aberrations were scored in first mitosis and were induced not by chemical mutagens. Our results showed that chromosomal aberration frequency of post-flight samples was significantly higher than that of pre-flight ones and that SCE frequency was similar between pre- and post-flight samples. Based on a dose-response curve of preflight samples exposed to gamma rays, the absorbed dose received by crews during the mission was estimated to be about 14.5 cSv. Because the absorbed dose measured by physical dosimeters is 4.16 cGy for the entire mission, the RBE is about 3.5.

  19. Simplifying the Analysis of Data from Multiple Heliophysics Instruments and Missions

    NASA Astrophysics Data System (ADS)

    Bazell, D.; Vandegriff, J. D.

    2014-12-01

    Understanding the intertwined plasma, particles and fields connecting the Sun and the Earth requires combining data from many diverse sources, but there are still many technological barriers that complicate the merging of data from different instruments and missions. We present an emerging data serving capability that provides a uniform way to access heterogeneous and distributed data. The goal of our data server is to provide a standardized data access mechanism that is identical for data of any format and layout (CDF, custom binary, FITS, netCDF, CSV and other flavors of ASCII, etc). Data remain in their original format and location (i.e., at instrument team sites or existing data centers), and our data server delivers a dynamically reformatted view of the data. Scientists can then use tools (clients that talk to the server) that offer a single interface for browsing, analyzing or downloading many different contemporary and legacy heliophysics data sets. Our current server accesses many CDF data resources at CDAWeb, as well as multiple other instrument team sites. Our webservice will be deployed on the Amazon Cloud at http://datashop.elasticbeanstalk.com/. Two basic clients will also be demonstrated: one in Java and one in IDL. Python, Perl, and Matlab clients are also planned. Complex missions such as Solar Orbiter and Solar Probe Plus will benefit greatly from tools that enable multi-instrument and multi-mission data comparison.

  20. Distribution of dissolved zinc in the western and central subarctic North Pacific

    NASA Astrophysics Data System (ADS)

    Kim, T.; Obata, H.; Gamo, T.

    2016-02-01

    Zinc (Zn) is an essential micronutrient for bacteria and phytoplankton in the ocean as it plays an important role in numerous enzyme systems involved in various metabolic processes. However, large-scale distributions of total dissolved Zn in the subarctic North Pacific have not been investigated yet. In this study, we investigated the distributions of total dissolved Zn to understand biogeochemical cycling of Zn in the western and central subarctic North Pacific as a Japanese GEOTRACES project. Seawater samples were collected during the R/V Hakuho-maru KH-12-4 GEOTRACES GP 02 cruise (from August to October 2012), by using acid-cleaned Teflon-coated X-type Niskin samplers. Total dissolved Zn in seawater was determined using cathodic stripping voltammetry (CSV) after UV-digestion. In this study, total dissolved Zn concentrations in the western and central subarctic North Pacific commonly showed Zn increase from surface to approximately 400-500 m, just above the oxygen minimum layer. However, in the western subarctic North Pacific, relatively higher Zn concentrations have also been observed at intermediate depths (800-1200 m), in comparison with those observed in deep waters. The relationship between Zn and Si in the western subarctic North Pacific showed that Zn is slightly enriched at intermediate depths. These results may indicate that there are additional sources of Zn to intermediate water of the western subarctic North Pacific.

  1. COEUS: “semantic web in a box” for biomedical applications

    PubMed Central

    2012-01-01

    Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467

  2. Biodosimetry results from space flight Mir-18.

    PubMed

    Yang, T C; George, K; Johnson, A S; Durante, M; Fedorenko, B S

    1997-11-01

    Astronauts are classified as radiation workers due to the presence of ionizing radiation in space. For the assessment of health risks, physical dosimetry has been indispensable. However, the change of the location of dosimeters on the crew members, the variation in dose rate with location inside the spacecraft and the unknown biological effects of microgravity can introduce significant uncertainties in estimating exposure. To circumvent such uncertainty, a study on the cytogenetic effects of space radiation in human lymphocytes was proposed and conducted for Mir-18, a 115-day mission. This study used fluorescence in situ hybridization (FISH) with whole-chromosome painting probes to score chromosomal exchanges and the Giemsa staining method to determine the frequency of dicentrics. The growth kinetics of cells and sister chromatid exchanges (SCEs) were examined to ensure that chromosomal aberrations were scored in the first mitosis and were induced primarily by space radiation. Our results showed that the frequency of chromosomal aberrations increased significantly in postflight samples compared to samples drawn prior to flight, and that the frequency of SCEs was similar for both pre- and postflight samples. Based on a dose-response curve for preflight samples exposed to gamma rays, the absorbed dose received by crew members during the mission was estimated to be about 14.75 cSv. Because the absorbed dose measured by physical dosimeters is 5.2 cGy for the entire mission, the RBE is about 2.8.

  3. Biodosimetry results from space flight Mir-18

    NASA Technical Reports Server (NTRS)

    Yang, T. C.; George, K.; Johnson, A. S.; Durante, M.; Fedorenko, B. S.

    1997-01-01

    Astronauts are classified as radiation workers due to the presence of ionizing radiation in space. For the assessment of health risks, physical dosimetry has been indispensable. However, the change of the location of dosimeters on the crew members, the variation in dose rate with location inside the spacecraft and the unknown biological effects of microgravity can introduce significant uncertainties in estimating exposure. To circumvent such uncertainty, a study on the cytogenetic effects of space radiation in human lymphocytes was proposed and conducted for Mir-18, a 115-day mission. This study used fluorescence in situ hybridization (FISH) with whole-chromosome painting probes to score chromosomal exchanges and the Giemsa staining method to determine the frequency of dicentrics. The growth kinetics of cells and sister chromatid exchanges (SCEs) were examined to ensure that chromosomal aberrations were scored in the first mitosis and were induced primarily by space radiation. Our results showed that the frequency of chromosomal aberrations increased significantly in postflight samples compared to samples drawn prior to flight, and that the frequency of SCEs was similar for both pre- and postflight samples. Based on a dose-response curve for preflight samples exposed to gamma rays, the absorbed dose received by crew members during the mission was estimated to be about 14.75 cSv. Because the absorbed dose measured by physical dosimeters is 5.2 cGy for the entire mission, the RBE is about 2.8.

  4. Connectivity Reveals Sources of Predictive Coding Signals in Early Visual Cortex During Processing of Visual Optic Flow.

    PubMed

    Schindler, Andreas; Bartels, Andreas

    2017-05-01

    Superimposed on the visual feed-forward pathway, feedback connections convey higher level information to cortical areas lower in the hierarchy. A prominent framework for these connections is the theory of predictive coding where high-level areas send stimulus interpretations to lower level areas that compare them with sensory input. Along these lines, a growing body of neuroimaging studies shows that predictable stimuli lead to reduced blood oxygen level-dependent (BOLD) responses compared with matched nonpredictable counterparts, especially in early visual cortex (EVC) including areas V1-V3. The sources of these modulatory feedback signals are largely unknown. Here, we re-examined the robust finding of relative BOLD suppression in EVC evident during processing of coherent compared with random motion. Using functional connectivity analysis, we show an optic flow-dependent increase of functional connectivity between BOLD suppressed EVC and a network of visual motion areas including MST, V3A, V6, the cingulate sulcus visual area (CSv), and precuneus (Pc). Connectivity decreased between EVC and 2 areas known to encode heading direction: entorhinal cortex (EC) and retrosplenial cortex (RSC). Our results provide first evidence that BOLD suppression in EVC for predictable stimuli is indeed mediated by specific high-level areas, in accord with the theory of predictive coding. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. COEUS: "semantic web in a box" for biomedical applications.

    PubMed

    Lopes, Pedro; Oliveira, José Luís

    2012-12-17

    As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.

  6. Potential role of nuclear PD-L1 expression in cell-surface vimentin positive circulating tumor cells as a prognostic marker in cancer patients.

    PubMed

    Satelli, Arun; Batth, Izhar Singh; Brownlee, Zachary; Rojas, Christina; Meng, Qing H; Kopetz, Scott; Li, Shulin

    2016-07-01

    Although circulating tumor cells (CTCs) have potential as diagnostic biomarkers for cancer, determining their prognostic role in cancer patients undergoing treatment is a challenge. We evaluated the prognostic value of programmed death-ligand 1 (PD-L1) expression in CTCs in colorectal and prostate cancer patients undergoing treatment. Peripheral blood samples were collected from 62 metastatic colorectal cancer patients and 30 metastatic prostate cancer patients. CTCs were isolated from the samples using magnetic separation with the cell-surface vimentin(CSV)-specific 84-1 monoclonal antibody that detects epithelial-mesenchymal transitioned (EMT) CTCs. CTCs were enumerated and analyzed for PD-L1 expression using confocal microscopy. PD-L1 expression was detectable in CTCs and was localized in the membrane and/or cytoplasm and nucleus. CTC detection alone was not associated with poor progression-free or overall survival in colorectal cancer or prostate cancer patients, but nuclear PD-L1 (nPD-L1) expression in these patients was significantly associated with short survival durations. These results demonstrated that nPD-L1 has potential as a clinically relevant prognostic biomarker for colorectal and prostate cancer. Our data thus suggested that use of CTC-based models of cancer for risk assessment can improve the standard cancer staging criteria and supported the incorporation of nPD-L1 expression detection in CTCs detection in such models.

  7. An investigation of siderophore production by oceanic Synechococcus

    NASA Astrophysics Data System (ADS)

    Wisniewski, R. J.; Webb, E. A.; Moffett, J. W.

    2003-04-01

    Cyanobacteria are significant contributors to global primary production. They can be found in warm high-nutrient, low-chlorophyll regions where low concentrations of iron are thought to limit primary productivity. Determining how these organisms obtain iron is critical to understanding the biogeochemical cycle of iron and its role as a determinant of marine primary production. Siderophore production has been observed in halotolerant freshwater cyanobacteria (see C.G. Trick and co-authors) and marine heterotrophic bacteria (see A. Butler, M.G. Haygood and co-authors), but to date, siderophore production in truly marine cyanobacteria has not been demonstrated. We examined the response of two marine Synechococcus species (WH7803 and WH8102) to iron stress. Axenic cultures of both Synechococcus species were grown under iron-stressed and iron-replete conditions. The supernatants of these cultures were examined using competitive ligand exchange-cathodic stripping voltammetry (CLE-CSV), a sensitive method of quantitative ligand detection. Observing ligand accumulation in culture is an analytical challenge due to the low cell densities and reduced growth rates of iron stressed marine cyanobacteria. Preliminary results suggest the presence of an iron-binding ligand in the iron-stressed cultures which was not present under iron-replete conditions. The amount of ligand produced by Synechococcus was approximately 1 × 10-18 mol/cell, comparable with the amount produced by marine heterotrophic bacteria (K. Barbeau, pers. comm.).

  8. Space Technology Plasma Issues in 2001

    NASA Technical Reports Server (NTRS)

    Garrett, Henry (Editor); Feynman, Joan (Editor); Gabriel, Stephen (Editor)

    1986-01-01

    The purpose of the workshop was to identify and discuss plasma issues that need to be resolved during the next 10 to 20 years (circa 2001) to facilitate the development of the advanced space technology that will be required 20 or 30 years into the future. The workshop consisted of 2 days of invited papers and 2 sessions of contributed poster papers. During the third day the meeting broke into 5 working groups, each of which held discussions and then reported back to the conference as a whole. The five panels were: Measurements Technology and Active Experiments Working Group; Advanced High-Voltage, High-Power and Energy-Storage Space Systems Working Group; Large Structures and Tethers Working Group; Plasma Interactions and Surface/Materials Effects Working Group; and Beam Plasmas, Electronic Propulsion and Active Experiments Using Beams Working Group.

  9. Use of focus groups to study absenteeism due to illness.

    PubMed

    Høverstad, T; Kjølstad, S

    1991-10-01

    Reasons for sick leaves are often complex and influenced by nonmedical factors. We have used focus groups, a qualitative research method, to study the relationship between working conditions and absenteeism due to illness in both an industrial company and an insurance company. We organized 10 focus groups within each company, with participants randomly selected from departments having similar work tasks within each company. According to the groups, the most important working conditions that influenced absenteeism were (a) feeling of well-being at work (mainly defined as security in social relations), (b) the organization of the work, and (c) the department leader. Factors considered to be less important included: number of employees, male/female ratios, group norms for absenteeism, age distribution, work-related illness, substance abuse, and work loads. There was substantial agreement between the groups, indicating that our findings may be relevant to other companies.

  10. Food and nutrient intake among workers with different shift systems.

    PubMed

    Hemiö, Katri; Puttonen, Sampsa; Viitasalo, Katriina; Härmä, Mikko; Peltonen, Markku; Lindström, Jaana

    2015-07-01

    Over 20% of employees in Europe work in shifts. Shift work increases the risk for chronic diseases, but a healthy lifestyle may attenuate the adverse effect of shift work. The aim of this study was to explore food and nutrient intake differences between working time groups. The participants were 1478 employees (55% of men) of an airline divided into three working time groups: day work (n=608), shift work without in-flight work (n=541) and in-flight work (n=329). Measures included laboratory tests, physical measurements, a questionnaire, and food and nutrient intake estimations by a validated 16-item food intake questionnaire. Shift working men were less likely to consume vegetables (p<0.001) and fruits (p=0.049) daily than male day and in-flight workers. In women, energy intake from saturated fat was higher among shift workers compared with day workers (12.6 vs 12.2 E%, p=0.023). In older female participants, energy intake from fat and saturated fat was higher in the shift work and in-flight work groups than in the day work group (p<0.001). In this study, shift work and working environment were associated with dietary habits, and this association was not explained by other characteristics such as workers' educational level. Shift workers' increased risk for chronic diseases should be taken into account and lifestyle counselling including advice in nutrition should be incorporated in routine occupational healthcare of shift workers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Individual and group-level job resources and their relationships with individual work engagement.

    PubMed

    Füllemann, Désirée; Brauchli, Rebecca; Jenny, Gregor J; Bauer, Georg F

    2016-06-16

    This study adds a multilevel perspective to the well-researched individual-level relationship between job resources and work engagement. In addition, we explored whether individual job resources cluster within work groups because of a shared psychosocial environment and investigated whether a resource-rich psychosocial work group environment is beneficial for employee engagement over and above the beneficial effect of individual job resources and independent of their variability within groups. Data of 1,219 employees nested in 103 work groups were obtained from a baseline employee survey of a large stress management intervention project implemented in six medium and large-sized organizations in diverse sectors. A variety of important job resources were assessed and grouped to an overall job resource factor with three subfactors (manager behavior, peer behavior, and task-related resources). Data were analyzed using multilevel random coefficient modeling. The results indicated that job resources cluster within work groups and can be aggregated to a group-level job resources construct. However, a resource-rich environment, indicated by high group-level job resources, did not additionally benefit employee work engagement but on the contrary, was negatively related to it. On the basis of this unexpected result, replication studies are encouraged and suggestions for future studies on possible underlying within-group processes are discussed. The study supports the presumed value of integrating work group as a relevant psychosocial environment into the motivational process and indicates a need to further investigate emergent processes involved in aggregation procedures across levels.

  12. Reading Balint group work through Lacan's theory of the four discourses.

    PubMed

    Van Roy, Kaatje; Marché-Paillé, Anne; Geerardyn, Filip; Vanheule, Stijn

    2016-02-05

    In Balint groups, (para)medical professionals explore difficult interactions with patients by means of case presentations and discussions. As the process of Balint group work is not well understood, this article investigates Balint group meetings by making use of Lacan's theory of the four discourses. Five Balint group case presentations and their subsequent group discussion were studied, resulting in the observation of five crucial aspects of Balint group work. First, Balint group participants brought puzzlement to the group, which is indicative of the structural impossibility Lacan situates at the basis of all discourse (1). As for the group discussion, we emphasize 'hysterization' as a crucial process in Balint group work (2), the supporting role of the discourse of the analyst (3) and the centrality of discourse interactions (4). Finally, the potential transformation of the initial puzzlement is discussed (5). We conclude by putting forth the uniqueness of Balint group work as well as the potential usefulness of our analysis as a framework for Balint group leaders and professionals in charge of continuing medical education. © The Author(s) 2016.

  13. Work-Anxiety and Sickness Absence After a Short Inpatient Cognitive Behavioral Group Intervention in Comparison to a Recreational Group Meeting.

    PubMed

    Muschalla, Beate; Linden, Michael; Jöbges, Michael

    2016-04-01

    The aim of this study was to study the effects of a short-term cognitive behavior therapy on work-anxiety and sickness-absence in patients with work-anxiety. Three-hundred forty-five inpatients who suffered from cardiologic, neurological, or orthopedic problems and additionally work-anxiety were randomly assigned into two different group interventions. Patients got four sessions of a group intervention, which either focused on cognitive behavior-therapy anxiety-management (work-anxiety coping group, WAG) or unspecific recreational activities (RG). No differences were found between WAG and RG for work-anxiety and subjective work ability. When looking at patients who were suffering only from work-anxiety, and no additional mental disorder, the duration of sickness absence until 6 months follow-up was shorter in the WAG (WAG: 11 weeks, RG: 16 weeks, P = 0.050). A short-term WAG may help return to work in patients with work-anxieties, as long as there is no comorbid mental disorder.

  14. "Journal for Specialists in Group Work" ("JSGW") Publication Pattern Review: A Meta-Study of Author and Article Characteristics from 1981-2010

    ERIC Educational Resources Information Center

    Byrd, Rebekah; Crockett, Stephanie A.; Erford, Bradley T.

    2012-01-01

    "The Journal for Specialists in Group Work" ("JSGW") is the journal of the Association for Specialists in Group Work (ASGW), a division of the American Counseling Association (ACA). "JSGW" publishes articles related to "group work theory, interventions, training, current issues, and research" (ASGW, 2011). "JSGW" was first published in 1976 and is…

  15. The Forest Genetic Resources Working Group of the North American Forestry Commission (FAO)

    Treesearch

    Ronald C. Schmidtling

    2002-01-01

    The Forest Genetic Resources Working Group (FGRWG) is one of seven working groups established by the North American Forest Commission (NAFC). The NAFC is one of six Forest Commissions established by the Food and Agriculture Organization (F-40). The FGRWG was established by the NAFC in 1961 as the Working Group on Forest Tree Improvement but went through several-changes...

  16. How Much "Group" Is There in Online Group Work?

    ERIC Educational Resources Information Center

    Lowes, Susan

    2014-01-01

    The ability to work in groups across time and space has become a frequent requirement for the workplace and is increasingly common in higher education, but there is a surprising lack of research on how online groups work. This research applies analytic approaches used in studies of face-to-face classroom "talk" to multiple groups in two…

  17. 77 FR 17569 - United States-Canada Regulatory Cooperation Council (RCC)-Transportation-Dangerous Goods Working...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... identified in the Joint Action Plan, the Transportation--Dangerous Goods Working Group led by senior...)-- Transportation--Dangerous Goods Working Group AGENCY: Pipeline and Hazardous Materials Safety Administration...--Dangerous Goods Working Group, of the United States-Canada Regulatory Cooperation Council (RCC). Comments...

  18. The working alliance in a randomized controlled trial comparing online with face-to-face cognitive-behavioral therapy for depression

    PubMed Central

    2011-01-01

    Background Although numerous efficacy studies in recent years have found internet-based interventions for depression to be effective, there has been scant consideration of therapeutic process factors in the online setting. In face-to face therapy, the quality of the working alliance explains variance in treatment outcome. However, little is yet known about the impact of the working alliance in internet-based interventions, particularly as compared with face-to-face therapy. Methods This study explored the working alliance between client and therapist in the middle and at the end of a cognitive-behavioral intervention for depression. The participants were randomized to an internet-based treatment group (n = 25) or face-to-face group (n = 28). Both groups received the same cognitive behavioral therapy over an 8-week timeframe. Participants completed the Beck Depression Inventory (BDI) post-treatment and the Working Alliance Inventory at mid- and post- treatment. Therapists completed the therapist version of the Working Alliance Inventory at post-treatment. Results With the exception of therapists' ratings of the tasks subscale, which were significantly higher in the online group, the two groups' ratings of the working alliance did not differ significantly. Further, significant correlations were found between clients' ratings of the working alliance and therapy outcome at post-treatment in the online group and at both mid- and post-treatment in the face-to-face group. Correlation analysis revealed that the working alliance ratings did not significantly predict the BDI residual gain score in either group. Conclusions Contrary to what might have been expected, the working alliance in the online group was comparable to that in the face-to-face group. However, the results showed no significant relations between the BDI residual gain score and the working alliance ratings in either group. Trial registration ACTRN12611000563965 PMID:22145768

  19. Student perception of group dynamics predicts individual performance: Comfort and equity matter

    PubMed Central

    Theobald, Elli J.; Eddy, Sarah L.; Grunspan, Daniel Z.; Wiggins, Benjamin L.

    2017-01-01

    Active learning in college classes and participation in the workforce frequently hinge on small group work. However, group dynamics vary, ranging from equitable collaboration to dysfunctional groups dominated by one individual. To explore how group dynamics impact student learning, we asked students in a large-enrollment university biology class to self-report their experience during in-class group work. Specifically, we asked students whether there was a friend in their group, whether they were comfortable in their group, and whether someone dominated their group. Surveys were administered after students participated in two different types of intentionally constructed group activities: 1) a loosely-structured activity wherein students worked together for an entire class period (termed the ‘single-group’ activity), or 2) a highly-structured ‘jigsaw’ activity wherein students first independently mastered different subtopics, then formed new groups to peer-teach their respective subtopics. We measured content mastery by the change in score on identical pre-/post-tests. We then investigated whether activity type or student demographics predicted the likelihood of reporting working with a dominator, being comfortable in their group, or working with a friend. We found that students who more strongly agreed that they worked with a dominator were 17.8% less likely to answer an additional question correct on the 8-question post-test. Similarly, when students were comfortable in their group, content mastery increased by 27.5%. Working with a friend was the single biggest predictor of student comfort, although working with a friend did not impact performance. Finally, we found that students were 67% less likely to agree that someone dominated their group during the jigsaw activities than during the single group activities. We conclude that group activities that rely on positive interdependence, and include turn-taking and have explicit prompts for students to explain their reasoning, such as our jigsaw, can help reduce the negative impact of inequitable groups. PMID:28727749

  20. Health Transportation Working Group 2016 Annual Report

    DOT National Transportation Integrated Search

    2017-06-30

    The Health in Transportation Working Group 2016 Annual Report provides an overview of the Working Groups activities and accomplishments in 2016, summarizes other USDOT health-related accomplishments, and documents its progress toward the recommend...

  1. The Physics and Applications of High Brightness Beams: Working Group C Summary on Applications to FELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuhn, Heinz-Dieter

    2003-03-19

    This is the summary of the activities in working group C, ''Application to FELs,'' which was based in the Bithia room at the Joint ICFA Advanced Accelerator and Beam Dynamics Workshop on July 1-6, 2002 in Chia Laguna, Sardinia, Italy. Working group C was small in relation to the other working groups at that workshop. Attendees include Enrica Chiadroni, University of Rome ape with an identical pulse length. ''La Sapienza'', Luca Giannessi, ENEA, Steve Lidia, LBNL, Vladimir Litvinenko, Duke University, Patrick Muggli, UCLA, Alex Murokh, UCLA, Heinz-Dieter Nuhn, SLAC, Sven Reiche, UCLA, Jamie Rosenzweig, UCLA, Claudio Pellegrini, UCLA, Susan Smith,more » Daresbury Laboratory, Matthew Thompson, UCLA, Alexander Varfolomeev, Russian Research Center, plus a small number of occasional visitors. The working group addressed a total of nine topics. Each topic was introduced by a presentation, which initiated a discussion of the topic during and after the presentation. The speaker of the introductory presentation facilitated the discussion. There were six topics that were treated in stand-alone sessions of working group C. In addition, there were two joint sessions, one with working group B, which included one topic, and one with working group C, which included two topics. The presentations that were given in the joint sessions are summarized in the working group summary reports for groups B and D, respectively. This summary will only discuss the topics that were addressed in the stand-alone sessions, including Start-To-End Simulations, SASE Experiment, PERSEO, ''Optics Free'' FEL Oscillators, and VISA II.« less

  2. Work experiences among attendees of day centres for people with psychiatric disabilities.

    PubMed

    Eklund, Mona; Sandlund, Mikael

    2015-01-01

    It is possible that people with psychiatric disabilities who visit day centres have previous work experiences that may be seen as resources for their current engagement in day centre activities. Research in this respect seems to lack, however. To investigate work experiences among attendees at day centres for people with psychiatric disabilities and relationships with current type of day centre (work-oriented, meeting place-oriented or mixed), engagement in day centre activities, motivation and socio-demographic and health-related factors. Seventy-seven attendees responded to questionnaires. Global Assessment of Functioning, GAF, was also used. Work was categorised into Group I (professionals, semi-professionals), Group II (clerical support, services workers) and Group III (e.g. craft workers, elementary occupations). Almost everyone had previously had open-market employment; more than half for ≥ 10 years. Group I was more common in mixed centres, Group II in meeting place-oriented ones and Group III in work-oriented ones. Group I more frequently had college degree and was rated high on GAF functioning. Women were over-represented in Group II, and men in Group III and in meeting place-oriented centres. Attending mixed centres was more likely when having a college degree, scoring high on GAF functioning and being highly engaged in activities. Attendees at work-oriented day centres were characterised by being motivated for spending time alone and reporting a diagnosis of psychosis. The participants had unused working capacity. No clear-cut relationships were found between work experiences and the investigated correlates.

  3. Technologies That Assist in Online Group Work: A Comparison of Synchronous and Asynchronous Computer Mediated Communication Technologies on Students' Learning and Community

    ERIC Educational Resources Information Center

    Rockinson-Szapkiw, Amanda; Wendt, Jillian

    2015-01-01

    While the benefits of online group work completed using asynchronous CMC technology is documented, researchers have identified a number of challenges that result in ineffective and unsuccessful online group work. Fewer channels of communication and lack of immediacy when compared to face-to-face group work are a few of the noted limitations. Thus,…

  4. Can the Enhancement of Group Working in Classrooms Provide a Basis for Effective Communication in Support of School-Based Cognitive Achievement in Classrooms of Young Learners?

    ERIC Educational Resources Information Center

    Kutnick, Peter; Berdondini, Lucia

    2009-01-01

    This quasi-experimental study was part of the SPRinG project (Social Pedagogy Research into Group Work). The review notes group work in "authentic" classrooms rarely fulfils its interactive or attainment potential. SPRinG classes undertook a programme of relational training to enhance children's group working skills while control classes…

  5. Introduction to NASA Living With a Star (LWS) Institute GIC Working Group Special Collection

    NASA Technical Reports Server (NTRS)

    Pulkkinen, A.

    2017-01-01

    This paper is a brief introduction to the NASA Living With a Star (LWS) Institute GIC Working Group Special Collection that is product of work by a group of researchers from more than 20 different international organizations. In this introductory paper, I summarize the group's work in the context of novel NASA LWS Institute element and introduce the individual contributions in the collection.

  6. Status of Laser/Lidar Working Group Requirements

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.; Gentry, Bruce M.

    2006-01-01

    This viewgraph presentation reviews the status of the development of the requirements by the Laser/Lidar working group. Included in the presentation is another viewgraph report on the NASA Earth Science Technology Office (ESTO) Laser/Lidar working group, by the chairperson of the working group. Some of the uses of Laser and Lidar in earth sciences are reviewed and a roadmap for the future use of the technology is included.

  7. Strategies for Successful Group Work

    ERIC Educational Resources Information Center

    Nipp, Mary Beth; Palenque, Stephanie Maher

    2017-01-01

    The thought of group work, or CLC Groups often strikes fear and loathing in the hearts and minds of both students and instructors. According to Swan, Shen, and Hiltz (2006) collaborative work presents the possibilities of many difficulties including a largely unequal contribution of group participants, an inability of the students to manage the…

  8. 76 FR 584 - Glen Canyon Dam Adaptive Management Program Work Group (AMWG)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... 2010 expenditures, (2) updates on High Flow Experimental Protocol and the Non-native Fish Control... Group (AMWG) AGENCY: Bureau of Reclamation, Interior. ACTION: Notice of public meeting. SUMMARY: The... committee, the Adaptive Management Work Group (AMWG), a technical work group (TWG), a Grand Canyon...

  9. Group Work as Facilitation of Spiritual Development for Drug and Alcohol Abusers.

    ERIC Educational Resources Information Center

    Page, Richard C.; Berkow, Daniel N.

    1998-01-01

    Describes group work designed to promote spiritual development with drug and alcohol abusers. Provides a definition of spirituality. Discusses research that relates to the spiritual development of members of drug and alcohol groups. Compares the ways that group work and Alcoholics Anonymous promote spiritual development. (Author/MKA)

  10. 77 FR 55903 - Confirmation, Portfolio Reconciliation, Portfolio Compression, and Swap Trading Relationship...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... and MSPs, trade associations, public interest groups, traders, and other interested parties. In... for the Proposed Rules The Working Group of Commercial Energy Firms (The Working Group) [[Page 55905... CEA. The Working Group believes that the Commission could meet its statutory mandate by publishing...

  11. Impact of Group Development Knowledge on Students' Perceived Importance and Confidence of Group Work Skills

    ERIC Educational Resources Information Center

    Coers, Natalie; Williams, Jennifer

    2010-01-01

    This study explored the impact of emphasis on the group development process on the perceived importance of and confidence in group work skills and students' perception of group work use in the collegiate classroom as developed by Tuckman and Jensen (1977). The purposive sample utilized in this study included 33 undergraduate students enrolled in…

  12. The Power and Promise of Group Work: Consumer Evaluation of Group Work Services in Gauteng, South Africa

    ERIC Educational Resources Information Center

    Rasool, Shahana; Ross, Eleanor

    2017-01-01

    Purpose: In light of the limited research into consumers' experiences of group work services in South Africa, the study evaluated groups offered by a range of social service agencies in Gauteng to determine whether group interventions were perceived by users as developmental and empowering. Methods: Program evaluation was employed to evaluate 47…

  13. 29 CFR 42.4 - Structure of the National Committee.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... responsibilities. (d) There shall be a National Committee staff level working group consisting of senior staff... Secretary shall be the director of the staff level working group. (f) The staff level working group shall...

  14. 77 FR 24759 - Aviation Rulemaking Advisory Committee Meeting on Transport Airplane and Engine Issues

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... Harmonization Working Group Report. Materials Flammability Working Group Report. Aging Airplanes Working Group... meeting documents, please contact the person listed in the FOR FURTHER INFORMATION CONTACT section. Sign...

  15. 78 FR 54482 - Charter Renewal, Glen Canyon Dam Adaptive Management Work Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    .... SUMMARY: Following consultation with the General Services Administration, notice is hereby given that the... Work Group. The purpose of the Adaptive Management Work Group is to provide advice and recommendations...

  16. Individual and group-level job resources and their relationships with individual work engagement

    PubMed Central

    Füllemann, Désirée; Brauchli, Rebecca; Jenny, Gregor J.; Bauer, Georg F.

    2016-01-01

    Objectives: This study adds a multilevel perspective to the well-researched individual-level relationship between job resources and work engagement. In addition, we explored whether individual job resources cluster within work groups because of a shared psychosocial environment and investigated whether a resource-rich psychosocial work group environment is beneficial for employee engagement over and above the beneficial effect of individual job resources and independent of their variability within groups. Methods: Data of 1,219 employees nested in 103 work groups were obtained from a baseline employee survey of a large stress management intervention project implemented in six medium and large-sized organizations in diverse sectors. A variety of important job resources were assessed and grouped to an overall job resource factor with three subfactors (manager behavior, peer behavior, and task-related resources). Data were analyzed using multilevel random coefficient modeling. Results: The results indicated that job resources cluster within work groups and can be aggregated to a group-level job resources construct. However, a resource-rich environment, indicated by high group-level job resources, did not additionally benefit employee work engagement but on the contrary, was negatively related to it. Conclusions: On the basis of this unexpected result, replication studies are encouraged and suggestions for future studies on possible underlying within-group processes are discussed. The study supports the presumed value of integrating work group as a relevant psychosocial environment into the motivational process and indicates a need to further investigate emergent processes involved in aggregation procedures across levels. PMID:27108639

  17. How much structuring is beneficial with regard to examination scores? A prospective study of three forms of active learning.

    PubMed

    Reinhardt, Claus H; Rosen, Evelyne N

    2012-09-01

    Many studies have demonstrated a superiority of active learning forms compared with traditional lecture. However, there is still debate as to what degree structuring is necessary with regard to high exam outcomes. Seventy-five students from a premedical school were randomly attributed to an active lecture group, a cooperative group, or a collaborative learning group. The active lecture group received lectures with questions to resolve at the end of the lecture. At the same time, the cooperative group and the collaborative group had to work on a problem and prepare presentations for their answers. The collaborative group worked in a mostly self-directed manner; the cooperative group had to follow a time schedule. For the additional work of preparing the poster presentation, the collaborative and cooperative groups were allowed 50% more working time. In part 1, all groups worked on the citric acid cycle, and in part 2, all groups worked on molecular genetics. Collaborative groups had to work on tasks and prepare presentations for their answers. At the end of each part, all three groups were subjected to the same exam. Additionally, in the collaborative and cooperative groups, the presentations were marked. All evaluations were performed by two independent examiners. Exam results of the active lecture groups were highest. Results of the cooperative group were nonsignificantly lower than the active lecture group and significantly higher than the collaborative group. The presentation quality was nonsignificantly higher in the collaborative group compared with the cooperative group. This study shows that active lecturing produced the highest exam results, which significantly differed from collaborative learning results. The additional elaboration in the cooperative and collaborative learning setting yielded the high presentation quality but apparently could not contribute further to exam scores. Cooperative learning seems to be a good compromise if high exam and presentation scores are expected.

  18. The Effect of Group Work on English Vocabulary Learning

    ERIC Educational Resources Information Center

    Lin, Su-Fei

    2018-01-01

    This study investigated the effectiveness of group work (GW) in EFL vocabulary learning by second year, non-English major, university students in Taiwan, in comparison with working individually (IW). The students (N = 44) worked in mixed ability groups of 3-4 or in IW to complete vocabulary exercises following reading activities. The classroom…

  19. 78 FR 26382 - Advisory Committee on Commercial Operations of Customs and Border Protection (COAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... Completed by the Export Mapping Working Group (EMWG) to date. 3. Review and Discuss the Global Supply Chain Subcommittee's Air Cargo Advance Screening (ACAS) Working Group and address Next Steps regarding Land Border... the Trusted Trader Subcommittee and the Work Completed by the Industry Standards Working Group (ISWG...

  20. 77 FR 66856 - Merchant Marine Personnel Advisory Committee: Intercessional Meeting AGENCY: Coast Guard, DHS

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... Committee Working Group Meeting. SUMMARY: A working group of the Merchant Marine Personnel Advisory Committee (MERPAC) will meet to work on Task Statement 77 concerning the development of new performance...-Technical Ratings. This meeting will be open to the public. DATES: A MERPAC working group will meet on...

  1. 75 FR 52807 - Aviation Rulemaking Advisory Committee; Transport Airplane and Engine Issues-New Task

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-27

    ...'s Transport Airplane and Engine Issues and has established a new Materials Flammability Working... International Aircraft Materials Fire Test Working Group. The working group is sponsored by the FAA's William J... implementation. FAA will provide ARAC with the proposed approach. The ARAC working group is expected to produce a...

  2. 75 FR 35458 - National Drinking Water Advisory Council's Climate Ready Water Utilities Working Group Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... supportive environment in which a utility can take steps to be climate ready. In this meeting, the Working... Ready Water Utilities Working Group Meeting Announcement AGENCY: Environmental Protection Agency (EPA... fourth in-person meeting of the Climate Ready Water Utilities (CRWU) Working Group of the National...

  3. The Effects of Work Group Structure on Social Psychological Aspects of the Human Organization.

    ERIC Educational Resources Information Center

    Fine, B. D.

    To investigate the effects of work group structure on measures of organizational behavior, questionnaire data from employees in a department characterized by complex, unstable work group structure and variable supervisory reporting relationships were compared with data from similar employees in two departments characterized by stable work group…

  4. Consequences of work group manpower and expertise understaffing: A multilevel approach.

    PubMed

    Hudson, Cristina K; Shen, Winny

    2018-01-01

    Complaints of chronic understaffing in organizations have become common among workers as employers face increasing pressures to do more with less. Unfortunately, despite its prevalence, there is currently limited research in the literature regarding the nature of workplace understaffing and its consequences. Taking a multilevel approach, this study introduces a new multidimensional conceptualization of subjective work group understaffing, comprising of manpower and expertise understaffing, and examines both its performance and well-being consequences for individual workers (Study 1) and work groups (Study 2). Results show that the relationship between work group understaffing and individual and work group emotional exhaustion is mediated through quantitative workload and role ambiguity for both levels of analysis. Work group understaffing was also related to individual job performance, but not group performance, and this relationship was mediated by role ambiguity. Results were generally similar for the 2 dimensions of understaffing. Implications for theory and research and future research directions are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Health in Transportation Working Group 2015 Annual Report

    DOT National Transportation Integrated Search

    2016-06-30

    The Health in Transportation Working Group 2015 Annual Report provides an overview of the Working Groups activities and accomplishments : in 2015, summarizes other U.S. DOT health-related accomplishments, and documents its progress toward the reco...

  6. Coaching positively influences the effects of working memory training on visual working memory as well as mathematical ability.

    PubMed

    Nelwan, Michel; Vissers, Constance; Kroesbergen, Evelyn H

    2018-05-01

    The goal of the present study was to test whether the amount of coaching influenced the results of working memory training on both visual and verbal working memory. Additionally, the effects of the working memory training on the amount of progress after specific training in mathematics were evaluated. In this study, 23 children between 9 and 12 years of age with both attentional and mathematical difficulties participated in a working memory training program with a high amount of coaching, while another 25 children received no working memory training. Results of these groups were compared to 21 children who completed the training with a lower amount of coaching. The quality of working memory, as well as mathematic skills, were measured three times using untrained transfer tasks. Bayesian statistics were used to test informative hypotheses. After receiving working memory training, the highly coached group performed better than the group that received less coaching on visual working memory and mathematics, but not on verbal working memory. The highly coached group retained their advantage in mathematics, even though the effect on visual working memory decreased. However, no added effect of working memory training was found on the learning curve during mathematical training. Moreover, the less-coached group was outperformed by the group that did not receive working memory training, both in visual working memory and mathematics. These results suggest that motivation and proper coaching might be crucial for ensuring compliance and effects of working memory training, and that far transfer might be possible. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Support for Struggling Students in Algebra: Contributions of Incorrect Worked Examples

    ERIC Educational Resources Information Center

    Barbieri, Christina; Booth, Julie L.

    2016-01-01

    Middle school algebra students (N = 125) randomly assigned within classroom to a Problem-solving control group, a Correct worked examples control group, or an Incorrect worked examples group, completed an experimental classroom study to assess the differential effects of incorrect examples versus the two control groups on students' algebra…

  8. Report of the Working Group on Faculty Attraction and Retention.

    ERIC Educational Resources Information Center

    Alberta Learning, Edmonton.

    In July 2001, the Alberta Ministry of Learning established a working group to make recommendations on improving Alberta's ability to attract and retain faculty. This report presents the findings of this group's evaluation of the ability of the province's postsecondary institutions to attract and retain college faculty. The working group identified…

  9. Collecting School Counseling Group Work Data: Initiating Consensual Qualitative Research through Practitioner-Researcher Partnerships

    ERIC Educational Resources Information Center

    Springer, Sarah I.; Land, Christy W.; Moss, Lauren J.; Cinotti, Daniel

    2018-01-01

    Group counseling interventions can be complex to assess and research. Over the years, The "Journal for Specialists in Group Work" ("JSGW") has highlighted many of these challenges and offered valued approaches to designing projects that promote the efficacy and meaningfulness of group work in various settings. Similarly, school…

  10. Group Work: How to Use Groups Effectively

    ERIC Educational Resources Information Center

    Burke, Alison

    2011-01-01

    Many students cringe and groan when told that they will need to work in a group. However, group work has been found to be good for students and good for teachers. Employers want college graduates to have developed teamwork skills. Additionally, students who participate in collaborative learning get better grades, are more satisfied with their…

  11. Group Work with Survivors of the 2004 Asian Tsunami: Reflections of an American-Trained Counselor

    ERIC Educational Resources Information Center

    Fernando, Delini M.

    2009-01-01

    This article describes a support group for Sri Lankan women survivors of the 2004 Asian Tsunami. The article discusses unique leader challenges in doing group work in a diverse and foreign setting, and presents leader reflections, recommendations, and implications for group workers who may work with disaster survivors.

  12. A 6-hour working day--effects on health and well-being.

    PubMed

    Akerstedt, T; Olsson, B; Ingre, M; Holmgren, M; Kecklund, G

    2001-12-01

    The effect of the total amount of work hours and the benefits of a shortening is frequently debated, but very little data is available. The present study compared a group (N = 41) that obtained a 9 h reduction of the working week (to a 6 h day) with a comparison group (N = 22) that retained normal work hours. Both groups were constituted of mainly female health care and day care nursery personnel. The experimental group retained full pay and extra personnel were employed to compensate for loss of hours. Questionnaire data were obtained before and 1 year after the change. The data were analyzed using a two-factor ANOVA with the interaction term year*group as the main focus. The results showed a significant interaction of year*group for social factors, sleep quality, mental fatigue, and heart/respiratory complaints, and attitude to work hours. In all cases the experimental group improved whereas the control group did not change. It was concluded that shortened work hours have clear social effects and moderate effects on well-being.

  13. Evaluation of a randomized controlled trial on the effect on return to work with coaching combined with light therapy and pulsed electromagnetic field therapy for workers with work-related chronic stress.

    PubMed

    Nieuwenhuijsen, Karen; Schoutens, Antonius M C; Frings-Dresen, Monique H W; Sluiter, Judith K

    2017-10-02

    Chronic work-related stress is quite prevalent in the working population and is in some cases accompanied by long-term sick leave. These stress complaints highly impact employees and are costly due to lost productivity and medical expenses. A new treatment platform with light therapy plus Pulsed Electro Magnetic Fields (PEMF) in combination with coaching was used to assess whether more positive effects on return to work, stress, work-related fatigue, and quality of life could be induced compared to coaching alone. A placebo-controlled trial was executed after inclusion of 96 workers, aged 18-65 with work-related chronic stress complaints and who were on sick leave (either part-time or full-time). Participants were divided into three arms at random. Group 1 (n = 28) received the treatment and coaching (Intervention group), group 2 (n = 28) received the treatment with the device turned off and coaching (Placebo group) and group 3 (n = 28) received coaching only (Control group). The data were collected at baseline, and after 6, 12 and 24 weeks. The primary outcome was % return to work, and secondary outcomes were work-related fatigue (emotional exhaustion and need for recovery after work), stress (distress and hair cortisol), and quality of life (SF-36 dimensions: vitality, emotional role limitation, and social functioning). Eighty-four workers completed all measurements, 28 in each group. All groups improved significantly over time in the level of return to work, as well as on all secondary outcomes. No statistical differences between the three groups were found either on the primary outcome or on any of the secondary outcomes. Light therapy with Pulsed Electro Magnetic Fields PEMF therapy has no additional effect on return to work, stress, fatigue, and quality of live compared to coaching alone. NTR4794 , registration date: 18-sep-2014.

  14. Workdays, in-between workdays and the weekend: a diary study on effort and recovery.

    PubMed

    van Hooff, Madelon L M; Geurts, Sabine A E; Kompier, Michiel A J; Taris, Toon W

    2007-07-01

    Effort-recovery theory (Meijman and Mulder in Handbook of work and organizational psychology, Psychology Press/Erlbaum, Hove, pp 5-33, 1998) proposes that effort expenditure may have adverse consequences for health in the absence of sufficient recovery opportunities. Thus, insight in the relationships between effort and recovery is imperative to understand work-related health. This study therefore focused on the relation between work-related effort and recovery (1) during workdays, (2) in-between workdays and (3) in the weekend. For these three time periods, we compared a group of employees reporting relatively low levels of work-related effort ("low-effort group") and a group of employees reporting relatively high levels of work-related effort ("high-effort group") with respect to (1) activity patterns, (2) the experience of these activity patterns, and (3) health and well-being indicators. Data were collected among university staff members. Participants (N(high-effort group) = 24 and N(low-effort group) = 27) completed a general questionnaire and took part in a 7-day daily diary study covering five weekdays and the following weekend. Differences between the two effort-groups were examined by means of analysis of variance. Compared to the low-effort group, the high-effort group (1) engaged less often in active leisure activities during the week and worked more overtime in the weekend, (2) considered both work and home activities as more effortful, but not as less pleasurable, and (3) reported higher levels of sleep complaints (weekdays only) and fatigue, more preoccupation with work (weekdays only) and lower motivation to start the next workweek during the weekend. Work-related effort is associated with various aspects of work time and (potential) recovery time in-between workdays and in the weekend. High levels of work-related effort are associated with activity patterns that are less beneficial in terms of recovery, with higher effort expenditure during and after work time, and with diminished health and well-being.

  15. A phenomenological research study: Perspectives of student learning through small group work between undergraduate nursing students and educators.

    PubMed

    Wong, Florence Mei Fung

    2018-06-18

    Small group work is an effective teaching-learning approach in nursing education to enhance students' learning in theoretical knowledge and skill development. Despite its potential advantageous effects on learning, little is known about its actual effects on students' learning from students' and educators' perspectives. To understand students' learning through small group work from the perspectives of students and educators. A qualitative study with focus group interviews was carried out. Semi-structured interviews with open-ended questions were performed with 13 undergraduate nursing students and 10 educators. Four main themes, "initiative learning", "empowerment of interactive group dynamics", "factors for creating effective learning environment", and "barriers influencing students' learning", were derived regarding students' learning in small group work based on the perspectives of the participants. The results showed the importance of learning attitudes of students in individual and group learning. Factors for creating an effective learning environment, including preference for forming groups, effective group size, and adequacy of discussion, facilitate students' learning with the enhancement of learning engagement in small group work. The identified barriers, such as "excessive group work", "conflicts", and "passive team members" can reduce students' motivation and enjoyment of learning. Small group work is recognized as an effective teaching method for knowledge enhancement and skill development in nursing education. All identified themes are important to understand the initiatives of students and group learning, factors influencing an effective learning environment, and barriers hindering students' learning. Nurse educators should pay more attention to the factors that influence an effective learning environment and reduce students' commitment and group dynamics. Moreover, students may need further support to reduce barriers that impede students' learning motivation and enjoyment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. An exploratory study of phonological awareness and working memory differences and literacy performance of people that use AAC.

    PubMed

    Gómez Taibo, María Luisa; Vieiro Iglesias, Pilar; González Raposo, María del Salvador; Sotillo Méndez, María

    2010-11-01

    Twelve cerebral palsied adolescents and young adults with complex communicative needs who used augmentative and alternative communication were studied. They were classified according to their working memory capacity (high vs. low) into two groups of 6 participants. They were also divided into two groups of 6 participants according to their high vs. low phonological skills. These groups were compared on their performance in reading tests -orthographic knowledge, a word test and a pseudoword reading test- and in the spelling of words, pseudowords and pictures' names. Statistical differences were found between high vs. low phonological skills groups, and between high and low working memory groups. High working memory capacity group scored significantly higher than low working memory group in the orthographic and word reading tests. The high phonological skills group outperformed the low phonological skills group in the word reading test and in the spelling of pseudowords and pictures' names. From a descriptive point of view, phonological skills and working memory, factors known to be highly predictive of literacy skills in people without disabilities, also hold as factors for the participants that used AAC in our study. Implications of the results are discussed.

  17. 76 FR 73689 - National Advisory Committee on Occupational Safety and Health (NACOSH)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    .... In conjunction with the committee meeting, NACOSH Work Groups will meet on December 14, 2011. DATES... Recordkeeping Work Groups will meet in conjunction with the NACOSH meeting. Those Work Groups will meet from 9 a...

  18. 76 FR 70751 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  19. 76 FR 52345 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-22

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  20. 75 FR 17158 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-05

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  1. 76 FR 14044 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  2. 75 FR 70947 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  3. 76 FR 23621 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  4. 75 FR 10501 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  5. 76 FR 34248 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  6. 75 FR 51284 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-19

    ...] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...

  7. Strain of implants depending on occlusion types in mandibular implant-supported fixed prostheses

    PubMed Central

    Sohn, Byoung-Sup; Heo, Seong-Joo; Koak, Jai-Young; Kim, Seong-Kyun

    2011-01-01

    PURPOSE This study investigated the strain of implants using a chewing simulator with strain gauges in mandibular implant-supported fixed prostheses under various dynamic loads. MATERIALS AND METHODS Three implant-supported 5-unit fixed prostheses were fabricated with three different occlusion types (Group I: Canine protected occlusion, Group II: Unilaterally balanced occlusion, Group III: Bilaterally balanced occlusion). Two strain gauges were attached to each implant abutment. The programmed dynamic loads (0 - 300 N) were applied using a chewing simulator (MTS 858 Mini Bionix II systems, MTS systems corp., Minn, USA) and the strains were monitored. The statistical analyses were performed using the paired t-test and the ANOVA. RESULTS The mean strain values (MSV) for the working sides were 151.83 µε, 176.23 µε, and 131.07 µε for Group I, Group II, and Group III, respectively. There was a significant difference between Group II and Group III (P < .05). Also, the MSV for non-working side were 58.29 µε, 72.64 µε, and 98.93 µε for Group I, Group II, and Group III, respectively. One was significantly different from the others with a 95% confidence interval (P < .05). CONCLUSION The MSV for the working side of Groups I and II were significantly different from that for the non-working side (Group I: t = 7.58, Group II: t = 6.25). The MSV for the working side of Group II showed significantly larger than that of Group III (P < .01). Lastly, the MSV for the non-working side of Group III showed significantly larger than those of Group I or Group II (P < .01). PMID:21503186

  8. A regular yoga intervention for staff nurse sleep quality and work stress: a randomised controlled trial.

    PubMed

    Fang, Ronghua; Li, Xia

    2015-12-01

    Although many studies have assessed the efficacy of yoga in older individuals, minimal research has focused on how nurses use yoga to improve sleep quality and to reduce work stress after work hours. We used the Pittsburgh Sleep Quality Index in Chinese and the Questionnaire on Medical Worker's Stress in Chinese to determine the impact of yoga on the quality of sleep and work stress of staff nurses employed by a general hospital in China. Disturbances in the circadian rhythm interrupt an individual's pattern of sleep. Convenient sampling method. One hundred and twenty nurses were randomised into two groups: a yoga group and a non-yoga group. The yoga group performed yoga more than two times every week for 50-60 minutes each time after work hours. The NG group did not participate in yoga. After six months, self-reported sleep quality and work stress were compared between the two groups, and then we used linear regression to confirm the independent factors related to sleep quality. Nurses in the yoga group had better sleep quality and lower work stress compared with nurses in the non-yoga group. The linear regression model indicated that nursing experience, age and yoga intervention were significantly related to sleep quality. Regular yoga can improve sleep quality and reduce work stress in staff nurses. This study provides evidence that hospital management should pay attention to nurse sleep quality and work stress, thereby taking corresponding measures to reduce work pressure and improve health outcomes. © 2015 John Wiley & Sons Ltd.

  9. The Association between Long Working Hours and Self-Rated Health.

    PubMed

    Song, Jun-Taek; Lee, Goeun; Kwon, Jongho; Park, Jung-Woo; Choi, Hyunrim; Lim, Sinye

    2014-01-20

    This study was conducted to determine the number of hours worked per week by full-time wage workers by using the data of the Korean Labor and Income Panel Study (KLIPS), which represents the domestic urban area household, and to determine the association between weekly working hours and the level of self-rated health. We used data from the 11th KLIPS conducted in 2008. The subjects of this study were 3,699 full-time wage workers between the ages of 25 and 64 years. The association between weekly working hours and self-rated health was analyzed considering socio-demographic characteristics, work environment, and health-related behaviors. Among the workers, 29.7% worked less than 40 hours per week; 39.7%, more than 40 to 52 hours; 19.7%, more than 52 to 60 hours; and 10.9%, more than 60 hours per week. After controlling for socio-demographic variables, work environment-related variables, and health-related behavior variables, the odds ratio (OR) for poor self-rated health for the group working more than 40 hours and up to 52 hours was calculated to be 1.06 (95% confidence interval (CI), 0.89-1.27) when the group working less than 40 hours per week was considered the reference. The OR for the group working more than 60 hours was 1.42 (95% CI, 1.10-1.83) and that for the group working more than 52 hours and up to 60 hours was 1.07 (95% CI, 0.86-1.33). After stratification by gender and tenure, the OR of the female workers group and that of the group with a tenure of more than 1 year were found to be significantly higher than those of the other groups. This study showed that workers working more than 60 hours per week have a significantly higher risk of poor self-rated health than workers working less than 40 hours per week. This effect was more obvious for the female workers group and the group with a tenure of more than 1 year. In the future, longitudinal studies may be needed to determine the association between long working hours and various health effects in Korean workers.

  10. The effect of skill mix in non-nursing assistants on work engagements among home visiting nurses in Japan.

    PubMed

    Naruse, Takashi; Taguchi, Atsuko; Kuwahara, Yuki; Nagata, Satoko; Sakai, Mahiro; Watai, Izumi; Murashima, Sachiyo

    2015-05-01

    This study evaluated the effect of a skill-mix programme intervention on work engagement in home visiting nurses. A skill-mix programme in which home visiting nurses are assisted by non-nursing workers is assumed to foster home visiting nurses' work engagement. Pre- and post-intervention evaluations of work engagement were conducted using self-administered questionnaires. A skill-mix programme was introduced in the intervention group of home visiting nurses. After 6 months, their pre- and post-intervention work engagement ratings were compared with those of a control group. Baseline questionnaires were returned by 174 home visiting nurses (44 in the intervention group, 130 in the control group). Post-intervention questionnaires were returned by 38 and 97 home visiting nurses from each group. The intervention group's average work engagement scores were 2.2 at baseline and 2.3 at post-intervention; the control group's were 3.3 and 2.6. Generalised linear regression showed significant between-group differences in score changes. The skill-mix programme might foster home visiting nurses' work engagement by improving the quality of care for each client. Future research is needed to explain the exact mechanisms that underlie its effectiveness. In order to improve the efficiency of services provided by home visiting nurses and foster their work engagement, skill-mix programmes might be beneficial. © 2014 John Wiley & Sons Ltd.

  11. Collaborative essay testing: group work that counts.

    PubMed

    Gallagher, Peggy A

    2009-01-01

    Because much of a nurse's work is accomplished through working in groups, nursing students need an understanding of group process as well as opportunities to problem-solve in groups. Despite an emphasis on group activities as critical for classroom learning, there is a lack of evidence in the nursing literature that describes collaborative essay testing as a teaching strategy. In this class, nursing students worked together in small groups to answer examination questions before submitting a common set of answers. In a follow-up survey, students reported that collaborative testing was a positive experience (e.g., promoting critical thinking, confidence in knowledge, and teamwork). Faculty were excited by the lively dialog heard during the testing in what appeared to be an atmosphere of teamwork. Future efforts could include providing nursing students with direct instruction on group process and more opportunities to work and test collaboratively.

  12. Group Work Education in Social Work: A Review of the Literature Reveals Possible Solutions

    ERIC Educational Resources Information Center

    LaRocque, Sarah E.

    2017-01-01

    This article examines the growing concerns in the literature that traditional group work education in social work is not providing the foundational knowledge, skills, evidence-based practice, professional uses of self, and adherence to practice standards necessary for effective group practice. An exploration of the best available evidence on group…

  13. The Use of Technology in Group-Work: A Situational Analysis of Students' Reflective Writing

    ERIC Educational Resources Information Center

    McKinney, Pamela; Sen, Barbara

    2016-01-01

    Group work is a powerful constructivist pedagogy for facilitating students' personal and professional development, but it can be difficult for students to work together in an academic context. The assessed reflective writings of undergraduate students studying Information Management are used as data in this exploration of the group work situation…

  14. The Multispectral Imaging Science Working Group. Volume 2: Working group reports

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    Summaries of the various multispectral imaging science working groups are presented. Current knowledge of the spectral and spatial characteristics of the Earth's surface is outlined and the present and future capabilities of multispectral imaging systems are discussed.

  15. 76 FR 21936 - Aviation Rulemaking Advisory Committee-New Task

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-19

    ... ARAC activity and solicits membership for the new Rulemaking Prioritization Working Group. FOR FURTHER... Rulemaking Prioritization Working Group will specifically address, in part, Recommendation 22: ``The... available * * *.'' The objective of the Rulemaking Prioritization Working Group is to provide advice and...

  16. 77 FR 69916 - Aviation Rulemaking Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... Expectations 3. Recommendation Reports a. Rulemaking Prioritization Working Group (RPWG) Recommendation Report (ARAC) b. Avionics Systems Harmonization Working Group--Low Speed Alerting, Phase 2 Recommendation Report (TAE) 4. Status Reports From Active Working Groups a. Airman Testing Standards and Training...

  17. Students' Use of the Interactive Whiteboard during Physics Group Work

    ERIC Educational Resources Information Center

    Mellingsaeter, Magnus Strøm; Bungum, Berit

    2015-01-01

    This paper presents a case study of how the interactive whiteboard (IWB) may facilitate collective meaning-making processes in group work in engineering education. In the case, first-year students attended group-work sessions as an organised part of a basic physics course at a Norwegian university college. Each student group was equipped with an…

  18. Power and Group Work in Physical Education: A Foucauldian Perspective

    ERIC Educational Resources Information Center

    Barker, Dean; Quennerstedt, Mikael

    2017-01-01

    Group work is used in physical education (PE) to encourage student-directed, collaborative learning. Aligned with this aim, group work is expected to shift some power from teacher to students and enable students to make decisions and co-construct meaning on their own. There are, however, very few investigations focusing on power in group work…

  19. Medical Team Training: Using Simulation as a Teaching Strategy for Group Work

    ERIC Educational Resources Information Center

    Moyer, Michael R.; Brown, Rhonda Douglas

    2011-01-01

    Described is an innovative approach currently being used to inspire group work, specifically a medical team training model, referred to as The Simulation Model, which includes as its major components: (1) Prior Training in Group Work of Medical Team Members; (2) Simulation in Teams or Groups; (3) Multidisciplinary Teamwork; (4) Team Leader…

  20. What Is Group Process?: Integrating Process Work into Psychoeducational Groups

    ERIC Educational Resources Information Center

    Mills, Bethany; McBride, Dawn Lorraine

    2016-01-01

    Process work has long been a tenet of successful counseling outcomes. However, there is little literature available that focuses on how to best integrate process work into group settings--particularly psychoeducational groups that are content heavy and most often utilized in a school setting. In this article, the authors provide an overview of the…

  1. Proceedings of the U.S. Geological Survey Interdisciplinary Microbiology Workshop, Estes Park, Colorado, October 15-17, 2008

    USGS Publications Warehouse

    Briggs, Kay Marano

    2010-01-01

    Preface A U.S. Geological Survey Interdisciplinary Microbiology Workshop was held in Estes Park, Colorado, on October 15-17, 2008. Participants came from all USGS regions and disciplines. This report contains abstracts from 36 presentations and 35 poster sessions and notes from 5 breakout sessions. The seven presentation topics follow: Ecology of wildlife and fish disease Mechanisms of fish and wildlife disease Microbial ecology Geographic patterns/visualization Public health and water quality Geomicrobiology Ecosystem function The six poster session topics follow: Wildlife disease Disease detection methods Water quality Microbial ecology Metabolic processes Tools and techniques Five working groups met in breakout sessions on October 16, 2008. The highlights for each working group are summarized in this report, and their goals are listed below: Working Group I: to plan a Fact Sheet on interdisciplinary microbiology in the USGS Working Group II: to plan a USGS interdisciplinary microbiology Web site Working Group III: to suggest ways to broadcast and publicize the types of microbiology conducted at the USGS Working Group IV: to identify emerging issues in USGS interdisciplinary microbiology research Working Group V: to identify potential opportunities for interdisciplinary microbiology work at the USGS After the workshop, the USGS interdisciplinary microbiology Web site was activated in June 2009 at http://microbiology.usgs.gov/.

  2. Working Memory Profiles in HIV-Exposed, Uninfected and HIV-Infected Children: A Comparison with Neurotypical Controls

    PubMed Central

    Milligan, Robyn; Cockcroft, Kate

    2017-01-01

    This study compared the working memory profiles of three groups of children, namely HIV-infected (HIV-I; n = 95), HIV-exposed, uninfected (HIV-EU; n = 86) and an HIV-unexposed, uninfected, (HIV-UU; n = 92) neurotypical control group. Working memory, an executive function, plays an important role in frontal lobe-controlled behaviors, such as motivation, planning, decision making, and social interaction, and is a strong predictor of academic success in school children. Memory impairments have been identified in HIV-I children, particularly in visuospatial processing. Verbal working memory has not been commonly investigated in this population, while it is unknown how the working memory profiles of HIV-EU children compare to their HIV-I and HIV-UU peers. Of interest was whether the working memory profiles of the HIV-EU children would be more similar to the HIV-I group or to the uninfected control group. The results revealed no significant differences in working memory performance between the HIV-I and HIV-EU groups. However, this does not mean that the etiology of the working memory deficits is the same in the two groups, as these groups showed important differences when compared to the control group. In comparison to the controls, the HIV-I group experienced difficulties with processing tasks irrespective of whether they drew on a verbal or visuospatial modality. This appears to stem from a generalized executive function deficit that also interferes with working memory. In the HIV-EU group, difficulties occurred with verbally based tasks, irrespective of whether they required storage or processing. For this group, the dual demands of complex processing and using a second language seem to result in demand exceeding capacity on verbal tasks. Both groups experienced the greatest difficulties with verbal processing tasks for these different reasons. Thus, disruption of different cognitive abilities could result in similar working memory profiles, as evidenced in this study. This has implications for the underlying developmental neurobiology of HIV-I and HIV-EU children, as well the choice of appropriate measures to assist affected children. PMID:28729828

  3. The relationship between different exercise modes and visuospatial working memory in older adults: a cross-sectional study.

    PubMed

    Guo, Wei; Wang, Biye; Lu, Yue; Zhu, Qin; Shi, Zhihao; Ren, Jie

    2016-01-01

    The purpose of the study was to investigate the relationship between different exercise modes and visuospatial working memory in healthy older adults. A cross-sectional design was adopted. A total of 111 healthy older adults were enrolled in the study. They were classified by the exercise-related questionnaire to be in an open-skill group, closed-skill group or sedentary group. In experiment 1, the participants performed a visuospatial working memory task. The results indicated that both closed-skill (p < 0.05) and open-skill (p < 0.01) groups reached a higher accuracy than the sedentary group. Experiment 2 examined whether the exercise-induced benefit of working memory was manifested in passive maintenance or active manipulation of working memory which was assessed by visuospatial short-term memory task and visuospatial mental rotation task, respectively. The results showed that the open-skill (p < 0.01) group was more accurate than the sedentary group in the visuospatial short-term memory task, whereas the group difference in the visuospatial mental rotation task was not significant. These findings combined to suggest that physical exercise was associated with better visuospatial working memory in older adults. Furthermore, open-skill exercises that demand higher cognitive processing showed selective benefit for passive maintenance of working memory.

  4. The mountains hold things in: the use of community research review work groups to address cancer disparities in Appalachia.

    PubMed

    Hutson, Sadie P; Dorgan, Kelly A; Phillips, Amber N; Behringer, Bruce

    2007-11-01

    To review regional findings about cancer disparities with grass roots community leaders in Appalachia and to identify perspectives about what makes the cancer experience unique in Appalachia. A community-based participatory approach that includes focus-group methodology. Work groups gathered in well-known community locations in northeastern Tennessee and southwestern Virginia. 22 lay adult community members (12 in Tennessee and 10 in Virginia), all of whom had a personal and community interest in cancer and were reputed as informal community leaders. Work groups engaged in a series of five sequential sessions designed to (a) review regional data about cancer disparities and identify perspectives about what makes the cancer experience unique in Appalachia, (b) promote dialogue between the work group members and healthcare providers to identify methods for improved collaboration, and (c) integrate the work group with regional efforts of the states' comprehensive cancer control plans. Four major themes emerged from the focus group sessions with each work group: cancer storytelling, cancer collectivism, healthcare challenges, and cancer expectations. The community research review work groups proved to be a successful method to disseminate information about regional cancer disparities. Study findings provide a unique foundation so that healthcare providers and researchers can begin to address cancer disparities in the Appalachian region. Nurses are in key positions to partner with trusted community leaders to address disparities across the cancer continuum in Appalachia.

  5. Albeni Falls Wildlife Mitigation : Annual Report 2002.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terra-Berns, Mary

    The Albeni Falls Interagency Work Group continued to actively engage in implementing wildlife mitigation actions in 2002. Regular Work Group meetings were held to discuss budget concerns affecting the Albeni Falls Wildlife Mitigation Program, to present potential acquisition projects, and to discuss and evaluate other issues affecting the Work Group and Project. Work Group members protected 1,386.29 acres of wildlife habitat in 2002. To date, the Albeni Falls project has protected approximately 5,914.31 acres of wildlife habitat. About 21% of the total wildlife habitat lost has been mitigated. Administrative activities have increased as more properties are purchased and continue tomore » center on restoration, operation and maintenance, and monitoring. In 2001, Work Group members focused on development of a monitoring and evaluation program as well as completion of site-specific management plans. This year the Work Group began implementation of the monitoring and evaluation program performing population and plant surveys, data evaluation and storage, and map development as well as developing management plans. Assuming that the current BPA budget restrictions will be lifted in the near future, the Work Group expects to increase mitigation properties this coming year with several potential projects.« less

  6. [Quality of life at work and quality of work].

    PubMed

    Bonnefond, Jean-Yves; Clot, Yves

    2011-10-01

    Unease at work is the consequence of a growing difficulty in carrying out high quality work based on performance criteria. Healthcare professionals are well placed to highlight these criteria which can be discussed within work groups. The aim of these groups is to work towards compromises combining efficiency and health.

  7. The teacher's role in promoting collaborative dialogue in the classroom.

    PubMed

    Webb, Noreen M

    2009-03-01

    Research on student-led small-group learning in schools going back nearly four decades has documented many types of student participation that promote learning. Less is known about how the teacher can foster effective groupwork behaviours. This paper reviews research that explores the role of the teacher in promoting learning in small groups. The focus is on how students can learn from their peers during small-group work, how teachers can prepare students for collaborative group work, and the role of teacher discourse and classroom norms in small-group dialogue. Studies selected for review focused on student-led small-group contexts for learning in which students were expected to collaborate, reported data from systematic observations of group work, and linked observational data to teacher practices and student learning outcomes. This review uncovered multiple dimensions of the teacher's role in fostering beneficial group dialogue, including preparing students for collaborative work, forming groups, structuring the group-work task, and influencing student interaction through teachers' discourse with small groups and with the class. Common threads through the research are the importance of students explaining their thinking, and teacher strategies and practices that may promote student elaboration of ideas.

  8. A Dozen Reasons Some Meetings Bomb--And Others Work Wonders.

    ERIC Educational Resources Information Center

    Bradford, Leland P.

    1978-01-01

    Effective meetings require three simultaneous operations: task activity (working on the agenda), maintenance (keeping the group in good working order), and team building (strengthening group's capacity to face future issues successfully). Discusses twelve reasons for group dysfunction during meetings. (EM)

  9. An Independent Evaluation of the Switching Operations Facility Analysis 2010 Working Group's Processes

    DOT National Transportation Integrated Search

    2009-12-01

    The Switching Operations Fatality Analysis (SOFA) Working Group was formed to analyze the factors contributing to fatalities in switching operations. The 2010 Working Group invited an independent team of evaluators to assess the thoroughness of the S...

  10. Group Work

    ERIC Educational Resources Information Center

    Wilson, Kristy J.; Brickman, Peggy; Brame, Cynthia J.

    2018-01-01

    Science, technology, engineering, and mathematics faculty are increasingly incorporating both formal and informal group work in their courses. Implementing group work can be improved by an understanding of the extensive body of educational research studies on this topic. This essay describes an online, evidence-based teaching guide published by…

  11. 77 FR 50155 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-20

    ...-FF08EACT00] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California...

  12. 77 FR 45370 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ...-FF08EACT00] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California...

  13. 77 FR 30314 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ...-FF08EACT00] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California...

  14. 77 FR 10766 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ...-FF08EACT00] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California...

  15. 77 FR 74203 - Trinity Adaptive Management Working Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ...-FF08EACT00] Trinity Adaptive Management Working Group AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of meeting. SUMMARY: The Trinity Adaptive Management Working Group (TAMWG) affords stakeholders the opportunity to give policy, management, and technical input concerning Trinity River (California...

  16. A health system program to reduce work disability related to musculoskeletal disorders.

    PubMed

    Abásolo, Lydia; Blanco, Margarita; Bachiller, Javier; Candelas, Gloria; Collado, Paz; Lajas, Cristina; Revenga, Marcelino; Ricci, Patricia; Lázaro, Pablo; Aguilar, Maria Dolores; Vargas, Emilio; Fernández-Gutiérrez, Benjamín; Hernández-García, César; Carmona, Loreto; Jover, Juan A

    2005-09-20

    Musculoskeletal disorders (MSDs) are a frequent cause of work disability, accounting for productivity losses in industrialized societies equivalent to 1.3% of the U.S. gross national product. To evaluate whether a population-based clinical program offered to patients with recent-onset work disability caused by MSDs is cost-effective. Randomized, controlled intervention study. The inclusion and follow-up periods each lasted 12 months. Three health districts in Madrid, Spain. All patients with MSD-related temporary work disability in 1998 and 1999. The control group received standard primary care management, with referral to specialized care if needed. The intervention group received a specific program, administered by rheumatologists, in which care was delivered during regular visits and included 3 main elements: education, protocol-based clinical management, and administrative duties. Efficacy variables were 1) days of temporary work disability and 2) number of patients with permanent work disability. All analyses were done on an intention-to-treat basis. 1,077 patients were included in the study, 7805 in the control group and 5272 in the intervention group, generating 16,297 episodes of MSD-related temporary work disability. These episodes were shorter in the intervention group than in the control group (mean, 26 days compared with 41 days; P < 0.001), and the groups had similar numbers of episodes per patient. Fewer patients received long-term disability compensation in the intervention group (n = 38 [0.7%]) than in the control group (n = 99 [1.3%]) (P < 0.005). Direct and indirect costs were lower in the intervention group than in the control group. To save 1 day of temporary work disability, 6.00 dollars had to be invested in the program. Each dollar invested generated a benefit of 11.00 dollars. The program's net benefit was in excess of 5 million dollars. The study was unblinded. Implementation of the program, offered to the general population, improves short- and long-term work disability outcomes and is cost-effective.

  17. Effects on employees of controlling working hours and working schedules.

    PubMed

    Kubo, T; Takahashi, M; Togo, F; Liu, X; Shimazu, A; Tanaka, K; Takaya, M

    2013-03-01

    High levels of control over working time and low variability in working hours have been associated with improved health-related outcomes. The potential mechanisms for this association remain unclear. To examine how work-time control and variability of working times are associated with fatigue recovery, sleep quality, work-life balance, and 'near misses' at work. Manufacturing sector employees completed a questionnaire that assessed work-time control, work-time variability, fatigue recovery, sleep quality, work-life balance and the frequency of near misses in the past 6 months. Mixed model analysis of covariance and multiple logistic regression analysis tested the main effects of work-time control and variability and their interaction, while adjusting for age, sex, work schedules, and overtime work in the past month. Subscales of work-time control were also investigated (control over daily working hours and over days off). One thousand three hundred and seventy-two completed questionnaires were returned, a response rate of 69%. A significantly higher quality of sleep and better work-life balance were found in the 'high control with low variability' reference group than in the other groups. Significantly better recovery of fatigue was also observed in the group having control over days off with low variability. While near misses were more frequent in the group with high control over daily working hours coupled with high variability compared with the reference group this was not significant. High work-time control and low variability were associated with favourable outcomes of health and work-life balance. This combined effect was not observed for the safety outcome addressed here.

  18. Group Selection Methods and Contribution to the West Point Leadership Development System (WPLDS)

    DTIC Science & Technology

    2015-08-01

    Government. 14. ABSTRACT Group work in an academic setting can consist of projects or problems students can work on collaboratively. Although pedagogical ...ABSTRACT Group work in an academic setting can consist of projects or problems students can work on collaboratively. Although pedagogical studies...helping students develop intangibles like communication, time management, organization, leadership, interpersonal, and relationship skills. Supporting

  19. Group Work with Adolescents: Principles and Practice. Second Edition. Social Work Practice with Children and Families

    ERIC Educational Resources Information Center

    Malekoff, Andrew

    2006-01-01

    This popular text provides essential knowledge and skills for conducting creative, strengths-based group work with adolescents. A rich introduction to the field, enlivened by numerous illustrations from actual sessions, the book provides principles and guidelines for practice in a wide range of settings. The book covers all phases of group work,…

  20. Rehabilitation using high-intensity physical training and long-term return-to-work in cancer survivors.

    PubMed

    Thijs, Karin M; de Boer, Angela G E M; Vreugdenhil, Gerard; van de Wouw, Agnès J; Houterman, Saskia; Schep, Goof

    2012-06-01

    Due to large and increasing numbers of cancer survivors, long-term cancer-related health issues have become a major focus of attention. This study examined the relation between a high-intensity physical rehabilitation program and return-to-work in cancer survivors who had received chemotherapy. The intervention group, consisting of 72 cancer survivors from one hospital (8 men and 64 women, mean age 49 years), followed an 18-weeks rehabilitation program including strength and interval training, and home-based activities. An age-matched control group, consisting of 38 cancer survivors (9 men and 29 women), was recruited from two other hospitals. They received only standard medical care. All subjects were evaluated during a telephone interview on employment issues, conducted at ±3 years after diagnosis. The main outcomes were change in working hours per week and time until return-to-work. Patients in the intervention group showed significant less reduction in working hours per week [-5.0 h/week vs. -10.8 h/week (P = .03)]. Multivariate analyses showed that the training intervention, the age of patients, and the number of working hours pre-diagnosis could explain the improvement in long-term participation at work. Time until (partial) return-to-work was 11.5 weeks for the intervention group versus 13.2 weeks for the control group (P = .40). On long-term follow-up, 78% of the participants from the intervention group versus 66% from the control group had returned to work on the pre-diagnosis level of working hours (P = .18). Rehabilitation using high-intensity physical training is useful for working patients to minimize the decreased ability to work resulting from cancer and its treatment.

  1. Are we on the same page? The performance effects of congruence between supervisor and group trust.

    PubMed

    Carter, Min Z; Mossholder, Kevin W

    2015-09-01

    Taking a multiple-stakeholder perspective, we examined the effects of supervisor-work group trust congruence on groups' task and contextual performance using a polynomial regression and response surface analytical framework. We expected motivation experienced by work groups to mediate the positive influence of trust congruence on performance. Although hypothesized congruence effects on performance were more strongly supported for affective rather than for cognitive trust, we found significant indirect effects on performance (via work group motivation) for both types of trust. We discuss the performance effects of trust congruence and incongruence between supervisors and work groups, as well as implications for practice and future research. (c) 2015 APA, all rights reserved).

  2. Cognitive Limitations at Work Among Employed Breast Cancer Survivors in China.

    PubMed

    Zeng, Yingchun; Cheng, Andy S K; Feuerstein, Michael

    This study aimed to determine whether levels of distress (anxiety and depression) and cognitive symptoms at work are related to work productivity and quality of life (QOL) in Chinese breast cancer survivors (BCS), compared to a group of Chinese women without cancer but with different musculoskeletal pain related to work. This study used a cross-sectional study design. Working BCS were recruited in a tumor hospital's outpatient department, and women with no history of cancer (noncancer comparison [NCC] group) were recruited from a rehabilitation center. A total of 412 participants were included. Multiple regression analyses indicated that higher anxiety was associated with work limitations (B = .005, p = .014) and QOL (B = 2.417, p = .004) in the BCS group only. Cognitive limitations at work were associated with work limitations (B = .002, p = .001) and QOL (B = 1.022, p = .003) in the BCS group only. Depressive symptoms (B = .028, p = .017) were significantly associated with work limitations in the NCC group. Breast cancer survivors reported higher levels of cognitive limitations at work and anxiety, lower levels of work productivity, and QOL. When remaining at work is a viable option for the cancer survivor with cognitive limitations at work, the rehabilitation nurse should consider approaches to best accommodate the specific cognitive limitations and work tasks, as well as help the patient manage associated anxiety when present.

  3. Albeni Falls Wildlife Mitigation Project, 2008 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soults, Scott

    The Albeni Falls Interagency Work Group (AFIWG) was actively involved in implementing wildlife mitigation activities in late 2007, but due to internal conflicts, the AFIWG members has fractionated into a smaller group. Implementation of the monitoring and evaluation program continued across protected lands. As of 2008, The Albeni Falls Interagency Work Group (Work Group) is a coalition comprised of wildlife managers from three tribal entities (Kalispel Tribe, Kootenai Tribe, Coeur d Alene Tribe) and the US Army Corps of Engineers. The Work Group directs where wildlife mitigation implementation occurs in the Kootenai, Pend Oreille and Coeur d Alene subbasins. Themore » Work Group is unique in the Columbia Basin. The Columbia Basin Fish and Wildlife Authority (CBFWA) wildlife managers in 1995, approved what was one of the first two project proposals to implement mitigation on a programmatic basis. The maintenance of this kind of approach through time has allowed the Work Group to implement an effective and responsive habitat protection program by reducing administrative costs associated with site-specific project proposals. The core mitigation entities maintain approximately 9,335 acres of wetland/riparian habitats in 2008.« less

  4. Dyadic Collaboration among Preschool-Age Children and the Benefits of Working with a More Socially Advanced Peer

    ERIC Educational Resources Information Center

    Park, Jeongeon; Lee, Jeonghwa

    2015-01-01

    Research Findings: This study examined the learning effects of collaborative group work under heterogeneous group composition among 5-year-old children, especially in terms of their social skills. To this end, the study utilized an experimental research design wherein 3 groups of differently composed dyads and a group of students who worked alone…

  5. A Standards-Based Inventory for Assessing Perceived Importance of and Confidence in Using ASGW's Core Group Work Skills

    ERIC Educational Resources Information Center

    Wilson, F. Robert; Newmeyer, Mark D.

    2008-01-01

    Since the early 1980s, ASGW (Association for Specialists in Group Work) has promulgated standards for training group workers. Now, in their third revision, these standards establish core group work knowledge and skills to be included in all counselor training programs. To advance research on the relationship between mastery of ASGW's core…

  6. Keep It Positive: Using Student Goals and Appraisals to Inform Small Group Work in Science

    ERIC Educational Resources Information Center

    Woods-McConney, Amanda; Wosnitza, Marold; Donetta, Kevin

    2011-01-01

    In teaching science, small group work is often recommended and frequently used. In this study, we asked 130 students about their personal goals and views (appraisals) of small group work in science. We found significant relationships between students' personal goals and their views of doing science in small groups. We discuss the practical…

  7. A Comparative Study of Effectiveness of Peer Assessment of Individuals' Contributions to Group Projects in Undergraduate Construction Management Core Units

    ERIC Educational Resources Information Center

    Jin, Xiao-Hua

    2012-01-01

    In recent years, various forms of group work have been introduced in university courses across various subject domains, including construction management courses. Although the use of group work in higher education has sound pedagogical reasons and advantages, group work has its own drawbacks. Therefore, the acceptance by students and the success…

  8. Group Work in the MBA Classroom: Improving Pedagogical Practice and Maximizing Positive Outcomes with Part-Time MBA Students

    ERIC Educational Resources Information Center

    Rafferty, Patricia D.

    2013-01-01

    This article forms part of an exploration into how graduate students experience group work. A single case, embedded study was completed in 2011, which reveals insight and understanding into the manner in which part-time MBA students experience group work assignments and how these experiences contribute to their perception of positive group work…

  9. Working Conditions and Mental Health of Nursing Staff in Nursing Homes

    PubMed Central

    Zhang, Yuan; Punnett, Laura; Mawn, Barbara; Gore, Rebecca

    2018-01-01

    Nursing staff in nursing homes suffer from poor mental health, probably associated with stressful working conditions. Working conditions may distribute differently among nursing assistants, licensed practical nurses, and registered nurses due to their different levels in the organizational hierarchy. The objectives of this study were to evaluate the association between working conditions and mental health among different nursing groups, and examine the potential moderating effect of job group on this association. Self-administered questionnaires were collected with 1,129 nursing staff in 15 for-profit non-unionized nursing homes. Working conditions included both physical and psychosocial domains. Multivariate linear regression modeling found that mental health was associated with different working conditions in different nursing groups: physical safety (β = 2.37, p < 0.05) and work-family conflict (β = –2.44, p < 0.01) in NAs; work-family conflict (β = –4.17, p < 0.01) in LPNs; and physical demands (β = 10.54, p < 0.05) in RNs. Job group did not moderate the association between working conditions and mental health. Future workplace interventions to improve mental health should reach to nursing staff at different levels and consider tailored working condition interventions in different nursing groups. PMID:27104634

  10. Working Conditions and Mental Health of Nursing Staff in Nursing Homes.

    PubMed

    Zhang, Yuan; Punnett, Laura; Mawn, Barbara; Gore, Rebecca

    2016-07-01

    Nursing staff in nursing homes suffer from poor mental health, probably associated with stressful working conditions. Working conditions may distribute differently among nursing assistants, licensed practical nurses, and registered nurses due to their different levels in the organizational hierarchy. The objectives of this study were to evaluate the association between working conditions and mental health among different nursing groups, and examine the potential moderating effect of job group on this association. Self-administered questionnaires were collected with 1,129 nursing staff in 15 for-profit non-unionized nursing homes. Working conditions included both physical and psychosocial domains. Multivariate linear regression modeling found that mental health was associated with different working conditions in different nursing groups: physical safety (β = 2.37, p < 0.05) and work-family conflict (β = -2.44, p < 0.01) in NAs; work-family conflict (β = -4.17, p < 0.01) in LPNs; and physical demands (β = 10.54, p < 0.05) in RNs. Job group did not moderate the association between working conditions and mental health. Future workplace interventions to improve mental health should reach to nursing staff at different levels and consider tailored working condition interventions in different nursing groups.

  11. 78 FR 48029 - Improving Chemical Facility Safety and Security

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... responding to risks in chemical facilities (including during pre-inspection, inspection execution, post.... Sec. 2. Establishment of the Chemical Facility Safety and Security Working Group. (a) There is established a Chemical Facility Safety and Security Working Group (Working Group) co-chaired by the Secretary...

  12. Challenges Facing Group Work Online

    ERIC Educational Resources Information Center

    Chang, Bo; Kang, Haijun

    2016-01-01

    Online group work can be complicated because of its asynchronous characteristics and lack of physical presence, and its requirements for skills in handling technology, human relationships, and content-related tasks. This study focuses on the administrative, logistical and relationship-related challenges in online group work. Challenges in areas…

  13. 75 FR 21602 - Online Safety and Technology Working Group Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-26

    ... OSTWG is tasked with evaluating industry efforts to promote a safe online environment for children. The... and Technology Working Group Meeting AGENCY: National Telecommunications and Information... public meeting of the Online Safety and Technology Working Group (OSTWG). DATES: The meeting will be held...

  14. 78 FR 23329 - Aircraft Access to SWIM Working Group Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-18

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Aircraft Access to SWIM Working Group... in FAA NextGen technologies to attend and participate in an Aircraft Access to SWIM Working Group... information environment. The AAtS initiative will utilize commercial air/ground network providers...

  15. Use of Beach Shoes for Foot Protection during the Bangkok Flood of 2011.

    PubMed

    Waikakul, Saranatra

    2013-03-01

    Foot injury was common as a result of the Bangkok flood of 2011. In the future, this type of injury should be prevented to lessen the burden during a disaster. The study was performed to ascertain what type of footwear is appropriate for volunteer rescue workers during a flood. The study was carried out during the flood in November 2011 at Siriraj Hospital. There were 15 volunteers enrolled in the study. None of the volunteers had any foot deformity or injury before the study. Participants were divided into 3 groups of 5 volunteers: group A, the barefoot group; group B, the high top shoe group; and group C, the beach shoe group. All volunteers worked in the areas close to Siriraj Hospital and were followed up after 5 days of rescue work. Prevalence of foot and ankle injuries, satisfaction regarding work conditions and willingness to use the shoes were subjectively evaluated. Wearing of beach shoes during rescue was satisfactory during the early phase of the flood. The age range of volunteers was 20-28. In the group A, most volunteers were barely satisfied with conducting rescue work in water with bare feet, that bare feet were good for working on a wet surface and were 'just satisfied' to not satisfied that bare feet were good for work on dry surfaces. In group B, most of the volunteers had opinions similar to group A with the exception that they felt better while they were working on dry surfaces. In group C, most volunteers were significantly more satisfied under all three conditions. Foot injury occurred in 2 volunteers from group A. Beach shoes offer adequate foot protection during flood rescue.

  16. Work-related self-efficacy as a moderator of the impact of a worksite stress management training intervention: Intrinsic work motivation as a higher order condition of effect.

    PubMed

    Lloyd, Joda; Bond, Frank W; Flaxman, Paul E

    2017-01-01

    Employees with low levels of work-related self-efficacy may stand to benefit more from a worksite stress management training (SMT) intervention. However, this low work-related self-efficacy/enhanced SMT benefits effect may be conditional on employees also having high levels of intrinsic work motivation. In the present study, we examined this proposition by testing three-way, or higher order, interaction effects. One hundred and fifty-three U.K. government employees were randomly assigned to a SMT intervention group (n = 68), or to a waiting list control group (n = 85). The SMT group received three half-day training sessions spread over two and a half months. Findings indicated that there were significant overall reductions in psychological strain, emotional exhaustion and depersonalization in the SMT group, in comparison to the control group. Furthermore, there were significant higher order Group (SMT vs. control) × Time 1 Work-Related Self-Efficacy × Time 1 Intrinsic Work Motivation interactions, such that reductions in emotional exhaustion and depersonalization at certain time points were experienced only by those who had low baseline levels of work-related self-efficacy and high baseline levels of intrinsic work motivation. Implications for work-related self-efficacy theory and research and SMT research and practice are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. New Developments in Group Counseling.

    ERIC Educational Resources Information Center

    Gladding, Samuel T., Ed.

    Group counseling is a rapidly changing field. This collection of 31 digests examines various aspects of group process and group counseling. The digests are arranged under different subject headings. In section 1, the nature of group work is examined, along with the evolution of group work training since 1990. The second section looks at…

  18. Post-Disaster Social Justice Group Work and Group Supervision

    ERIC Educational Resources Information Center

    Bemak, Fred; Chung, Rita Chi-Ying

    2011-01-01

    This article discusses post-disaster group counseling and group supervision using a social justice orientation for working with post-disaster survivors from underserved populations. The Disaster Cross-Cultural Counseling model is a culturally responsive group counseling model that infuses social justice into post-disaster group counseling and…

  19. Group relationships in early and late sessions and improvement in interpersonal problems.

    PubMed

    Lo Coco, Gianluca; Gullo, Salvatore; Di Fratello, Carla; Giordano, Cecilia; Kivlighan, Dennis M

    2016-07-01

    Groups are more effective when positive bonds are established and interpersonal conflicts resolved in early sessions and work is accomplished in later sessions. Previous research has provided mixed support for this group development model. We performed a test of this theoretical perspective using group members' (actors) and aggregated group members' (partners) perceptions of positive bonding, positive working, and negative group relationships measured early and late in interpersonal growth groups. Participants were 325 Italian graduate students randomly (within semester) assigned to 1 of 16 interpersonal growth groups. Groups met for 9 weeks with experienced psychologists using Yalom and Leszcz's (2005) interpersonal process model. Outcome was assessed pre- and posttreatment using the Inventory of Interpersonal Problems, and group relationships were measured at Sessions 3 and 6 using the Group Questionnaire. As hypothesized, early measures of positive bonding and late measures of positive working, for both actors and partners, were positively related to improved interpersonal problems. Also as hypothesized, late measures of positive bonding and early measures of positive working, for both actors and partners, were negatively related to improved interpersonal problems. We also found that early actor and partner positive bonding and negative relationships interacted to predict changes in interpersonal problems. The findings are consistent with group development theory and suggest that group therapists focus on group-as-a-whole positive bonding relationships in early group sessions and on group-as-a-whole positive working relationships in later group sessions. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. [A Group Cognitive-Behavioural Intervention to Prevent Depression Relapse in Individuals Having Recently Returned to Work: Protocol and Feasibility].

    PubMed

    Lecomte, Tania; Corbière, Marc

    Workplace depression is one of the major causes for sick leave and loss of productivity at work. Many studies have investigated factors predicting return to work for people with depression, including studies evaluating return to work programs and organizational factors. Yet, a paucity of studies have targeted the prevention of depressive relapses at work, even though more than half of those having had a depression will have a depressive relapse in the near future.Objectives This article describes a research protocol involving a novel group intervention based on cognitive behavioural principles with the aim to optimize return to work and diminish risk of depressive relapses.Method This pilot study follows a randomized controlled trial design, with half the participants (N=25) receiving the group intervention and the other half (N=25) receiving usual services. The theoretical and empirical underpinnings of the intervention are described, along with a detailed presentation of the intervention and of the study's objectives. The group intervention consists of 8 sessions whereby Cognitive behavioural therapy (CBT) principles and techniques are applied to the following themes: (1) Coping with stress at work; (2) Recognizing and modifying my dysfunctional beliefs linked to work; (3) Overcoming obstacles linked to work functioning and maintaining work; (4) Negotiating needed work adjustments with the support of the immediate supervisor; (5) Finding my strengths and competencies related to work; (6) Accepting criticism and asserting myself appropriately at work; (7) Uncovering my best coping strategies for work.Results Qualitative information pertaining to the first two cohorts' participants' subjective appreciation of the group experience revealed that the intervention was perceived as very useful by all, with group support, namely harmony and interpersonal support, as well as CBT strategies being mentioned specifically.Conclusion Finally, the potential relevance of the group intervention will be brought forward.

Top