Sample records for sample curation facility

  1. Recovery, Transportation and Acceptance to the Curation Facility of the Hayabusa Re-Entry Capsule

    NASA Technical Reports Server (NTRS)

    Abe, M.; Fujimura, A.; Yano, H.; Okamoto, C.; Okada, T.; Yada, T.; Ishibashi, Y.; Shirai, K.; Nakamura, T.; Noguchi, T.; hide

    2011-01-01

    The "Hayabusa" re-entry capsule was safely carried into the clean room of Sagamihara Planetary Sample Curation Facility in JAXA on June 18, 2010. After executing computed tomographic (CT) scanning, removal of heat shield, and surface cleaning of sample container, the sample container was enclosed into the clean chamber. After opening the sample container and residual gas sampling in the clean chamber, optical observation, sample recovery, sample separation for initial analysis will be performed. This curation work is continuing for several manths with some selected member of Hayabusa Asteroidal Sample Preliminary Examination Team (HASPET). We report here on the 'Hayabusa' capsule recovery operation, and transportation and acceptance at the curation facility of the Hayabusa re-entry capsule.

  2. Sample Transport for a European Sample Curation Facility

    NASA Astrophysics Data System (ADS)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  3. EURO-CARES as Roadmap for a European Sample Curation Facility

    NASA Astrophysics Data System (ADS)

    Brucato, J. R.; Russell, S.; Smith, C.; Hutzler, A.; Meneghin, A.; Aléon, J.; Bennett, A.; Berthoud, L.; Bridges, J.; Debaille, V.; Ferrière, L.; Folco, L.; Foucher, F.; Franchi, I.; Gounelle, M.; Grady, M.; Leuko, S.; Longobardo, A.; Palomba, E.; Pottage, T.; Rettberg, P.; Vrublevskis, J.; Westall, F.; Zipfel, J.; Euro-Cares Team

    2018-04-01

    EURO-CARES is a three-year multinational project funded under the European Commission Horizon2020 research program to develop a roadmap for a European Extraterrestrial Sample Curation Facility for samples returned from solar system missions.

  4. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Coleff, D. M.; McCubbin, F. M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (hereafter JSC curation) is the past, present, and future home of all of NASA's astromaterials sample collections. JSC curation currently houses all or part of nine different sample collections: (1) Apollo samples (1969), (2) Lunar samples (1972), (3) Antarctic meteorites (1976), (4) Cosmic Dust particles (1981), (5) Microparticle Impact Collection (1985), (6) Genesis solar wind atoms (2004); (7) Stardust comet Wild-2 particles (2006), (8) Stardust interstellar particles (2006), and (9) Hayabusa asteroid Itokawa particles (2010). Each sample collection is housed in a dedicated clean room, or suite of clean rooms, that is tailored to the requirements of that sample collection. Our primary goals are to maintain the long-term integrity of the samples and ensure that the samples are distributed for scientific study in a fair, timely, and responsible manner, thus maximizing the return on each sample. Part of the curation process is planning for the future, and we also perform fundamental research in advanced curation initiatives. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of sample collections, or getting new results from existing sample collections [2]. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples. As part of these advanced curation efforts we are augmenting our analytical facilities as well. A micro X-Ray computed tomography (micro-XCT) laboratory dedicated to the study of astromaterials will be coming online this spring within the JSC Curation office, and we plan to add additional facilities that will enable nondestructive (or minimally-destructive) analyses of astromaterials in the near future (micro-XRF, confocal imaging Raman Spectroscopy). These facilities will be available to: (1) develop sample handling and storage techniques for future sample return missions; (2) be utilized by PET for future sample return missions; (3) be used for retroactive PET (Positron Emission Tomography)-style analyses of our existing collections; and (4) for periodic assessments of the existing sample collections. Here we describe the new micro-XCT system, as well as some of the ongoing or anticipated applications of the instrument.

  5. Curating NASA's Past, Present, and Future Astromaterial Sample Collections

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Allton, J. H.; Evans, C. A.; Fries, M. D.; McCubbin, F. M.; Nakamura-Messenger, K.; Righter, K.; Zolensky, M.; Stansbery, E. K.

    2016-01-01

    The Astromaterials Acquisition and Curation Office at NASA Johnson Space Center (hereafter JSC curation) is responsible for curating all of NASA's extraterrestrial samples. JSC presently curates 9 different astromaterials collections in seven different clean-room suites: (1) Apollo Samples (ISO (International Standards Organization) class 6 + 7); (2) Antarctic Meteorites (ISO 6 + 7); (3) Cosmic Dust Particles (ISO 5); (4) Microparticle Impact Collection (ISO 7; formerly called Space-Exposed Hardware); (5) Genesis Solar Wind Atoms (ISO 4); (6) Stardust Comet Particles (ISO 5); (7) Stardust Interstellar Particles (ISO 5); (8) Hayabusa Asteroid Particles (ISO 5); (9) OSIRIS-REx Spacecraft Coupons and Witness Plates (ISO 7). Additional cleanrooms are currently being planned to house samples from two new collections, Hayabusa 2 (2021) and OSIRIS-REx (2023). In addition to the labs that house the samples, we maintain a wide variety of infra-structure facilities required to support the clean rooms: HEPA-filtered air-handling systems, ultrapure dry gaseous nitrogen systems, an ultrapure water system, and cleaning facilities to provide clean tools and equipment for the labs. We also have sample preparation facilities for making thin sections, microtome sections, and even focused ion-beam sections. We routinely monitor the cleanliness of our clean rooms and infrastructure systems, including measurements of inorganic or organic contamination, weekly airborne particle counts, compositional and isotopic monitoring of liquid N2 deliveries, and daily UPW system monitoring. In addition to the physical maintenance of the samples, we track within our databases the current and ever changing characteristics (weight, location, etc.) of more than 250,000 individually numbered samples across our various collections, as well as more than 100,000 images, and countless "analog" records that record the sample processing records of each individual sample. JSC Curation is co-located with JSC's Astromaterials Research Office, which houses a world-class suite of analytical instrumentation and scientists. We leverage these labs and personnel to better curate the samples. Part of the cu-ration process is planning for the future, and we refer to these planning efforts as "advanced curation". Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of collections as envi-sioned by NASA exploration goals. We are (and have been) planning for future cu-ration, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples.

  6. Mars Sample Handling and Requirements Panel (MSHARP)

    NASA Technical Reports Server (NTRS)

    Carr, Michael H.; McCleese, Daniel J.; Bada, Jeffrey L.; Bogard, Donald D.; Clark, Benton C.; DeVincenzi, Donald; Drake, Michael J.; Nealson, Kenneth H.; Papike, James J.; Race, Margaret S.; hide

    1999-01-01

    In anticipation of the return of samples from Mars toward the end of the first decade of the next century, NASA's Office of Space Sciences chartered a panel to examine how Mars samples should be handled. The panel was to make recommendations in three areas: (1) sample collection and transport back to Earth; (2) certification of the samples as nonhazardous; and (3) sample receiving, curation, and distribution. This report summarizes the findings of that panel. The samples should be treated as hazardous until proven otherwise. They are to be sealed within a canister on Mars, and the canister is not to be opened until within a Biosafety Hazard Level 4 (BSL-4) containment facility here on Earth. This facility must also meet or exceed the cleanliness requirements of the Johnson Space Center (JSC) facility for curation of extraterrestrial materials. A containment facility meeting both these requirements does not yet exist. Hazard assessment and life detection experiments are to be done at the containment facility, while geochemical characterization is being performed on a sterilized subset of the samples released to the science community. When and if the samples are proven harmless, they are to be transferred to a curation facility, such as that at JSC.

  7. Planning Related to the Curation and Processing of Returned Martian Samples

    NASA Astrophysics Data System (ADS)

    McCubbin, F. M.; Harrington, A. D.

    2018-04-01

    Many of the planning activities in the NASA Astromaterials Acquisition and Curation Office at JSC are centered around Mars Sample Return. The importance of contamination knowledge and the benefits of a mobile/modular receiving facility are discussed.

  8. EURO-CARES: European Roadmap for a Sample Return Curation Facility and Planetary Protection Implications.

    NASA Astrophysics Data System (ADS)

    Brucato, John Robert

    2016-07-01

    A mature European planetary exploration program and evolving sample return mission plans gathers the interest of a wider scientific community. The interest is generated from studying extraterrestrial samples in the laborato-ry providing new opportunities to address fundamental issues on the origin and evolution of the Solar System, on the primordial cosmochemistry, and on the nature of the building blocks of terrestrial planets and on the origin of life. Major space agencies are currently planning for missions that will collect samples from a variety of Solar Sys-tem environments, from primitive (carbonaceous) small bodies, from the Moon, Mars and its moons and, final-ly, from icy moons of the outer planets. A dedicated sample return curation facility is seen as an essential re-quirement for the receiving, assessment, characterization and secure preservation of the collected extraterrestrial samples and potentially their safe distribution to the scientific community. EURO-CARES is a European Commission study funded under the Horizon-2020 program. The strategic objec-tive of EURO-CARES is to create a roadmap for the implementation of a European Extraterrestrial Sample Cu-ration Facility. The facility has to provide safe storage and handling of extraterrestrial samples and has to enable the preliminary characterization in order to achieve the required effectiveness and collaborative outcomes for the whole international scientific community. For example, samples returned from Mars could pose a threat on the Earth's biosphere if any living extraterrestrial organism are present in the samples. Thus planetary protection is an essential aspect of all Mars sample return missions that will affect the retrival and transport from the point of return, sample handling, infrastructure methodology and management of a future curation facility. Analysis of the state of the art of Planetary Protection technology shows there are considerable possibilities to define and develop technical and scientific features in a sample return mission and the infrastructural, procedur-al and legal issues that consequently rely on a curation facility. This specialist facility will be designed with con-sideration drawn from highcontainment laboratories and cleanroom facilities to protect the Earth from contami-nation with potential Martian organisms and the samples from Earth contaminations. This kind of integrated facility does not currently exist and this emphasises the need for an innovative design approach with an integrat-ed and multidisciplinary design to enable the ultimate science goals of such exploration. The issues of how the Planetary Protection considerations impact on the system technologies and scientific meaurements, with a final aim to prioritize outstanding technology needs is presented in the framework of sam-ple return study missions and the Horizon-2020 EURO-CARES project.

  9. Improving the Discoverability and Availability of Sample Data and Imagery in NASA's Astromaterials Curation Digital Repository Using a New Common Architecture for Sample Databases

    NASA Technical Reports Server (NTRS)

    Todd, N. S.; Evans, C.

    2015-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (JSC) is the designated facility for curating all of NASA's extraterrestrial samples. The suite of collections includes the lunar samples from the Apollo missions, cosmic dust particles falling into the Earth's atmosphere, meteorites collected in Antarctica, comet and interstellar dust particles from the Stardust mission, asteroid particles from the Japanese Hayabusa mission, and solar wind atoms collected during the Genesis mission. To support planetary science research on these samples, NASA's Astromaterials Curation Office hosts the Astromaterials Curation Digital Repository, which provides descriptions of the missions and collections, and critical information about each individual sample. Our office is implementing several informatics initiatives with the goal of better serving the planetary research community. One of these initiatives aims to increase the availability and discoverability of sample data and images through the use of a newly designed common architecture for Astromaterials Curation databases.

  10. Curating NASA's Astromaterials Collections: Past, Present, and Future

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan

    2015-01-01

    Planning for the curation of samples from future sample return missions must begin during the initial planning stages of a mission. Waiting until the samples have been returned to Earth, or even when you begin to physically build the spacecraft is too late. A lack of proper planning could lead to irreversible contamination of the samples, which in turn would compromise the scientific integrity of the mission. For example, even though the Apollo missions first returned samples in 1969, planning for the curation facility began in the early 1960s, and construction of the Lunar Receiving Laboratory was completed in 1967. In addition to designing the receiving facility and laboratory that the samples will be characterized and stored in, there are many aspects of contamination that must be addressed during the planning and building of the spacecraft: planetary protection (both outbound and inbound); cataloging, documenting, and preserving the materials used to build spacecraft (also known as coupons); near real-time monitoring of the environment in which the spacecraft is being built using witness plates for critical aspects of contamination (known as contamination control); and long term monitoring and preservation of the environment in which the spacecraft is being built for most aspects of potential contamination through the use of witness plates (known as contamination knowledge). The OSIRIS REx asteroid sample return mission, currently being built, is dealing with all of these aspects of contamination in order to ensure they return the best preserved sample possible. Coupons and witness plates from OSIRIS REx are currently being studied and stored (for future studies) at the Johnson Space Center. Similarly, planning for the clean room facility at Johnson Space Center to house the OSIRIS-REx samples is well advanced, and construction of the facility should begin in early 2017 (despite a nominal 2023 return date for OSIRIS-REx samples). Similar development is being done, in concert with JAXA, for the return of Hayabusa 2 samples (nominally in 2020). We are also actively developing advanced techniques like cold curation and organically clean curation in anticipation of future sample return missions such as comet nucleus sample return and Mars sample return.

  11. Technical Tension Between Achieving Particulate and Molecular Organic Environmental Cleanliness: Data from Astromaterial Curation Laboratories

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Burkett, P. J.

    2011-01-01

    NASA Johnson Space Center operates clean curation facilities for Apollo lunar, Antarctic meteorite, stratospheric cosmic dust, Stardust comet and Genesis solar wind samples. Each of these collections is curated separately due unique requirements. The purpose of this abstract is to highlight the technical tensions between providing particulate cleanliness and molecular cleanliness, illustrated using data from curation laboratories. Strict control of three components are required for curating samples cleanly: a clean environment; clean containers and tools that touch samples; and use of non-shedding materials of cleanable chemistry and smooth surface finish. This abstract focuses on environmental cleanliness and the technical tension between achieving particulate and molecular cleanliness. An environment in which a sample is manipulated or stored can be a room, an enclosed glovebox (or robotic isolation chamber) or an individual sample container.

  12. 50th Anniversary of the World's First Extraterrestrial Sample Receiving Laboratory: The Apollo Program's Lunar Receiving Laboratory

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Allton, J. H.; Zeigler, R. A.; McCubbin, F. M.

    2017-01-01

    The Apollo program's Lunar Receiving Laboratory (LRL), building 37 at NASA's Manned Spaceflight Center (MSC), now Johnson Space Center (JSC), in Houston, TX, was the world's first astronaut and extraterrestrial sample quarantine facility (Fig. 1). It was constructed by Warrior Construction Co. and Warrior-Natkin-National at a cost of $8.1M be-tween August 10, 1966 and June 26, 1967. In 1969, the LRL received and curated the first collection of extra-terrestrial samples returned to Earth; the rock and soil samples of the Apollo 11 mission. This year, the JSC Astromaterials Acquisition and Curation Office (here-after JSC curation) celebrates 50 years since the opening of the LRL and its legacy of laying the foundation for modern curation of extraterrestrial samples.

  13. Evolution of the Lunar Receiving Laboratory to the Astromaterial Sample Curation Facility: Technical Tensions Between Containment and Cleanliness, Between Particulate and Organic Cleanliness

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Zeigler, R. A.; Calaway, M. J.

    2016-01-01

    The Lunar Receiving Laboratory (LRL) was planned and constructed in the 1960s to support the Apollo program in the context of landing on the Moon and safely returning humans. The enduring science return from that effort is a result of careful curation of planetary materials. Technical decisions for the first facility included sample handling environment (vacuum vs inert gas), and instruments for making basic sample assessment, but the most difficult decision, and most visible, was stringent biosafety vs ultra-clean sample handling. Biosafety required handling of samples in negative pressure gloveboxes and rooms for containment and use of sterilizing protocols and animal/plant models for hazard assessment. Ultra-clean sample handling worked best in positive pressure nitrogen environment gloveboxes in positive pressure rooms, using cleanable tools of tightly controlled composition. The requirements for these two objectives were so different, that the solution was to design and build a new facility for specific purpose of preserving the scientific integrity of the samples. The resulting Lunar Curatorial Facility was designed and constructed, from 1972-1979, with advice and oversight by a very active committee comprised of lunar sample scientists. The high precision analyses required for planetary science are enabled by stringent contamination control of trace elements in the materials and protocols of construction (e.g., trace element screening for paint and flooring materials) and the equipment used in sample handling and storage. As other astromaterials, especially small particles and atoms, were added to the collections curated, the technical tension between particulate cleanliness and organic cleanliness was addressed in more detail. Techniques for minimizing particulate contamination in sample handling environments use high efficiency air filtering techniques typically requiring organic sealants which offgas. Protocols for reducing adventitious carbon on sample handling surfaces often generate particles. Further work is needed to achieve both minimal particulate and adventitious carbon contamination. This paper will discuss these facility topics and others in the historical context of nearly 50 years' curation experience for lunar rocks and regolith, meteorites, cosmic dust, comet particles, solar wind atoms, and asteroid particles at Johnson Space Center.

  14. Curation, Spacecraft Recovery and Preliminary Examination for the Stardust Mission: A Perspective from the Curatorial Facility

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael; Nakamura-Messenger, Keiko; Fletcher, Lisa; See, Thomas

    2008-01-01

    We briefly describe some of the challenges to the Stardust mission, curation and sample preliminary analysis, from the perspective of the Curation Office at the Johnson Space Center. Our goal is to inform persons planning future sample returns, so that they may learn from both our successes and challenges (and avoid some of our mistakes). The Curation office played a role in the mission from its inception, most critically assisting in the design and implementation of the spacecraft contamination control plan, and in planning and documenting the recovery of the spacecraft reentry capsule in Utah. A unique class 100 cleanroom was built to maintain the returned comet and interstellar samples in clean comfort, and to permit dissection and allocation of samples for analysis.

  15. Apollo Lunar Sample Photographs: Digitizing the Moon Rock Collection

    NASA Technical Reports Server (NTRS)

    Lofgren, Gary E.; Todd, Nancy S.; Runco, S. K.; Stefanov, W. L.

    2011-01-01

    The Acquisition and Curation Office at JSC has undertaken a 4-year data restoration project effort for the lunar science community funded by the LASER program (Lunar Advanced Science and Exploration Research) to digitize photographs of the Apollo lunar rock samples and create high resolution digital images. These sample photographs are not easily accessible outside of JSC, and currently exist only on degradable film in the Curation Data Storage Facility

  16. NASA Curation Preparation for Ryugu Sample Returned by JAXA's Hayabusa2 Mission

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, Keiko; Righter, Kevin; Snead, Christopher J.; McCubbin, Francis M.; Pace, Lisa F.; Zeigler, Ryan A.; Evans, Cindy

    2017-01-01

    The NASA OSIRIS-REx and JAXA Hayabusa2 missions to near-Earth asteroids Bennu and Ryugu share similar mission goals of understanding the origins of primitive, organic-rich asteroids. Under an agreement between JAXA and NASA, there is an on-going and productive collaboration between science teams of Hayabusa2 and OSIRIS-REx missions. Under this agreement, a portion of each of the returned sample masses will be exchanged between the agencies and the scientific results of their study will be shared. NASA’s portion of the returned Hayabusa2 sample, consisting of 10% of the returned mass, will be jointly separated by NASA and JAXA. The sample will be legally and physically transferred to NASA’s dedicated Hayabusa2 curation facility at Johnson Space Center (JSC) no later than one year after the return of the Hayabusa2 sample to Earth (December 2020). The JSC Hayabusa2 curation cleanroom facility design has now been completed. In the same manner, JAXA will receive 0.5% of the total returned OSIRIS-REx sample (minimum required sample to return 60 g, maximum sample return capacity of 2 kg) from the rest of the specimen. No later than one year after the return of the OSIRIS-REx sample to Earth (September 2023), legal, physical, and permanent custody of this sample subset will be transferred to JAXA, and the sample subset will be brought to JAXA’s Extraterrestrial Sample Curation Center (ESCuC) at Institute of Space and Astronautical Science, Sagamihara City Japan.

  17. Hayabusa Recovery, Curation and Preliminary Sample Analysis: Lessons Learned from Recent Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.

    2011-01-01

    I describe lessons learned from my participation on the Hayabusa Mission, which returned regolith grains from asteroid Itokawa in 2010 [1], comparing this with the recently returned Stardust Spacecraft, which sampled the Jupiter Family comet Wild 2. Spacecraft Recovery Operations: The mission Science and Curation teams must actively participate in planning, testing and implementing spacecraft recovery operations. The crash of the Genesis spacecraft underscored the importance of thinking through multiple contingency scenarios and practicing field recovery for these potential circumstances. Having the contingency supplies on-hand was critical, and at least one full year of planning for Stardust and Hayabusa recovery operations was necessary. Care must be taken to coordinate recovery operations with local organizations and inform relevant government bodies well in advance. Recovery plans for both Stardust and Hayabusa had to be adjusted for unexpectedly wet landing site conditions. Documentation of every step of spacecraft recovery and deintegration was necessary, and collection and analysis of launch and landing site soils was critical. We found the operation of the Woomera Text Range (South Australia) to be excellent in the case of Hayabusa, and in many respects this site is superior to the Utah Test and Training Range (used for Stardust) in the USA. Recovery operations for all recovered spacecraft suffered from the lack of a hermetic seal for the samples. Mission engineers should be pushed to provide hermetic seals for returned samples. Sample Curation Issues: More than two full years were required to prepare curation facilities for Stardust and Hayabusa. Despite this seemingly adequate lead time, major changes to curation procedures were required once the actual state of the returned samples became apparent. Sample databases must be fully implemented before sample return for Stardust we did not adequately think through all of the possible sub sampling and analytical activities before settling on a database design - Hayabusa has done a better job of this. Also, analysis teams must not be permitted to devise their own sample naming schemes. The sample handling and storage facilities for Hayabusa are the finest that exist, and we are now modifying Stardust curation to take advantage of the Hayabusa facilities. Remote storage of a sample subset is desirable. Preliminary Examination (PE) of Samples: There must be some determination of the state and quantity of the returned samples, to provide a necessary guide to persons requesting samples and oversight committees tasked with sample curation oversight. Hayabusa s sample PE, which is called HASPET, was designed so that late additions to the analysis protocols were possible, as new analytical techniques became available. A small but representative number of recovered grains are being subjected to in-depth characterization. The bulk of the recovered samples are being left untouched, to limit contamination. The HASPET plan takes maximum advantage of the unique strengths of sample return missions

  18. Hayabusa Asteroidal Sample Preliminary Examination Team (HASPET) and the Astromaterial Curation Facility at JAXA/ISAS

    NASA Astrophysics Data System (ADS)

    Yano, H.; Fujiwara, A.

    After the successful launch in May 2003, the Hayabusa (MUSES-C) mission of JAXA/ISAS will collect surface materials (e.g., regolith) of several hundred mg to several g in total from the S-type near Earth asteroid (25143) Itokawa in late 2005 and bring them back to ground laboratories in the summer of 2007. The retrieved samples will be given initial analysis at the JAXA/ISAS astromaterial curation facility, which is currently in the preparation for its construction, by the Hayabusa Asteroidal Sample Preliminary Examination Team (HASPET). HASPET is consisted of the ISAS Hayabusa team, the international partners from NASA and Australia and all-Japan meteoritic scientists to be selected as outsourcing parts of the initial analyses. The initial analysis to characterize general aspects of returned samples can consume only 15 % of its total mass and must complete the whole analyses including the database building before international AO for detailed analyses within the maximum of 1 year. Confident exercise of non-destructive, micro-analyses whenever possible are thus vital for the HASPET analysis. In the purpose to survey what kinds and levels of micro-analysis techniques in respective fields, from major elements and mineralogy to trace and isotopic elements and organics, are available in Japan at present, ISAS has conducted the HASPET open competitions in 2000-01 and 2004. The initial evaluation was made by multiple domestic peer reviews. Applicants were then provided two kinds of unknown asteroid sample analogs in order to conduct proposed analysis with self-claimed amount of samples in self-claimed duration. After the completion of multiple, international peer reviews, the Selection Committee compiled evaluations and recommended the finalists of each round. The final members of the HASPET will be appointed about 2 years prior to the Earth return. Then they will conduct a test-run of the whole initial analysis procedures at the ISAS astromaterial curation facility and their respective analysis facilities. This talk also summarizes the curation facility design and plans of initial analysis procedure flow.

  19. The Hayabusa Curation Facility at Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Zolensky, M.; Bastien, R.; McCann, B.; Frank, D.; Gonzalez, C.; Rodriguez, M.

    2013-01-01

    The Japan Aerospace Exploration Agency (JAXA) Hayabusa spacecraft made contact with the asteroid 25143 Itokawa and collected regolith dust from Muses Sea region of smooth terrain [1]. The spacecraft returned to Earth with more than 10,000 grains ranging in size from just over 300 µm to less than 10 µm [2, 3]. These grains represent the only collection of material returned from an asteroid by a spacecraft. As part of the joint agreement between JAXA and NASA for the mission, 10% of the Hayabusa grains are being transferred to NASA for parallel curation and allocation. In order to properly receive process and curate these samples, a new curation facility was established at Johnson Space Center (JSC). Since the Hayabusa samples within the JAXA curation facility have been stored free from exposure to terrestrial atmosphere and contamination [4], one of the goals of the new NASA curation facility was to continue this treatment. An existing lab space at JSC was transformed into a 120 sq.ft. ISO class 4 (equivalent to the original class 10 standard) clean room. Hayabusa samples are stored, observed, processed, and packaged for allocation inside a stainless steel glove box under dry N2. Construction of the clean laboratory was completed in 2012. Currently, 25 Itokawa particles are lodged in NASA's Hayabusa Lab. Special care has been taken during lab construction to remove or contain materials that may contribute contaminant particles in the same size range as the Hayabusa grains. Several witness plates of various materials are installed around the clean lab and within the glove box to permit characterization of local contaminants at regular intervals by SEM and mass spectrometry, and particle counts of the lab environment are frequently acquired. Of particular interest is anodized aluminum, which contains copious sub-mm grains of a multitude of different materials embedded in its upper surface. Unfortunately the use of anodized aluminum was necessary in the construction of the clean room frame to strengthen it and eliminate corrosion and wear over time. All anodized aluminum interior to the lab was thus covered or replaced by minimally contaminating materials.

  20. Organic Contamination Baseline Study in NASA Johnson Space Center Astromaterials Curation Laboratories

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Allen, Carlton C.; Allton, Judith H.

    2014-01-01

    Future robotic and human spaceflight missions to the Moon, Mars, asteroids, and comets will require curating astromaterial samples with minimal inorganic and organic contamination to preserve the scientific integrity of each sample. 21st century sample return missions will focus on strict protocols for reducing organic contamination that have not been seen since the Apollo manned lunar landing program. To properly curate these materials, the Astromaterials Acquisition and Curation Office under the Astromaterial Research and Exploration Science Directorate at NASA Johnson Space Center houses and protects all extraterrestrial materials brought back to Earth that are controlled by the United States government. During fiscal year 2012, we conducted a year-long project to compile historical documentation and laboratory tests involving organic investigations at these facilities. In addition, we developed a plan to determine the current state of organic cleanliness in curation laboratories housing astromaterials. This was accomplished by focusing on current procedures and protocols for cleaning, sample handling, and storage. While the intention of this report is to give a comprehensive overview of the current state of organic cleanliness in JSC curation laboratories, it also provides a baseline for determining whether our cleaning procedures and sample handling protocols need to be adapted and/or augmented to meet the new requirements for future human spaceflight and robotic sample return missions.

  1. Planetary Sample Analysis Laboratory at DLR

    NASA Astrophysics Data System (ADS)

    Helbert, J.; Maturilli, A.; de Vera, J. P.

    2018-04-01

    Building on the available infrastructure and the long heritage, DLR is planning to create a Planetary Sample Analysis laboratory (PSA), which can be later extended to a full sample curation facility in collaboration with the Robert-Koch Institute.

  2. Technology Development and Advanced Planning for Curation of Returned Mars Samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Allen, Carlton C.

    2002-01-01

    NASA Johnson Space Center (JSC) curates extraterrestrial samples, providing the international science community with lunar rock and soil returned by the Apollo astronauts, meteorites collected in Antarctica, cosmic dust collected in the stratosphere, and hardware exposed to the space environment. Curation comprises initial characterization of new samples, preparation and allocation of samples for research, and clean, secure long-term storage. The foundations of this effort are the specialized cleanrooms (class 10 to 10,000) for each of the four types of materials, the supporting facilities, and the people, many of whom have been doing detailed work in clean environments for decades. JSC is also preparing to curate the next generation of extraterrestrial samples. These include samples collected from the solar wind, a comet, and an asteroid. Early planning and R\\&D are underway to support post-mission sample handling and curation of samples returned from Mars. One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination . Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. JSC has been conducting feasibility studies and developing designs for a sample receiving facility that would offer biocontainment at least the equivalent of current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels equaling those of state-of-the-art cleanrooms. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samples require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation Laboratory includes a new-generation glovebox equipped with a robotic arm to evaluate the usability of robotic and teleoperated systems to perform curatorial tasks. The laboratory also contains equipment for precision cleaning and the measurement of trace organic contamination.

  3. Technology Development and Advanced Planning for Curation of Returned Mars Samples

    NASA Astrophysics Data System (ADS)

    Lindstrom, D. J.; Allen, C. C.

    2002-05-01

    NASA/Johnson Space Center (JSC) curates extraterrestrial samples, providing the international science community with lunar rock and soil returned by the Apollo astronauts, meteorites collected in Antarctica, cosmic dust collected in the stratosphere, and hardware exposed to the space environment. Curation comprises initial characterization of new samples, preparation and allocation of samples for research, and clean, secure long-term storage. The foundations of this effort are the specialized cleanrooms (class 10 to 10,000) for each of the four types of materials, the supporting facilities, and the people, many of whom have been doing detailed work in clean environments for decades. JSC is also preparing to curate the next generation of extraterrestrial samples. These include samples collected from the solar wind, a comet, and an asteroid. Early planning and R&D are underway to support post-mission sample handling and curation of samples returned from Mars. One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. JSC has been conducting feasibility studies and developing designs for a sample receiving facility that would offer biocontainment at least the equivalent of current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels equaling those of state-of-the-art cleanrooms. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samples require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation Laboratory includes a new-generation glovebox equipped with a robotic arm to evaluate the usability of robotic and teleoperated systems to perform curatorial tasks. The laboratory also contains equipment for precision cleaning and the measurement of trace organic contamination.

  4. Astromaterials Acquisition and Curation Office (KT) Overview

    NASA Technical Reports Server (NTRS)

    Allen, Carlton

    2014-01-01

    The Astromaterials Acquisition and Curation Office has the unique responsibility to curate NASA's extraterrestrial samples - from past and forthcoming missions - into the indefinite future. Currently, curation includes documentation, preservation, physical security, preparation, and distribution of samples from the Moon, asteroids, comets, the solar wind, and the planet Mars. Each of these sample sets has a unique history and comes from a unique environment. The curation laboratories and procedures developed over 40 years have proven both necessary and sufficient to serve the evolving needs of a worldwide research community. A new generation of sample return missions to destinations across the solar system is being planned and proposed. The curators are developing the tools and techniques to meet the challenges of these new samples. Extraterrestrial samples pose unique curation requirements. These samples were formed and exist under conditions strikingly different from those on the Earth's surface. Terrestrial contamination would destroy much of the scientific significance of extraterrestrial materials. To preserve the research value of these precious samples, contamination must be minimized, understood, and documented. In addition, the samples must be preserved - as far as possible - from physical and chemical alteration. The elaborate curation facilities at JSC were designed and constructed, and have been operated for many years, to keep sample contamination and alteration to a minimum. Currently, JSC curates seven collections of extraterrestrial samples: (a)) Lunar rocks and soils collected by the Apollo astronauts, (b) Meteorites collected on dedicated expeditions to Antarctica, (c) Cosmic dust collected by high-altitude NASA aircraft,t (d) Solar wind atoms collected by the Genesis spacecraft, (e) Comet particles collected by the Stardust spacecraft, (f) Interstellar dust particles collected by the Stardust spacecraft, and (g) Asteroid soil particles collected by the Japan Aerospace Exploration Agency (JAXA) Hayabusa spacecraft Each of these sample sets has a unique history and comes from a unique environment. We have developed specialized laboratories and practices over many years to preserve and protect the samples, not only for current research but for studies that may be carried out in the indefinite future.

  5. Sample Curation at a Lunar Outpost

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.; Lofgren, Gary E.; Treiman, A. H.; Lindstrom, Marilyn L.

    2007-01-01

    The six Apollo surface missions returned 2,196 individual rock and soil samples, with a total mass of 381.6 kg. Samples were collected based on visual examination by the astronauts and consultation with geologists in the science back room in Houston. The samples were photographed during collection, packaged in uniquely-identified containers, and transported to the Lunar Module. All samples collected on the Moon were returned to Earth. NASA's upcoming return to the Moon will be different. Astronauts will have extended stays at an out-post and will collect more samples than they will return. They will need curation and analysis facilities on the Moon in order to carefully select samples for return to Earth.

  6. Gains in efficiency and scientific potential of continental climate reconstruction provided by the LRC LacCore Facility, University of Minnesota

    NASA Astrophysics Data System (ADS)

    Noren, A.; Brady, K.; Myrbo, A.; Ito, E.

    2007-12-01

    Lacustrine sediment cores comprise an integral archive for the determination of continental paleoclimate, for their potentially high temporal resolution and for their ability to resolve spatial variability in climate across vast sections of the globe. Researchers studying these archives now have a large, nationally-funded, public facility dedicated to the support of their efforts. The LRC LacCore Facility, funded by NSF and the University of Minnesota, provides free or low-cost assistance to any portion of research projects, depending on the specific needs of the project. A large collection of field equipment (site survey equipment, coring devices, boats/platforms, water sampling devices) for nearly any lacustrine setting is available for rental, and Livingstone-type corers and drive rods may be purchased. LacCore staff can accompany field expeditions to operate these devices and curate samples, or provide training prior to device rental. The Facility maintains strong connections to experienced shipping agents and customs brokers, which vastly improves transport and importation of samples. In the lab, high-end instrumentation (e.g., multisensor loggers, high-resolution digital linescan cameras) provides a baseline of fundamental analyses before any sample material is consumed. LacCore staff provide support and training in lithological description, including smear-slide, XRD, and SEM analyses. The LRC botanical macrofossil reference collection is a valuable resource for both core description and detailed macrofossil analysis. Dedicated equipment and space for various subsample analyses streamlines these endeavors; subsamples for several analyses may be submitted for preparation or analysis by Facility technicians for a fee (e.g., carbon and sulfur coulometry, grain size, pollen sample preparation and analysis, charcoal, biogenic silica, LOI, freeze drying). The National Lacustrine Core Repository now curates ~9km of sediment cores from expeditions around the world, and stores metadata and analytical data for all cores processed at the facility. Any researcher may submit sample requests for material in archived cores. Supplies for field (e.g., polycarbonate pipe, endcaps), lab (e.g., sample containers, pollen sample spike), and curation (e.g., D-tubes) are sold at cost. In collaboration with facility users, staff continually develop new equipment, supplies, and procedures as needed in order to provide the best and most comprehensive set of services to the research community.

  7. Partnering with NASA JSC for Community Research Needs; Collaborative and Student Opportunities via Jacobs and Psams Initiative

    NASA Astrophysics Data System (ADS)

    Danielson, L. R.; Draper, D. S.

    2016-12-01

    NASA Johnson Space Center's (JSC) Astromaterials Research and Exploration Science Division houses a unique combination of laboratories and other assets for conducting cutting-edge planetary research. These facilities have been accessed for decades by outside scientists; over the past five years, the 16 full time contract research and technical staff members in our division have hosted a total of 223 visiting researchers, representing 35 institutions. We intend to submit a proposal to NASA specifically for facilities support and establishment of our laboratories as a collective, PSAMS, Planetary Sample Analyses and Mission Science, which should result in substantial cost savings to PIs who wish to use our facilities. JSC is a recognized NASA center of excellence for curation, and in future will allow PIs easy access to samples in Curation facilities that they have been approved to study. Our curation expertise could also be used for a collection of experimental run products and standards that could be shared and distributed to community members, products that could range from 1 bar controlled atmosphere furnace, piston cylinder, multi-anvil, to shocked products. Coordinated analyses of samples is one of the major strengths of our division, where a single sample can be prepared with minimal destruction for a variety of chemical and structural analyses, from macro to nano-scale. A CT scanner will be delivered August 2016 and installed in the same building as all the other division experimental and analytical facilities, allowing users to construct a 3 dimensional model of their run product and/or starting material before any destruction of their sample for follow up analyses. The 3D printer may also be utilized to construct containers for diamond anvil cell experiments. Our staff scientists will work with PIs to maximize science return and serve the needs of the community. We welcome student visitors, and a graduate semester internship is available through Jacobs.

  8. Partnering With NASA JSC for Community Research Needs; Collaborative and Student Opportunities via Jacobs and PSAMS Initiative

    NASA Technical Reports Server (NTRS)

    Danielson, Lisa; Draper, David

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division houses a unique combination of laboratories and other assets for conducting cutting-edge planetary research. These facilities have been accessed for decades by outside scientists; over the past five years, the 16 full time contract research and technical staff members in our division have hosted a total of 223 visiting researchers, representing 35 institutions. In order to continue to provide this level of support to the planetary sciences community, and also expand our services and collaboration within the broader scientific community, we intend to submit a proposal to NASA specifically for facilities support and establishment of our laboratories as a collective, PSAMS, Planetary Sample Analyses and Mission Science. This initiative should result in substantial cost savings to PIs with NASA funding who wish to use our facilities. Another cost saving could be realized by aggregating visiting user experiments and analyses through COMPRES, which would be of particular interest to researchers in earth and material sciences. JSC is a recognized NASA center of excellence for curation, and in future will allow PIs and mission teams easy access to samples in Curation facilities that they have been approved to study. Our curation expertise could also be used for a collection of experimental run products that could be shared and distributed to COMPRES community members. These experimental run products could range from 1 bar controlled atmosphere furnace, piston cylinder, multi-anvil, CETUS (see companion abstract), to shocked products. Coordinated analyses of samples is one of the major strengths of our division, where a single sample can be prepared with minimal destruction for a variety of chemical and structural analyses, from macro to nano-scale.

  9. Curation and Analysis of Samples from Comet Wild-2 Returned by NASA's Stardust Mission

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, Keiko; Walker, Robert M.

    2015-01-01

    The NASA Stardust mission returned the first direct samples of a cometary coma from comet 81P/Wild-2 in 2006. Intact capture of samples encountered at 6 km/s was enabled by the use of aerogel, an ultralow dense silica polymer. Approximately 1000 particles were captured, with micron and submicron materials distributed along mm scale length tracks. This sample collection method and the fine scale of the samples posed new challenges to the curation and cosmochemistry communities. Sample curation involved extensive, detailed photo-documentation and delicate micro-surgery to remove particles without loss from the aerogel tracks. This work had to be performed in highly clean facility to minimize the potential of contamination. JSC Curation provided samples ranging from entire tracks to micrometer-sized particles to external investigators. From the analysis perspective, distinguishing cometary materials from aerogel and identifying the potential alteration from the capture process were essential. Here, transmission electron microscopy (TEM) proved to be the key technique that would make this possible. Based on TEM work by ourselves and others, a variety of surprising findings were reported, such as the observation of high temperature phases resembling those found in meteorites, rarely intact presolar grains and scarce organic grains and submicrometer silicates. An important lesson from this experience is that curation and analysis teams must work closely together to understand the requirements and challenges of each task. The Stardust Mission also has laid important foundation to future sample returns including OSIRIS-REx and Hayabusa II and future cometary nucleus sample return missions.

  10. Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer Planning (OSIRIS-REx)

    NASA Technical Reports Server (NTRS)

    Nakamura-Messenger, Keiko; Messenger, Scott; Keller, Lindsay; Righter, Kevin

    2014-01-01

    Scientists at ARES are preparing to curate and analyze samples from the first U.S. mission to return samples from an asteroid. The Origins-Spectral Interpretation- Resource Identification-Security-Regolith Explorer, or OSIRIS-REx, was selected by NASA as the third mission in its New Frontiers Program. The robotic spacecraft will launch in 2016 and rendezvous with the near-Earth asteroid Bennu, in 2020. A robotic arm will collect at least 60 grams of material from the surface of the asteroid to be returned to Earth in 2023 for worldwide distribution by the NASA Astromaterials Curation Facility at ARES.

  11. NASA's Astromaterials Database: Enabling Research Through Increased Access to Sample Data, Metadata and Imagery

    NASA Technical Reports Server (NTRS)

    Evans, Cindy; Todd, Nancy

    2014-01-01

    The Astromaterials Acquisition & Curation Office at NASA's Johnson Space Center (JSC) is the designated facility for curating all of NASA's extraterrestrial samples. Today, the suite of collections includes the lunar samples from the Apollo missions, cosmic dust particles falling into the Earth's atmosphere, meteorites collected in Antarctica, comet and interstellar dust particles from the Stardust mission, asteroid particles from Japan's Hayabusa mission, solar wind atoms collected during the Genesis mission, and space-exposed hardware from several missions. To support planetary science research on these samples, JSC's Astromaterials Curation Office hosts NASA's Astromaterials Curation digital repository and data access portal [http://curator.jsc.nasa.gov/], providing descriptions of the missions and collections, and critical information about each individual sample. Our office is designing and implementing several informatics initiatives to better serve the planetary research community. First, we are re-hosting the basic database framework by consolidating legacy databases for individual collections and providing a uniform access point for information (descriptions, imagery, classification) on all of our samples. Second, we continue to upgrade and host digital compendia that summarize and highlight published findings on the samples (e.g., lunar samples, meteorites from Mars). We host high resolution imagery of samples as it becomes available, including newly scanned images of historical prints from the Apollo missions. Finally we are creating plans to collect and provide new data, including 3D imagery, point cloud data, micro CT data, and external links to other data sets on selected samples. Together, these individual efforts will provide unprecedented digital access to NASA's Astromaterials, enabling preservation of the samples through more specific and targeted requests, and supporting new planetary science research and collaborations on the samples.

  12. Advances in Astromaterials Curation: Supporting Future Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Evans, C. A.; Zeigler, R. A.; Fries, M. D..; Righter, K.; Allton, J. H.; Zolensky, M. E.; Calaway, M. J.; Bell, M. S.

    2015-01-01

    NASA's Astromaterials, curated at the Johnson Space Center in Houston, are the most extensive, best-documented, and leastcontaminated extraterrestrial samples that are provided to the worldwide research community. These samples include lunar samples from the Apollo missions, meteorites collected over nearly 40 years of expeditions to Antarctica (providing samples of dozens of asteroid bodies, the Moon, and Mars), Genesis solar wind samples, cosmic dust collected by NASA's high altitude airplanes, Comet Wild 2 and interstellar dust samples from the Stardust mission, and asteroid samples from JAXA's Hayabusa mission. A full account of NASA's curation efforts for these collections is provided by Allen, et al [1]. On average, we annually allocate about 1500 individual samples from NASA's astromaterials collections to hundreds of researchers from around the world, including graduate students and post-doctoral scientists; our allocation rate has roughly doubled over the past 10 years. The curation protocols developed for the lunar samples returned from the Apollo missions remain relevant and are adapted to new and future missions. Several lessons from the Apollo missions, including the need for early involvement of curation scientists in mission planning [1], have been applied to all subsequent sample return campaigns. From the 2013 National Academy of Sciences report [2]: "Curation is the critical interface between sample return missions and laboratory research. Proper curation has maintained the scientific integrity and utility of the Apollo, Antarctic meteorite, and cosmic dust collections for decades. Each of these collections continues to yield important new science. In the past decade, new state-of-the-art curatorial facilities for the Genesis and Stardust missions were key to the scientific breakthroughs provided by these missions." The results speak for themselves: research on NASA's astromaterials result in hundreds of papers annually, yield fundamental discoveries about the evolution of the solar system (e.g. [3] and references contained therein), and serve the global scientific community as ground truth for current and planned missions such as NASA's Dawn mission to Vesta and Ceres, and the future OSIRIS REx mission to asteroid Bennu [1,3

  13. Curating NASA's future extraterrestrial sample collections: How do we achieve maximum proficiency?

    NASA Astrophysics Data System (ADS)

    McCubbin, Francis; Evans, Cynthia; Allton, Judith; Fries, Marc; Righter, Kevin; Zolensky, Michael; Zeigler, Ryan

    2016-07-01

    Introduction: The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10E "Curation of Extraterrestrial Materials", JSC is charged with "The curation of all extraterrestrial material under NASA control, including future NASA missions." The Directive goes on to define Curation as including "…documentation, preservation, preparation, and distribution of samples for research, education, and public outreach." Here we describe some of the ongoing efforts to ensure that the future activities of the NASA Curation Office are working to-wards a state of maximum proficiency. Founding Principle: Curatorial activities began at JSC (Manned Spacecraft Center before 1973) as soon as design and construction planning for the Lunar Receiving Laboratory (LRL) began in 1964 [1], not with the return of the Apollo samples in 1969, nor with the completion of the LRL in 1967. This practice has since proven that curation begins as soon as a sample return mission is conceived, and this founding principle continues to return dividends today [e.g., 2]. The Next Decade: Part of the curation process is planning for the future, and we refer to these planning efforts as "advanced curation" [3]. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of collections as envisioned by NASA exploration goals. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, curation of organically- and biologically-sensitive samples, and the use of minimally invasive analytical techniques (e.g., micro-CT, [4]) to characterize samples. These efforts will be useful for Mars Sample Return, Lunar South Pole-Aitken Basin Sample Return, and Comet Surface Sample Return, all of which were named in the NRC Planetary Science Decadal Survey 2013-2022. We are fully committed to pushing the boundaries of curation protocol as humans continue to push the boundaries of space exploration and sample return. However, to improve our ability to curate astromaterials collections of the future and to provide maximum protection to any returned samples, it is imperative that curation involvement commences at the time of mission conception. When curation involvement is at the ground floor of mission planning, it provides a mechanism by which the samples can be protected against project-level decisions that could undermine the scientific value of the re-turned samples. A notable example of one of the bene-fits of early curation involvement in mission planning is in the acquisition of contamination knowledge (CK). CK capture strategies are designed during the initial planning stages of a sample return mission, and they are to be implemented during all phases of the mission from assembly, test, and launch operations (ATLO), through cruise and mission operations, to the point of preliminary examination after Earth return. CK is captured by witness materials and coupons exposed to the contamination environment in the assembly labs and on the space craft during launch, cruise, and operations. These materials, along with any procedural blanks and returned flight-hardware, represent our CK capture for the returned samples and serves as a baseline from which analytical results can be vetted. Collection of CK is a critical part of being able to conduct and interpret data from organic geochemistry and biochemistry investigations of returned samples. The CK samples from a given mission are treated as part of the sample collection of that mission, hence they are part of the permanent archive that is maintained by the NASA curation Office. We are in the midst of collecting witness plates and coupons for the OSIRIS-REx mission, and we are in the planning stages for similar activities for the Mars 2020 rover mission, which is going to be the first step in a multi-stage campaign to return martian samples to Earth. Concluding Remarks: The return of every extraterrestrial sample is a scientific investment, and the CK samples and any procedural blanks represent an insurance policy against imperfections in the sample-collection and sample-return process. The curation facilities and personnel are the primary managers of that investment, and the scientific community, at large, is the beneficiary. The NASA Curation Office at JSC has the assigned task of maintaining the long-term integrity of all of NASA's astromaterials and ensuring that the samples are distributed for scientific study in a fair, timely, and responsible manner. It is only through this openness and global collaboration in the study of astromaterials that the return on our scientific investments can be maximized. For information on requesting samples and becoming part of the global study of astromaterials, please visit curator.jsc.nasa.gov References: [1] Mangus, S. & Larsen, W. (2004) NASA/CR-2004-208938, NASA, Washington, DC. [2] Allen, C. et al., (2011) Chemie Der Erde-Geochemistry, 71, 1-20. [3] McCubbin, F.M. et al., (2016) 47th LPSC #2668. [4] Zeigler, R.A. et al., (2014) 45th LPSC #2665.

  14. Lunar Processing Cabinet 2.0: Retrofitting Gloveboxes into the 21st Century

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.

    2015-01-01

    In 2014, the Apollo 16 Lunar Processing Glovebox (cabinet 38) in the Lunar Curation Laboratory at NASA JSC received an upgrade including new technology interfaces. A Jacobs - Technology Innovation Project provided the primary resources to retrofit this glovebox into the 21st century. NASA Astromaterials Acquisition & Curation Office continues the over 40 year heritage of preserving lunar materials for future scientific studies in state-of-the-art facilities. This enhancement has not only modernized the contamination controls, but provides new innovative tools for processing and characterizing lunar samples as well as supports real-time exchange of sample images and information with the scientific community throughout the world.

  15. Digital Management and Curation of the National Rock and Ore Collections at NMNH, Smithsonian

    NASA Astrophysics Data System (ADS)

    Cottrell, E.; Andrews, B.; Sorensen, S. S.; Hale, L. J.

    2011-12-01

    The National Museum of Natural History, Smithsonian Institution, is home to the world's largest curated rock collection. The collection houses 160,680 physical rock and ore specimen lots ("samples"), all of which already have a digital record that can be accessed by the public through a searchable web interface (http://collections.mnh.si.edu/search/ms/). In addition, there are 66 accessions pending that when catalogued will add approximately 60,000 specimen lots. NMNH's collections are digitally managed on the KE EMu° platform which has emerged as the premier system for managing collections in natural history museums worldwide. In 2010 the Smithsonian released an ambitious 5 year Digitization Strategic Plan. In Mineral Sciences, new digitization efforts in the next five years will focus on integrating various digital resources for volcanic specimens. EMu sample records will link to the corresponding records for physical eruption information housed within the database of Smithsonian's Global Volcanism Program (GVP). Linkages are also planned between our digital records and geochemical databases (like EarthChem or PetDB) maintained by third parties. We anticipate that these linkages will increase the use of NMNH collections as well as engender new scholarly directions for research. Another large project the museum is currently undertaking involves the integration of the functionality of in-house designed Transaction Management software with the EMu database. This will allow access to the details (borrower, quantity, date, and purpose) of all loans of a given specimen through its catalogue record. We hope this will enable cross-referencing and fertilization of research ideas while avoiding duplicate efforts. While these digitization efforts are critical, we propose that the greatest challenge to sample curation is not posed by digitization and that a global sample registry alone will not ensure that samples are available for reuse. We suggest instead that the ability of the Earth science community to identify and preserve important collections and make them available for future study is limited by personnel and space resources from the level of the individual PI to the level of national facilities. Moreover, when it comes to specimen "estate planning," the cultural attitudes of scientists, institutions, and funding agencies are often inadequate to provide for long-term specimen curation - even if specimen discovery is enabled by digital registry. Timely access to curated samples requires that adequate resources be devoted to the physical care of specimens (facilities) and to the personnel costs associated with curation - from the conservation, storage, and inventory management of specimens, to the dispersal of samples for research, education, and exhibition.

  16. Distribution and utilization of curative primary healthcare services in Lahej, Yemen.

    PubMed

    Bawazir, A A; Bin Hawail, T S; Al-Sakkaf, K A Z; Basaleem, H O; Muhraz, A F; Al-Shehri, A M

    2013-09-01

    No evidence-based data exist on the availability, accessibility and utilization of healthcare services in Lahej Governorate, Yemen. The aim of this study was to assess the distribution and utilization of curative services in primary healthcare units and centres in Lahej. Cross-sectional study (clustering sample). This study was conducted in three of the 15 districts in Lahej between December 2009 and August 2010. Household members were interviewed using a questionnaire to determine sociodemographic characteristics and types of healthcare services available in the area. The distribution of health centres, health units and hospitals did not match the size of the populations or areas of the districts included in this study. Geographical accessibility was the main obstacle to utilization. Factors associated with the utilization of curative services were significantly related to the time required to reach the nearest facility, seeking curative services during illness and awareness of the availability of health facilities (P < 0.01). There is an urgent need to look critically and scientifically at the distribution of healthcare services in the region in order to ensure accessibility and quality of services. Copyright © 2013 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  17. Preliminary Examination of Particles Recovered from the Surface of the Asteroid Itokawa by the Hayabusa Mission

    NASA Technical Reports Server (NTRS)

    Tsuchiyama, A.; Ebihara, M.; Kimura, M.; Kitajima, F.; Kotsugi, M.; Ito, S.; Nagao, K.; Nakamura, T.; Naraoka, H.; Noguchi, T.; hide

    2011-01-01

    The Hayabusa spacecraft arrived at S-type Asteroid 25143 Itokawa in November 2006, and reveal astounding features of the small asteroid (535 x 294 x 209 m). Near-infrared spectral shape indicates that the surface of this body has an olivinerich mineral assemblage potentially similar to that of LL5 or LL6 chondrites with different degrees of space weathering. Based on the surface morphological features observed in high-resolution images of Itokawa s surface, two major types of boulders were distinguished: rounded and angular boulders. Rounded boulders seem to be breccias, while angular boulders seem to have severe impact origin. Although the sample collection did not be made by normal operations, it was considered that some amount of samples, probably small particles of regolith, was collected from MUSES-C regio on the Itokawa s surface. The sample capsule was successfully recovered on the earth on June 13, 2010, and was opened at curation facility of JAXA (Japan Aerospace Exploration Agency), Sagamihara, Japan. A large number of small particles were found in the sample container. Preliminary analysis with SEM/EDX at the curation facility showed that at least more than 1500 grains were identified as rocky particles, and most of them were judged to be of extraterrestrial origin, and definitely from Asteroid Itokawa. Minerals (olivine, low-Ca pyroxene, high-Ca pyroxene, plagioclase, Fe sulfide, Fe-Ni metal, chromite, Ca phosphate), roughly estimated mode the minerals and rough measurement of the chemical compositions of the silicates show that these particles are roughly similar to LL chondrites. Although their size are mostly less than 10 m, some larger particles of about 100 m or larger were also identified. A part of the sample (probably several tens particles) will be selected by Hayabusa sample curation team and examined preliminary in Japan within one year after the sample recovery in prior to detailed analysis phase. Hayabusa Asteroidal Sample Preliminary Examination Team (HASPET) has been preparing for the preliminary examination with close cooperation with the curation team.

  18. The Mars Sample Return Lab(s) - Lessons from the Past and Implications for the Future

    NASA Technical Reports Server (NTRS)

    Allen, Carlton

    2012-01-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning.

  19. Reducing Organic Contamination in NASA JSC Astromaterial Curation Facility

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Allen, C. C.; Allton, J. H.

    2013-01-01

    Future robotic and human spaceflight missions to the Moon, Mars, asteroids and comets will require handling and storing astromaterial samples with minimal inorganic and organic contamination to preserve the scientific integrity of each sample. Much was learned from the rigorous attempts to minimize and monitor organic contamination during Apollo, but it was not adequate for current analytical requirements; thus [1]. OSIRIS-REx, Hayabusa-2, and future Mars sample return will require better protocols for reducing organic contamination. Future isolation con-tainment systems for astromaterials, possibly nitrogen enriched gloveboxes, must be able to reduce organic and inorganic cross-contamination. In 2012, a baseline study established the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs that could be used as a benchmark for future mission designs [2, 3]. After standard ultra-pure water (UPW) cleaning, the majority of organic contaminates found were hydrocarbons, plasticizers, silicones, and solvents. Hydro-carbons loads (> C7) ranged from 1.9 to 11.8 ng/cm2 for TD-GC-MS wafer exposure analyses and 5.0 to 19.5 ng/L for TD-GC-MS adsorbent tube exposure. Plasticizers included < 0.6 ng/cm2 of DBP, DEP, TXIB, and DIBP. Silicones included < 0.5 ng/cm2 of cyclo(Me2SiO)x (x = 6, 8, 9, 10) and siloxane. Solvents included < 1.0 ng/cm2 of 2-cyclohexen-1-one, 3,5,5-trimethyl- (Isopho-rone), N-formylpiperidine, and 2-(2-butoxyethoxy) ethanol. In addition, DBF, rubber/polymer additive was found at < 0.2 ng/cm2 and caprolactam, nylon-6 at < 0.6 ng/cm2. Reducing Organics: The Apollo program was the last sam-ple return mission to place high-level organic requirements and biological containment protocols on a curation facility. The high vacuum complex F-201 glovebox in the Lunar Receiving Labora-tory used ethyl alcohol (190 proof), 3:1 benzene/methanol (nano grade solution), and heat sterilization at 130degC for 48 hours to reduce organic contamination. In addition, both heat sterilization and peracetic acid sterilization were used in the atmospheric de-contamination (R) cabinets. Later, Lunar curation gloveboxes were degreased with a pressurized Freon 113 wash. Today, UPW has replaced Freon as the standard cleaning procedure, but does not have the degreasing solvency power of Freon. Future Cleaning Studies: Cleaning experiments are cur-rently being orchestrated to study how to degrease and reduce organics in a JSC curation glovebox lower than the established baseline. Several new chemicals in the industry have replaced traditional degreasing solvents such as Freon and others that are now federally restricted. However, these new suites of chemicals remain untested for lowering organics in curation gloveboxes. 3M's HFE-7100DL and DuPont's Vertrel XF are currently being tested as a replacement for Freon 113 as a degreaser at JSC cura-tion facilities. In addition, the use of UPW as a final rinse is be-ing tested, which presumably can maintain a lower total organic carbon load than the filtered purity of chemical solutions. References: [1] Allton J.H. et al. (2012) LPS XLIII, 2439; [2] Calaway M.

  20. Advanced Curation Preparation for Mars Sample Return and Cold Curation

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Harrington, A. D.; McCubbin, F. M.; Mitchell, J.; Regberg, A. B.; Snead, C.

    2017-01-01

    NASA Curation is tasked with the care and distribution of NASA's sample collections, such as the Apollo lunar samples and cometary material collected by the Stardust spacecraft. Curation is also mandated to perform Advanced Curation research and development, which includes improving the curation of existing collections as well as preparing for future sample return missions. Advanced Curation has identified a suite of technologies and techniques that will require attention ahead of Mars sample return (MSR) and missions with cold curation (CCur) requirements, perhaps including comet sample return missions.

  1. Does preventive health care have a chance in the changing health sector in Tanzania?

    PubMed

    Msuya, J M; Nyaruhucha, C N M; Kaswahili, J

    2003-03-01

    To investigate the status and practice of preventive health care (relative to curative) in the health delivery system at the time when the health sector reforms are taking place. A cross-sectional, descriptive study. The study was conducted in Morogoro District between January and May 1999. Eighty six medical personnel and two hospital administrators from thirty four health facilities. The health facilities included twenty five dispensaries, five health centres and four hospitals. Care was also taken to include health facilities owned by various institutions and organisations, including governmental and non-governmental. Generally, preventive health received little attention compared to the curative health measures whereby more than 80% of the medical personnel in some of the facilities were assigned to curative services. Health personnel reported to spend an average of up to six hours per day providing curative services such as chemotherapy, surgical treatment, psychotherapy and radiography. On the contrary, they spent about four hours or less on providing child immunisation and education on nutrition, health and family planning. As expected, the type of ownership of a health facility influenced the extent to which preventive measures were included. For example, while all the government owned facilities did provide child immunisation, nutrition education and family planning services, some non-governmental facilities were lacking such services. It is obvious that while the provision of curative health care can be left to the hands of the private suppliers, that of preventive health care needs strong government involvement. It is suggested that deliberate efforts be taken to shift resources from curative to preventive measures. One way in which such a strategy can be attained is for the government to set, as a condition for private operators, a minimum level of preventive measures to be provided by every operator before a permit is issued. However, caution should be taken to ensure that such deliberations do not discourage investors in the health sector.

  2. Curation of Samples from Mars

    NASA Astrophysics Data System (ADS)

    Lindstrom, D.; Allen, C.

    One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. We have been conducting feasibility studies and developing designs for a facility that would be at least as capable as current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels exceeding those of the cleanest electronics manufacturing labs. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samp les require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation Laboratory includes a new- generation glovebox equipped with a robotic arm to evaluate the usability of robotic and teleoperated systems to perform curatorial tasks. The laboratory also contains equipment for precision cleaning and the measurement of trace organic contamination.

  3. Curating NASA's Extraterrestrial Samples - Past, Present, and Future

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Allton, Judith; Lofgren, Gary; Righter, Kevin; Zolensky, Michael

    2011-01-01

    Curation of extraterrestrial samples is the critical interface between sample return missions and the international research community. The Astromaterials Acquisition and Curation Office at the NASA Johnson Space Center (JSC) is responsible for curating NASA s extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10E "Curation of Extraterrestrial Materials", JSC is charged with ". . . curation of all extraterrestrial material under NASA control, including future NASA missions." The Directive goes on to define Curation as including "documentation, preservation, preparation, and distribution of samples for research, education, and public outreach."

  4. Curating NASA's Extraterrestrial Samples - Past, Present, and Future

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Allton, Judith; Lofgren, Gary; Righter, Kevin; Zolensky, Michael

    2010-01-01

    Curation of extraterrestrial samples is the critical interface between sample return missions and the international research community. The Astromaterials Acquisition and Curation Office at the NASA Johnson Space Center (JSC) is responsible for curating NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10E "Curation of Extraterrestrial Materials," JSC is charged with ". . . curation of all extraterrestrial material under NASA control, including future NASA missions." The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for research, education, and public outreach.

  5. An Integrated Science Glovebox for the Gateway Habitat

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Evans, C. A.; Garrison, D. H.; Bell, M. S.

    2018-01-01

    Next generation habitats for deep space exploration of cislunar space, the Moon, and ultimately Mars will benefit from on-board glovebox capability. Such a glovebox facility will maintain sample integrity for a variety of scientific endeavors whether for life science, materials science, or astromaterials. Glovebox lessons learned from decades of astromaterials curation, ISS on-board sample handling, and robust analog missions provide key design and operational factors for inclusion in on-going habitat development.

  6. Curation of Frozen Samples

    NASA Technical Reports Server (NTRS)

    Fletcher, L. A.; Allen, C. C.; Bastien, R.

    2008-01-01

    NASA's Johnson Space Center (JSC) and the Astromaterials Curator are charged by NPD 7100.10D with the curation of all of NASA s extraterrestrial samples, including those from future missions. This responsibility includes the development of new sample handling and preparation techniques; therefore, the Astromaterials Curator must begin developing procedures to preserve, prepare and ship samples at sub-freezing temperatures in order to enable future sample return missions. Such missions might include the return of future frozen samples from permanently-shadowed lunar craters, the nuclei of comets, the surface of Mars, etc. We are demonstrating the ability to curate samples under cold conditions by designing, installing and testing a cold curation glovebox. This glovebox will allow us to store, document, manipulate and subdivide frozen samples while quantifying and minimizing contamination throughout the curation process.

  7. Cleanroom Robotics: Appropriate Technology for a Sample Receiving Facility?

    NASA Technical Reports Server (NTRS)

    Bell, M. S.; Allen, C. C.

    2005-01-01

    NASA is currently pursuing a vigorous program that will collect samples from a variety of solar system environments. The Mars Exploration Program is expected to launch spacecraft that are designed to collect samples of martian soil, rocks, and atmosphere and return them to Earth, perhaps as early as 2016. International treaty obligations mandate that NASA conduct such a program in a manner that avoids cross-contamination both Earth and Mars. Because of this requirement, Mars sample curation will require a high degree biosafety, combined with extremely low levels inorganic, organic, and biological contamination.

  8. Three Dimensional Structures of Particles Recovered from the Asteroid Itokawa by the Hayabusa Mission and a Role of X-Ray Microtomography in the Preliminary Examination

    NASA Technical Reports Server (NTRS)

    Tsuchiyama, A.; Uesugi, M.; Uesugi, K.; Nakano, T.; Nakamura, T.; Noguchi, T.; Noguchi, R.; Matsumoto, T.; Matsuno, J.; Nagano, T.; hide

    2011-01-01

    Particles of regolith on S-type Asteroid 25143 Itokawa were successfully recovered by the Hayabusa mission of JAXA (Japan Aerospace Exploration Agency). Near-infrared spectral study of Itokawa s surface indicates that these particles are materials similar to LL5 or LL6 chondrites. High-resolution images of Itokawa's surface suggest that they may be breccias and some impact products. At least more than 1500 particles were identified as Itokawa origin at curation facility of JAXA. Preliminary analysis with SEM/EDX at the curation facility shows that they are roughly similar to LL chondrites. Although most of them are less than 10 micron in size, some larger particles of about 100 micron or larger were also identified. A part of the sample (probably several tens particles) will be selected by Hayabusa sample curation team, and sequential examination will start from January 2011 by Hayabusa Asteroidal Sample Preliminary Examination Team (HASPET). In mainstream of the analytical flow, each particle will be examined by microtomography, XRD and XRF first as nondestructive analyses, and then the particle will be cut by an ultra-microtome and examined by TEM, SEM, EPMA, SIMS, PEEM/XANES, and TOF-SIMS sequentially. Three-dimensional structures of Itokawa particles will be obtained by microtomography sub-team of HASPET. The results together with XRD and XRF will be used for design of later destructive analyses, such as determination of cutting direction and depth, to obtain as much information as possible from small particles. Scientific results and a role of the microtomography in the preliminary examination will be presented.

  9. Priority Science Targets for Future Sample Return Missions within the Solar System Out to the Year 2050

    NASA Technical Reports Server (NTRS)

    McCubbin, F. M.; Allton, J. H.; Barnes, J. J.; Boyce, J. W.; Burton, A. S.; Draper, D. S.; Evans, C. A.; Fries, M. D.; Jones, J. H.; Keller, L. P.; hide

    2017-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. JSC presently curates 9 different astromaterials collections: (1) Apollo samples, (2) LUNA samples, (3) Antarctic meteorites, (4) Cosmic dust particles, (5) Microparticle Impact Collection [formerly called Space Exposed Hardware], (6) Genesis solar wind, (7) Star-dust comet Wild-2 particles, (8) Stardust interstellar particles, and (9) Hayabusa asteroid Itokawa particles. In addition, the next missions bringing carbonaceous asteroid samples to JSC are Hayabusa 2/ asteroid Ryugu and OSIRIS-Rex/ asteroid Bennu, in 2021 and 2023, respectively. The Hayabusa 2 samples are provided as part of an international agreement with JAXA. The NASA Curation Office plans for the requirements of future collections in an "Advanced Curation" program. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of collections as envisioned by NASA exploration goals. Here we review the science value and sample curation needs of some potential targets for sample return missions over the next 35 years.

  10. Curating NASA's Past, Present, and Future Extraterrestrial Sample Collections

    NASA Technical Reports Server (NTRS)

    McCubbin, F. M.; Allton, J. H.; Evans, C. A.; Fries, M. D.; Nakamura-Messenger, K.; Righter, K.; Zeigler, R. A.; Zolensky, M.; Stansbery, E. K.

    2016-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10E "Curation of Extraterrestrial Materials", JSC is charged with "...curation of all extra-terrestrial material under NASA control, including future NASA missions." The Directive goes on to define Curation as including "...documentation, preservation, preparation, and distribution of samples for research, education, and public outreach." Here we describe some of the past, present, and future activities of the NASA Curation Office.

  11. Interview with Smithsonian NASM Spacesuit Curator Dr. Cathleen Lewis

    NASA Technical Reports Server (NTRS)

    Lewis, Cathleen; Wright, Rebecca

    2012-01-01

    Dr. Cathleen Lewis was interviewed by Rebecca Wright during the presentation of an "Interview with Smithsonian NASM Spacesuit Curator Dr. Cathleen Lewis" on May 14, 2012. Topics included the care, size, and history of the spacesuit collection at the Smithsonian and the recent move to the state-of-the-art permanent storage facility at the Udvar-Hazy facility in Virginia.

  12. The Importance of Contamination Knowledge in Curation - Insights into Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Harrington, A. D.; Calaway, M. J.; Regberg, A. B.; Mitchell, J. L.; Fries, M. D.; Zeigler, R. A.; McCubbin, F. M.

    2018-01-01

    The Astromaterials Acquisition and Curation Office at NASA Johnson Space Center (JSC), in Houston, TX (henceforth Curation Office) manages the curation of extraterrestrial samples returned by NASA missions and shared collections from international partners, preserving their integrity for future scientific study while providing the samples to the international community in a fair and unbiased way. The Curation Office also curates flight and non-flight reference materials and other materials from spacecraft assembly (e.g., lubricants, paints and gases) of sample return missions that would have the potential to cross-contaminate a present or future NASA astromaterials collection.

  13. Curating NASA's Future Extraterrestrial Sample Collections: How Do We Achieve Maximum Proficiency?

    NASA Technical Reports Server (NTRS)

    McCubbin, Francis; Evans, Cynthia; Zeigler, Ryan; Allton, Judith; Fries, Marc; Righter, Kevin; Zolensky, Michael

    2016-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10E "Curation of Extraterrestrial Materials", JSC is charged with "The curation of all extraterrestrial material under NASA control, including future NASA missions." The Directive goes on to define Curation as including "... documentation, preservation, preparation, and distribution of samples for research, education, and public outreach." Here we describe some of the ongoing efforts to ensure that the future activities of the NASA Curation Office are working towards a state of maximum proficiency.

  14. The French initiative for scientific cores virtual curating : a user-oriented integrated approach

    NASA Astrophysics Data System (ADS)

    Pignol, Cécile; Godinho, Elodie; Galabertier, Bruno; Caillo, Arnaud; Bernardet, Karim; Augustin, Laurent; Crouzet, Christian; Billy, Isabelle; Teste, Gregory; Moreno, Eva; Tosello, Vanessa; Crosta, Xavier; Chappellaz, Jérome; Calzas, Michel; Rousseau, Denis-Didier; Arnaud, Fabien

    2016-04-01

    Managing scientific data is probably one the most challenging issue in modern science. The question is made even more sensitive with the need of preserving and managing high value fragile geological sam-ples: cores. Large international scientific programs, such as IODP or ICDP are leading an intense effort to solve this problem and propose detailed high standard work- and dataflows thorough core handling and curating. However most results derived from rather small-scale research programs in which data and sample management is generally managed only locally - when it is … The national excellence equipment program (Equipex) CLIMCOR aims at developing French facilities for coring and drilling investigations. It concerns indiscriminately ice, marine and continental samples. As part of this initiative, we initiated a reflexion about core curating and associated coring-data management. The aim of the project is to conserve all metadata from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. In that aim, our demarche was conducted through an close relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative currently proposes a single web portal in which all scientifics teams can store their field data. For legacy samples, this will requires the establishment of a dedicated core lists with associated metadata. For forthcoming samples, we propose a mobile application, under Android environment to capture technical and scientific metadata on the field. This application is linked with a unique coring tools library and is adapted to most coring devices (gravity, drilling, percussion, etc...) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards or persistent identifiers (IGSN, ORCID and INSPIRE) and displayed in international portals (currently, NOAA's IMLGS). In this paper, we present the architecture of the integrated system, future perspectives and the approach we adopted to reach our goals. We will also present in front of our poster, one of the three mobile applications, dedicated more particularly to the operations of continental drillings.

  15. Starting a European Space Agency Sample Analogue Collection (ESA2C) and Curation Facility for Exploration Missions.

    NASA Astrophysics Data System (ADS)

    Smith, C. L.; Rumsey, M. S.; Manick, K.; Gill, S.-J.; Mavris, C.; Schroeven-Deceuninck, H.; Duvet, L.

    2017-09-01

    The ESA2C will support current and future technology development activities that are required for human and robotic exploration of Mars, Phobos, Deimos, C-Type Asteroids and the Moon.The long-term goal of this work is to produce a useful, useable and sustainable resource for engineers and scientists developing technologies for ESA space exploration missions.

  16. Advanced Curation Activities at NASA: Preparing to Receive, Process, and Distribute Samples Returned from Future Missions

    NASA Technical Reports Server (NTRS)

    McCubbin, Francis M.; Zeigler, Ryan A.

    2017-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.

  17. Advanced Curation Activities at NASA: Implications for Astrobiological Studies of Future Sample Collections

    NASA Technical Reports Server (NTRS)

    McCubbin, F. M.; Evans, C. A.; Fries, M. D.; Harrington, A. D.; Regberg, A. B.; Snead, C. J.; Zeigler, R. A.

    2017-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for re-search, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.

  18. Astromaterials Curation Online Resources for Principal Investigators

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Zeigler, Ryan A.; Mueller, Lina

    2017-01-01

    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center curates all of NASA's extraterrestrial samples, the most extensive set of astromaterials samples available to the research community worldwide. The office allocates 1500 individual samples to researchers and students each year and has served the planetary research community for 45+ years. The Astromaterials Curation office provides access to its sample data repository and digital resources to support the research needs of sample investigators and to aid in the selection and request of samples for scientific study. These resources can be found on the Astromaterials Acquisition and Curation website at https://curator.jsc.nasa.gov. To better serve our users, we have engaged in several activities to enhance the data available for astromaterials samples, to improve the accessibility and performance of the website, and to address user feedback. We havealso put plans in place for continuing improvements to our existing data products.

  19. Collaboration between Government and Non-Governmental Organizations (NGOs) in Delivering Curative Health Services in North Darfur State, Sudan- a National Report.

    PubMed

    I A Yagub, Abdallah

    2014-05-01

    North Darfur State has been affected by conflict since 2003 and the government has not been able to provide adequate curative health services to the people. The government has come to rely on Non-Governmental Organizations (NGOs) to provide curative health services. This study was conducted to examine the existing collaboration between government and NGOs in curative health service delivery in North Darfur State, and to identify the challenges that affect their collaboration. Documentary data were collected from government offices and medical organizations. Primary data were obtained through interviews with government and NGOs representatives. The interviews were conducted with (1) expatriates working for international NGOs (N=15) and (2), health professionals and administrators working in the health sector (N= 45). The collaboration between the government and NGOs has been very weak because of security issues and lack of trust. The NGOs collaborate by providing human and financial resources, material and equipment, and communication facilities. The NGOs supply 70% of curative health services, and contribute 52.9% of the health budget in North Darfur State. The NGOs have employed 1 390 health personnel, established 44 health centres and manage and support 83 health facilities across the State. The NGOs have played a positive role in collaborating with the government in North Darfur State in delivering curative health services, while government's role has been negative. The problem that faces the government in future is how health facilities will be run should a peaceful settlement be reached and NGOs leave the region.

  20. Collaboration between Government and Non-Governmental Organizations (NGOs) in Delivering Curative Health Services in North Darfur State, Sudan- a National Report

    PubMed Central

    I A YAGUB, Abdallah

    2014-01-01

    Abstract Background North Darfur State has been affected by conflict since 2003 and the government has not been able to provide adequate curative health services to the people. The government has come to rely on Non-Governmental Organizations (NGOs) to provide curative health services. This study was conducted to examine the existing collaboration between government and NGOs in curative health service delivery in North Darfur State, and to identify the challenges that affect their collaboration. Methods Documentary data were collected from government offices and medical organizations. Primary data were obtained through interviews with government and NGOs representatives. The interviews were conducted with (1) expatriates working for international NGOs (N=15) and (2), health professionals and administrators working in the health sector (N= 45). Results The collaboration between the government and NGOs has been very weak because of security issues and lack of trust. The NGOs collaborate by providing human and financial resources, material and equipment, and communication facilities. The NGOs supply 70% of curative health services, and contribute 52.9% of the health budget in North Darfur State. The NGOs have employed 1 390 health personnel, established 44 health centres and manage and support 83 health facilities across the State. Conclusion The NGOs have played a positive role in collaborating with the government in North Darfur State in delivering curative health services, while government’s role has been negative. The problem that faces the government in future is how health facilities will be run should a peaceful settlement be reached and NGOs leave the region. PMID:26056656

  1. Preparing to Receive and Handle Martian Samples When They Arrive on Earth

    NASA Technical Reports Server (NTRS)

    McCubbin, Francis M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F+ derivative NPR 'Curation of Extraterrestrial Materials', JSC is charged with 'The curation of all extraterrestrial material under NASA control, including future NASA missions. 'The Directive goes on to define Curation as including'...documentation, preservation, preparation, and distribution of samples for research, education, and public outreach."

  2. Improving the Acquisition and Management of Sample Curation Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  3. Marine Curators Gather

    ERIC Educational Resources Information Center

    McCoy, Floyd W.

    1977-01-01

    Reports on a recent meeting of marine curators in which data dissemination, standardization of marine curating techniques and methods, responsibilities of curators, funding problems, and sampling equipment were the main areas of discussion. A listing of the major deep sea sample collections in the United States is also provided. (CP)

  4. JSC Advanced Curation: Research and Development for Current Collections and Future Sample Return Mission Demands

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.

    2015-01-01

    Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.

  5. Investigating Astromaterials Curation Applications for Dexterous Robotic Arms

    NASA Technical Reports Server (NTRS)

    Snead, C. J.; Jang, J. H.; Cowden, T. R.; McCubbin, F. M.

    2018-01-01

    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center is currently investigating tools and methods that will enable the curation of future astromaterials collections. Size and temperature constraints for astromaterials to be collected by current and future proposed missions will require the development of new robotic sample and tool handling capabilities. NASA Curation has investigated the application of robot arms in the past, and robotic 3-axis micromanipulators are currently in use for small particle curation in the Stardust and Cosmic Dust laboratories. While 3-axis micromanipulators have been extremely successful for activities involving the transfer of isolated particles in the 5-20 micron range (e.g. from microscope slide to epoxy bullet tip, beryllium SEM disk), their limited ranges of motion and lack of yaw, pitch, and roll degrees of freedom restrict their utility in other applications. For instance, curators removing particles from cosmic dust collectors by hand often employ scooping and rotating motions to successfully free trapped particles from the silicone oil coatings. Similar scooping and rotating motions are also employed when isolating a specific particle of interest from an aliquot of crushed meteorite. While cosmic dust curators have been remarkably successful with these kinds of particle manipulations using handheld tools, operator fatigue limits the number of particles that can be removed during a given extraction session. The challenges for curation of small particles will be exacerbated by mission requirements that samples be processed in N2 sample cabinets (i.e. gloveboxes). We have been investigating the use of compact robot arms to facilitate sample handling within gloveboxes. Six-axis robot arms potentially have applications beyond small particle manipulation. For instance, future sample return missions may involve biologically sensitive astromaterials that can be easily compromised by physical interaction with a curator; other potential future returned samples may require cryogenic curation. Robot arms may be combined with high resolution cameras within a sample cabinet and controlled remotely by curator. Sophisticated robot arm and hand combination systems can be programmed to mimic the movements of a curator wearing a data glove; successful implementation of such a system may ultimately allow a curator to virtually operate in a nitrogen, cryogenic, or biologically sensitive environment with dexterity comparable to that of a curator physically handling samples in a glove box.

  6. The role of non-governmental organizations in providing curative health services in North Darfur State, Sudan.

    PubMed

    Yagub, Abdallah I A; Mtshali, Khondlo

    2015-09-01

    Conflict in North Darfur state, Western Sudan started in 2003, and the delivering of curative health services was becoming a greater challenge for the country's limited resources. NGOs have played an important role in providing curative health services. To examine the role that Non-Governmental Organizations (NGOs) have played in providing curative health services, as well as to identify the difficulties and challenges that affect NGOs in delivering curative health services. Secondary data was collected from different sources, including government offices and medical organizations in Sudan and in North Darfur state. Primary data was obtained through interviews with government and NGOs representatives. The interviews were conducted with (1) expatriates working for international NGOs (N=15) (2) health professionals and administrators working in health sector (N= 45) in the period from November 2010 to January 2011. The government in North Darfur state spent 70% of its financial budget on security, while it spent it less than 1% on providing health services. The international NGOs have been providing 70% of curative health services to the State's population by contributing 52.9% of the health budget and 1 390 health personnel. Since 2003 NGOs have provided technical assistance to the health staff. As a result, more than fifty nurses have been trained to provide care and treatment, more than twenty-three doctors have been trained in laboratory equipment operation, and approximately six senior doctors and hospital directors have received management training. NGOs have been managing and supporting 89 public health facilities, and established 24 health centres in IDP camps, and 20 health centres across all the districts in North Darfur state. The NGOs have played an important role in providing curative health services and in establishing good health facilities, but a future problem is how the government will run these health facilities after a peaceful settlement has been reached which might cause NGOs to leave the region.

  7. Lunar and Meteorite Thin Sections for Undergraduate and Graduate Studies

    NASA Astrophysics Data System (ADS)

    Allen, J.; Allen, C.

    2012-12-01

    The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Studies of rock and soil samples from the Moon and meteorites continue to yield useful information about the early history of the Moon, the Earth, and the inner solar system. Petrographic Thin Section Packages containing polished thin sections of samples from either the Lunar or Meteorite collections have been prepared. Each set of twelve sections of Apollo lunar samples or twelve sections of meteorites is available for loan from JSC. The thin sections sets are designed for use in domestic college and university courses in petrology. The loan period is very strict and limited to two weeks. Contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov Each set of slides is accompanied by teaching materials and a sample disk of representative lunar or meteorite samples. It is important to note that the samples in these sets are not exactly the same as the ones listed here. This list represents one set of samples. A key education resource available on the Curation website is Antarctic Meteorite Teaching Collection: Educational Meteorite Thin Sections, originally compiled by Bevan French, Glenn McPherson, and Roy Clarke and revised by Kevin Righter in 2010. Curation Websites College and university staff and students are encouraged to access the Lunar Petrographic Thin Section Set Publication and the Meteorite Petrographic Thin Section Package Resource which feature many thin section images and detailed descriptions of the samples, research results. http://curator.jsc.nasa.gov/Education/index.cfm Request research samples: http://curator.jsc.nasa.gov/ JSC-CURATION-EDUCATION-DISKS@mail.nasa.govLunar Thin Sections; Meteorite Thin Sections;

  8. NASA Johnson Space Center's Planetary Sample Analysis and Mission Science (PSAMS) Laboratory: A National Facility for Planetary Research

    NASA Technical Reports Server (NTRS)

    Draper, D. S.

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.

  9. Antarctic meteorite newsletter. Volume 4: Number 1, February 1981: Antarctic meteorite descriptions, 1976, 1977, 1978, 1979

    NASA Technical Reports Server (NTRS)

    Stone, R.; Schwarz, C. M.; King, T. V. V.; Mason, B.; Bogard, D. D.; Gabel, E. M.

    1981-01-01

    This issue of the Newsletter is essentially a catalog of all antarctic meteorites in the collections of the Johnson Space Center Curation Facility and the Smithsonian except for 288 pebbles now being classed. It includes listings of all previously distributed data sheets plus a number of new ones for 1979. Indexes of samples include meteorite name/number, classification, and weathering category. Separate indexes list type 3 and 4 chondrites, all irons, all achondrites, and all carbonaceous chondrites.

  10. An Internationally Coordinated Science Management Plan for Samples Returned from Mars

    NASA Astrophysics Data System (ADS)

    Haltigin, T.; Smith, C. L.

    2015-12-01

    Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.

  11. Measuring effective coverage of curative child health services in rural Burkina Faso: a cross-sectional study

    PubMed Central

    Koulidiati, Jean-Louis; Nesbitt, Robin C; Ouedraogo, Nobila; Hien, Hervé; Robyn, Paul Jacob; Compaoré, Philippe; Souares, Aurélia; Brenner, Stephan

    2018-01-01

    Objective To estimate both crude and effective curative health services coverage provided by rural health facilities to under 5-year-old (U5YO) children in Burkina Faso. Methods We surveyed 1298 child health providers and 1681 clinical cases across 494 primary-level health facilities, as well as 12 497 U5YO children across 7347households in the facilities’ catchment areas. Facilities were scored based on a set of indicators along three quality-of-care dimensions: management of common childhood diseases, management of severe childhood diseases and general service readiness. Linking service quality to service utilisation, we estimated both crude and effective coverage of U5YO children by these selected curative services. Results Measured performance quality among facilities was generally low with only 12.7% of facilities surveyed reaching our definition of high and 57.1% our definition of intermediate quality of care. The crude coverage was 69.5% while the effective coverages indicated that 5.3% and 44.6% of children reporting an illness episode received services of only high or high and intermediate quality, respectively. Conclusion Our study showed that the quality of U5YO child health services provided by primary-level health facilities in Burkina Faso was low, resulting in relatively ineffective population coverage. Poor adherence to clinical treatment guidelines combined with the lack of equipment and qualified clinical staff that performed U5YO consultations seemed to be contributors to the gap between crude and effective coverage. PMID:29858415

  12. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    NASA Astrophysics Data System (ADS)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  13. Planning considerations for a Mars Sample Receiving Facility: summary and interpretation of three design studies.

    PubMed

    Beaty, David W; Allen, Carlton C; Bass, Deborah S; Buxbaum, Karen L; Campbell, James K; Lindstrom, David J; Miller, Sylvia L; Papanastassiou, Dimitri A

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  14. The demand for child curative care in two rural thanas of Bangladesh: effect of income and women's employment.

    PubMed

    Levin, A; Rahman, M A; Quayyum, Z; Routh, S; Barkat-e-Khuda

    2001-01-01

    This paper seeks to investigate the determinants of child health care seeking behaviours in rural Bangladesh. In particular, the effects of income, women's access to income, and the prices of obtaining child health care are examined. Data on the use of child curative care were collected in two rural areas of Bangladesh--Abhoynagar Thana of Jessore District and Mirsarai Thana of Chittagong District--in March 1997. In estimating the use of child curative care, the nested multinomial logit specification was used. The results of the analysis indicate that a woman's involvement in a credit union or income generation affected the likelihood that curative child care was used. Household wealth decreased the likelihood that the child had an illness episode and affected the likelihood that curative child care was sought. Among facility characteristics, travel time was statistically significant and was negatively associated with the use of a provider.

  15. Reflections on curative health care in Nicaragua.

    PubMed Central

    Slater, R G

    1989-01-01

    Improved health care in Nicaragua is a major priority of the Sandinista revolution; it has been pursued by major reforms of the national health care system, something few developing countries have attempted. In addition to its internationally recognized advances in public health, considerable progress has been made in health care delivery by expanding curative medical services through training more personnel and building more facilities to fulfill a commitment to free universal health coverage. The very uneven quality of medical care is the leading problem facing curative medicine now. Underlying factors include the difficulty of adequately training the greatly increased number of new physicians. Misdiagnosis and mismanagement continue to be major problems. The curative medical system is not well coordinated with the preventive sector. Recent innovations include initiation of a "medicina integral" residency, similar to family practice. Despite its inadequacies and the handicaps of war and poverty, the Nicaraguan curative medical system has made important progress. PMID:2705603

  16. Hayabusa Reentry and Recovery of Its Capsule -Quick Report

    NASA Astrophysics Data System (ADS)

    Kawaguchi, Junichiro; Yoshikawa, Makoto; Kuninaka, Hitoshi

    The Hayabusa spacecraft successfully returned to the Earth and re-entered into the atmosphere for sample recovery after also the successful touching-downs to NEO Itokawa in 2005. The reentry occurred on June 13th, and took place in Woomera Prohibited Area (WPA) of Australia. This paper presents how the reentry and recovery operations were performed, and also reports the current status about the sample curation activity. The Hayabusa mission aims at demonstrating key technologies requisite for future real Sample and Return missions. However, the spacecraft adopted the actual Sample and Return flight sequence and was designed to make a world's first round trip to an extra terrestrial object with touching-down and lifting-off. It is the spacecraft propelled by the ion engines aboard for interplanetary cruise. The Hayabusa spacecraft launched in May of 2003 reached NEO Itokawa in September of 2005 via Earth gravity assist in May of 2004. It stayed there for about two and a half months, and performed detailed scientific observation and mapping and determination of the shape. In November of 2005, the spacecraft made two touching-downs and lifting-offs having attempted collection of surface sample. At the second opportunity, the spacecraft directed shooting a projectile. But, due to the programming problem, presumably the projectile was not shot. However, the spacecraft may have captured some small amount of sample particles in a catcher aboard, when the spacecraft made actual touches down to the surface. The spacecraft suffered from fuel leak in December of 2005, and the communication resumed after seven weeks of hiatus. And the ion engines all faced their life by November of 2009, and the project team devised an alternative drive configuration and successfully coped with the difficulty. Despite many hardships, the spacecraft has been operated for return cruise, and it made a reentry for sample recovery this June. The sample catcher was retrieved at WPA and transported back to the curation facility of JAXA. Currently the curators have examined analyzed the catcher recovered. This presentation quickly reports recent status of the spacecraft, capsule and sample analysis.

  17. Rapid Classification of Ordinary Chondrites Using Raman Spectroscopy

    NASA Technical Reports Server (NTRS)

    Fries, M.; Welzenbach, L.

    2014-01-01

    Classification of ordinary chondrites is typically done through measurements of the composition of olivine and pyroxenes. Historically, this measurement has usually been performed via electron microprobe, oil immersion or other methods which can be costly through lost sample material during thin section preparation. Raman microscopy can perform the same measurements but considerably faster and with much less sample preparation allowing for faster classification. Raman spectroscopy can facilitate more rapid classification of large amounts of chondrites such as those retrieved from North Africa and potentially Antarctica, are present in large collections, or are submitted to a curation facility by the public. With development, this approach may provide a completely automated classification method of all chondrite types.

  18. Clean and Cold Sample Curation

    NASA Technical Reports Server (NTRS)

    Allen, C. C.; Agee, C. B.; Beer, R.; Cooper, B. L.

    2000-01-01

    Curation of Mars samples includes both samples that are returned to Earth, and samples that are collected, examined, and archived on Mars. Both kinds of curation operations will require careful planning to ensure that the samples are not contaminated by the instruments that are used to collect and contain them. In both cases, sample examination and subdivision must take place in an environment that is organically, inorganically, and biologically clean. Some samples will need to be prepared for analysis under ultra-clean or cryogenic conditions. Inorganic and biological cleanliness are achievable separately by cleanroom and biosafety lab techniques. Organic cleanliness to the <50 ng/sq cm level requires material control and sorbent removal - techniques being applied in our Class 10 cleanrooms and sample processing gloveboxes.

  19. Organic Contamination Baseline Study: In NASA JSC Astromaterials Curation Laboratories. Summary Report

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.

    2013-01-01

    In preparation for OSIRIS-REx and other future sample return missions concerned with analyzing organics, we conducted an Organic Contamination Baseline Study for JSC Curation Labsoratories in FY12. For FY12 testing, organic baseline study focused only on molecular organic contamination in JSC curation gloveboxes: presumably future collections (i.e. Lunar, Mars, asteroid missions) would use isolation containment systems over only cleanrooms for primary sample storage. This decision was made due to limit historical data on curation gloveboxes, limited IR&D funds and Genesis routinely monitors organics in their ISO class 4 cleanrooms.

  20. COmet Nucleus Dust and Organics Return (CONDOR): a New Frontiers 4 Mission Proposal

    NASA Astrophysics Data System (ADS)

    Choukroun, M.; Raymond, C.; Wadhwa, M.

    2017-09-01

    CONDOR would collect and return a ≥ 50 g sample from the surface of 67P/Churyumov-Gerasimenko for detailed analysis in terrestrial laboratories. It would carry a simple payload comprising a narrow-angle camera and mm-wave radiometer to select a sampling site, and perform a gravity science investigation to survey changes of 67P since Rosetta. The proposed sampling system uses the BiBlade tool to acquire a sample down to 15 cm depth in a Touch-and-Go event. The Stardust-based sample return capsule is augmented with cooling and purge systems to maintain sample integrity during landing and until delivery to JSC's Astromaterials Curation Facility. Analysis of rock-forming minerals, organics, water and noble gases would probe the origin of these materials, and their evolution from the primordial molecular cloud to the 67P environment.

  1. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Astrophysics Data System (ADS)

    Zeigler, R. A.; Blumenfeld, E. H.; Srinivasan, P.; McCubbin, F. M.; Evans, C. A.

    2018-04-01

    The Astromaterials Curation Office has recently begun incorporating X-ray CT data into the curation processes for lunar and meteorite samples, and long-term curation of that data and serving it to the public represent significant technical challenges.

  2. International Agreement on Planetary Protection

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The maintenance of a NASA policy, is consistent with international agreements. The planetary protection policy management in OSS, with Field Center support. The advice from internal and external advisory groups (NRC, NAC/Planetary Protection Task Force). The technology research and standards development in bioload characterization. The technology research and development in bioload reduction/sterilization. This presentation focuses on: forward contamination - research on the potential for Earth life to exist on other bodies, improved strategies for planetary navigation and collision avoidance, and improved procedures for sterile spacecraft assembly, cleaning and/or sterilization; and backward contamination - development of sample transfer and container sealing technologies for Earth return, improvement in sample return landing target assessment and navigation strategy, planning for sample hazard determination requirements and procedures, safety certification, (liaison to NEO Program Office for compositional data on small bodies), facility planning for sample recovery system, quarantine, and long-term curation of 4 returned samples.

  3. Lunar and Meteorite Thin Sections for Undergraduate and Graduate Studies

    NASA Technical Reports Server (NTRS)

    Allen, J.; Galindo, C.; Luckey, M.; Reustle, J.; Todd, N.; Allen, C.

    2012-01-01

    The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Between 1969 and 1972 six Apollo missions brought back 382 kilograms of lunar rocks, core samples, pebbles, sand and dust from the lunar surface. JSC also curates meteorites collected on US expeditions to Antarctica including rocks from Moon, Mars, and many asteroids including Vesta. Studies of rock and soil samples from the Moon and meteorites continue to yield useful information about the early history of the Moon, the Earth, and the inner solar system.

  4. Sharing Responsibility for Data Stewardship Between Scientists and Curators

    NASA Astrophysics Data System (ADS)

    Hedstrom, M. L.

    2012-12-01

    Data stewardship is becoming increasingly important to support accurate conclusions from new forms of data, integration of and computation across heterogeneous data types, interactions between models and data, replication of results, data governance and long-term archiving. In addition to increasing recognition of the importance of data management, data science, and data curation by US and international scientific agencies, the National Academies of Science Board on Research Data and Information is sponsoring a study on Data Curation Education and Workforce Issues. Effective data stewardship requires a distributed effort among scientists who produce data, IT staff and/or vendors who provide data storage and computational facilities and services, and curators who enhance data quality, manage data governance, provide access to third parties, and assume responsibility for long-term archiving of data. The expertise necessary for scientific data management includes a mix of knowledge of the scientific domain; an understanding of domain data requirements, standards, ontologies and analytical methods; facility with leading edge information technology; and knowledge of data governance, standards, and best practices for long-term preservation and access that rarely are found in a single individual. Rather than developing data science and data curation as new and distinct occupations, this paper examines the set of tasks required for data stewardship. The paper proposes an alternative model that embeds data stewardship in scientific workflows and coordinates hand-offs between instruments, repositories, analytical processing, publishers, distributors, and archives. This model forms the basis for defining knowledge and skill requirements for specific actors in the processes required for data stewardship and the corresponding educational and training needs.

  5. Advanced Curation: Solving Current and Future Sample Return Problems

    NASA Technical Reports Server (NTRS)

    Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.

    2015-01-01

    Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon. The chemical kinetics of this reaction are poorly understood at present under the conditions of cached or curated martian samples. Among other parameters, what is the maximum temperature allowed during storage in order to preserve native martian organic compounds for analysis? What is the best means to collect headspace gases from cached martian (and other) samples? This gas will contain not only martian atmosphere but also off-gassed volatiles from the cached solids.

  6. Pieces of Other Worlds - Extraterrestrial Samples for Education and Public Outreach

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2010-01-01

    During the Year of the Solar System spacecraft from NASA and our international partners will encounter two comets; orbit the asteroid Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories, and their continued study provides incredibly valuable "ground truth" to complement space exploration missions. Extensive information about these unique materials, as well as actual lunar samples and meteorites, are available for display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. At the current time JSC curates six types of extraterrestrial samples: (1) Moon rocks and soils collected by the Apollo astronauts (2) Meteorites collected on US expeditions to Antarctica (including rocks from the Moon, Mars, and many asteroids including Vesta) (3) "Cosmic dust" (asteroid and comet particles) collected by high-altitude aircraft (4) Solar wind atoms collected by the Genesis spacecraft (5) Comet particles collected by the Stardust spacecraft (6) Interstellar dust particles collected by the Stardust spacecraft These rocks, soils, dust particles, and atoms continue to be studied intensively by scientists around the world. Descriptions of the samples, research results, thousands of photographs, and information on how to request research samples are on the JSC Curation website: http://curator.jsc.nasa.gov/ NASA provides a limited number of Moon rock samples for either short-term or long-term displays at museums, planetariums, expositions, and professional events that are open to the public. The JSC Public Affairs Office handles requests for such display samples. Requestors should apply in writing to Mr. Louis Parker, JSC Exhibits Manager. Mr. Parker will advise successful applicants regarding provisions for receipt, display, and return of the samples. All loans will be preceded by a signed loan agreement executed between NASA and the requestor's organization. Email address: louis.a.parker@nasa.gov Sets of twelve thin sections of Apollo lunar samples and sets of twelve thin sections of meteorites are available for short-term loan from JSC Curation. The thin sections are designed for use in college and university courses where petrographic microscopes are available for viewing. Requestors should contact the Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov

  7. Advances in Small Particle Handling of Astromaterials in Preparation for OSIRIS-REx and Hayabusa2: Initial Developments

    NASA Technical Reports Server (NTRS)

    Snead, C. J.; McCubbin, F. M.; Nakamura-Messenger, K.; Righter, K.

    2018-01-01

    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center has established an Advanced Curation program that is tasked with developing procedures, technologies, and data sets necessary for the curation of future astromaterials collections as envisioned by NASA exploration goals. One particular objective of the Advanced Curation program is the development of new methods for the collection, storage, handling and characterization of small (less than 100 micrometer) particles. Astromaterials Curation currently maintains four small particle collections: Cosmic Dust that has been collected in Earth's stratosphere by ER2 and WB-57 aircraft, Comet 81P/Wild 2 dust returned by NASA's Stardust spacecraft, interstellar dust that was returned by Stardust, and asteroid Itokawa particles that were returned by the JAXA's Hayabusa spacecraft. NASA Curation is currently preparing for the anticipated return of two new astromaterials collections - asteroid Ryugu regolith to be collected by Hayabusa2 spacecraft in 2021 (samples will be provided by JAXA as part of an international agreement), and asteroid Bennu regolith to be collected by the OSIRIS-REx spacecraft and returned in 2023. A substantial portion of these returned samples are expected to consist of small particle components, and mission requirements necessitate the development of new processing tools and methods in order to maximize the scientific yield from these valuable acquisitions. Here we describe initial progress towards the development of applicable sample handling methods for the successful curation of future small particle collections.

  8. Nuts and Bolts - Techniques for Genesis Sample Curation

    NASA Technical Reports Server (NTRS)

    Burkett, Patti J.; Rodriquez, M. C.; Allton, J. H.

    2011-01-01

    The Genesis curation staff at NASA Johnson Space Center provides samples and data for analysis to the scientific community, following allocation approval by the Genesis Oversight Committee, a sub-committee of CAPTEM (Curation Analysis Planning Team for Extraterrestrial Materials). We are often asked by investigators within the scientific community how we choose samples to best fit the requirements of the request. Here we will demonstrate our techniques for characterizing samples and satisfying allocation requests. Even with a systematic approach, every allocation is unique. We are also providing updated status of the cataloging and characterization of solar wind collectors as of January 2011. The collection consists of 3721 inventoried samples consisting of a single fragment, or multiple fragments containerized or pressed between post-it notes, jars or vials of various sizes.

  9. Processes to Open the Container and the Sample Catcher of the Hayabusa Returned Capsule in the Planetary Material Sample Curation Facility of JAXA

    NASA Technical Reports Server (NTRS)

    Fujimura, A.; Abe, M.; Yada, T.; Nakamura, T.; Noguchi, T.; Okazaki, R.; Ishibashi, Y.; Shirai, K.; Okada, T.; Yano, H.; hide

    2011-01-01

    Japanese spacecraft Hayabusa, which returned from near-Earth-asteroid Itokawa, successfully returned its reentry capsule to the Earth, the Woomera Prohibited Area in Australia in Jun 13th, 2010, as detailed in another paper [1]. The capsule introduced into the Planetary Material Sample Curation Facility in the Sagamihara campus of JAXA in the early morning of June 18th. Hereafter, we describe a series of processes for the returned capsule and the container to recover gas and materials in there. A transportation box of the recovered capsule was cleaned up on its outer surface beforehand and introduced into the class 10,000 clean room of the facility. Then, the capsule was extracted from the box and its plastic bag was opened and checked and photographed the outer surface of the capsule. The capsule was composed of the container, a backside ablator, a side ablator, an electronic box and a supporting frame. The container consists of an outer lid, an inner lid, a frame for latches, a container and a sample catcher, which is composed of room A and B and a rotational cylinder. After the first check, the capsule was packed in a plastic bag with N2 again, and transferred to the Chofu campus in JAXA, where the X-ray CT instrument is situated. The first X-ray CT analysis was performed on the whole returned capsule for confirming the conditions of latches and O-ring seal of the container. The analysis showed that the latches of the container should have worked normally, and that the double Orings of the container seemed to be sealed its sample catcher with no problem. After the first X-ray CT, the capsule was sent back to Sagamihara and introduced in the clean room to exclude the electronic box and the side ablator from the container by hand tools. Then the container with the backside ablator was set firmly to special jigs to fix the lid of container tightly to the container and set to a milling machine. The backside ablator was drilled by the machine to expose heads of bolts, which combined the ablator to the outer lid of the container, and after the drilling had been finished, all the bolts were unscrewed and the backside ablator was removed from the container. Then, the container was sent to the Chofu X-ray facility again to examine in detail by a micro X-ray CT instrument in order to reconfirm that the condition of the latches of the lid of container was normal and that its double O-ring seemed to have been sealed after the last X-ray CT analysis.

  10. Research-Grade 3D Virtual Astromaterials Samples: Novel Visualization of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Benefit Curation, Research, and Education

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2017-01-01

    NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.

  11. Curation of Microscopic Astromaterials by NASA: "Gathering Dust Since 1981"

    NASA Technical Reports Server (NTRS)

    Frank, D. R.; Bastien, R. K.; Rodriguez, M.; Gonzalez, C.; Zolensky, M. E.

    2013-01-01

    Employing the philosophy that "Small is Beautiful", NASA has been collecting and curating microscopic astromaterials since 1981. These active collections now include interplanetary dust collected in Earth's stratosphere by U-2, ER-2 and WB-57F aircraft (the Cosmic Dust Program - our motto is "Gathering dust since 1981"), comet Wild-2 coma dust (the Stardust Mission), modern interstellar dust (also the Stardust Mission), asteroid Itokawa regolith dust (the Hayabusa Mission - joint curation with JAXA-ISAS), and interplanetary dust impact features on recovered portions of the following spacecraft: Skylab, the Solar Maximum Satellite, the Palapa Satellite, the Long Duration Exposure Facility (LDEF), the MIR Space Station, the International Space Station, and the Hubble Space Telescope (all in the Space Exposed Hardware Laboratory).

  12. Initial analysis and curation plans for MUSES-C asteroidal samples

    NASA Astrophysics Data System (ADS)

    Yano, H.; Kushiro, I.; Fujiwara, A.

    In the MUSES-C mission, sample return of several hundred mg to several g in to- tal is expected from the surface of the S-type near Earth asteroid 1998SF36 in 2007. The MUSES-C samples are expected to be more similar to micrometeorites than large pieces of rocks. Also the initial analysis to characterize general aspects of returned samples can consume only 15 % of its total mass and must complete the whole anal- yses including the database building before international AO for detailed analyses opens in less than a year. Confident exercise of non-destructive, micro-analyses when- ever possible are thus vital for the SMUSES-C Asteroidal Sample Preliminary Exam- ination Team (MASPET)T, which will be formed by the ISAS MUSES-C team, the international partners from NASA and Australia and Sall-JapanT meteoritic scientists to be selected as outsourcing parts of the initial analyses. In 2000-2001, in the pur- pose to survey what kinds and levels of micro-analysis techniques in respective fields, from major elements and mineralogy to trace and isotopic elements and organics, are available in Japan at present, ISAS welcomed a total of 11 applications to the first round open competition for the MASPET candidates. The initial evaluation was made by multiple domestic peer reviews. Nine out of 11 were then provided two kinds of Sasteroid sample analogsT that no applicant knew what they were in advance by the Selection Committee (chair: I. Kushiro) in order to conduct proposed analysis with self-claimed amount of samples (100 mg max) in self-claimed duration (6 months max). The proponents must demonstrate how much their technical capabilities, ana- lytical precision, and usefulness of the derived results for subsequent detailed analyses are worth being included in the MASPET studies. After the completion of multiple, international peer reviews, the Selection Committee compiled evaluations and recom- mended the finalists of this round competition. However, it is also recognized that there are a few more areas of expertise still lacked within the recommended members. Thus, the competition shall be repeated one or two more times (in early 2003 after the launch, and possibly in 2005 after in-situ data is obtained) in order to collect the best Japanese experts in the whole range of different types of analyses at the time of the sample return. The final members of the MASPET will be appointed about 2 years prior to the Earth return. Then they will conduct a Stest runT of the whole initial analysis procedures at the ISAS asteromaterial curation facility, to be newly created in next a few years, and their respective analysis facilities. This talk also covers the current concepts of the facility and plans of analysis procedure flow.

  13. Advanced Curation Protocols for Mars Returned Sample Handling

    NASA Astrophysics Data System (ADS)

    Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.

    Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this end, cleaning techniques such as ultrasonication in ultra -pure water (UPW), oxygen (O2) plasma, and carbon dioxide (CO2) "snow" are being used to clean a variety of different contaminants on a variety of different surfaces. Additionally, once cleaned, techniques to directly verify the s rface cleanliness are being developed. Theseu include X ray photoelectron spectroscopy (XPS) quantification, and screening with- contact angle measure ments , which can be correlated with XPS standards. Methods developed in the Advanced Curation Laboratory will determine the extent to which inorganic and biological contamination can be controlled and minimized.

  14. Apollo Missions to the Lunar Surface

    NASA Technical Reports Server (NTRS)

    Graff, Paige V.

    2018-01-01

    Six Apollo missions to the Moon, from 1969-1972, enabled astronauts to collect and bring lunar rocks and materials from the lunar surface to Earth. Apollo lunar samples are curated by NASA Astromaterials at the NASA Johnson Space Center in Houston, TX. Samples continue to be studied and provide clues about our early Solar System. Learn more and view collected samples at: https://curator.jsc.nasa.gov/lunar.

  15. The Internet of Samples in the Earth Sciences: Providing Access to Uncurated Collections

    NASA Astrophysics Data System (ADS)

    Carter, M. R.; Lehnert, K. A.

    2014-12-01

    Vast amounts of physical samples have been collected in the Earth Sciences for studies that address a wide range of scientific questions. Only a fraction of these samples are well curated and preserved long-term in sample repositories and museums. Many samples and collections are stored in the offices and labs of investigators, or in basements and sheds of institutions and investigators' homes. These 'uncurated' collections often contain samples that have been well studied, or are unique and irreplaceable. They may also include samples that could reveal new insights if re-analyzed using new techniques, or specimens that could have unanticipated relevance to research being conducted in fields other than the one for which they were collected. Currently, these samples cannot be accessed or discovered online by the broader science community. Investigators and departments often lack the resources to properly catalog and curate the samples and respond to requests for splits. Long-term preservation of and access to these samples is usually not provided for. iSamplES, a recently-funded EarthCube Research Coordination Network (RCN), seeks to integrate scientific samples, including 'uncurated' samples, into digital data and information infrastructure in the Earth Sciences and to facilitate their curation, discovery, access, sharing, and analysis. The RCN seeks to develop and implement best practices that increase digital access to samples with the goal of establishing a comprehensive infrastructure not only for the digital, but also physical curation of samples. The RCN will engage a broad group of individuals from domain scientists to curators to publishers to computer scientists to define, articulate, and address the needs and challenges of digital sample management and recommend community-endorsed best practices and standards for registering, describing, identifying, and citing physical specimens, drawing upon other initiatives and existing or emerging software tools for digital sample and collection management. Community engagement will include surveys, in-person workshops and outreach events, the creation of the iSamplES knowledge hub (semantic wiki) and a registry of collections. iSamplES will specifically engage early career scientists to encourage that no samples go uncurated.

  16. Comprehensive Non-Destructive Conservation Documentation of Lunar Samples Using High-Resolution Image-Based 3D Reconstructions and X-Ray CT Data

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2015-01-01

    Established contemporary conservation methods within the fields of Natural and Cultural Heritage encourage an interdisciplinary approach to preservation of heritage material (both tangible and intangible) that holds "Outstanding Universal Value" for our global community. NASA's lunar samples were acquired from the moon for the primary purpose of intensive scientific investigation. These samples, however, also invoke cultural significance, as evidenced by the millions of people per year that visit lunar displays in museums and heritage centers around the world. Being both scientifically and culturally significant, the lunar samples require a unique conservation approach. Government mandate dictates that NASA's Astromaterials Acquisition and Curation Office develop and maintain protocols for "documentation, preservation, preparation and distribution of samples for research, education and public outreach" for both current and future collections of astromaterials. Documentation, considered the first stage within the conservation methodology, has evolved many new techniques since curation protocols for the lunar samples were first implemented, and the development of new documentation strategies for current and future astromaterials is beneficial to keeping curation protocols up to date. We have developed and tested a comprehensive non-destructive documentation technique using high-resolution image-based 3D reconstruction and X-ray CT (XCT) data in order to create interactive 3D models of lunar samples that would ultimately be served to both researchers and the public. These data enhance preliminary scientific investigations including targeted sample requests, and also provide a new visual platform for the public to experience and interact with the lunar samples. We intend to serve these data as they are acquired on NASA's Astromaterials Acquisistion and Curation website at http://curator.jsc.nasa.gov/. Providing 3D interior and exterior documentation of astromaterial samples addresses the increasing demands for accessability to data and contemporary techniques for documentation, which can be realized for both current collections as well as future sample return missions.

  17. An Interdisciplinary Method for the Visualization of Novel High-Resolution Precision Photography and Micro-XCT Data Sets of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Create Combined Research-Grade 3D Virtual Samples for the Benefit of Astromaterials Collections Conservation, Curation, Scientific Research and Education

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2016-01-01

    New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro- XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and exterior features, offers scientists a preliminary research tool for targeted sub-sample requests, and additionally is a visually engaging interactive tool for bringing astromaterials science to the public.

  18. An Interdisciplinary Method for the Visualization of Novel High-Resolution Precision Photography and Micro-XCT Data Sets of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Create Combined Research-Grade 3D Virtual Samples for the Benefit of Astromaterials Collections Conservation, Curation, Scientific Research and Education

    NASA Astrophysics Data System (ADS)

    Blumenfeld, E. H.; Evans, C. A.; Zeigler, R. A.; Righter, K.; Beaulieu, K. R.; Oshel, E. R.; Liddle, D. A.; Hanna, R.; Ketcham, R. A.; Todd, N. S.

    2016-12-01

    New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro-XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and exterior features, offers scientists a preliminary research tool for targeted sub-sample requests, and additionally is a visually engaging interactive tool for bringing astromaterials science to the public.

  19. Utilizing the International GeoSample Number Concept during ICDP Expedition COSC

    NASA Astrophysics Data System (ADS)

    Conze, Ronald; Lorenz, Henning; Ulbricht, Damian; Gorgas, Thomas; Elger, Kirsten

    2016-04-01

    The concept of the International GeoSample Number (IGSN) was introduced to uniquely identify and register geo-related sample material, and make it retrievable via electronic media (e.g., SESAR - http://www.geosamples.org/igsnabout). The general aim of the IGSN concept is to improve accessing stored sample material worldwide, enable the exact identification, its origin and provenance, and also the exact and complete citation of acquired samples throughout the literature. The ICDP expedition COSC (Collisional Orogeny in the Scandinavian Caledonides, http://cosc.icdp-online.org) prompted for the first time in ICDP's history to assign and register IGSNs during an ongoing drilling campaign. ICDP drilling expeditions are using commonly the Drilling Information System DIS (http://doi.org/10.2204/iodp.sd.4.07.2007) for the inventory of recovered sample material. During COSC IGSNs were assigned to every drill hole, core run, core section, and sample taken from core material. The original IGSN specification has been extended to achieve the required uniqueness of IGSNs with our offline-procedure. The ICDP name space indicator and the Expedition ID (5054) are forming an extended prefix (ICDP5054). For every type of sample material, an encoded sequence of characters follows. This sequence is derived from the DIS naming convention which is unique from the beginning. Thereby every ICDP expedition has an unlimited name space for IGSN assignments. This direct derivation of IGSNs from the DIS database context ensures the distinct parent-child hierarchy of the IGSNs among each other. In the case of COSC this method of inventory-keeping of all drill cores was done routinely using the ExpeditionDIS during field work and subsequent sampling party. After completing the field campaign, all sample material was transferred to the "Nationales Bohrkernlager" in Berlin-Spandau, Germany. Corresponding data was subsequently imported into the CurationDIS used at the aforementioned core storage facility. This CurationDIS assigns IGSNs on samples newly taken in the repository in the identical fashion as done in the field. Thereby, the parent-child linkage of the IGSNs is ensured consistently throughout the entire sampling process. The only difference between ExpeditionDIS and CurationDIS sample curation is using the name space ICDP and BGRB respectively as part of the corresponding ID string. To prepare the IGSN registry, a set of metadata is generated for every assigned IGSN using the DIS, which is then exported from the DIS into one common xml-file. The xml-file is based on the SESAR schema and a proposal of IGSN e.V. (http://schema.igsn.org). This systematics has been recently extended for drilling data to achieve additional information for future retrieval options. The two allocation agents GFZ Potsdam und PANGAEA are currently involved in the registry of IGSNs in the case of COSC drill campaigns. An example for the IGSN registration of the COSC-1 drill hole A (5054_1_A) is "ICDP5054EEW1001" and can be resolved using the URL http://hdl.handle.net/10273/ICDP5054EEW1001. Opening the landing page for the complete COSC core material for this particular hole showcases graphically a hierarchical tree entitled "Sample Family". An example of an IGSN citation associated with a COSC sample set is featured on an EGU-2016 poster presentation by Ulrich Harms, Johannes Hierold et al. (EGU2016-8646).

  20. The importance of community building for establishing data management and curation practices for physical samples

    NASA Astrophysics Data System (ADS)

    Ramdeen, S.; Hangsterfer, A.; Stanley, V. L.

    2017-12-01

    There is growing enthusiasm for curation of physical samples in the Earth Science community (see sessions at AGU, GSA, ESIP). Multiple federally funded efforts aim to develop best practices for curation of physical samples; however, these efforts have not yet been consolidated. Harmonizing these concurrent efforts would enable the community as a whole to build the necessary tools and community standards to move forward together. Preliminary research indicate the various groups focused on this topic are working in isolation, and the development of standards needs to come from the broadest view of `community'. We will investigate the gaps between communities by collecting information about preservation policies and practices from curators, who can provide a diverse cross-section of the grand challenges to the overall community. We will look at existing reports and study results to identify example cases, then develop a survey to gather large scale data to reinforce or clarify the example cases. We will be targeting the various community groups which are working on similar issues, and use the survey to improve the visibility of developed best practices. Given that preservation and digital collection management for physical samples are both important and difficult at present (GMRWG, 2015; NRC, 2002), barriers to both need to be addressed in order to achieve open science goals for the entire community. To address these challenges, EarthCube's iSamples, a research coordination network established to advance discoverability, access, and curation of physical samples using cyberinfrastructure, has formed a working group to collect use cases to examine the breadth of earth scientists' work with physical samples. This research team includes curators of state survey and oceanographic geological collections, and a researcher from information science. In our presentation, we will share our research and the design of the proposed survey. Our goal is to engage the audience in a discussion on next steps towards building this community. References: The Geologic Materials Repository Working Group, 2015, USGS Circular 1410 National Research Council. 2002. Geoscience Data and Collections: National Resources in Peril.

  1. Advanced Curation of Current and Future Extraterrestrial Samples

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2013-01-01

    Curation of extraterrestrial samples is the critical interface between sample return missions and the international research community. Curation includes documentation, preservation, preparation, and distribution of samples. The current collections of extraterrestrial samples include: Lunar rocks / soils collected by the Apollo astronauts Meteorites, including samples of asteroids, the Moon, and Mars "Cosmic dust" (asteroid and comet particles) collected by high-altitude aircraft Solar wind atoms collected by the Genesis spacecraft Comet particles collected by the Stardust spacecraft Interstellar dust collected by the Stardust spacecraft Asteroid particles collected by the Hayabusa spacecraft These samples were formed in environments strikingly different from that on Earth. Terrestrial contamination can destroy much of the scientific significance of many extraterrestrial materials. In order to preserve the research value of these precious samples, contamination must be minimized, understood, and documented. In addition the samples must be preserved - as far as possible - from physical and chemical alteration. In 2011 NASA selected the OSIRIS-REx mission, designed to return samples from the primitive asteroid 1999 RQ36 (Bennu). JAXA will sample C-class asteroid 1999 JU3 with the Hayabusa-2 mission. ESA is considering the near-Earth asteroid sample return mission Marco Polo-R. The Decadal Survey listed the first lander in a Mars sample return campaign as its highest priority flagship-class mission, with sample return from the South Pole-Aitken basin and the surface of a comet among additional top priorities. The latest NASA budget proposal includes a mission to capture a 5-10 m asteroid and return it to the vicinity of the Moon as a target for future sampling. Samples, tools, containers, and contamination witness materials from any of these missions carry unique requirements for acquisition and curation. Some of these requirements represent significant advances over methods currently used. New analytical and screening techniques will increase the value of current sample collections. Improved web-based tools will make information on all samples more accessible to researchers and the public. Advanced curation of current and future extraterrestrial samples includes: Contamination Control - inorganic / organic Temperature of preservation - subfreezing / cryogenic Non-destructive preliminary examination - X-ray tomography / XRF mapping / Raman mapping Microscopic samples - handling / sectioning / transport Special samples - unopened lunar cores Informatics - online catalogs / community-based characterization.

  2. Apollo Lunar Sample Integration into Google Moon: A New Approach to Digitization

    NASA Technical Reports Server (NTRS)

    Dawson, Melissa D.; Todd, nancy S.; Lofgren, Gary E.

    2011-01-01

    The Google Moon Apollo Lunar Sample Data Integration project is part of a larger, LASER-funded 4-year lunar rock photo restoration project by NASA s Acquisition and Curation Office [1]. The objective of this project is to enhance the Apollo mission data already available on Google Moon with information about the lunar samples collected during the Apollo missions. To this end, we have combined rock sample data from various sources, including Curation databases, mission documentation and lunar sample catalogs, with newly available digital photography of rock samples to create a user-friendly, interactive tool for learning about the Apollo Moon samples

  3. HEROD: a human ethnic and regional specific omics database.

    PubMed

    Zeng, Xian; Tao, Lin; Zhang, Peng; Qin, Chu; Chen, Shangying; He, Weidong; Tan, Ying; Xia Liu, Hong; Yang, Sheng Yong; Chen, Zhe; Jiang, Yu Yang; Chen, Yu Zong

    2017-10-15

    Genetic and gene expression variations within and between populations and across geographical regions have substantial effects on the biological phenotypes, diseases, and therapeutic response. The development of precision medicines can be facilitated by the OMICS studies of the patients of specific ethnicity and geographic region. However, there is an inadequate facility for broadly and conveniently accessing the ethnic and regional specific OMICS data. Here, we introduced a new free database, HEROD, a human ethnic and regional specific OMICS database. Its first version contains the gene expression data of 53 070 patients of 169 diseases in seven ethnic populations from 193 cities/regions in 49 nations curated from the Gene Expression Omnibus (GEO), the ArrayExpress Archive of Functional Genomics Data (ArrayExpress), the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC). Geographic region information of curated patients was mainly manually extracted from referenced publications of each original study. These data can be accessed and downloaded via keyword search, World map search, and menu-bar search of disease name, the international classification of disease code, geographical region, location of sample collection, ethnic population, gender, age, sample source organ, patient type (patient or healthy), sample type (disease or normal tissue) and assay type on the web interface. The HEROD database is freely accessible at http://bidd2.nus.edu.sg/herod/index.php. The database and web interface are implemented in MySQL, PHP and HTML with all major browsers supported. phacyz@nus.edu.sg. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. New Web Services for Broader Access to National Deep Submergence Facility Data Resources Through the Interdisciplinary Earth Data Alliance

    NASA Astrophysics Data System (ADS)

    Ferrini, V. L.; Grange, B.; Morton, J. J.; Soule, S. A.; Carbotte, S. M.; Lehnert, K.

    2016-12-01

    The National Deep Submergence Facility (NDSF) operates the Human Occupied Vehicle (HOV) Alvin, the Remotely Operated Vehicle (ROV) Jason, and the Autonomous Underwater Vehicle (AUV) Sentry. These vehicles are deployed throughout the global oceans to acquire sensor data and physical samples for a variety of interdisciplinary science programs. As part of the EarthCube Integrative Activity Alliance Testbed Project (ATP), new web services were developed to improve access to existing online NDSF data and metadata resources. These services make use of tools and infrastructure developed by the Interdisciplinary Earth Data Alliance (IEDA) and enable programmatic access to metadata and data resources as well as the development of new service-driven user interfaces. The Alvin Frame Grabber and Jason Virtual Van enable the exploration of frame-grabbed images derived from video cameras on NDSF dives. Metadata available for each image includes time and vehicle position, data from environmental sensors, and scientist-generated annotations, and data are organized and accessible by cruise and/or dive. A new FrameGrabber web service and service-driven user interface were deployed to offer integrated access to these data resources through a single API and allows users to search across content curated in both systems. In addition, a new NDSF Dive Metadata web service and service-driven user interface was deployed to provide consolidated access to basic information about each NDSF dive (e.g. vehicle name, dive ID, location, etc), which is important for linking distributed data resources curated in different data systems.

  5. A Draft Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth

    NASA Technical Reports Server (NTRS)

    Viso, M.; DeVincenzi, D. L.; Race, M. S.; Schad, P. J.; Stabekis, P. D.; Acevedo, S. E.; Rummel, J. D.

    2002-01-01

    In preparation for missions to Mars that will involve the return of samples, it is necessary to prepare for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but a specific protocol for handling and testing of returned -=1 samples from Mars remained to be developed. To refine the requirements for Mars sample hazard testing and to develop criteria for the subsequent release of sample materials from precautionary containment, NASA Planetary Protection Officer, working in collaboration with CNES, convened a series of workshops to produce a Protocol by which returned martian sample materials could be assessed for biological hazards and examined for evidence of life (extant or extinct), while safeguarding the samples from possible terrestrial contamination. The Draft Protocol was then reviewed by an Oversight and Review Committee formed specifically for that purpose and composed of senior scientists. In order to preserve the scientific value of returned martian samples under safe conditions, while avoiding false indications of life within the samples, the Sample Receiving Facility (SRF) is required to allow handling and processing of the Mars samples to prevent their terrestrial contamination while maintaining strict biological containment. It is anticipated that samples will be able to be shipped among appropriate containment facilities wherever necessary, under procedures developed in cooperation with international appropriate institutions. The SRF will need to provide different types of laboratory environments for carrying out, beyond sample description and curation, the various aspects of the protocol: Physical/Chemical analysis, Life Detection testing, and Biohazard testing. The main principle of these tests will be described and the criteria for release will be discussed, as well as the requirements for the SRF and its personnel.

  6. Toward Lower Organic Environments in Astromaterial Sample Curation for Diverse Collections

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Allen, C. C.; Burkett, P. J.; Calaway, M. J.; Oehler, D. Z.

    2012-01-01

    Great interest was taken during the frenzied pace of the Apollo lunar sample return to achieve and monitor organic cleanliness. Yet, the first mission resulted in higher organic contamination to samples than desired. But improvements were accomplished by Apollo 12 [1]. Quarantine complicated the goal of achieving organic cleanliness by requiring negative pressure glovebox containment environments, proximity of animal, plant and microbial organic sources, and use of organic sterilants in protocols. A special low organic laboratory was set up at University of California Berkeley (UCB) to cleanly subdivide a subset of samples [2, 3, 4]. Nevertheless, the basic approach of handling rocks and regolith inside of a positive pressure stainless steel glovebox and restrict-ing the tool and container materials allowed in the gloveboxes was established by the last Apollo sample re-turn. In the last 40 years, the collections have grown to encompass Antarctic meteorites, Cosmic Dust, Genesis solar wind, Stardust comet grains and Hayabusa asteroid grains. Each of these collections have unique curation requirements for organic contamination monitor-ing and control. Here is described some changes allowed by improved technology or driven by changes in environmental regulations and economy, concluding with comments on organic witness wafers. Future sample return missions (OSIRIS-Rex; Mars; comets) will require extremely low levels of organic contamination in spacecraft collection and thus similarly low levels in curation. JSC Curation is undertaking a program to document organic baseline levels in current operations and devise ways to reduce those levels.

  7. Trustworthy Digital Repositories: Building Trust the Old Fashion Way, EARNING IT.

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Shepherd, A.; Rauch, S.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.

    2016-12-01

    There are several drivers increasing the importance of high quality data management and curation in today's research process (e.g., OSTP PARR memo, journal publishers, funders, academic and private institutions), and proper management is necessary throughout the data lifecycle to enable reuse and reproducibility of results. Many digital data repositories are capable of satisfying the basic management needs of an investigator looking to share their data (i.e., publish data in the public domain), but repository services vary greatly and not all provide mature services that facilitate discovery, access, and reuse of research data. Domain-specific repositories play a vital role in the data curation process by working closely with investigators to create robust metadata, perform first order QC, and assemble and publish research data. In addition, they may employ technologies and services that enable increased discovery, access, and long-term archive. However, smaller domain facilities operate in varying states of capacity and curation ability. Within this repository environment, individual investigators (driven by publishers, funders, or institutions) need to find trustworthy repositories for their data; and funders need to direct investigators to quality repositories to ensure return on their investment. So, how can one determine the best home for valuable research data? Metrics can be applied to varying aspects of data curation, and many credentialing organizations offer services that assess and certify the trustworthiness of a given data management facility. Unfortunately, many of these certifications can be inaccessible to a small repository in cost, time, or scope. Are there alternatives? This presentation will discuss methods and approaches used by the Biological and Chemical Oceanography Data Management Office (BCO-DMO; a domain-specific, intermediate digital data repository) to demonstrate trustworthiness in the face of a daunting accreditation landscape.

  8. High-Resolution Imaged-Based 3D Reconstruction Combined with X-Ray CT Data Enables Comprehensive Non-Destructive Documentation and Targeted Research of Astromaterials

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2014-01-01

    Providing web-based data of complex and sensitive astromaterials (including meteorites and lunar samples) in novel formats enhances existing preliminary examination data on these samples and supports targeted sample requests and analyses. We have developed and tested a rigorous protocol for collecting highly detailed imagery of meteorites and complex lunar samples in non-contaminating environments. These data are reduced to create interactive 3D models of the samples. We intend to provide these data as they are acquired on NASA's Astromaterials Acquisition and Curation website at http://curator.jsc.nasa.gov/.

  9. EXTRACT: interactive extraction of environment metadata and term suggestion for metagenomic sample annotation.

    PubMed

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; Pereira, Emiliano; Schnetzer, Julia; Arvanitidis, Christos; Jensen, Lars Juhl

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed. Database URL: https://extract.hcmr.gr/. © The Author(s) 2016. Published by Oxford University Press.

  10. EXTRACT: Interactive extraction of environment metadata and term suggestion for metagenomic sample annotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less

  11. EXTRACT: Interactive extraction of environment metadata and term suggestion for metagenomic sample annotation

    DOE PAGES

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra; ...

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have therefore developed an interactive annotation tool, EXTRACT, which helps curators identify and extract standard-compliant terms for annotation of metagenomic records and other samples. Behind its web-based user interface, the system combines published methods for named entity recognition of environment, organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, wellmore » documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Here the comparison of fully manual and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15–25% and helps curators to detect terms that would otherwise have been missed.« less

  12. Lunar Rocks: Available for Year of the Solar System Events

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2010-12-01

    NASA is actively exploring the moon with our Lunar Reconnaissance Orbiter, the Grail Discovery Mission will launch next year, and each year there is an International Observe the Moon Night providing many events and lunar science focus opportunities to share rocks from the moon with students and the public. In our laboratories, we have Apollo rocks and soil from six different places on the moon, and their continued study provides incredibly valuable ground truth to complement space exploration missions. Extensive information and actual lunar samples are available for public display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. The lunar rocks and soils continue to be studied intensively by scientists around the world. Descriptions of the samples, research results, thousands of photographs, and information on how to request research samples are on the JSC Curation website: http://curator.jsc.nasa.gov/ NASA is eager for scientists and the public to have access to these exciting Apollo samples through our various loan procedures. NASA provides a limited number of Moon rock samples for either short-term or long-term displays at museums, planetariums, expositions, and professional events that are open to the public. The JSC Public Affairs Office handles requests for such display samples. Requestors should apply in writing to Mr. Louis Parker, JSC Exhibits Manager. Mr. Parker will advise successful applicants regarding provisions for receipt, display, and return of the samples. All loans will be preceded by a signed loan agreement executed between NASA and the requestor's organization. Email address: louis.a.parker@nasa.gov Sets of twelve thin sections of Apollo lunar samples are available for short-term loan from JSC Curation. The thin sections may be use requested for college and university courses where petrographic microscopes are available for viewing. Requestors should contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov NASA also loans sets of Moon rocks for use in classrooms, libraries, museums, and planetariums through the Lunar Sample Education Program. Lunar samples (three soils and three rocks) are encapsulated in a six-inch diameter clear plastic disk. A CD with PowerPoint presentations, analogue samples from Earth, a classroom activity guide, and additional printed material accompany the disks. Educators may qualify for the use of these disks by attending a content and security certification workshop sponsored by NASA's Aerospace Education Services Program (AESP). Contact Ms. Margaret Maher, AESP Director. Email address: mjm67@psu.edu NASA makes these precious samples available for the public and encourages the use of lunar rocks to highlight Year of the Solar System events. Surely these interesting specimens of another world will enhance the experience of all YSS participants so please take advantage of these lunar samples and borrow them for events and classes.

  13. GeoLab: A Geological Workstation for Future Missions

    NASA Technical Reports Server (NTRS)

    Evans, Cynthia; Calaway, Michael; Bell, Mary Sue; Li, Zheng; Tong, Shuo; Zhong, Ye; Dahiwala, Ravi

    2014-01-01

    The GeoLab glovebox was, until November 2012, fully integrated into NASA's Deep Space Habitat (DSH) Analog Testbed. The conceptual design for GeoLab came from several sources, including current research instruments (Microgravity Science Glovebox) used on the International Space Station, existing Astromaterials Curation Laboratory hardware and clean room procedures, and mission scenarios developed for earlier programs. GeoLab allowed NASA scientists to test science operations related to contained sample examination during simulated exploration missions. The team demonstrated science operations that enhance theThe GeoLab glovebox was, until November 2012, fully integrated into NASA's Deep Space Habitat (DSH) Analog Testbed. The conceptual design for GeoLab came from several sources, including current research instruments (Microgravity Science Glovebox) used on the International Space Station, existing Astromaterials Curation Laboratory hardware and clean room procedures, and mission scenarios developed for earlier programs. GeoLab allowed NASA scientists to test science operations related to contained sample examination during simulated exploration missions. The team demonstrated science operations that enhance the early scientific returns from future missions and ensure that the best samples are selected for Earth return. The facility was also designed to foster the development of instrument technology. Since 2009, when GeoLab design and construction began, the GeoLab team [a group of scientists from the Astromaterials Acquisition and Curation Office within the Astromaterials Research and Exploration Science (ARES) Directorate at JSC] has progressively developed and reconfigured the GeoLab hardware and software interfaces and developed test objectives, which were to 1) determine requirements and strategies for sample handling and prioritization for geological operations on other planetary surfaces, 2) assess the scientific contribution of selective in-situ sample characterization for mission planning, operations, and sample prioritization, 3) evaluate analytical instruments and tools for providing efficient and meaningful data in advance of sample return and 4) identify science operations that leverage human presence with robotic tools. In the first year of tests (2010), GeoLab examined basic glovebox operations performed by one and two crewmembers and science operations performed by a remote science team. The 2010 tests also examined the efficacy of basic sample characterization [descriptions, microscopic imagery, X-ray fluorescence (XRF) analyses] and feedback to the science team. In year 2 (2011), the GeoLab team tested enhanced software and interfaces for the crew and science team (including Web-based and mobile device displays) and demonstrated laboratory configurability with a new diagnostic instrument (the Multispectral Microscopic Imager from the JPL and Arizona State University). In year 3 (2012), the GeoLab team installed and tested a robotic sample manipulator and evaluated robotic-human interfaces for science operations.

  14. Lunar Reference Suite to Support Instrument Development and Testing

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Sellar, Glenn; Nunez, Jorge I.; Winterhalter, Daniel; Farmer, Jack

    2010-01-01

    Astronauts on long-duration lunar missions will need the capability to "high-grade" their samples - to select the highest value samples for transport to Earth - and to leave others on the Moon. Instruments that may be useful for such high-grading are under development. Instruments are also being developed for possible use on future lunar robotic landers, for lunar field work, and for more sophisticated analyses at a lunar outpost. The Johnson Space Center Astromaterials acquisition and Curation Office (JSC Curation) wll support such instrument testing by providing lunar sample "ground truth".

  15. Pieces of Other Worlds - Enhance YSS Education and Public Outreach Events with Extraterrestrial Samples

    NASA Astrophysics Data System (ADS)

    Allen, C.

    2010-12-01

    During the Year of the Solar System spacecraft will encounter two comets; orbit the asteroid Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories. Extensive information about these unique materials, as well as actual lunar samples and meteorites, is available for display and education. The Johnson Space Center (JSC) curates NASA's extraterrestrial samples to support research, education, and public outreach. At the current time JSC curates five types of extraterrestrial samples: Moon rocks and soils collected by the Apollo astronauts Meteorites collected on US expeditions to Antarctica (including rocks from the Moon, Mars, and many asteroids including Vesta) “Cosmic dust” (asteroid and comet particles) collected by high-altitude aircraft Solar wind atoms collected by the Genesis spacecraft Comet and interstellar dust particles collected by the Stardust spacecraft These rocks, soils, dust particles, and atoms continue to be studied intensively by scientists around the world. Descriptions of the samples, research results, thousands of photographs, and information on how to request research samples are on the JSC Curation website: http://curator.jsc.nasa.gov/ NASA is eager for scientists and the public to have access to these exciting samples through our various loan procedures. NASA provides a limited number of Moon rock samples for either short-term or long-term displays at museums, planetariums, expositions, and professional events that are open to the public. The JSC Public Affairs Office handles requests for such display samples. Requestors should apply in writing to Mr. Louis Parker, JSC Exhibits Manager. He will advise successful applicants regarding provisions for receipt, display, and return of the samples. All loans will be preceded by a signed loan agreement executed between NASA and the requestor's organization. Email address: louis.a.parker@nasa.gov Sets of twelve thin sections of Apollo lunar samples and sets of twelve thin sections of meteorites are available for short-term loan from JSC Curation. The thin sections are designed for use in college and university courses where petrographic microscopes are available for viewing. Requestors should contact Ms. Mary Luckey, Education Sample Curator. Email address: mary.k.luckey@nasa.gov NASA also loans sets of Moon rocks and meteorites for use in classrooms, libraries, museums and planetariums. Lunar samples (three soils and three rocks) are encapsulated in a six-inch diameter clear plastic disk. Disks containing six different samples of meteorites are also available. A CD with PowerPoint presentations, a classroom activity guide, and additional printed material accompany the disks. Educators may qualify for the use of these disks by attending a security certification workshop sponsored by NASA's Aerospace Education Services Program (AESP). Contact Ms. Margaret Maher, AESP Director. Email address: mjm67@psu.edu Please take advantage of the wealth of data and the samples that we have from an exciting variety of solar system bodies.

  16. An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight

    NASA Astrophysics Data System (ADS)

    Petters, J.; Coleman, S.; Andrea, O.

    2016-12-01

    A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.

  17. Lunar and Meteorite Sample Disk for Educators

    NASA Technical Reports Server (NTRS)

    Foxworth, Suzanne; Luckey, M.; McInturff, B.; Allen, J.; Kascak, A.

    2015-01-01

    NASA Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation and distribution of samples for research, education and public outreach. Between 1969 and 1972 six Apollo missions brought back 382 kilograms of lunar rocks, core and regolith samples, from the lunar surface. JSC also curates meteorites collected from a US cooperative effort among NASA, the National Science Foundation (NSF) and the Smithsonian Institution that funds expeditions to Antarctica. The meteorites that are collected include rocks from Moon, Mars, and many asteroids including Vesta. The sample disks for educational use include these different samples. Active relevant learning has always been important to teachers and the Lunar and Meteorite Sample Disk Program provides this active style of learning for students and the general public. The Lunar and Meteorite Sample Disks permit students to conduct investigations comparable to actual scientists. The Lunar Sample Disk contains 6 samples; Basalt, Breccia, Highland Regolith, Anorthosite, Mare Regolith and Orange Soil. The Meteorite Sample Disk contains 6 samples; Chondrite L3, Chondrite H5, Carbonaceous Chondrite, Basaltic Achondrite, Iron and Stony-Iron. Teachers are given different activities that adhere to their standards with the disks. During a Sample Disk Certification Workshop, teachers participate in the activities as students gain insight into the history, formation and geologic processes of the moon, asteroids and meteorites.

  18. Geosamples.org: Shared Cyberinfrastructure for Geoscience Samples

    NASA Astrophysics Data System (ADS)

    Lehnert, Kerstin; Allison, Lee; Arctur, David; Klump, Jens; Lenhardt, Christopher

    2014-05-01

    Many scientific domains, specifically in the geosciences, rely on physical samples as basic elements for study and experimentation. Samples are collected to analyze properties of natural materials and features that are key to our knowledge of Earth's dynamical systems and evolution, and to preserve a record of our environment over time. Huge volumes of samples have been acquired over decades or even centuries and stored in a large number and variety of institutions including museums, universities and colleges, state geological surveys, federal agencies, and industry. All of these collections represent highly valuable, often irreplaceable records of nature that need to be accessible so that they can be re-used in future research and for educational purposes. Many sample repositories are keen to use cyberinfrastructure capabilities to enhance access to their collections on the internet and to support and streamline collection management (accessioning of new samples, labeling, handling sample requests, etc.), but encounter substantial challenges and barriers to integrate digital sample management into their daily routine. They lack the resources (staff, funding) and infrastructure (hardware, software, IT support) to develop and operate web-enabled databases, to migrate analog sample records into digital data management systems, and to transfer paper- or spreadsheet-based workflows to electronic systems. Use of commercial software is often not an option as it incurs high costs for licenses, requires IT expertise for installation and maintenance, and often does not match the needs of the smaller repositories, being designed for large museums or different types of collections (art, archeological, biological). Geosamples.org is an alliance of sample repositories (academic, US federal and state surveys, industry) and data facilities that aims to develop a cyberinfrastructure that will dramatically advance access to physical samples for the research community, government agencies, students, educators, and the general public, while supporting, simplifying, and standardizing the work of curators in repositories, museums, and universities, and even for individual investigators who manage personal or project-based sample collections in their lab. Geosamples.org builds upon best practices and cyberinfrastructure for sample identification, registration, and documentation developed by the IGSN e.V., an international organization that governs the International Geosample Number, a persistent unique identifier for physical samples. Geosamples.org will develop a Digital Environment for Sample Curation (DESC) that will facilitate the creation, identification, and registration of 'virtual samples' and network them into an 'Internet of Samples' that will allow to discover, access, and track online physical samples, the data derived by their study, and the publications that contain these data. DESC will provide easy-to-use software tools for curators to maintain digital catalogs of their collections, to provide online access to the catalog to search for and request samples, manage sample requests and users, track collection usage and impact. Geosamples.org will also work toward joint practices for the recognition of intellectual property, build mechanisms to create sustainable business models for continuing maintenance and evolution of managing sample resources, and integrate the sample management life-cycle into professional and cultural practice of science.

  19. Removing user fees for basic health services: a pilot study and national roll-out in Afghanistan

    PubMed Central

    Steinhardt, Laura C; Aman, Iqbal; Pakzad, Iqbalshah; Kumar, Binay; Singh, Lakhwinder P; Peters, David H

    2011-01-01

    Background User fees for primary care tend to suppress utilization, and many countries are experimenting with fee removal. Studies show that additional inputs are needed after removing fees, although well-documented experiences are lacking. This study presents data on the effects of fee removal on facility quality and utilization in Afghanistan, based on a pilot experiment and subsequent nationwide ban on fees. Methods Data on utilization and observed structural and perceived overall quality of health care were compared from before-and-after facility assessments, patient exit interviews and catchment area household surveys from eight facilities where fees were removed and 14 facilities where fee levels remained constant, as part of a larger health financing pilot study from 2005 to 2007. After a national user fee ban was instituted in 2008, health facility administrative data were analysed to assess subsequent changes in utilization and quality. Results The pilot study analysis indicated that observed and perceived quality increased across facilities but did not differ by fee removal status. Difference-in-difference analysis showed that utilization at facilities previously charging both service and drug fees increased by 400% more after fee removal, prompting additional inputs from service providers, compared with facilities that previously only charged service fees or had no change in fees (P = 0.001). Following the national fee ban, visits for curative care increased significantly (P < 0.001), but institutional deliveries did not. Services typically free before the ban—immunization and antenatal care—had immediate increases in utilization but these were not sustained. Conclusion Both pilot and nationwide data indicated that curative care utilization increased following fee removal, without differential changes in quality. Concerns raised by non-governmental organizations, health workers and community leaders over the effects of lost revenue and increased utilization require continued effort to raise revenues, monitor health worker and patient perceptions, and carefully manage health facility performance. PMID:22027924

  20. 45 CFR 674.5 - Requirements for collection, handling, documentation, and curation of Antarctic meteorites.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...; and (v) Thawing in a clean, dry, non-reactive gas environment, such as nitrogen or argon. (2) Sample..., documentation, and curation of Antarctic meteorites. 674.5 Section 674.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION ANTARCTIC METEORITES § 674.5 Requirements for...

  1. 45 CFR 674.5 - Requirements for collection, handling, documentation, and curation of Antarctic meteorites.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...; and (v) Thawing in a clean, dry, non-reactive gas environment, such as nitrogen or argon. (2) Sample..., documentation, and curation of Antarctic meteorites. 674.5 Section 674.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION ANTARCTIC METEORITES § 674.5 Requirements for...

  2. 45 CFR 674.5 - Requirements for collection, handling, documentation, and curation of Antarctic meteorites.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...; and (v) Thawing in a clean, dry, non-reactive gas environment, such as nitrogen or argon. (2) Sample..., documentation, and curation of Antarctic meteorites. 674.5 Section 674.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION ANTARCTIC METEORITES § 674.5 Requirements for...

  3. 45 CFR 674.5 - Requirements for collection, handling, documentation, and curation of Antarctic meteorites.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...; and (v) Thawing in a clean, dry, non-reactive gas environment, such as nitrogen or argon. (2) Sample..., documentation, and curation of Antarctic meteorites. 674.5 Section 674.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION ANTARCTIC METEORITES § 674.5 Requirements for...

  4. The BioCyc collection of microbial genomes and metabolic pathways.

    PubMed

    Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi

    2017-08-17

    BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Distilling Design Patterns From Agile Curation Case Studies

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.; Lenhardt, W. C.; Young, J. W.

    2016-12-01

    In previous work the authors have argued that there is a need to take a new look at the data management lifecycle. Our core argument is that the data management lifecycle needs to be in essence deconstructed and rebuilt. As part of this process we also argue that much can be gained from applying ideas, concepts, and principles from agile software development methods. To be sure we are not arguing for a rote application of these agile software approaches, however, given various trends related to data and technology, it is imperative to update our thinking about how to approach the data management lifecycle, recognize differing project scales, corresponding variations in structure, and alternative models for solving the problems of scientific data curation. In this paper we will describe what we term agile curation design patterns, borrowing the concept of design patterns from the software world and we will present some initial thoughts on agile curation design patterns as informed by a sample of data curation case studies solicited from participants in agile data curation meeting sessions conducted in 2015-16.

  6. If we build it, will they come? Curation and use of the ESO telescope bibliography

    NASA Astrophysics Data System (ADS)

    Grothkopf, Uta; Meakins, Silvia; Bordelon, Dominic

    2015-12-01

    The ESO Telescope Bibliography (telbib) is a database of refereed papers published by the ESO users community. It links data in the ESO Science Archive with the published literature, and vice versa. Developed and maintained by the ESO library, telbib also provides insights into the organization's research output and impact as measured through bibliometric studies. Curating telbib is a multi-step process that involves extensive tagging of the database records. Based on selected use cases, this talk will explain how the rich metadata provide parameters for reports and statistics in order to investigate the performance of ESO's facilities and to understand trends and developments in the publishing behaviour of the user community.

  7. Socioeconomic and family influences on dental treatment needs among Brazilian underprivileged schoolchildren participating in a dental health program

    PubMed Central

    2013-01-01

    Background The objective of this study was to compare the socioeconomic and family characteristics of underprivileged schoolchildren with and without curative dental needs participating in a dental health program. Methods A random sample of 1411 of 8-to-10 year-old Brazilian schoolchildren was examined and two sample groups were included in the cross-sectional study: 544 presented curative dental needs and the other 867 schoolchildren were without curative dental needs. The schoolchildren were examined for the presence of caries lesions using the DMFT index and their parents were asked to answer questions about socioenvironmental characteristics of their families. Logistic regression models were adjusted estimating the Odds Ratios (OR), their 95% confidence intervals (CI), and significance levels. Results After adjusting for potential confounders, it was found that families earning more than one Brazilian minimum wage, having fewer than four residents in the house, families living in homes owned by them, and children living with both biological parents were protective factors for the presence of dental caries, and consequently, curative dental needs. Conclusions Socioeconomic status and family structure influences the curative dental needs of children from underprivileged communities. In this sense, dental health programs should plan and implement strategic efforts to reduce inequities in oral health status and access to oral health services of vulnerable schoolchildren and their families. PMID:24138683

  8. Sustainable data policy for a data production facility: a work in (continual) progress

    NASA Astrophysics Data System (ADS)

    Ketcham, R. A.

    2017-12-01

    The University of Texas High-Resolution X-Ray Computed Tomography Facility (UTCT) has been producing volumetric data and data products of geological and other scientific specimens and engineering materials for over 20 years. Data volumes, both in terms of the size of individual data sets and overall facility production, have progressively grown and fluctuated near the upper boundary of what can be managed by contemporary workstations and lab-scale servers and network infrastructure, making data policy a preoccupation for our entire history. Although all projects have been archived since our first day of operation, policies on which data to keep (raw, reconstructed after corrections, processed) have varied, and been periodically revisited in consideration of the cost of curation and the likelihood of revisiting and reprocessing data when better techniques become available, such as improved artifact corrections or iterative tomographic reconstruction. Advances in instrumentation regularly make old data obsolete and more advantageous to reacquire, but the simple act of getting a sample to a scanning facility is a practical barrier that cannot be overlooked. In our experience, the main times that raw data have been revisited using improved processing to improve image quality were predictable, high-impact charismatic projects (e.g., Archaeopteryx, A. Afarensis "Lucy"). These cases actually provided the impetus for development of the new techniques (ring and beam hardening artifact reduction), which were subsequently incorporated into our data processing pipeline going forward but were rarely if ever retroactively applied to earlier data sets. The only other times raw data have been reprocessed were when reconstruction parameters were inappropriate, due to unnoticed sample features or human error, which are usually recognized fairly quickly. The optimal data retention policy thus remains an open question, although erring on the side of caution remains the default position.

  9. The NASA Ames Research Center Institutional Scientific Collection: History, Best Practices and Scientific Opportunities

    NASA Technical Reports Server (NTRS)

    Rask, Jon C.; Chakravarty, Kaushik; French, Alison; Choi, Sungshin; Stewart, Helen

    2017-01-01

    The NASA Ames Life Sciences Institutional Scientific Collection (ISC), which is composed of the Ames Life Sciences Data Archive (ALSDA) and the Biospecimen Storage Facility (BSF), is managed by the Space Biosciences Division and has been operational since 1993. The ALSDA is responsible for archiving information and animal biospecimens collected from life science spaceflight experiments and matching ground control experiments. Both fixed and frozen spaceflight and ground tissues are stored in the BSF within the ISC. The ALSDA also manages a Biospecimen Sharing Program, performs curation and long-term storage operations, and makes biospecimens available to the scientific community for research purposes via the Life Science Data Archive public website (https:lsda.jsc.nasa.gov). As part of our best practices, a viability testing plan has been developed for the ISC, which will assess the quality of archived samples. We expect that results from the viability testing will catalyze sample use, enable broader science community interest, and improve operational efficiency of the ISC. The current viability test plan focuses on generating disposition recommendations and is based on using ribonucleic acid (RNA) integrity number (RIN) scores as a criteria for measurement of biospecimen viablity for downstream functional analysis. The plan includes (1) sorting and identification of candidate samples, (2) conducting a statiscally-based power analysis to generate representaive cohorts from the population of stored biospecimens, (3) completion of RIN analysis on select samples, and (4) development of disposition recommendations based on the RIN scores. Results of this work will also support NASA open science initiatives and guides development of the NASA Scientific Collections Directive (a policy on best practices for curation of biological collections). Our RIN-based methodology for characterizing the quality of tissues stored in the ISC since the 1980s also creates unique scientific opportunities for temporal assessment across historical missions. Support from the NASA Space Biology Program and the NASA Human Research Program is gratefully acknowledged.

  10. Lunar Sample Quarantine & Sample Curation

    NASA Technical Reports Server (NTRS)

    Allton, Judith H.

    2000-01-01

    The main goal of this presentation is to discuss some of the responsibility of the lunar sample quarantine project. The responsibilities are: flying the mission safely, and on schedule, protect the Earth from biohazard, and preserve scientific integrity of samples.

  11. Solar System Samples for Research, Education, and Public Outreach

    NASA Technical Reports Server (NTRS)

    Allen, J.; Luckey, M.; McInturff, B.; Kascak, A.; Tobola, K.; Galindo, C.; Allen, C.

    2011-01-01

    In the next two years, during the NASA Year of the Solar System, spacecraft from NASA and our international partners will; encounter a comet, orbit asteroid 4 Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories, and their continued study provides incredibly valuable "ground truth" to complement space exploration missions. Extensive information about these unique materials, as well as actual lunar samples and meteorites, are available for display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach.

  12. ARES Biennial Report 2012 Final

    NASA Technical Reports Server (NTRS)

    Stansbery, Eileen

    2014-01-01

    Since the return of the first lunar samples, what is now the Astromaterials Research and Exploration Science (ARES) Directorate has had curatorial responsibility for all NASA-held extraterrestrial materials. Originating during the Apollo Program (1960s), this capability at Johnson Space Center (JSC) included scientists who were responsible for the science planning and training of astronauts for lunar surface activities as well as experts in the analysis and preservation of the precious returned samples. Today, ARES conducts research in basic and applied space and planetary science, and its scientific staff represents a broad diversity of expertise in the physical sciences (physics, chemistry, geology, astronomy), mathematics, and engineering organized into three offices (figure 1): Astromaterials Research (KR), Astromaterials Acquisition and Curation (KT), and Human Exploration Science (KX). Scientists within the Astromaterials Acquisition and Curation Office preserve, protect, document, and distribute samples of the current astromaterials collections. Since the return of the first lunar samples, ARES has been assigned curatorial responsibility for all NASA-held extraterrestrial materials (Apollo lunar samples, Antarctic meteorites - some of which have been confirmed to have originated on the Moon and on Mars - cosmic dust, solar wind samples, comet and interstellar dust particles, and space-exposed hardware). The responsibilities of curation consist not only of the longterm care of the samples, but also the support and planning for future sample collection missions and research and technology to enable new sample types. Curation provides the foundation for research into the samples. The Lunar Sample Facility and other curation clean rooms, the data center, laboratories, and associated instrumentation are unique NASA resources that, together with our staff's fundamental understanding of the entire collection, provide a service to the external research community, which relies on access to the samples. The curation efforts are greatly enhanced by a strong group of planetary scientists who conduct peerreviewed astromaterials research. Astromaterials Research Office scientists conduct peer-reviewed research as Principal or Co-Investigators in planetary science (e. g., cosmochemistry, origins of solar systems, Mars fundamental research, planetary geology and geophysics) and participate as Co-Investigators or Participating Scientists in many of NASA's robotic planetary missions. Since the last report, ARES has achieved several noteworthy milestones, some of which are documented in detail in the sections that follow. Within the Human Exploration Science Office, ARES is a world leader in orbital debris research, modeling and monitoring the debris environment, designing debris shielding, and developing policy to control and mitigate the orbital debris population. ARES has aggressively pursued refinements in knowledge of the debris environment and the hazard it presents to spacecraft. Additionally, the ARES Image Science and Analysis Group has been recognized as world class as a result of the high quality of near-real-time analysis of ascent and on-orbit inspection imagery to identify debris shedding, anomalies, and associated potential damage during Space Shuttle missions. ARES Earth scientists manage and continuously update the database of astronaut photography that is predominantly from Shuttle and ISS missions, but also includes the results of 40 years of human spaceflight. The Crew Earth Observations Web site (http://eol.jsc.nasa.gov/Education/ESS/crew.htm) continues to receive several million hits per month. ARES scientists are also influencing decisions in the development of the next generation of human and robotic spacecraft and missions through laboratory tests on the optical qualities of materials for windows, micrometeoroid/orbital debris shielding technology, and analog activities to assess surface science operations. ARES serves as host to numerous students and visiting scientists as part of the services provided to the research community and conducts a robust education and outreach program. ARES scientists are recognized nationally and internationally by virtue of their success in publishing in peer-reviewed journals and winning competitive research proposals. ARES scientists have won every major award presented by the Meteoritical Society, including the Leonard Medal, the most prestigious award in planetary science and cosmochemistry; the Barringer Medal, recognizing outstanding work in the field of impact cratering; the Nier Prize for outstanding research by a young scientist; and several recipients of the Nininger Meteorite Award. One of our scientists received the Department of Defense (DoD) Joint Meritorious Civilian Service Award (the highest civilian honor given by the DoD). ARES has established numerous partnerships with other NASA Centers, universities, and national laboratories. ARES scientists serve as journal editors, members of advisory panels and review committees, and society officers, and several scientists have been elected as Fellows in their professional societies. This biennial report summarizes a subset of the accomplishments made by each of the ARES offices and highlights participation in ongoing human and robotic missions, development of new missions, and planning for future human and robotic exploration of the solar system beyond low Earth orbit.

  13. Contracting for health and curative care use in Afghanistan between 2004 and 2005

    PubMed Central

    Arur, Aneesa; Peters, David; Hansen, Peter; Mashkoor, Mohammad Ashraf; Steinhardt, Laura C.; Burnham, Gilbert

    2010-01-01

    Afghanistan has used several approaches to contracting as part of its national strategy to increase access to basic health services. This study compares changes in the utilization of outpatient curative services from 2004 to 2005 between the different approaches for contracting-out services to non-governmental service providers, contracting-in technical assistance at public sector facilities, and public sector facilities that did not use contracting. We find that both contracting-in and contracting-out approaches are associated with substantial double difference increases in service use from 2004 to 2005 compared with non-contracted facilities. The double difference increase in contracting-out facilities for outpatient visits is 29% (P < 0.01), while outpatient visits from female patients increased 41% (P < 0.01), use by the poorest quintile increased 68% (P < 0.01) and use by children aged under 5 years increased 27% (P < 0.05). Comparing the individual contracting-out approaches, we find similar increases in outpatient visits when contracts are managed directly by the Ministry of Public Health compared with when contracts are managed by an experienced international non-profit organization. Finally, contracting-in facilities show even larger increases in all the measures of utilization other than visits from children under 5. Although there are minor differences in the results between contracting-out approaches, these differences cannot be attributed to a specific contracting-out approach because of factors limiting the comparability of the groups. It is nonetheless clear that the government was able to manage contracts effectively despite early concerns about their lack of experience, and that contracting has helped to improve utilization of basic health services. PMID:19850664

  14. Organic Contamination Baseline Study on NASA JSC Astromaterial Curation Gloveboxes

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Allton, J. H.; Allen, C. C.; Burkett, P. J.

    2013-01-01

    Future planned sample return missions to carbon-rich asteroids and Mars in the next two decades will require strict handling and curation protocols as well as new procedures for reducing organic contamination. After the Apollo program, astromaterial collections have mainly been concerned with inorganic contamination [1-4]. However, future isolation containment systems for astromaterials, possibly nitrogen enriched gloveboxes, must be able to reduce organic and inorganic cross-contamination. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs that could be used as a benchmark for future mission designs.

  15. The OSIRIS-Rex Asteroid Sample Return: Mission Operations Design

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Cheuvront, Allan

    2014-01-01

    The OSIRIS-REx mission employs a methodical, phased approach to ensure success in meeting the missions science requirements. OSIRIS-REx launches in September 2016, with a backup launch period occurring one year later. Sampling occurs in 2019. The departure burn from Bennu occurs in March 2021. On September 24, 2023, the SRC lands at the Utah Test and Training Range (UTTR). Stardust heritage procedures are followed to transport the SRC to Johnson Space Center, where the samples are removed and delivered to the OSIRIS-REx curation facility. After a six-month preliminary examination period the mission will produce a catalog of the returned sample, allowing the worldwide community to request samples for detailed analysis.Traveling and returning a sample from an Asteroid that has not been explored before requires unique operations consideration. The Design Reference Mission (DRM) ties together space craft, instrument and operations scenarios. The project implemented lessons learned from other small body missions: APLNEAR, JPLDAWN and ESARosetta. The key lesson learned was expected the unexpected and implement planning tools early in the lifecycle. In preparation to PDR, the project changed the asteroid arrival date, to arrive one year earlier and provided additional time margin. STK is used for Mission Design and STKScheduler for instrument coverage analysis.

  16. From field to database : a user-oriented approche to promote cyber-curating of scientific drilling cores

    NASA Astrophysics Data System (ADS)

    Pignol, C.; Arnaud, F.; Godinho, E.; Galabertier, B.; Caillo, A.; Billy, I.; Augustin, L.; Calzas, M.; Rousseau, D. D.; Crosta, X.

    2016-12-01

    Managing scientific data is probably one the most challenging issues in modern science. In plaeosciences the question is made even more sensitive with the need of preserving and managing high value fragile geological samples: cores. Large international scientific programs, such as IODP or ICDP led intense effort to solve this problem and proposed detailed high standard work- and dataflows thorough core handling and curating. However many paleoscience results derived from small-scale research programs in which data and sample management is too often managed only locally - when it is… In this paper we present a national effort leads in France to develop an integrated system to curate ice and sediment cores. Under the umbrella of the national excellence equipment program CLIMCOR, we launched a reflexion about core curating and the management of associated fieldwork data. Our aim was then to conserve all data from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. To do so, our demarche was conducted through an intimate relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative proposes a single web portal in which all teams can store their fieldwork data. This portal is used as a national hub to attribute IGSNs. For legacy samples, this requires the establishment of a dedicated core list with associated metadata. However, for forthcoming core data, we developed a mobile application to capture technical and scientific data directly on the field. This application is linked with a unique coring-tools library and is adapted to most coring devices (gravity, drilling, percussion etc.) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards (IGSN and INSPIRE) and displayed in international portals (currently, NOAA's IMLGS). In this paper, we present the architecture of the integrated system, future perspectives and the approach we adopted to reach our goals. We will also present our mobile application through didactic examples.

  17. 78 FR 29393 - University of Missouri-Columbia Facility Operating License No. R-103

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... Curators of the University of Missouri--Columbia (the licensee) to operate the Missouri University Research Reactor (MURR) at a maximum steady-state thermal power of 10 megawatts (MW). The renewed license would authorize the licensee to operate the MURR up to a steady-state thermal power of 10 MW for an additional 20...

  18. Cleaning Genesis Sample Return Canister for Flight: Lessons for Planetary Sample Return

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Hittle, J. D.; Mickelson, E. T.; Stansbery, Eileen K.

    2016-01-01

    Sample return missions require chemical contamination to be minimized and potential sources of contamination to be documented and preserved for future use. Genesis focused on and successfully accomplished the following: - Early involvement provided input to mission design: a) cleanable materials and cleanable design; b) mission operation parameters to minimize contamination during flight. - Established contamination control authority at a high level and developed knowledge and respect for contamination control across all institutions at the working level. - Provided state-of-the-art spacecraft assembly cleanroom facilities for science canister assembly and function testing. Both particulate and airborne molecular contamination was minimized. - Using ultrapure water, cleaned spacecraft components to a very high level. Stainless steel components were cleaned to carbon monolayer levels (10 (sup 15) carbon atoms per square centimeter). - Established long-term curation facility Lessons learned and areas for improvement, include: - Bare aluminum is not a cleanable surface and should not be used for components requiring extreme levels of cleanliness. The problem is formation of oxides during rigorous cleaning. - Representative coupons of relevant spacecraft components (cut from the same block at the same time with identical surface finish and cleaning history) should be acquired, documented and preserved. Genesis experience suggests that creation of these coupons would be facilitated by specification on the engineering component drawings. - Component handling history is critical for interpretation of analytical results on returned samples. This set of relevant documents is not the same as typical documentation for one-way missions and does include data from several institutions, which need to be unified. Dedicated resources need to be provided for acquiring and archiving appropriate documents in one location with easy access for decades. - Dedicated, knowledgeable contamination control oversight should be provided at sites of fabrication and integration. Numerous excellent Genesis chemists and analytical facilities participated in the contamination oversight; however, additional oversight at fabrication sites would have been helpful.

  19. An efficient field and laboratory workflow for plant phylotranscriptomic projects1

    PubMed Central

    Yang, Ya; Moore, Michael J.; Brockington, Samuel F.; Timoneda, Alfonso; Feng, Tao; Marx, Hannah E.; Walker, Joseph F.; Smith, Stephen A.

    2017-01-01

    Premise of the study: We describe a field and laboratory workflow developed for plant phylotranscriptomic projects that involves cryogenic tissue collection in the field, RNA extraction and quality control, and library preparation. We also make recommendations for sample curation. Methods and Results: A total of 216 frozen tissue samples of Caryophyllales and other angiosperm taxa were collected from the field or botanical gardens. RNA was extracted, stranded mRNA libraries were prepared, and libraries were sequenced on Illumina HiSeq platforms. These included difficult mucilaginous tissues such as those of Cactaceae and Droseraceae. Conclusions: Our workflow is not only cost effective (ca. $270 per sample, as of August 2016, from tissue to reads) and time efficient (less than 50 h for 10–12 samples including all laboratory work and sample curation), but also has proven robust for extraction of difficult samples such as tissues containing high levels of secondary compounds. PMID:28337391

  20. Oxygen and Magnesium Isotopic Compositions of Asteroidal Materials Returned from Itokawa by the Hayabusa Mission

    NASA Technical Reports Server (NTRS)

    Yurimoto, H; Abe, M.; Ebihara, M.; Fujimura, A.; Hashizume, K.; Ireland, T. R.; Itoh, S.; Kawaguchi, K.; Kitajima, F.; Mukai, T.; hide

    2011-01-01

    The Hayabusa spacecraft made two touchdowns on the surface of Asteroid 25143 Itokawa on November 20th and 26th, 2005. The Asteroid 25143 Itokawa is classified as an S-type asteroid and inferred to consist of materials similar to ordinary chondrites or primitive achondrites [1]. Near-infrared spectroscopy by the Hayabusa spacecraft proposed that the surface of this body has an olivine-rich mineral assemblage potentially similar to that of LL5 or LL6 chondrites with different degrees of space weathering [2]. The spacecraft made the reentry into the Earth s atmosphere on June 12th, 2010 and the sample capsule was successfully recovered in Australia on June 13th, 2010. Although the sample collection processes on the Itokawa surface had not been made by the designed operations, more than 1,500 grains were identified as rocky particles in the sample curation facility of JAXA, and most of them were judged to be of extraterrestrial origin, and definitely from Asteroid Itokawa on November 17th, 2010 [3]. Although their sizes are mostly less than 10 microns, some larger grains of about 100 microns or larger were also included. The mineral assembly is olivine, pyroxene, plagioclase, iron sulfide and iron metal. The mean mineral compositions are consistent with the results of near-infrared spectroscopy from Hayabusa spacecraft [2], but the variations suggest that the petrologic type may be smaller than the spectroscopic results. Several tens of grains of relatively large sizes among the 1,500 grains will be selected by the Hayabusa sample curation team for preliminary examination [4]. Each grain will be subjected to one set of preliminary examinations, i.e., micro-tomography, XRD, XRF, TEM, SEM, EPMA and SIMS in this sequence. The preliminary examination will start from the last week of January 2011. Therefore, samples for isotope analyses in this study will start from the last week of February 2011. By the time of the LPSC meeting we will have measured the oxygen and magnesium isotopic composition of several grains. We will present the first results from the isotope analyses that will have been performed.

  1. The Index to Marine and Lacustrine Geological Samples (IMLGS): Linking Digital Data to Physical Samples for the Marine Community

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Jencks, J. H.; Eakins, B.

    2016-12-01

    The Index to Marine and Lacustrine Geological Samples (IMLGS) is a community designed and maintained resource enabling researchers to locate and request seafloor and lakebed geologic samples curated by partner institutions. The Index was conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center, now the National Centers for Environmental Information (NCEI), at a 1977 meeting convened by the National Science Foundation (NSF). The Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. The Curators Consortium, international in scope, meets biennially to share ideas and discuss best practices. NCEI serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the laborious task of creating and contributing metadata for over 205,000 sea floor and lake-bed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The IMLGS has a persistent URL/Digital Object Identifier (DOI), as well as DOIs assigned to partner collections for citation and to provide a persistent link to curator collections. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images : 1) at participating institutions, 2) in the NCEI archive, and 3) through a Linked Data interface maintained by the Rolling Deck to Repository R2R. Over 43,000 International GeoSample Numbers (IGSNs) linking to the System for Earth Sample Registration (SESAR) are included in anticipation of opportunities for interconnectivity with Integrated Earth Data Applications (IEDA) systems. The paper will discuss the database with a goal to increase the connections and links to related data at partner institutions.

  2. 'First we go to the small doctor': first contact for curative health care sought by rural communities in Andhra Pradesh & Orissa, India.

    PubMed

    Gautham, Meenakshi; Binnendijk, Erika; Koren, Ruth; Dror, David M

    2011-11-01

    Against the backdrop of insufficient public supply of primary care and reports of informal providers, the present study sought to collect descriptive evidence on 1 st contact curative health care seeking choices among rural communities in two States of India - Andhra Pradesh (AP) and Orissa. The cross-sectional study design combined a Household Survey (1,810 households in AP; 5,342 in Orissa), 48 Focus Group Discussions (19 in AP; 29 in Orissa), and 61 Key Informant Interviews with healthcare providers (22 in AP; 39 in Orissa). In AP, 69.5 per cent of respondents accessed non-degree allopathic practitioners (NDAPs) practicing in or near their village; in Orissa, 40.2 per cent chose first curative contact with NDAPs and 36.2 per cent with traditional healers. In AP, all NDAPs were private practitioners, in Orissa some pharmacists and nurses employed in health facilities, also practiced privately. Respondents explained their choice by proximity and providers' readiness to make house-calls when needed. Less than a quarter of respondents chose qualified doctors as their first point of call: mostly private practitioners in AP, and public practitioners in Orissa. Amongst those who chose a qualified practitioner, the most frequent reason was doctors' quality rather than proximity. The results of this study show that most rural persons seek first level of curative healthcare close to home, and pay for a composite convenient service of consulting-cum-dispensing of medicines. NDAPs fill a huge demand for primary curative care which the public system does not satisfy, and are the de facto first level access in most cases.

  3. Asteroid Redirect Mission: EVA and Sample Collection

    NASA Technical Reports Server (NTRS)

    Abell, Paul; Stich, Steve

    2015-01-01

    Asteroid Redirect Mission (ARM) Overview (1) Notional Development Schedule, (2) ARV Crewed Mission Accommodations; Asteroid Redirect Crewed Mission (ARCM) Mission Summary; ARCM Accomplishments; Sample collection/curation plan (1) CAPTEM Requirements; SBAG Engagement Plan

  4. ECTFE (HALAR) as a New Material for Primary Sample Containment of Astromaterials

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; McConnell, J.T.

    2014-01-01

    Fluoropolymers, such as Teflon® (PTFE, PFA, FEP) and Viton® (FKM), have been used for over 40 years in curating astromaterials at NASA JSC. In general, fluoropolymers have low outgassing and particle shedding properties that reduce cross-contamination to curated samples. Ethylene - Chlorotrifluoroethylene (ECTFE), commonly called Halar® (trademark of Solvay Solexis), is a partially fluorinated semi-crystalline copolymer in the same class of fluoropolymers with superior abrasion resistance and extremely low permeability to liquids, gases, and vapors than any other fluoropolymer (fig. 1). ECTFE coatings are becoming more popular in the nuclear, semiconductor, and biomedical industry for lining isolation containment gloveboxes and critical piping as well as other clean room applications. A study was conducted at NASA JSC to evaluate the potential use of Halar on future sample return missions as a material for primary sample containment.

  5. Identifying the Functional Requirements for an Arizona Astronomy Data Hub (AADH)

    NASA Astrophysics Data System (ADS)

    Stahlman, G.; Heidorn, P. B.

    2015-12-01

    Astronomy data represent a curation challenge for information managers, as well as for astronomers. Extracting knowledge from these heterogeneous and complex datasets is particularly complicated and requires both interdisciplinary and domain expertise to accomplish true curation, with an overall goal of facilitating reproducible science through discoverability and persistence. A group of researchers and professional staff at the University of Arizona held several meetings during the spring of 2015 about astronomy data and the role of the university in curation of that data. The group decided that it was critical to obtain a broader consensus on the needs of the community. With assistance from a Start for Success grant provided by the University of Arizona Office of Research and Discovery and funding from the American Astronomical Society (AAS), a workshop was held in early July 2015, with 28 participants plus 4 organizers in attendance. Representing University researchers as well as astronomical facilities and a scholarly society, the group verified that indeed there is a problem with the long-term curation of some astronomical data not associated with major facilities, and that a repository or "data hub" with the correct functionality could facilitate research and the preservation and use of astronomy data. The workshop members also identified a set of next steps, including the identification of possible data and metadata to be included in the Hub. The participants further helped to identify additional information that must be gathered before construction of the AADH could begin, including identifying significant datasets that do not currently have sufficient preservation and dissemination infrastructure, as well as some data associated with journal publications and the broader context of the data beyond that directly published in the journals. Workshop participants recommended that a set of grant proposal should be developed that ensures community buy-in and participation. The project should be developed in an agile, incremental manner that will allow consistent community growth from the early stages of the project, building on existing iPlant infrastructure (www.iplantcollaborative.org) initially developed for the biology community.

  6. The Index to Marine and Lacustrine Geological Samples: Improving Sample Accessibility and Enabling Current and Future Research

    NASA Astrophysics Data System (ADS)

    Moore, C.

    2011-12-01

    The Index to Marine and Lacustrine Geological Samples is a community designed and maintained resource enabling researchers to locate and request sea floor and lakebed geologic samples archived by partner institutions. Conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center (NGDC) at a 1977 meeting convened by the National Science Foundation (NSF), the Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. Form and content of underlying vocabularies and metadata continue to evolve according to the needs of the community, as do supporting technologies and access methodologies. The Curators Consortium, now international in scope, meets at partner institutions biennially to share ideas and discuss best practices. NGDC serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the herculean task of creating and contributing metadata for over 195,000 sea floor and lakebed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images; 1) at participating institutions, 2) in the NGDC archive, and 3) at sites such as the Rolling Deck to Repository (R2R) and the System for Earth Sample Registration (SESAR). Over 34,000 International GeoSample Numbers (IGSNs) linking to SESAR are included in anticipation of opportunities for interconnectivity with Integrated Earth Data Applications (IEDA) systems. To promote interoperability and broaden exposure via the semantic web, NGDC is publishing lithologic classification schemes and terminology used in the Index as Simple Knowledge Organization System (SKOS) vocabularies, coordinating with R2R and the Consortium for Ocean Leadership for consistency. Availability in SKOS form will also facilitate use of the vocabularies in International Standards Organization (ISO) 19115-2 compliant metadata records. NGDC provides stewardship for the Index on behalf of U.S. repositories as the NSF designated "appropriate National Data Center" for data and metadata pertaining to sea floor samples as specified in the 2011 Division of Ocean Sciences Sample and Data Policy, and on behalf of international partners via a collocated World Data Center. NGDC operates on the Open Archival Information System (OAIS) reference model. Active Partners: Antarctic Marine Geology Research Facility, Florida State University; British Ocean Sediment Core Research Facility; Geological Survey of Canada; Integrated Ocean Drilling Program; Lamont-Doherty Earth Observatory; National Lacustrine Core Repository, University of Minnesota; Oregon State University; Scripps Institution of Oceanography; University of Rhode Island; U.S. Geological Survey; Woods Hole Oceanographic Institution.

  7. Utilization of surgical treatment for local and locoregional esophageal cancer: Analysis of the National Cancer Data Base.

    PubMed

    Taylor, Lauren J; Greenberg, Caprice C; Lidor, Anne O; Leverson, Glen E; Maloney, James D; Macke, Ryan A

    2017-02-01

    Previous studies have suggested that esophagectomy is severely underused for patients with resectable esophageal cancer. The recent expansion of endoscopic local therapies, advances in surgical techniques, and improved postoperative outcomes have changed the therapeutic landscape. The impact of these developments and evolving treatment guidelines on national practice patterns is unknown. Patients diagnosed with clinical stage 0 to III esophageal cancer were identified from the National Cancer Database (2004-2013). The receipt of potentially curative surgical treatment over time was analyzed, and multivariate logistic regression was used to identify factors associated with surgical treatment. The analysis included 52,122 patients. From 2004 to 2013, the overall rate of potentially curative surgical treatment increased from 36.4% to 47.4% (P < .001). For stage 0 disease, the receipt of esophagectomy decreased from 23.8% to 17.9% (P < .001), whereas the use of local therapies increased from 34.3% to 58.8% (P < .001). The use of surgical treatment increased from 43.4% to 61.8% (P < .001), from 36.1% to 45.0% (P < .001), and from 30.8% to 38.6% (P < .001) for patients with stage I, II, and III disease, respectively. In the multivariate analysis, divergent practice patterns and adherence to national guidelines were noted between academic and community facilities. The use of potentially curative surgical treatment has increased for patients with stage 0 to III esophageal cancer. The expansion of local therapies has driven increased rates of surgical treatment for early-stage disease. Although the increased use of esophagectomy for more advanced disease is encouraging, significant variation persists at the patient and facility levels. Cancer 2017;123:410-419. © 2016 American Cancer Society. © 2016 American Cancer Society.

  8. St. Petersburg Coastal and Marine Science Center's Core Archive Portal

    USGS Publications Warehouse

    Reich, Chris; Streubert, Matt; Dwyer, Brendan; Godbout, Meg; Muslic, Adis; Umberger, Dan

    2012-01-01

    This Web site contains information on rock cores archived at the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC). Archived cores consist of 3- to 4-inch-diameter coral cores, 1- to 2-inch-diameter rock cores, and a few unlabeled loose coral and rock samples. This document - and specifically the archive Web site portal - is intended to be a 'living' document that will be updated continually as additional cores are collected and archived. This document may also contain future references and links to a catalog of sediment cores. Sediment cores will include vibracores, pushcores, and other loose sediment samples collected for research purposes. This document will: (1) serve as a database for locating core material currently archived at the USGS SPCMSC facility; (2) provide a protocol for entry of new core material into the archive system; and, (3) set the procedures necessary for checking out core material for scientific purposes. Core material may be loaned to other governmental agencies, academia, or non-governmental organizations at the discretion of the USGS SPCMSC curator.

  9. Preserving Samples and Their Scientific Integrity — Insights into MSR from the Astromaterials Acquisition and Curation Office at NASA Johnson Space Center

    NASA Astrophysics Data System (ADS)

    Calaway, M. J.; Regberg, A. B.; Mitchell, J. L.; Fries, M. D.; Zeigler, R. A.; McCubbin, F. M.; Harrington, A. D.

    2018-04-01

    Rigorous collection of samples for contamination knowledge, the information gained from the characterization of reference materials and witness plates in concurrence with sample return, is essential for MSR mission success.

  10. Extraterrestrial Samples at JSC

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2007-01-01

    A viewgraph presentation on the curation of extraterrestrial samples at NASA Johnson Space Center is shown. The topics include: 1) Apollo lunar samples; 2) Meteorites from Antarctica; 3) Cosmic dust from the stratosphere; 4) Genesis solar wind ions; 5) Stardust comet and interstellar grains; and 5) Space-Exposed Hardware.

  11. Observations from a 4-year contamination study of a sample depth profile through Martian meteorite Nakhla.

    PubMed

    Toporski, Jan; Steele, Andrew

    2007-04-01

    Morphological, compositional, and biological evidence indicates the presence of numerous well-developed microbial hyphae structures distributed within four different sample splits of the Nakhla meteorite obtained from the British Museum (allocation BM1913,25). By examining depth profiles of the sample splits over time, morphological changes displayed by the structures were documented, as well as changes in their distribution on the samples, observations that indicate growth, decay, and reproduction of individual microorganisms. Biological staining with DNA-specific molecular dyes followed by epifluorescence microscopy showed that the hyphae structures contain DNA. Our observations demonstrate the potential of microbial interaction with extraterrestrial materials, emphasize the need for rapid investigation of Mars return samples as well as any other returned or impactor-delivered extraterrestrial materials, and suggest the identification of appropriate storage conditions that should be followed immediately after samples retrieved from the field are received by a handling/curation facility. The observations are further relevant in planetary protection considerations as they demonstrate that microorganisms may endure and reproduce in extraterrestrial materials over long (at least 4 years) time spans. The combination of microscopy images coupled with compositional and molecular staining techniques is proposed as a valid method for detection of life forms in martian materials as a first-order assessment. Time-resolved in situ observations further allow observation of possible (bio)dynamics within the system.

  12. Site-based data curation based on hot spring geobiology

    PubMed Central

    Palmer, Carole L.; Thomer, Andrea K.; Baker, Karen S.; Wickett, Karen M.; Hendrix, Christie L.; Rodman, Ann; Sigler, Stacey; Fouke, Bruce W.

    2017-01-01

    Site-Based Data Curation (SBDC) is an approach to managing research data that prioritizes sharing and reuse of data collected at scientifically significant sites. The SBDC framework is based on geobiology research at natural hot spring sites in Yellowstone National Park as an exemplar case of high value field data in contemporary, cross-disciplinary earth systems science. Through stakeholder analysis and investigation of data artifacts, we determined that meaningful and valid reuse of digital hot spring data requires systematic documentation of sampling processes and particular contextual information about the site of data collection. We propose a Minimum Information Framework for recording the necessary metadata on sampling locations, with anchor measurements and description of the hot spring vent distinct from the outflow system, and multi-scale field photography to capture vital information about hot spring structures. The SBDC framework can serve as a global model for the collection and description of hot spring systems field data that can be readily adapted for application to the curation of data from other kinds scientifically significant sites. PMID:28253269

  13. Digital Curation of Earth Science Samples Starts in the Field

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Hsu, L.; Song, L.; Carter, M. R.

    2014-12-01

    Collection of physical samples in the field is an essential part of research in the Earth Sciences. Samples provide a basis for progress across many disciplines, from the study of global climate change now and over the Earth's history, to present and past biogeochemical cycles, to magmatic processes and mantle dynamics. The types of samples, methods of collection, and scope and scale of sampling campaigns are highly diverse, ranging from large-scale programs to drill rock and sediment cores on land, in lakes, and in the ocean, to environmental observation networks with continuous sampling, to single investigator or small team expeditions to remote areas around the globe or trips to local outcrops. Cyberinfrastructure for sample-related fieldwork needs to cater to the different needs of these diverse sampling activities, aligning with specific workflows, regional constraints such as connectivity or climate, and processing of samples. In general, digital tools should assist with capture and management of metadata about the sampling process (location, time, method) and the sample itself (type, dimension, context, images, etc.), management of the physical objects (e.g., sample labels with QR codes), and the seamless transfer of sample metadata to data systems and software relevant to the post-sampling data acquisition, data processing, and sample curation. In order to optimize CI capabilities for samples, tools and workflows need to adopt community-based standards and best practices for sample metadata, classification, identification and registration. This presentation will provide an overview and updates of several ongoing efforts that are relevant to the development of standards for digital sample management: the ODM2 project that has generated an information model for spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples, aligned with OGC's Observation & Measurements model (Horsburgh et al, AGU FM 2014); implementation of the IGSN (International Geo Sample Number) as a globally unique sample identifier via a distributed system of allocating agents and a central registry; and the EarthCube Research Coordination Network iSamplES (Internet of Samples in the Earth Sciences) that aims to improve sharing and curation of samples through the use of CI.

  14. Data Stewardship in the Ocean Sciences Needs to Include Physical Samples

    NASA Astrophysics Data System (ADS)

    Carter, M.; Lehnert, K.

    2016-02-01

    Across the Ocean Sciences, research involves the collection and study of samples collected above, at, and below the seafloor, including but not limited to rocks, sediments, fluids, gases, and living organisms. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). iSamples (Internet of Samples in the Earth Sciences) is a Research Coordination Network within the EarthCube program that aims to advance the use of innovative cyberinfrastructure to support and advance the utility of physical samples and sample collections for science and ensure reproducibility of sample-based data and research results. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture for a shared cyberinfrastructure to manage collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical samples. Repositories that curate marine sediment cores and dredge samples from the oceanic crust are participating in iSamples, but many other samples collected in the Ocean sciences are not yet represented. This presentation aims to engage a wider spectrum of Ocean scientists and sample curators in iSamples.

  15. The Principles for Successful Scientific Data Management Revisited

    NASA Astrophysics Data System (ADS)

    Walker, R. J.; King, T. A.; Joy, S. P.

    2005-12-01

    It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.

  16. Managing Data and Facilitating Science: A spectrum of activities in the Centre for Environmental Data Archival. (Invited)

    NASA Astrophysics Data System (ADS)

    Lawrence, B.; Bennett, V.; Callaghan, S.; Juckes, M. N.; Pepler, S.

    2013-12-01

    The UK Centre for Environmental Data Archival (CEDA) hosts a number of formal data centres, including the British Atmospheric Data Centre (BADC), and is a partner in a range of national and international data federations, including the InfraStructure for the European Network for Earth system Simulation, the Earth System Grid Federation, and the distributed IPCC Data Distribution Centres. The mission of CEDA is to formally curate data from, and facilitate the doing of, environmental science. The twin aims are symbiotic: data curation helps facilitate science, and facilitating science helps with data curation. Here we cover how CEDA delivers this strategy by established internal processes supplemented by short-term projects, supported by staff with a range of roles. We show how CEDA adds value to data in the curated archive, and how it supports science, and show examples of the aforementioned symbiosis. We begin by discussing curation: CEDA has the formal responsibility for curating the data products of atmospheric science and earth observation research funded by the UK Natural Environment Research Council (NERC). However, curation is not just about the provider community, the consumer communities matter too, and the consumers of these data cross the boundaries of science, including engineers, medics, as well as the gamut of the environmental sciences. There is a small, and growing cohort of non-science users. For both producers and consumers of data, information about data is crucial, and a range of CEDA staff have long worked on tools and techniques for creating, managing, and delivering metadata (as well as data). CEDA "science support" staff work with scientists to help them prepare and document data for curation. As one of a spectrum of activities, CEDA has worked on data Publication as a method of both adding value to some data, and rewarding the effort put into the production of quality datasets. As such, we see this activity as both a curation and a facilitation activity. A range of more focused facilitation activities are carried out, from providing a computing platform suitable for big-data analytics (the Joint Analysis System, JASMIN), to working on distributed data analysis (EXARCH), and the acquisition of third party data to support science and impact (e.g. in the context of the facility for Climate and Environmental Monitoring from Space, CEMS). We conclude by confronting the view of Parsons and Fox (2013) that metaphors such as Data Publication, Big Iron, Science Support etc are limiting, and suggest the CEDA experience is that these sorts of activities can and do co-exist, much as they conclude they should. However, we also believe that within co-existing metaphors, production systems need to be limited in their scope, even if they are on a road to a more joined up infrastructure. We shouldn't confuse what we can do now with what we might want to do in the future.

  17. Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea

    2014-05-01

    National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.

  18. CCDB: a curated database of genes involved in cervix cancer.

    PubMed

    Agarwal, Subhash M; Raghav, Dhwani; Singh, Harinder; Raghava, G P S

    2011-01-01

    The Cervical Cancer gene DataBase (CCDB, http://crdd.osdd.net/raghava/ccdb) is a manually curated catalog of experimentally validated genes that are thought, or are known to be involved in the different stages of cervical carcinogenesis. In spite of the large women population that is presently affected from this malignancy still at present, no database exists that catalogs information on genes associated with cervical cancer. Therefore, we have compiled 537 genes in CCDB that are linked with cervical cancer causation processes such as methylation, gene amplification, mutation, polymorphism and change in expression level, as evident from published literature. Each record contains details related to gene like architecture (exon-intron structure), location, function, sequences (mRNA/CDS/protein), ontology, interacting partners, homology to other eukaryotic genomes, structure and links to other public databases, thus augmenting CCDB with external data. Also, manually curated literature references have been provided to support the inclusion of the gene in the database and establish its association with cervix cancer. In addition, CCDB provides information on microRNA altered in cervical cancer as well as search facility for querying, several browse options and an online tool for sequence similarity search, thereby providing researchers with easy access to the latest information on genes involved in cervix cancer.

  19. Celebrated Moon Rocks

    NASA Astrophysics Data System (ADS)

    Martel, L. M. V.

    2009-12-01

    The Need for Lunar Samples and Simulants: Where Engineering and Science Meet sums up one of the sessions attracting attention at the annual meeting of the Lunar Exploration Analysis Group (LEAG), held November 16-19, 2009 in Houston, Texas. Speakers addressed the question of how the Apollo lunar samples can be used to facilitate NASA's return to the Moon while preserving the collection for scientific investigation. Here is a summary of the LEAG presentations of Dr. Gary Lofgren, Lunar Curator at the NASA Johnson Space Center in Houston, Texas, and Dr. Meenakshi (Mini) Wadhwa, Professor at Arizona State University and Chair of NASA's advisory committee called CAPTEM (Curation and Analysis Planning Team for Extraterrestrial Materials). Lofgren gave a status report of the collection of rocks and regolith returned to Earth by the Apollo astronauts from six different landing sites on the Moon in 1969-1972. Wadhwa explained the role of CAPTEM in lunar sample allocation.

  20. Ultrasonic Micro-Blades for the Rapid Extraction of Impact Tracks from Aerogel

    NASA Technical Reports Server (NTRS)

    Ishii, H. A.; Graham, G. A.; Kearsley, A. T.; Grant, P. G.; Snead, C. J.; Bradley, J. P.

    2005-01-01

    The science return of NASA's Stardust Mission with its valuable cargo of cometary debris hinges on the ability to efficiently extract particles from silica aerogel collectors. The current method for extracting cosmic dust impact tracks is a mature procedure involving sequential perforation of the aerogel with glass needles on computer controlled micromanipulators. This method is highly successful at removing well-defined aerogel fragments of reasonable optical clarity while causing minimal damage to the surrounding aerogel collector tile. Such a system will be adopted by the JSC Astromaterials Curation Facility in anticipation of Stardust s arrival in early 2006. In addition to Stardust, aerogel is a possible collector for future sample return missions and is used for capture of hypervelocity ejecta in high power laser experiments of interest to LLNL. Researchers will be eager to obtain Stardust samples for study as quickly as possible, and rapid extraction tools requiring little construction, training, or investment would be an attractive asset. To this end, we have experimented with micro-blades for the Stardust impact track extraction process. Our ultimate goal is a rapid extraction system in a clean electron beam environment, such as an SEM or dual-beam FIB, for in situ sample preparation, mounting and analysis.

  1. Physicians' evaluations of patients' decisions to refuse oncological treatment

    PubMed Central

    van Kleffens, T; van Leeuwen, E

    2005-01-01

    Objective: To gain insight into the standards of rationality that physicians use when evaluating patients' treatment refusals. Design of the study: Qualitative design with indepth interviews. Participants: The study sample included 30 patients with cancer and 16 physicians (oncologists and general practitioners). All patients had refused a recommended oncological treatment. Results: Patients base their treatment refusals mainly on personal values and/or experience. Physicians mainly emphasise the medical perspective when evaluating patients' treatment refusals. From a medical perspective, a patient's treatment refusal based on personal values and experience is generally evaluated as irrational and difficult to accept, especially when it concerns a curative treatment. Physicians have a different attitude towards non-curative treatments and have less difficulty accepting a patient's refusal of these treatments. Thus, an important factor in the physician's evaluation of a treatment refusal is whether the treatment refused is curative or non-curative. Conclusion: Physicians mainly use goal oriented and patients mainly value oriented rationality, but in the case of non-curative treatment refusal, physicians give more emphasis to value oriented rationality. A consensus between the value oriented approaches of patient and physician may then emerge, leading to the patient's decision being understood and accepted by the physician. The physician's acceptance is crucial to his or her attitude towards the patient. It contributes to the patient's feeling free to decide, and being understood and respected, and thus to a better physician–patient relationship. PMID:15738431

  2. Go Digital! Making Physical Samples a Valued Part of the Online Record of Science

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Lehnert, K.

    2016-12-01

    Physical samples, at first glance, seem to be the opposite to the virtual world of the internet. Yet, as anything not natively digital, physical samples can have a digital representation that is accessible through the internet. Most museums and other institutions have many more objects in their collections than they could ever put on display and many samples exist outside of formal curation workflows. Nevertheless, these objects can be of importance to science, maybe because this particular fossil is a holotype that defines an extinct animal species, or it is a mineral sample that was used to derive a reference optical reflectance spectrum that is used in the interpretation of remote sensing data from satellites. As these examples show, the value of a scientific collection lies not only in its objects but also in how these objects are integrated into the record of science. Fundamental to this are, of course, catalogues of the samples held in a collection. Significant value can be added to a collection if its catalogue is web accessible, and even better if its catalogue can be harvested into disciplinary portals to aid the discovery of samples. Sample curation in the digital age, however, must go beyond simply labeling and cataloguing. In the same way that publications and datasets can now be identified and accessed over the web, steps are now being made to do the same for physical samples. Globally unique, resolvable identifiers of samples, datasets and literature can serve as nodes to link these resources together and in this way, then cross-link between scientific interpretation in the literature, data interpreted in these works, and samples from which these data were derived. These linkages must not only be recorded in the metadata but must also be machine actionable to allow integration of these digital assets into the ever growing body and richness of the scientific record. This presentation will discuss cyberinfrastructures for samples and sample curation through case studies that illustrate how the life cycle of a sample relates to other digital objects in literature and data, and how added value is generated through these linkages.

  3. HPMCD: the database of human microbial communities from metagenomic datasets and microbial reference genomes.

    PubMed

    Forster, Samuel C; Browne, Hilary P; Kumar, Nitin; Hunt, Martin; Denise, Hubert; Mitchell, Alex; Finn, Robert D; Lawley, Trevor D

    2016-01-04

    The Human Pan-Microbe Communities (HPMC) database (http://www.hpmcd.org/) provides a manually curated, searchable, metagenomic resource to facilitate investigation of human gastrointestinal microbiota. Over the past decade, the application of metagenome sequencing to elucidate the microbial composition and functional capacity present in the human microbiome has revolutionized many concepts in our basic biology. When sufficient high quality reference genomes are available, whole genome metagenomic sequencing can provide direct biological insights and high-resolution classification. The HPMC database provides species level, standardized phylogenetic classification of over 1800 human gastrointestinal metagenomic samples. This is achieved by combining a manually curated list of bacterial genomes from human faecal samples with over 21000 additional reference genomes representing bacteria, viruses, archaea and fungi with manually curated species classification and enhanced sample metadata annotation. A user-friendly, web-based interface provides the ability to search for (i) microbial groups associated with health or disease state, (ii) health or disease states and community structure associated with a microbial group, (iii) the enrichment of a microbial gene or sequence and (iv) enrichment of a functional annotation. The HPMC database enables detailed analysis of human microbial communities and supports research from basic microbiology and immunology to therapeutic development in human health and disease. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Opportunities and Challenges of Linking Scientific Core Samples to the Geoscience Data Ecosystem

    NASA Astrophysics Data System (ADS)

    Noren, A. J.

    2016-12-01

    Core samples generated in scientific drilling and coring are critical for the advancement of the Earth Sciences. The scientific themes enabled by analysis of these samples are diverse, and include plate tectonics, ocean circulation, Earth-life system interactions (paleoclimate, paleobiology, paleoanthropology), Critical Zone processes, geothermal systems, deep biosphere, and many others, and substantial resources are invested in their collection and analysis. Linking core samples to researchers, datasets, publications, and funding agencies through registration of globally unique identifiers such as International Geo Sample Numbers (IGSNs) offers great potential for advancing several frontiers. These include maximizing sample discoverability, access, reuse, and return on investment; a means for credit to researchers; and documentation of project outputs to funding agencies. Thousands of kilometers of core samples and billions of derivative subsamples have been generated through thousands of investigators' projects, yet the vast majority of these samples are curated at only a small number of facilities. These numbers, combined with the substantial similarity in sample types, make core samples a compelling target for IGSN implementation. However, differences between core sample communities and other geoscience disciplines continue to create barriers to implementation. Core samples involve parent-child relationships spanning 8 or more generations, an exponential increase in sample numbers between levels in the hierarchy, concepts related to depth/position in the sample, requirements for associating data derived from core scanning and lithologic description with data derived from subsample analysis, and publications based on tens of thousands of co-registered scan data points and thousands of analyses of subsamples. These characteristics require specialized resources for accurate and consistent assignment of IGSNs, and a community of practice to establish norms, workflows, and infrastructure to support implementation.

  5. Managing the explosion of high resolution topography in the geosciences

    NASA Astrophysics Data System (ADS)

    Crosby, Christopher; Nandigam, Viswanath; Arrowsmith, Ramon; Phan, Minh; Gross, Benjamin

    2017-04-01

    Centimeter to decimeter-scale 2.5 to 3D sampling of the Earth surface topography coupled with the potential for photorealistic coloring of point clouds and texture mapping of meshes enables a wide range of science applications. Not only is the configuration and state of the surface as imaged valuable, but repeat surveys enable quantification of topographic change (erosion, deposition, and displacement) caused by various geologic processes. We are in an era of ubiquitous point clouds that come from both active sources such as laser scanners and radar as well as passive scene reconstruction via structure from motion (SfM) photogrammetry. With the decreasing costs of high-resolution topography (HRT) data collection, via methods such as SfM and UAS-based laser scanning, the number of researchers collecting these data is increasing. These "long-tail" topographic data are of modest size but great value, and challenges exist to making them widely discoverable, shared, annotated, cited, managed and archived. Presently, there are no central repositories or services to support storage and curation of these datasets. The U.S. National Science Foundation funded OpenTopography (OT) Facility employs cyberinfrastructure including large-scale data management, high-performance computing, and service-oriented architectures, to provide efficient online access to large HRT (mostly lidar) datasets, metadata, and processing tools. With over 225 datasets and 15,000 registered users, OT is well positioned to provide curation for community collected high-resolution topographic data. OT has developed a "Community DataSpace", a service built on a low cost storage cloud (e.g. AWS S3) to make it easy for researchers to upload, curate, annotate and distribute their datasets. The system's ingestion workflow will extract metadata from data uploaded; validate it; assign a digital object identifier (DOI); and create a searchable catalog entry, before publishing via the OT portal. The OT Community DataSpace enables wider discovery and utilization of these HRT datasets via the OT portal and sources that federate the OT data catalog, promote citations, and most importantly increase the impact of investments in data to catalyzes scientific discovery.

  6. Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.

  7. [Determination of barium in natural curative waters by ICP-OES technique. Part I. Waters taken on the area of health resorts in Poland].

    PubMed

    Garboś, Sławomir; Swiecicka, Dorota

    2011-01-01

    Maximum admissible concentration level (MACL) of barium in natural mineral waters, natural spring waters and potable waters was set at the level of 1 mg/l, while MACL of this element in natural curative waters intended for drinking therapies and inhalations were set at the levels of 1.0 mg/l and 10.0 mg/l, respectively. Those requirements were related to therapies which are applied longer than one month. Above mentioned maximum admissible concentration levels of barium in consumed waters were established after taking into account actual criteria of World Health Organization which determined the guidelines value for this element in water intended for human consumption at the level of 0.7 mg/l. In this work developed and validated method of determination of barium by inductively coupled plasma emission spectrometry technique was applied for determination of this element in 45 natural curative waters sampled from 24 spa districts situated on the area of Poland. Concentrations of barium determined were in the range from 0.0036 mg/l to 24.0 mg/l. Natural curative waters characterized by concentrations of barium in the ranges of 0.0036 - 0.073 mg/l, 0.0036 - 1.31 mg/l and 0.0036 - 24.0 mg/l, were applied to drinking therapy, inhalations and balneotherapy, respectively (some of waters analyzed were simultaneously applied to drinking therapy, inhalations and balneotherapy). In the cases of 11 natural curative waters exceeding limit of 1 mg/l were observed, however they were classified mainly as waters applied to balneotherapy and in two cases to inhalation therapies (concentrations of barium - 1.08 mg/l and 1.31 mg/l). The procedure of classification of curative waters for adequate therapies based among other things on barium concentrations meets requirements of the Decree of Minister of Health from 13 April 2006 on the range of studies indispensable for establishing medicinal properties of natural curative materials and curative properties of climate, criteria of their assessment and a specimen of certificate confirmed those properties.

  8. Sampling and Analysis of Impact Crater Residues Found on the Wide Field Planetary Camera-2 Radiator

    NASA Astrophysics Data System (ADS)

    Anz-Meador, P. D.; Liou, J.-C.; Ross, D.; Robinson, G. A.; Opiela, J. N.; Kearsley, A. T.; Grime, G. W.; Colaux, J. L.; Jeynes, C.; Palitsin, V. V.; Webb, R. P.; Griffin, T. J.; Reed, B. B.; Gerlach, L.

    2013-08-01

    After nearly 16 years in low Earth orbit (LEO), the Wide Field Planetary Camera-2 (WFPC2) was recovered from the Hubble Space Telescope (HST) in May 2009, during the 12 day shuttle mission designated STS-125. The WFPC-2 radiator had been struck by approximately 700 impactors producing crater features 300 μ m and larger in size. Following optical inspection in 2009, agreement was reached for joint NASA-ESA study of crater residues, in 2011. Over 480 impact features were extracted at NASA Johnson Space Center's (JSC) Space Exposed Hardware clean-room and curation facility during 2012, and were shared between NASA and ESA. We describe analyses conducted using scanning electron microscopy (SEM) - energy dispersive X-ray spectrometry (EDX): by NASA at JSC's Astromaterials Research and Exploration Science (ARES) Division; and for ESA at the Natural History Museum (NHM), with Ion beam analysis (IBA) using a scanned proton microbeam at the University of Surrey Ion Beam Centre (IBC).

  9. Sampling and Analysis of Impact Crater Residues Found on the Wide Field Planetary Camera-2 Radiator

    NASA Technical Reports Server (NTRS)

    Kearsley, A. T.; Grime, G. W.; Colaux, J. L.; Jeynes, C.; Palitsin, V. V.; Webb, R, P.; Griffin, T. J.; Reed, B. B.; Anz-Meador, P. D.; Kou, J.-C.; hide

    2013-01-01

    After nearly 16 years in low Earth orbit (LEO), the Wide Field Planetary Camera-2 (WFPC2) was recovered from the Hubble Space Telescope (HST) in May 2009, during the 12 day shuttle mission designated STS-125. The WFPC-2 radiator had been struck by approximately 700 impactors producing crater features 300 microns and larger in size. Following optical inspection in 2009, agreement was reached for joint NASA-ESA study of crater residues, in 2011. Over 480 impact features were extracted at NASA Johnson Space Center's (JSC) Space Exposed Hardware clean-room and curation facility during 2012, and were shared between NASA and ESA. We describe analyses conducted using scanning electron microscopy (SEM) - energy dispersive X-ray spectrometry (EDX): by NASA at JSC's Astromaterials Research and Exploration Science (ARES) Division; and for ESA at the Natural History Museum (NHM), with Ion beam analysis (IBA) using a scanned proton microbeam at the University of Surrey Ion Beam Centre (IBC).

  10. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    NASA Technical Reports Server (NTRS)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass. These scans show the clast to be approx.4.5 g, however (assuming a density of approx.3.5 g/cc). This is large enough for detailed studies including multiple geo-chronometers. This basalt clast is of particular interest as it is the largest Apollo 16 basalt, and it is the only mid-TiO2 basalt in the Apollo sample suite. By identifying the location of interesting clasts or grains within a sample, we will be able to make more informed decisions about where to cut a sample in order to best expose clasts of interest for future study. Moreover, knowing the location of internal defects (e.g., fractures) will allow more precise chipping and extraction of clasts or grains. By combining micro-CT scans with compositional techniques like micro x-ray fluorescence (particularly on sawn slabs), we will be able to provide even more comprehensive information to scientists trying to best select samples that fit their scientific needs.

  11. Sample Handling Considerations for a Europa Sample Return Mission: An Overview

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Calaway, M. L.; Evans, C. A.; McCubbin, F. M.

    2015-01-01

    The intent of this abstract is to provide a basic overview of mission requirements for a generic Europan plume sample return mission, based on NASA Curation experience in NASA sample return missions ranging from Apollo to OSIRIS-REx. This should be useful for mission conception and early stage planning. We will break the mission down into Outbound and Return legs and discuss them separately.

  12. GEOMetaCuration: a web-based application for accurate manual curation of Gene Expression Omnibus metadata

    PubMed Central

    Li, Zhao; Li, Jin; Yu, Peng

    2018-01-01

    Abstract Metadata curation has become increasingly important for biological discovery and biomedical research because a large amount of heterogeneous biological data is currently freely available. To facilitate efficient metadata curation, we developed an easy-to-use web-based curation application, GEOMetaCuration, for curating the metadata of Gene Expression Omnibus datasets. It can eliminate mechanical operations that consume precious curation time and can help coordinate curation efforts among multiple curators. It improves the curation process by introducing various features that are critical to metadata curation, such as a back-end curation management system and a curator-friendly front-end. The application is based on a commonly used web development framework of Python/Django and is open-sourced under the GNU General Public License V3. GEOMetaCuration is expected to benefit the biocuration community and to contribute to computational generation of biological insights using large-scale biological data. An example use case can be found at the demo website: http://geometacuration.yubiolab.org. Database URL: https://bitbucket.com/yubiolab/GEOMetaCuration PMID:29688376

  13. [Cost-effectiveness research in elderly residents in long-term care: prevention is better than cure, but not always cheaper].

    PubMed

    Achterberg, Wilco P; Gussekloo, Jacobijn; van den Hout, Wilbert B

    2015-01-01

    Cost-effectiveness research in elderly residents in long-term care facilities is based on general principals of cost-effectiveness research; these have been developed primarily from the perspective of relatively healthy adults in curative medicine. These principals are, however, inadequate when evaluating interventions for the fragile elderly in long-term care, both in terms of the value attached to the health of patients and to the specific decision-making context of the institution. Here we discuss the pitfalls of cost-effectiveness research in long-term care facilities, illustrated by two prevention interventions for prevalent conditions in nursing homes: pressure ulcers and urinary tract infections. These turned out to be effective, but not cost-effective.

  14. Mouse Genome Database: From sequence to phenotypes and disease models

    PubMed Central

    Richardson, Joel E.; Kadin, James A.; Smith, Cynthia L.; Blake, Judith A.; Bult, Carol J.

    2015-01-01

    Summary The Mouse Genome Database (MGD, www.informatics.jax.org) is the international scientific database for genetic, genomic, and biological data on the laboratory mouse to support the research requirements of the biomedical community. To accomplish this goal, MGD provides broad data coverage, serves as the authoritative standard for mouse nomenclature for genes, mutants, and strains, and curates and integrates many types of data from literature and electronic sources. Among the key data sets MGD supports are: the complete catalog of mouse genes and genome features, comparative homology data for mouse and vertebrate genes, the authoritative set of Gene Ontology (GO) annotations for mouse gene functions, a comprehensive catalog of mouse mutations and their phenotypes, and a curated compendium of mouse models of human diseases. Here, we describe the data acquisition process, specifics about MGD's key data areas, methods to access and query MGD data, and outreach and user help facilities. genesis 53:458–473, 2015. © 2015 The Authors. Genesis Published by Wiley Periodicals, Inc. PMID:26150326

  15. Virtual Microscope Views of the Apollo 11 and 12 Lunar Samples

    NASA Technical Reports Server (NTRS)

    Gibson, E. K.; Tindle, A. G.; Kelley, S. P.; Pillinger, J. M.

    2016-01-01

    The Apollo virtual microscope is a means of viewing, over the Internet, polished thin sections of every rock in the Apollo lunar sample collections via software, duplicating many of the functions of a petrological microscope, is described. Images from the Apollo 11 and 12 missions may be viewed at: www.virtualmicroscope.org/content/apollo. Introduction: During the six NASA missions to the Moon from 1969-72 a total of 382 kilograms of rocks and soils, often referred to as "the legacy of Apollo", were collected and returned to Earth. A unique collection of polished thin sections (PTSs) was made from over 400 rocks by the Lunar Sample Curatorial Facility at the Johnson Spacecraft Center (JSC), Houston. These materials have been available for loan to approved PIs but of course they can't be simultaneously investigated by several researchers unless they are co-located or the sample is passed back and forward between them by mail/hand carrying which is inefficient and very risky for irreplaceable material. When The Open University (OU), the world's largest Distance Learning Higher Education Establishment found itself facing a comparable problem (how to supply thousands of undergraduate students with an interactive petrological microscope and a personal set of thin sections), it decided to develop a software tool called the Virtual Microscope (VM). As a result it is now able to make the unique and precious collection of Apollo specimens universally available as a resource for concurrent study by anybody in the world's Earth and Planetary Sciences community. Herein, we describe the first steps of a collaborative project between OU and the Johnson Space Center (JSC) Curatorial Facility to record a PTS for every lunar rock, beginning with those collected by the Apollo 11 and 12 missions. Method: Production of a virtual microscope dedicated to a particular theme divides into four main parts - photography, image processing, building and assembly of virtual microscope components, and publication on a website. Two large research quality microscopes are used to collect all the images required for a virtual microscope. The first is part of an integrated package that utilizes Leica PowerMosaic software and a motorised XYZ stage to generate large area mosaics. It includes a fast acquisition camera and depending on the PTS size normally is used to produce seamless mosaic images consisting of 100-500 individual photographs. If the sample is suitable, three mosaics of each sample are recorded - plane polarised light, between crossed polars and reflected light. In order for the VM to be a true petrological microscope it is necessary to recreate the features of a rotating stage and perform observations using filters to produce polarised light. Thus the petrological VM includes the capability of seeing changes in optical properties (pleochroism and birefringence) during rotation allowing mineral identification. The second microscope in the system provides the functions of the rotating stage. To this microscope we have added a robotically controlled motor to acquire seventy-two images (5 degree intervals) in plane polarised light and between crossed polars. To process the images acquired from the two microscopes involves a combination of proprietary software (Photoshop) and our own in-house code. The final stage involves assembling all the components in an HTML5 environment. Pathfinder investigations: We have undertaken a number of pilot studies to demonstrate the efficacy of the petrological microscope with lunar samples. The first was to make available on-line images collected from the Educational Package of Apollo samples provided by NASA to the UK STFC (Science and Technical Facilities Council) for loan as educational material e.g. for schools. The real PTSs of the samples are now no longer sent out to schools removing the risks associated with transport, accidental breakage and eliminating the possibility of loss. The availability of lunar sample VM-related material was further extended to include twenty-eight specimens from all of the Apollo missions. Some of these samples were made more generally available through an ibook entitled "Moon Rocks: an introduction to the Geology of the Moon," free from the Apple Bookstore. Research possibilities: Although the Virtual Microscope was originally conceived as a teaching aid and was later recognised as a means of public outreach and engagement, we now realize that it also has enormous potential as a high level research tool. Following discussions with the JSC Curators we have received Curation and Analysis Planning Team for Extraterrestrial Materials (CAPTEM) permission to embark on a programme of digitizing the entire lunar sample PTS collection for all three of the above purposes. By the time of the 47th Lunar and Planetary Science Conference (LPSC) we will have completed 81 rocks collected during the Apollo 11 and 12 missions and the data, with cross-links to the Lunar Sample Compendium will go live on the Web at the 47th LPSC. The VM images of the Apollo 11 (41 VM images) and 12 (40 VM images) missions can be viewed at: http:/www.virtualmicroscope.org/content/apollo. The lunar sample VM will enable large numbers of skilled/unskilled microscopists (professional and amateur researchers, educators and students, enthusiasts and the simply curious non-scientists) to share the information from a single sample. It will mean that all the PTSs already cut, even historical ones, could be available for new joint investigations or private study. The scientific return from the collection will increase exponentially as a result of further debate and discussion. Simultaneously the VM will remove the need for making unnecessary multiple samplings, avoid consignment of delicate/breakable specimens (all of which are priceless) to insecure mail/courier services and reduce direct labour and indirect costs, travel budgets and unproductive travelling time necessary for co-location of collaborating researchers. For the future we have already recognized further potential for virtual technology. There is nothing that a petrologist likes more than to see the original rock as a hand specimen. It is entirely possible to recreate virtual hand specimens with 3-D hard and software, already developed for viewing fossils, located within the Curatorial Facility, http://curator.jsc.nasa.gov/lunar/lsc/index.cfm.

  16. The IGSN Experience: Successes and Challenges of Implementing Persistent Identifiers for Samples

    NASA Astrophysics Data System (ADS)

    Lehnert, Kerstin; Arko, Robert

    2016-04-01

    Physical samples collected and studied in the Earth sciences represent both a research resource and a research product in the Earth Sciences. As such they need to be properly managed, curated, documented, and cited to ensure re-usability and utility for future science, reproducibility of the data generated by their study, and credit for funding agencies and researchers who invested substantial resources and intellectual effort into their collection and curation. Use of persistent and unique identifiers and deposition of metadata in a persistent registry are therefore as important for physical samples as they are for digital data. The International Geo Sample Number (IGSN) is a persistent, globally unique identifier. Its adoption by individual investigators, repository curators, publishers, and data managers is rapidly growing world-wide. This presentation will provide an analysis of the development and implementation path of the IGSN and relevant insights and experiences gained along its way. Development of the IGSN started in 2004 as part of a US NSF-funded project to establish a registry for sample metadata, the System for Earth Sample Registration (SESAR). The initial system provided a centralized solution for users to submit information about their samples and obtain IGSNs and bar codes. Challenges encountered during this initial phase related to defining the scope of the registry, granularity of registered objects, responsibilities of relevant actors, and workflows, and designing the registry's metadata schema, its user interfaces, and the identifier itself, including its syntax. The most challenging task though was to make the IGSN an integral part of personal and institutional sample management, digital management of sample-based data, and data publication on a global scale. Besides convincing individual researchers, curators, editors and publishers, as well as data managers in US and non-US academia, state and federal agencies, the PIs of the SESAR project needed to identify ways to organize, operate, and govern the global registry in the short and in the long-term. A major breakthrough was achieved at an international workshop in February 2011, at which participants designed a new distributed and scalable architecture for the IGSN with international governance by a membership organization modeled after the DataCite consortium. The founding of the international governing body and implementation organization for the IGSN, the IGSN e.V., took place at the AGU Fall Meeting 2011. Recent progress came at a workshop in September 2015, where stakeholders from both geoscience and life science disciplines drafted a standard IGSN metadata schema for describing samples, to complement the existing schema for registering samples. Consensus was achieved on an essential set of properties to describe a sample's origin and classification, creating a "birth certificate" for the sample. Further consensus was achieved in clarifying that an IGSN may represent exactly one physical sample, sampling feature, or collection of samples; and in aligning the IGSN schema with the existing Observations Data Model (ODM-2). The resulting schema was published online at schema.igsn.org and presented at the AGU Fall Meeting 2015.

  17. Laying the groundwork for NEON's continental-scale ecological research

    NASA Astrophysics Data System (ADS)

    Dethloff, G.; Denslow, M.

    2013-12-01

    The National Ecological Observatory Network (NEON) is designed to examine a suite of ecological issues. Field-collected data from 96 terrestrial and aquatic sites across the U.S. will be combined with remotely sensed data and existing continental-scale data sets. Field collections will include a range of physical and biological types, including soil, sediment, surface water, groundwater, precipitation, plants, animals, insects, and microbes as well as biological sub-samples such as leaf material, blood and tissue samples, and DNA extracts. Initial data analyses and identifications of approximately 175,000 samples per year will occur at numerous external laboratories when all sites are fully staffed in 2017. Additionally, NEON will archive biotic and abiotic specimens at collections facilities where they will be curated and available for additional analyses by the scientific community. The number of archived specimens is currently estimated to exceed 130,000 per year by 2017. We will detail how NEON is addressing the complexities and challenges around this set of analyses and specimens and how the resulting high-quality data can impact ecological understanding. The raw data returned from external laboratories that is quality checked and served by NEON will be the foundation for many NEON data products. For example, sequence-quality nucleic acids extracted from surface waters, benthic biofilms, and soil samples will be building blocks for data products on microbial diversity. The raw sequence data will also be available for uses such as evolutionary investigations, and the extracts will be archived so others can acquire them for additional research. Currently, NEON is establishing contracts for the analysis and archiving of field-collected samples through 2017. During this period, NEON will gather information on the progress and success of this large-scale effort in order to determine the most effective course to pursue with external facilities. Two areas that NEON already knows to evaluate are the need for geographic expertise in taxonomic identifications and the capacity necessary to handle the volume of samples. NEON is also addressing challenges associated with external entities and the logistics of sample movement, data formatting, data ingestion, and reporting. For example, NEON is considering tools, such as web APIs, which could allow efficient transfer of data from external facilities. Having a standard format in place for that data will be critical to transfer success and quality assessment. NEON is also working on the implementation of quality control measures for diverse analytical and taxonomic processes across laboratories, and is developing an external audit process. Additionally, given NEON's open access approach, the Network is focused on selecting a sample identification protocol that aids in tracking samples with more involved analytical needs and also allows maximum utility for the scientific community. Given the complex nature and breadth of the project, NEON will be developing novel sample management systems as well as metadata schemas. These efforts insure integrity and quality from field to external facility to archive for each sample taken, providing high-quality data now and confidence in future research stemming from raw data generated by NEON and its collection specimens.

  18. Influence of Network Structure on Glass Transition Temperature of Elastomers

    PubMed Central

    Bandzierz, Katarzyna; Reuvekamp, Louis; Dryzek, Jerzy; Dierkes, Wilma; Blume, Anke; Bielinski, Dariusz

    2016-01-01

    It is generally believed that only intermolecular, elastically-effective crosslinks influence elastomer properties. The role of the intramolecular modifications of the polymer chains is marginalized. The aim of our study was the characterization of the structural parameters of cured elastomers, and determination of their influence on the behavior of the polymer network. For this purpose, styrene-butadiene rubbers (SBR), cured with various curatives, such as DCP, TMTD, TBzTD, Vulcuren®, DPG/S8, CBS/S8, MBTS/S8 and ZDT/S8, were investigated. In every series of samples a broad range of crosslink density was obtained, in addition to diverse crosslink structures, as determined by equilibrium swelling and thiol-amine analysis. Differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA) were used to study the glass transition process, and positron annihilation lifetime spectroscopy (PALS) to investigate the size of the free volumes. For all samples, the values of the glass transition temperature (Tg) increased with a rise in crosslink density. At the same time, the free volume size proportionally decreased. The changes in Tg and free volume size show significant differences between the series crosslinked with various curatives. These variations are explained on the basis of the curatives’ structure effect. Furthermore, basic structure-property relationships are provided. They enable the prediction of the effect of curatives on the structural parameters of the network, and some of the resulting properties. It is proved that the applied techniques—DSC, DMA, and PALS—can serve to provide information about the modifications to the polymer chains. Moreover, on the basis of the obtained results and considering the diversified curatives available nowadays, the usability of “part per hundred rubber” (phr) unit is questioned. PMID:28773731

  19. MiDAS 2.0: an ecosystem-specific taxonomy and online database for the organisms of wastewater treatment systems expanded for anaerobic digester groups

    PubMed Central

    McIlroy, Simon Jon; Kirkegaard, Rasmus Hansen; McIlroy, Bianca; Nierychlo, Marta; Kristensen, Jannie Munk; Karst, Søren Michael; Albertsen, Mads

    2017-01-01

    Abstract Wastewater is increasingly viewed as a resource, with anaerobic digester technology being routinely implemented for biogas production. Characterising the microbial communities involved in wastewater treatment facilities and their anaerobic digesters is considered key to their optimal design and operation. Amplicon sequencing of the 16S rRNA gene allows high-throughput monitoring of these systems. The MiDAS field guide is a public resource providing amplicon sequencing protocols and an ecosystem-specific taxonomic database optimized for use with wastewater treatment facility samples. The curated taxonomy endeavours to provide a genus-level-classification for abundant phylotypes and the online field guide links this identity to published information regarding their ecology, function and distribution. This article describes the expansion of the database resources to cover the organisms of the anaerobic digester systems fed primary sludge and surplus activated sludge. The updated database includes descriptions of the abundant genus-level-taxa in influent wastewater, activated sludge and anaerobic digesters. Abundance information is also included to allow assessment of the role of emigration in the ecology of each phylotype. MiDAS is intended as a collaborative resource for the progression of research into the ecology of wastewater treatment, by providing a public repository for knowledge that is accessible to all interested in these biotechnologically important systems. Database URL: http://www.midasfieldguide.org PMID:28365734

  20. MortalityPredictors.org: a manually-curated database of published biomarkers of human all-cause mortality

    PubMed Central

    Winslow, Ksenia; Ho, Andrew; Fortney, Kristen; Morgen, Eric

    2017-01-01

    Biomarkers of all-cause mortality are of tremendous clinical and research interest. Because of the long potential duration of prospective human lifespan studies, such biomarkers can play a key role in quantifying human aging and quickly evaluating any potential therapies. Decades of research into mortality biomarkers have resulted in numerous associations documented across hundreds of publications. Here, we present MortalityPredictors.org, a manually-curated, publicly accessible database, housing published, statistically-significant relationships between biomarkers and all-cause mortality in population-based or generally healthy samples. To gather the information for this database, we searched PubMed for appropriate research papers and then manually curated relevant data from each paper. We manually curated 1,576 biomarker associations, involving 471 distinct biomarkers. Biomarkers ranged in type from hematologic (red blood cell distribution width) to molecular (DNA methylation changes) to physical (grip strength). Via the web interface, the resulting data can be easily browsed, searched, and downloaded for further analysis. MortalityPredictors.org provides comprehensive results on published biomarkers of human all-cause mortality that can be used to compare biomarkers, facilitate meta-analysis, assist with the experimental design of aging studies, and serve as a central resource for analysis. We hope that it will facilitate future research into human mortality and aging. PMID:28858850

  1. MortalityPredictors.org: a manually-curated database of published biomarkers of human all-cause mortality.

    PubMed

    Peto, Maximus V; De la Guardia, Carlos; Winslow, Ksenia; Ho, Andrew; Fortney, Kristen; Morgen, Eric

    2017-08-31

    Biomarkers of all-cause mortality are of tremendous clinical and research interest. Because of the long potential duration of prospective human lifespan studies, such biomarkers can play a key role in quantifying human aging and quickly evaluating any potential therapies. Decades of research into mortality biomarkers have resulted in numerous associations documented across hundreds of publications. Here, we present MortalityPredictors.org , a manually-curated, publicly accessible database, housing published, statistically-significant relationships between biomarkers and all-cause mortality in population-based or generally healthy samples. To gather the information for this database, we searched PubMed for appropriate research papers and then manually curated relevant data from each paper. We manually curated 1,576 biomarker associations, involving 471 distinct biomarkers. Biomarkers ranged in type from hematologic (red blood cell distribution width) to molecular (DNA methylation changes) to physical (grip strength). Via the web interface, the resulting data can be easily browsed, searched, and downloaded for further analysis. MortalityPredictors.org provides comprehensive results on published biomarkers of human all-cause mortality that can be used to compare biomarkers, facilitate meta-analysis, assist with the experimental design of aging studies, and serve as a central resource for analysis. We hope that it will facilitate future research into human mortality and aging.

  2. Extending the Reach of IGSN Beyond Earth: Implementing IGSN Registration to Link Nasa's Apollo Lunar Samples and Their Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.

    2016-01-01

    The rock and soil samples returned from the Apollo missions from 1969-72 have supported 46 years of research leading to advances in our understanding of the formation and evolution of the inner Solar System. NASA has been engaged in several initiatives that aim to restore, digitize, and make available to the public existing published and unpublished research data for the Apollo samples. One of these initiatives is a collaboration with IEDA (Interdisciplinary Earth Data Alliance) to develop MoonDB, a lunar geochemical database modeled after PetDB (Petrological Database of the Ocean Floor). In support of this initiative, NASA has adopted the use of IGSN (International Geo Sample Number) to generate persistent, unique identifiers for lunar samples that scientists can use when publishing research data. To facilitate the IGSN registration of the original 2,200 samples and over 120,000 subdivided samples, NASA has developed an application that retrieves sample metadata from the Lunar Curation Database and uses the SESAR API to automate the generation of IGSNs and registration of samples into SESAR (System for Earth Sample Registration). This presentation will describe the work done by NASA to map existing sample metadata to the IGSN metadata and integrate the IGSN registration process into the sample curation workflow, the lessons learned from this effort, and how this work can be extended in the future to help deal with the registration of large numbers of samples.

  3. Extending the Reach of IGSN Beyond Earth: Implementing IGSN Registration to Link NASA's Apollo Lunar Samples and their Data

    NASA Astrophysics Data System (ADS)

    Todd, N. S.

    2016-12-01

    The rock and soil samples returned from the Apollo missions from 1969-72 have supported 46 years of research leading to advances in our understanding of the formation and evolution of the inner Solar System. NASA has been engaged in several initiatives that aim to restore, digitize, and make available to the public existing published and unpublished research data for the Apollo samples. One of these initiatives is a collaboration with IEDA (Interdisciplinary Earth Data Alliance) to develop MoonDB, a lunar geochemical database modeled after PetDB. In support of this initiative, NASA has adopted the use of IGSN (International Geo Sample Number) to generate persistent, unique identifiers for lunar samples that scientists can use when publishing research data. To facilitate the IGSN registration of the original 2,200 samples and over 120,000 subdivided samples, NASA has developed an application that retrieves sample metadata from the Lunar Curation Database and uses the SESAR API to automate the generation of IGSNs and registration of samples into SESAR (System for Earth Sample Registration). This presentation will describe the work done by NASA to map existing sample metadata to the IGSN metadata and integrate the IGSN registration process into the sample curation workflow, the lessons learned from this effort, and how this work can be extended in the future to help deal with the registration of large numbers of samples.

  4. The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier

    NASA Technical Reports Server (NTRS)

    Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.

    2017-01-01

    The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.

  5. Additive treatment improves survival in elderly patients after non-curative endoscopic resection for early gastric cancer.

    PubMed

    Jung, Da Hyun; Lee, Yong Chan; Kim, Jie-Hyun; Lee, Sang Kil; Shin, Sung Kwan; Park, Jun Chul; Chung, Hyunsoo; Park, Jae Jun; Youn, Young Hoon; Park, Hyojin

    2017-03-01

    Endoscopic resection (ER) is accepted as a curative treatment option for selected cases of early gastric cancer (EGC). Although additional surgery is often recommended for patients who have undergone non-curative ER, clinicians are cautious when managing elderly patients with GC because of comorbid conditions. The aim of the study was to investigate clinical outcomes in elderly patients following non-curative ER with and without additive treatment. Subjects included 365 patients (>75 years old) who were diagnosed with EGC and underwent ER between 2007 and 2015. Clinical outcomes of three patient groups [curative ER (n = 246), non-curative ER with additive treatment (n = 37), non-curative ER without additive treatment (n = 82)] were compared. Among the patients who underwent non-curative ER with additive treatment, 28 received surgery, three received a repeat ER, and six experienced argon plasma coagulation. Patients who underwent non-curative ER alone were significantly older than those who underwent additive treatment. Overall 5-year survival rates in the curative ER, non-curative ER with treatment, and non-curative ER without treatment groups were 84, 86, and 69 %, respectively. No significant difference in overall survival was found between patients in the curative ER and non-curative ER with additive treatment groups. The non-curative ER groups were categorized by lymph node metastasis risk factors to create a high-risk group that exhibited positive lymphovascular invasion or deep submucosal invasion greater than SM2 and a low-risk group without risk factors. Overall 5-year survival rate was lowest (60 %) in the high-risk group with non-curative ER and no additive treatment. Elderly patients who underwent non-curative ER with additive treatment showed better survival outcome than those without treatment. Therefore, especially with LVI or deep submucosal invasion, additive treatment is recommended in patients undergoing non-curative ER, even if they are older than 75 years.

  6. Post-treatment management options for patients with lung cancer.

    PubMed Central

    Virgo, K S; McKirgan, L W; Caputo, M C; Mahurin, D M; Chao, L C; Caputo, N A; Naunheim, K S; Flye, M W; Gillespie, K N; Johnson, F E

    1995-01-01

    OBJECTIVES: The first objective was to identify variations in patient management practice patterns after potentially curative lung cancer surgery. Patient management practice patterns were expected to range from intensive follow-up to no active surveillance. The second objective was to measure whether intensity of follow-up was related to patient outcomes. METHODS: An 18-month retrospective analysis was conducted of 182 patients with low TNM stage (< or = IIIA) lung cancer who were surgically treated with curative intent over the 11-year period from 1982 through 1992 at the St. Louis Department of Veterans Affairs Medical Center. RESULTS: Patients were followed for a mean of 3.3 years, until death or the end of the study. Analyses of diagnostic test and outpatient visit frequency distributions and cluster analyses facilitated the identification of 62 nonintensively followed patients and 120 intensively followed patients. Both groups were comparable at baseline, and there were no significant differences in patient outcomes attributable to intensity of follow-up. Intensively followed patients did, however, live an average of 192 days longer than nonintensively followed patients. CONCLUSIONS: Significant variations in follow-up practice patterns can exist within a single health care facility. In this analysis, variations in test and visit frequency did not result in statistically significant differences in patient outcomes, though the survival difference between groups suggests that some benefit might exist. Only well-designed prospective trials are likely to answer the question of what constitutes optimal follow-up after potentially curative lung cancer treatment. PMID:8526576

  7. Using Data From Ontario's Episode-Based Funding Model to Assess Quality of Chemotherapy.

    PubMed

    Kaizer, Leonard; Simanovski, Vicky; Lalonde, Carlin; Tariq, Huma; Blais, Irene; Evans, William K

    2016-10-01

    A new episode-based funding model for ambulatory systemic therapy was implemented in Ontario, Canada on April 1, 2014, after a comprehensive knowledge transfer and exchange strategy with providers and administrators. An analysis of the data from the first year of the new funding model provided an opportunity to assess the quality of chemotherapy, which was not possible under the old funding model. Options for chemotherapy regimens given with adjuvant/curative intent or palliative intent were informed by input from disease site groups. Bundles were developed and priced to enable evidence-informed best practice. Analysis of systemic therapy utilization after model implementation was performed to assess the concordance rate of the treatments chosen with recommended practice. The actual number of cycles of treatment delivered was also compared with expert recommendations. Significant improvement compared with baseline was seen in the proportion of adjuvant/curative regimens that aligned with disease site group-recommended options (98% v 90%). Similar improvement was seen for palliative regimens (94% v 89%). However, overall, the number of cycles of adjuvant/curative therapy delivered was lower than recommended best practice in 57.5% of patients. There was significant variation by disease site and between facilities. Linking funding to quality, supported by knowledge transfer and exchange, resulted in a rapid improvement in the quality of systemic treatment in Ontario. This analysis has also identified further opportunities for improvement and the need for model refinement.

  8. Canto: an online tool for community literature curation.

    PubMed

    Rutherford, Kim M; Harris, Midori A; Lock, Antonia; Oliver, Stephen G; Wood, Valerie

    2014-06-15

    Detailed curation of published molecular data is essential for any model organism database. Community curation enables researchers to contribute data from their papers directly to databases, supplementing the activity of professional curators and improving coverage of a growing body of literature. We have developed Canto, a web-based tool that provides an intuitive curation interface for both curators and researchers, to support community curation in the fission yeast database, PomBase. Canto supports curation using OBO ontologies, and can be easily configured for use with any species. Canto code and documentation are available under an Open Source license from http://curation.pombase.org/. Canto is a component of the Generic Model Organism Database (GMOD) project (http://www.gmod.org/). © The Author 2014. Published by Oxford University Press.

  9. Patterns of brachytherapy practice for patients with carcinoma of the cervix (1996-1999): a patterns of care study.

    PubMed

    Erickson, Beth; Eifel, Patricia; Moughan, Jennifer; Rownd, Jason; Iarocci, Thomas; Owen, Jean

    2005-11-15

    To analyze the details of brachytherapy practice in patients treated for carcinoma of the cervix in the United States between 1996 and 1999. Radiation facilities were selected from a stratified random sample. Patients were randomly selected from lists of eligible patients treated at each facility. A total of 442 patients' records were reviewed in 59 facilities to obtain data about patients' characteristics, evaluation, tumor extent, and treatment. National estimates were made using weights that reflected the relative contribution of each institution and of each patient within the sampled institutions. From our survey we estimate that 16,375 patients were treated in the United States during this study period. Unless otherwise specified, brachytherapy practice was based on the 408 patients who had their brachytherapy or all their treatment at the surveyed facility. A total of 91.5% of patients underwent brachytherapy at the initial treating institution; 8.5% were referred to a second site for brachytherapy. Forty-two percent of U.S. facilities referred at least some patients to a second facility for brachytherapy. Of U.S. facilities that treated < or =2 eligible patients per year, 61% referred all of their patients to a second facility for brachytherapy or treated with external RT alone; none of the U.S. facilities with larger experience (>2 eligible patients per year) referred all their patients to a second facility for brachytherapy treatment, but 28% referred some patients to an outside facility for brachytherapy. Overall, 94% of patients who completed treatment with curative intent received brachytherapy. Of these patients who had brachytherapy, 77.8%, 13.3%, and 0.9%, respectively, were treated with low-dose-rate (LDR), high-dose-rate (HDR), or a combination of HDR and LDR brachytherapy; 7.9% had interstitial brachytherapy (5.7% LDR and 1.9% HDR, 0.3% mixed). In facilities that treated >2 patients per year, 15.5% and 9.4% of brachytherapy procedures included HDR or interstitial, respectively; in facilities that treated fewer patients, 3.4% had HDR brachytherapy, and only 1.2% had interstitial brachytherapy. Patients treated with LDR intracavitary radiotherapy had one (23.5%), two (74.1%), or three (2.4%) implants. For patients treated with curative intent who completed radiation therapy with LDR intracavitary radiation therapy without hysterectomy, the median brachytherapy dose to Point A was 40.3 Gy, and the median total dose to Point A was 82.9 Gy. Patients were treated with HDR intracavitary radiation therapy using a variety of treatment schedules using 1-2 fractions (7.5%), 3-4 fractions (17.4%), 5-6 fractions (38.5%), 7-9 fractions (33.5%), or 12 fractions (3%). Fraction sizes were <500 cGy (29.5%), 500-<600 (25.2%), 600 (28.1%), >600 (8%), or unknown (9.2%). For patients treated with HDR, the median total dose to Point A (corrected for fraction size using a alpha/beta = 10) was 85.8 Gy (range: 56.2-116.1 Gy). At institutions treating <500 new patients per year, the percentage of patients receiving a brachytherapy dose <40 Gy was significantly higher than at institutions treating > or =500 new patients per year (p < 0.0001). For LDR intracavitary radiation therapy, 5.8% had neither bladder nor rectal doses recorded for any of their implants, whereas in HDR intracavitary radiation therapy, 73.4% had neither bladder nor rectal doses recorded for any of their implants. The median total duration of radiation therapy was identical for patients who had HDR or LDR intracavitary radiation therapy (57 days). For LDR at institutions treating <500 new patients per year, the percentage of patients with treatment duration >56 days was significantly greater than at institutions > or =500 new patients per year (p = 0.002). Of the patients who had LDR intracavitary radiation therapy implants, 65% were treated using tandem and shielded Fletcher-Suit-Delclos colpostats; other patients had mini ovoids (10.9%), cylinders (3.9%), Henschke (3.7%), or other/mixed applicators (16.5%). In contrast, of patients treated with HDR intracavitary radiation therapy, 68.7% had tandem and rings, 18.2% Fletcher-Suit-Delclos ovoids, 7.5% mini ovoids, 2.3% cylinders, and 3.2% other or mixed applicators. The median duration of treatment and median Point A dose were very similar for patients treated with HDR or LDR. Patients with HDR were treated using a variety of treatment schedules. Different applicator types were favored for LDR vs. HDR. Of patients treated with HDR, 73.4% had no brachytherapy bladder or rectal doses recorded, suggesting that full dosimetric calculations were performed only for the first fraction in many institutions. Facility size significantly impacted on referral to another institution for brachytherapy, brachytherapy dose, and treatment duration.

  10. Genesis Contingency Planning and Mishap Recovery: The Sample Curation View

    NASA Technical Reports Server (NTRS)

    Stansbery, E. K.; Allton, J. H.; Allen, C. C.; McNamara, K. M.; Calaway, M.; Rodriques, M. C.

    2007-01-01

    Planning for sample preservation and curation was part of mission design from the beginning. One of the scientific objectives for Genesis included collecting samples of three regimes of the solar wind in addition to collecting bulk solar wind during the mission. Collectors were fabricated in different thicknesses for each regime of the solar wind and attached to separate frames exposed to the solar wind during specific periods of solar activity associated with each regime. The original plan to determine the solar regime sampled for specific collectors was to identify to which frame the collector was attached. However, the collectors were dislodged during the hard landing making identification by frame attachment impossible. Because regimes were also identified by thickness of the collector, the regime sampled is identified by measuring fragment thickness. A variety of collector materials and thin films applied to substrates were selected and qualified for flight. This diversity provided elemental measurement in more than one material, mitigating effects of diffusion rates and/or radiation damage. It also mitigated against different material and substrate strengths resulting in differing effects of the hard landing. For example, silicon crystal substrates broke into smaller fragments than sapphire-based substrates and diamond surfaces were more resilient to flying debris damage than gold. The primary responsibility of the curation team for recovery was process documentation. Contingency planning for the recovery phase expanded this responsibility to include not only equipment to document, but also gather, contain and identify samples from the landing area and the recovered spacecraft. The team developed contingency plans for various scenarios as part of mission planning that included topographic maps to aid in site recovery and identification of different modes of transport and purge capability depending on damage. A clean tent, set-up at Utah Test & Training Range to control the environment for processing the sample return capsule and cleanly installing a nitrogen purge to the canister, was used to control the environment for extracting collector fragments from the damaged canister and to document and package over 10,000 collector fragments.

  11. The Role of Community-Driven Data Curation for Enterprises

    NASA Astrophysics Data System (ADS)

    Curry, Edward; Freitas, Andre; O'Riáin, Sean

    With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.

  12. The Opera Instrument: An Advanced Curation Development for Mars Sample Return Organic Contamination Monitoring

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Fries, W. D.; McCubbin, F. M.; Zeigler, R. A.

    2018-01-01

    Mars Sample Return (MSR) requires strict organic contamination control (CC) and contamination knowledge (CK) as outlined by the Mars 2020 Organic Contamination Panel (OCP). This includes a need to monitor surficial organic contamination to a ng/sq. cm sensitivity level. Archiving and maintaining this degree of surface cleanliness may be difficult but has been achieved. MSR's CK effort will be very important because all returned samples will be studied thoroughly and in minute detail. Consequently, accurate CK must be collected and characterized to best interpret scientific results from the returned samples. The CK data are not only required to make accurate measurements and interpretations for carbon-depleted martian samples, but also to strengthen the validity of science investigations performed on the samples. The Opera instrument prototype is intended to fulfill a CC/CK role in the assembly, cleaning, and overall contamination history of hardware used in the MSR effort, from initial hardware assembly through post-flight sample curation. Opera is intended to monitor particulate and organic contamination using quartz crystal microbalances (QCMs), in a self-contained portable package that is cleanroom-compliant. The Opera prototype is in initial development capable of approximately 100 ng/sq. cm organic contamination sensitivity, with additional development planned to achieve 1 ng/sq. cm. The Opera prototype was funded by the 2017 NASA Johnson Space Center Innovation Charge Account (ICA), which provides funding for small, short-term projects.

  13. A curated compendium of monocyte transcriptome datasets of relevance to human monocyte immunobiology research

    PubMed Central

    Rinchai, Darawan; Boughorbel, Sabri; Presnell, Scott; Quinn, Charlie; Chaussabel, Damien

    2016-01-01

    Systems-scale profiling approaches have become widely used in translational research settings. The resulting accumulation of large-scale datasets in public repositories represents a critical opportunity to promote insight and foster knowledge discovery. However, resources that can serve as an interface between biomedical researchers and such vast and heterogeneous dataset collections are needed in order to fulfill this potential. Recently, we have developed an interactive data browsing and visualization web application, the Gene Expression Browser (GXB). This tool can be used to overlay deep molecular phenotyping data with rich contextual information about analytes, samples and studies along with ancillary clinical or immunological profiling data. In this note, we describe a curated compendium of 93 public datasets generated in the context of human monocyte immunological studies, representing a total of 4,516 transcriptome profiles. Datasets were uploaded to an instance of GXB along with study description and sample annotations. Study samples were arranged in different groups. Ranked gene lists were generated based on relevant group comparisons. This resource is publicly available online at http://monocyte.gxbsidra.org/dm3/landing.gsp. PMID:27158452

  14. The Origin of Amino Acids in Lunar Regolith Samples

    NASA Technical Reports Server (NTRS)

    Cook, Jamie E.; Callahan, Michael P.; Dworkin, Jason P.; Glavin, Daniel P.; McLain, Hannah L.; Noble, Sarah K.; Gibson, Everett K., Jr.

    2016-01-01

    We analyzed the amino acid content of seven lunar regolith samples returned by the Apollo 16 and Apollo 17 missions and stored under NASA curation since collection using ultrahigh-performance liquid chromatography with fluorescence detection and time-of-flight mass spectrometry. Consistent with results from initial analyses shortly after collection in the 1970s, we observed amino acids at low concentrations in all of the curated samples, ranging from 0.2 parts-per-billion (ppb) to 42.7 ppb in hot-water extracts and 14.5 ppb to 651.1 ppb in 6M HCl acid-vapor-hydrolyzed, hot-water extracts. Amino acids identified in the Apollo soil extracts include glycine, D- and L-alanine, D- and L-aspartic acid, D- and L-glutamic acid, D- and L-serine, L-threonine, and L-valine, all of which had previously been detected in lunar samples, as well as several compounds not previously identified in lunar regoliths: -aminoisobutyric acid (AIB), D-and L-amino-n-butyric acid (-ABA), DL-amino-n-butyric acid, -amino-n-butyric acid, -alanine, and -amino-n-caproic acid. We observed an excess of the L enantiomer in most of the detected proteinogenic amino acids, but racemic alanine and racemic -ABA were present in some samples.

  15. HEALTH CARE SERVICES IN SAUDI ARABIA: PAST, PRESENT AND FUTURE

    PubMed Central

    Sebai, Zohair A.; Milaat, Waleed A.; Al-Zulaibani, Abdulmohsen A.

    2001-01-01

    Health services in Saudi Arabia have developed enormously over the last two decades, as evidenced by the availability of health facilities throughout all parts of the vast Kingdom. The Saudi Ministry of Health (MOH) provides over 60% of these services while the rest are shared among other government agencies and the private sector. A series of development plans in Saudi Arabia have established the infra-structure for the expansion of curative services all over the country. Rapid development in medical education and the training of future Saudi health manpower have also taken place. Future challenges facing the Saudi health system are to be addressed in order to achieve the ambitious goals set by the most recent health development plan. These include the optimum utilization of current health resources with competent health managerial skills, the search for alternative means of financing these services, the maintenance of a balance between curative and preventive services, the expansion of training Saudi health manpower to meet the increasing demand, and the implementation of a comprehensive primary health care program. PMID:23008647

  16. CoINcIDE: A framework for discovery of patient subtypes across multiple datasets.

    PubMed

    Planey, Catherine R; Gevaert, Olivier

    2016-03-09

    Patient disease subtypes have the potential to transform personalized medicine. However, many patient subtypes derived from unsupervised clustering analyses on high-dimensional datasets are not replicable across multiple datasets, limiting their clinical utility. We present CoINcIDE, a novel methodological framework for the discovery of patient subtypes across multiple datasets that requires no between-dataset transformations. We also present a high-quality database collection, curatedBreastData, with over 2,500 breast cancer gene expression samples. We use CoINcIDE to discover novel breast and ovarian cancer subtypes with prognostic significance and novel hypothesized ovarian therapeutic targets across multiple datasets. CoINcIDE and curatedBreastData are available as R packages.

  17. Perceptions of health stakeholders on task shifting and motivation of community health workers in different socio demographic contexts in Kenya (nomadic, peri-urban and rural agrarian)

    PubMed Central

    2014-01-01

    Background The shortage of health professionals in low income countries is recognized as a crisis. Community health workers are part of a “task-shift” strategy to address this crisis. Task shifting in this paper refers to the delegation of tasks from health professionals to lay, trained volunteers. In Kenya, there is a debate as to whether these volunteers should be compensated, and what motivation strategies would be effective in different socio-demographic contexts, based type of tasks shifted. The purpose of this study was to find out, from stakeholders’ perspectives, the type of tasks to be shifted to community health workers and the appropriate strategies to motivate and retain them. Methods This was an analytical comparative study employing qualitative methods: key informant interviews with health policy makers, managers, and service providers, and focus group discussions with community health workers and service consumers, to explore their perspectives on tasks to be shifted and appropriate motivation strategies. Results The study found that there were tasks to be shifted and motivation strategies that were common to all three contexts. Common tasks were promotive, preventive, and simple curative services. Common motivation strategies were supportive supervision, means of identification, equitable allocation of resources, training, compensation, recognition, and evidence based community dialogue. Further, in the nomadic and peri-urban sites, community health workers had assumed curative services beyond the range provided for in the Kenyan task shifting policy. This was explained to be influenced by lack of access to care due to distance to health facilities, population movement, and scarcity of health providers in the nomadic setting and the harsh economic realities in peri-urban set up. Therefore, their motivation strategies included training on curative skills, technical support, and resources for curative care. Data collection was viewed as an important task in the rural site, but was not recognized as priority in nomadic and peri-urban sites, where they sought monetary compensation for data collection. Conclusions The study concluded that inclusion of curative tasks for community health workers, particularly in nomadic contexts, is inevitable but raises the need for accreditation of their training and regulation of their tasks. PMID:25079588

  18. IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.

    2014-12-01

    The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.

  19. Successful treatment of sodium oxalate induced urolithiasis with Helichrysum flowers.

    PubMed

    Onaran, Metin; Orhan, Nilüfer; Farahvash, Amirali; Ekin, Hasya Nazlı; Kocabıyık, Murat; Gönül, İpek Işık; Şen, İlker; Aslan, Mustafa

    2016-06-20

    Helichrysum (Asteraceae) flowers, known as "altın otu, yayla çiçeği, kudama çiçeği" , are widely used to remove kidney stones and for their diuretic properties in Turkey. To determine the curative effect of infusions prepared from capitulums of Helichrysum graveolens (M. Bieb.) Sweet (HG) and H. stoechas ssp. barellieri (Ten.) Nyman (HS) on sodium oxalate induced kidney stones. Infusions prepared from the capitulums of HG and HS were tested for their curative effect on calcium oxalate deposition induced by sodium oxalate (70mg/kg i.p.). Following the injection of sodium oxalate for 5 days, plant extracts were administered to rats at two different doses. Potassium citrate was used as positive control. Water intake, urine volume, body, liver and kidney weights were measured; biochemical and hematological analyses were conducted on urine and blood samples. Additionally, histopathological examinations were done on kidney samples. H. stoechas extract showed prominent effect at 156mg/kg dose (stone formation score: 0.33), whereas number of kidney stones was maximum in sodium oxalate group (stone formation score: 2.33). The reduction in the uric acid and oxalate levels of urine samples and the elevation in the urine citrate levels are significant and promising in extract groups. Some hematological, biochemical and enzymatic markers are also ameliorated by the extracts. This is the first report on the curative effect of immortal flowers. Our preliminary study indicated that Helichrysum extracts may be used for treatment of urolithiasis and Helichrysum extracts are an alternative therapy to potassium citrate for patients suffering from kidney stones. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Data Curation Is for Everyone! The Case for Master's and Baccalaureate Institutional Engagement with Data Curation

    ERIC Educational Resources Information Center

    Shorish, Yasmeen

    2012-01-01

    This article describes the fundamental challenges to data curation, how these challenges may be compounded for smaller institutions, and how data management is an essential and manageable component of data curation. Data curation is often discussed within the confines of large research universities. As a result, master's and baccalaureate…

  1. MiDAS 2.0: an ecosystem-specific taxonomy and online database for the organisms of wastewater treatment systems expanded for anaerobic digester groups.

    PubMed

    McIlroy, Simon Jon; Kirkegaard, Rasmus Hansen; McIlroy, Bianca; Nierychlo, Marta; Kristensen, Jannie Munk; Karst, Søren Michael; Albertsen, Mads; Nielsen, Per Halkjær

    2017-01-01

    Wastewater is increasingly viewed as a resource, with anaerobic digester technology being routinely implemented for biogas production. Characterising the microbial communities involved in wastewater treatment facilities and their anaerobic digesters is considered key to their optimal design and operation. Amplicon sequencing of the 16S rRNA gene allows high-throughput monitoring of these systems. The MiDAS field guide is a public resource providing amplicon sequencing protocols and an ecosystem-specific taxonomic database optimized for use with wastewater treatment facility samples. The curated taxonomy endeavours to provide a genus-level-classification for abundant phylotypes and the online field guide links this identity to published information regarding their ecology, function and distribution. This article describes the expansion of the database resources to cover the organisms of the anaerobic digester systems fed primary sludge and surplus activated sludge. The updated database includes descriptions of the abundant genus-level-taxa in influent wastewater, activated sludge and anaerobic digesters. Abundance information is also included to allow assessment of the role of emigration in the ecology of each phylotype. MiDAS is intended as a collaborative resource for the progression of research into the ecology of wastewater treatment, by providing a public repository for knowledge that is accessible to all interested in these biotechnologically important systems. http://www.midasfieldguide.org. © The Author(s) 2017. Published by Oxford University Press.

  2. Recommendations for Locus-Specific Databases and Their Curation

    PubMed Central

    Cotton, R.G.H.; Auerbach, A.D.; Beckmann, J.S.; Blumenfeld, O.O.; Brookes, A.J.; Brown, A.F.; Carrera, P.; Cox, D.W.; Gottlieb, B.; Greenblatt, M.S.; Hilbert, P.; Lehvaslaiho, H.; Liang, P.; Marsh, S.; Nebert, D.W.; Povey, S.; Rossetti, S.; Scriver, C.R.; Summar, M.; Tolan, D.R.; Verma, I.C.; Vihinen, M.; den Dunnen, J.T.

    2009-01-01

    Expert curation and complete collection of mutations in genes that affect human health is essential for proper genetic healthcare and research. Expert curation is given by the curators of gene-specific mutation databases or locus-specific databases (LSDBs). While there are over 700 such databases, they vary in their content, completeness, time available for curation, and the expertise of the curator. Curation and LSDBs have been discussed, written about, and protocols have been provided for over 10 years, but there have been no formal recommendations for the ideal form of these entities. This work initiates a discussion on this topic to assist future efforts in human genetics. Further discussion is welcome. PMID:18157828

  3. Recommendations for locus-specific databases and their curation.

    PubMed

    Cotton, R G H; Auerbach, A D; Beckmann, J S; Blumenfeld, O O; Brookes, A J; Brown, A F; Carrera, P; Cox, D W; Gottlieb, B; Greenblatt, M S; Hilbert, P; Lehvaslaiho, H; Liang, P; Marsh, S; Nebert, D W; Povey, S; Rossetti, S; Scriver, C R; Summar, M; Tolan, D R; Verma, I C; Vihinen, M; den Dunnen, J T

    2008-01-01

    Expert curation and complete collection of mutations in genes that affect human health is essential for proper genetic healthcare and research. Expert curation is given by the curators of gene-specific mutation databases or locus-specific databases (LSDBs). While there are over 700 such databases, they vary in their content, completeness, time available for curation, and the expertise of the curator. Curation and LSDBs have been discussed, written about, and protocols have been provided for over 10 years, but there have been no formal recommendations for the ideal form of these entities. This work initiates a discussion on this topic to assist future efforts in human genetics. Further discussion is welcome. (c) 2007 Wiley-Liss, Inc.

  4. Data Curation: Improving Environmental Health Data Quality.

    PubMed

    Yang, Lin; Li, Jiao; Hou, Li; Qian, Qing

    2015-01-01

    With the growing recognition of the influence of climate change on human health, scientists' attention to analyzing the relationship between meteorological factors and adverse health effects. However, the paucity of high quality integrated data is one of the great challenges, especially when scientific studies rely on data-intensive computing. This paper aims to design an appropriate curation process to address this problem. We present a data curation workflow that: (i) follows the guidance of DCC Curation Lifecycle Model; (ii) combines manual curation with automatic curation; (iii) and solves environmental health data curation problem. The workflow was applied to a medical knowledge service system and showed that it was capable of improving work efficiency and data quality.

  5. Over 5,600 Japanese collection of Antarctic meteorites: Recoveries, curation and distribution

    NASA Technical Reports Server (NTRS)

    Yanai, K.; Kojima, H.

    1986-01-01

    The history of recovery of meteorite fragments in the Yamato Mountains, Allan Hills, and Victoria Land, Antarctica is reviewed. The Japanese collection of Antarctic meteorites were numbered, weighed, photographed, identified, and classified. Sample distribution of the Japanese Antarctic meteorites is described.

  6. Advancing Site-Based Data Curation for Geobiology: The Yellowstone Exemplar (Invited)

    NASA Astrophysics Data System (ADS)

    Palmer, C. L.; Fouke, B. W.; Rodman, A.; Choudhury, G. S.

    2013-12-01

    While advances in the management and archiving of scientific digital data are proceeding apace, there is an urgent need for data curation services to collect and provide access to high-value data fit for reuse. The Site-Based Data Curation (SBDC) project is establishing a framework of guidelines and processes for the curation of research data generated at scientifically significant sites. The project is a collaboration among information scientists, geobiologists, data archiving experts, and resource managers at Yellowstone National Park (YNP). Based on our previous work with the Data Conservancy on indicators of value for research data, several factors made YNP an optimal site for developing the SBDC framework, including unique environmental conditions, a permitting process for data collection, and opportunities for geo-located longitudinal data and multiple data sources for triangulation and context. Stakeholder analysis is informing the SBDC requirements, through engagement with geologists, geochemists, and microbiologists conducting research at YNP and personnel from the Yellowstone Center for Resources and other YNP units. To date, results include data value indicators specific to site-based research, minimum and optimal parameters for data description and metadata, and a strategy for organizing data around sampling events. New value indicators identified by the scientists include ease of access to park locations for verification and correction of data, and stable environmental conditions important for controlling variables. Researchers see high potential for data aggregated from the many individual investigators conducting permitted research at YNP, however reuse is clearly contingent on detailed and consistent sampling records. Major applications of SBDC include identifying connections in dynamic systems, spatial temporal synthesis, analyzing variability within and across geological features, tracking site evolution, assessing anomalies, and greater awareness of complementary research and opportunities for collaboration. Moreover, making evident the range of available YNP data will inform what should be explored next, even beyond YNP. Like funding agencies and policy makers, YNP researchers and resource managers are invested in data curation for strategic purposes related to the big picture and efficiency of science. For the scientists, YNP represents an ideal, protected natural system that can serve as an indicator of world events, and SBDC provides the ability to ask and answer broader research questions and leverage an extensive store of highly applicable data. SBDC affords YNP improved coordination and transparency of data collection activities, and easier identification of trends and connections across projects. SBDC capabilities that support broader inquiry and better coordination of scientific effort have clear implications for data curation at other research intensive sites, and may also inform how data systems can provide strategic assistance to science more generally.

  7. Digital Curation and Digital Literacy: Evaluating the Role of Curation in Developing Critical Literacies for Participation in Digital Culture

    ERIC Educational Resources Information Center

    Mihailidis, Paul

    2015-01-01

    Despite the increased role of digital curation tools and platforms in the daily life of social network users, little research has focused on the competencies and dispositions that young people develop to effectively curate content online. This paper details the results of a mixed method study exploring the curation competencies of young people in…

  8. [Effect of jian-gan-le on advanced schistosomiasis].

    PubMed

    He, Zheng-Wen; Wang, You-Bin; Huang, Wen-Jun

    2011-06-01

    A total of 80 cases of advanced schistosomiasis were selected and divided into an experiment group and a control group, 40 cases each group, by the random sampling method. The patients in the experiment group were administered with Jian-gan-le, and the patients in the control group received compound purple granules. In the experiment group, the curative rate was 25.0%, the improving rate was 70.0%, the inefficacy rate was 5%, and the efficiency rate was 95.0%. In the control group, the curative rate was 12.5%, the improving rate was 75%, the inefficacy rate was 12.5%. There was no statistic difference between the 2 groups (P all > 0.05). The expense was cheaper in the experiment group than in the control group.

  9. Apollo Lunar Sample Photograph Digitization Project Update

    NASA Technical Reports Server (NTRS)

    Todd, N. S.; Lofgren, G. E.

    2012-01-01

    This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].

  10. Dataset of breath research manuscripts curated using PubMed search strings from 1995-2016.

    PubMed

    Geer Wallace, M Ariel; Pleil, Joachim D

    2018-06-01

    The data contained in this article are PubMed search strings and search string builders used to curate breath research manuscripts published from 1995-2016 and the respective number of articles found that satisfied the search requirements for selected categories. Breath sampling represents a non-invasive technique that has gained usefulness for public health, clinical, diagnostic, and environmental exposure assessment applications over the years. This data article includes search strings that were utilized to retrieve publications through the PubMed database for different breath research-related topics that were related to the analysis of exhaled breath, exhaled breath condensate (EBC), and exhaled breath aerosol (EBA) as well as the analysis of cellular headspace. Manuscripts were curated for topics including EBC, EBA, Direct MS, GC-MS, LC-MS, alcohol, and sensors. A summary of the number of papers published per year for the data retrieved using each of the search strings is also included. These data can be utilized to discern trends in the number of breath research publications in each of the different topics over time. A supplementary Appendix A containing the titles, author lists, journal names, publication dates, PMID numbers, and EntrezUID numbers for each of the journal articles curated using the finalized search strings for the seven breath research-related topics can also be found within this article. The selected manuscripts can be used to explore the impact that breath research has had on expanding the scientific knowledge in each of the investigated topics.

  11. The OSIRIS-REx Asteroid Sample Return Mission Operations Design

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan S.; Cheuvront, Allan

    2015-01-01

    OSIRIS-REx is an acronym that captures the scientific objectives: Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer. OSIRIS-REx will thoroughly characterize near-Earth asteroid Bennu (Previously known as 1019551999 RQ36). The OSIRIS-REx Asteroid Sample Return Mission delivers its science using five instruments and radio science along with the Touch-And-Go Sample Acquisition Mechanism (TAGSAM). All of the instruments and data analysis techniques have direct heritage from flown planetary missions. The OSIRIS-REx mission employs a methodical, phased approach to ensure success in meeting the mission's science requirements. OSIRIS-REx launches in September 2016, with a backup launch period occurring one year later. Sampling occurs in 2019. The departure burn from Bennu occurs in March 2021. On September 24, 2023, the Sample Return Capsule (SRC) lands at the Utah Test and Training Range (UTTR). Stardust heritage procedures are followed to transport the SRC to Johnson Space Center, where the samples are removed and delivered to the OSIRIS-REx curation facility. After a six-month preliminary examination period the mission will produce a catalog of the returned sample, allowing the worldwide community to request samples for detailed analysis. Traveling and returning a sample from an Asteroid that has not been explored before requires unique operations consideration. The Design Reference Mission (DRM) ties together spacecraft, instrument and operations scenarios. Asteroid Touch and Go (TAG) has various options varying from ground only to fully automated (natural feature tracking). Spacecraft constraints such as thermo and high gain antenna pointing impact the timeline. The mission is sensitive to navigation errors, so a late command update has been implemented. The project implemented lessons learned from other "small body" missions. The key lesson learned was 'expect the unexpected' and implement planning tools early in the lifecycle. This paper summarizes the ground and spacecraft design as presented at OSIRIS-REx Critical Design Review(CDR) held April 2014.

  12. genenames.org: the HGNC resources in 2011

    PubMed Central

    Seal, Ruth L.; Gordon, Susan M.; Lush, Michael J.; Wright, Mathew W.; Bruford, Elspeth A.

    2011-01-01

    The HUGO Gene Nomenclature Committee (HGNC) aims to assign a unique gene symbol and name to every human gene. The HGNC database currently contains almost 30 000 approved gene symbols, over 19 000 of which represent protein-coding genes. The public website, www.genenames.org, displays all approved nomenclature within Symbol Reports that contain data curated by HGNC editors and links to related genomic, phenotypic and proteomic information. Here we describe improvements to our resources, including a new Quick Gene Search, a new List Search, an integrated HGNC BioMart and a new Statistics and Downloads facility. PMID:20929869

  13. Curation of Federally Owned Archeological Collections at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Eastman, John Arnold (Compiler)

    1995-01-01

    As a Federal agency, NASA has a moral and legal obligation to the public to manage the archeological heritage resources under its control. Archeological sites are unique, nonrenewable resources that must be preserved so that future generations may experience and interpret the material remains of the past. These sites are protected by a wide array of federal regulations. These regulations are intended to ensure that our nation's cultural heritage is preserved for the study and enjoyment of future generations. Once a site has been excavated, all that remains of it are the artifacts and associated records which, taken together, allow researchers to reconstruct the past. With the contextual information provided by associated records such as field notes, maps and photographs, archeological collections can provide important information about life in the past. An integral component of the federal archeology program is the curation of these databases so that qualified scholars will have access to them in years to come. Standards for the maintenance of archeological collections have been codified by various professional organizations and by the federal government. These guidelines focus on providing secure, climate-controlled archival storage conditions for the collections and an adequate study area in which researchers can examine the artifacts and documents. In the 1970's and early 1980's, a group of NASA employees formed the LRC Historical and Archeological Society (LRCHAS) in order to pursue studies of the colonial plantations that ha been displaced by Langley Research Center (LaRC). They collected data on family histories and land ownership as well as conducting archeological surveys and excavations at two important 17th-20th century plantation sites in LaRC, Cloverdale and Chesterville. The excavations produced a wealth of information in the form of artifacts, photographs, maps and other documents. Unfortunately, interest on the part of the LRCHAS membership waned before a report was written, and since 1982 the artifacts have moldered in a flimsy trailer with no climate controls, which had once served as a field laboratory but which threatened to become a tomb for the collection. A recent analysis of Langley's cultural resources by Gray & Pape, Inc. recommended that the collection be organized, cataloged, and placed in a proper curation facility in accordance with Federal regulations. The project for the LARSS program was to research curation standards, organize the collection, catalog it, and prepare it for transfer to a facility which could provide adequate long-term curation conditions for the artifacts and documents. The first phase was to organize the artifacts, which were lying about the lab in various stages of cleaning, analysis, and conservation. Once all of the artifacts from the various excavation units and levels had been regrouped, they were cleaned and/or repackaged in archivally-stable materials. A basic catalog was prepared which will provide interested parties with a rough idea of what we have and where it can be found. Another aspect of the project was to organize the records left by the LRCHAS. Bundles of papers, photographs, and field data found in every corner and drawer of the laboratory trailer were put into order and, where appropriate, copies were made on acid-free Permabond paper for long term storage. Finally, the entire collection and most of the lab equipment was transferred into a secure, climate controlled room which will serve as an archive and study space for qualified scholars interested in exploring LaRC's rich historical heritage.

  14. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key summary information for each will be presented in an online catalog. b) Bulk Asteroid sample: The Touch and Go Sampling Mechanism (TAGSAM) head will contain up to 1.5 kg of asteroid material. Upon return to Earth, the TAGSAM head with the sample canister will be subjected to a nitrogen purge and then opened in a nitrogen cabinet in Houston. Once the TAGSAM head is removed from the canister, it will be dis-assembled slowly and carefully under nitrogen until the sample can be removed for processing in a dedicated nitrogen glovebox. Bennu surface samples are expected to be sub-cm sized, based on thermal infrared and radar polarization ratio measurements [1]. The upper limit on material collected by the TAGSAM head is 2 cm. Therefore, we will be prepared to handle, subdivide, and characterize materials of a wide grain size (from 10 ?m to 2 cm), and for both organic (UV fluorescence) and inorganic (SEM, FTIR, optical) properties. Representative portions of the bulk sample will be prepared for JAXA (0.5 %; see also [5]) and Canadian Space Agency (4%), with the remaining divided between the science team (<25%) and archived for future studies (NASA) (>75%). c) Contact Pad samples: The base of the TAGSAM head contains 24 contact pads that are designed to trap the upper surface layer of material and thus offer an opportunity to study asteroid samples that have resided at the very top surface of the regolith. Asteroid material is trapped on the pads in spring steel Velcro hooks, and material will have to be removed from these pads by curation specialists in the lab. d) Hardware: Some canister and SRC hardware items will contain information that will be important to understanding the collected samples, including the canister gas filter, temperature strips, flight witness plates, and the TAGSAM and canister parts that might have adhering dust grains. Some challenges remaining for both bulk sample and contact pad samples include: i) working with intermediate size range (200 to 500 micron) samples - a size range NASA has not previously worked in such detail; ii) techniques for removal of contact pad material from the spring steel hooks, iii) static electrical effects of dust sized particles during sample handling and curation is likely to be significant, and iv) the TAGSAM head and associated canister hardware will undoubtedly be coated with fine adhering dust grains from Bennu. In the case of collection of a large bulk sample mass, the adhering dust grains may be of lower priority. If a small sample mass is returned, the adhering dust may attain a higher priority, so recovery of adhering dust grains is an additional challenge to consider. In the year leading up to sample return we plan a variety of sample handling rehearsals that will enables the curation team to be prepared for many new aspects posed by this sample suite.

  15. Can we replace curation with information extraction software?

    PubMed

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  16. The Role of the Curator in Modern Hospitals: A Transcontinental Perspective.

    PubMed

    Moss, Hilary; O'Neill, Desmond

    2016-12-13

    This paper explores the role of the curator in hospitals. The arts play a significant role in every society; however, recent studies indicate a neglect of the aesthetic environment of healthcare. This international study explores the complex role of the curator in modern hospitals. Semi-structured interviews were conducted with ten arts specialists in hospitals across five countries and three continents for a qualitative, phenomenological study. Five themes arose from the data: (1) Patient involvement and influence on the arts programme in hospital (2) Understanding the role of the curator in hospital (3) Influences on arts programming in hospital (4) Types of arts programmes (5) Limitations to effective curation in hospital. Recommendations arising from the research included recognition of the specialised role of the curator in hospitals; building positive links with clinical staff to effect positive hospital arts programmes and increasing formal involvement of patients in arts planning in hospital. Hospital curation can be a vibrant arena for arts development, and the role of the hospital curator is a ground-breaking specialist role that can bring benefits to hospital life. The role of curator in hospital deserves to be supported and developed by both the arts and health sectors.

  17. Study on patient-induced radioactivity during proton treatment in hengjian proton medical facility.

    PubMed

    Wu, Qingbiao; Wang, Qingbin; Liang, Tianjiao; Zhang, Gang; Ma, Yinglin; Chen, Yu; Ye, Rong; Liu, Qiongyao; Wang, Yufei; Wang, Huaibao

    2016-09-01

    At present, increasingly more proton medical facilities have been established globally for better curative effect and less side effect in tumor treatment. Compared with electron and photon, proton delivers more energy and dose at its end of range (Bragg peak), and has less lateral scattering for its much larger mass. However, proton is much easier to produce neutron and induced radioactivity, which makes radiation protection for proton accelerators more difficult than for electron accelerators. This study focuses on the problem of patient-induced radioactivity during proton treatment, which has been ignored for years. However, we confirmed it is a vital factor for radiation protection to both patient escort and positioning technician, by FLUKA's simulation and activation formula calculation of Hengjian Proton Medical Facility (HJPMF), whose energy ranges from 130 to 230MeV. Furthermore, new formulas for calculating the activity buildup process of periodic irradiation were derived and used to study the relationship between saturation degree and half-life of nuclides. Finally, suggestions are put forward to lessen the radiation hazard from patient-induced radioactivity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. PDB data curation.

    PubMed

    Wang, Yanchao; Sunderraman, Rajshekhar

    2006-01-01

    In this paper, we propose two architectures for curating PDB data to improve its quality. The first one, PDB Data Curation System, is developed by adding two parts, Checking Filter and Curation Engine, between User Interface and Database. This architecture supports the basic PDB data curation. The other one, PDB Data Curation System with XCML, is designed for further curation which adds four more parts, PDB-XML, PDB, OODB, Protin-OODB, into the previous one. This architecture uses XCML language to automatically check errors of PDB data that enables PDB data more consistent and accurate. These two tools can be used for cleaning existing PDB files and creating new PDB files. We also show some ideas how to add constraints and assertions with XCML to get better data. In addition, we discuss the data provenance that may affect data accuracy and consistency.

  19. Architecture for the Interdisciplinary Earth Data Alliance

    NASA Astrophysics Data System (ADS)

    Richard, S. M.

    2016-12-01

    The Interdisciplinary Earth Data Alliance (IEDA) is leading an EarthCube (EC) Integrative Activity to develop a governance structure and technology framework that enables partner data systems to share technology, infrastructure, and practice for documenting, curating, and accessing heterogeneous geoscience data. The IEDA data facility provides capabilities in an extensible framework that enables domain-specific requirements for each partner system in the Alliance to be integrated into standardized cross-domain workflows. The shared technology infrastructure includes a data submission hub, a domain-agnostic file-based repository, an integrated Alliance catalog and a Data Browser for data discovery across all partner holdings, as well as services for registering identifiers for datasets (DOI) and samples (IGSN). The submission hub will be a platform that facilitates acquisition of cross-domain resource documentation and channels users into domain and resource-specific workflows tailored for each partner community. We are exploring an event-based message bus architecture with a standardized plug-in interface for adding capabilities. This architecture builds on the EC CINERGI metadata pipeline as well as the message-based architecture of the SEAD project. Plug-in components for file introspection to match entities to a data type registry (extending EC Digital Crust and Research Data Alliance work), extract standardized keywords (using CINERGI components), location, cruise, personnel and other metadata linkage information (building on GeoLink and existing IEDA partner components). The submission hub will feed submissions to appropriate partner repositories and service endpoints targeted by domain and resource type for distribution. The Alliance governance will adopt patterns (vocabularies, operations, resource types) for self-describing data services using standard HTTP protocol for simplified data access (building on EC GeoWS and other `RESTful' approaches). Exposure of resource descriptions (datasets and service distributions) for harvesting by commercial search engines as well as geoscience-data focused crawlers (like EC B-Cube crawler) will increase discoverability of IEDA resources with minimal effort by curators.

  20. Surgical outcomes for liposarcoma of the lower limbs with synchronous pulmonary metastases.

    PubMed

    Illuminati, Giulio; Ceccanei, Gianluca; Pacilè, Maria Antonietta; Calio, Francesco G; Migliano, Francesco; Mercurio, Valentina; Pizzardi, Giulia; Nigri, Giuseppe

    2010-12-01

    Surgical resection of pulmonary metastases from soft tissues sarcomas has typically yielded disparate results, owing to the histologic heterogeneity of various series and the presentation times relative to primary tumor discovery. It was our hypothesis that with expeditious, curative surgical resection of both, primary and metastatic disease, patients with liposarcoma of the lower limb and synchronous, resectable, pulmonary metastases might achieve satisfactory outcomes. A consecutive sample clinical study, with a mean follow-up duration of 30 months. Twenty-two patients (mean age, 50 years), each presenting with a liposarcoma of the lower limb and synchronous, resectable, pulmonary metastases, underwent curative resection of both the primary mass and all pulmonary metastases within a mean of 18 days from presentation (range 9-32 days). Mean overall survival was 28 months, disease-related survival (SE) was 9% at 5 years (±9.7%), and disease-free survival was 9% at 5 years (±7.6%). Expeditious, curative resection of both--primary and metastatic lesions--yields acceptable near-term results, with potential for long-term survival, in patients with liposarcoma of the lower limb and synchronous pulmonary metastases. 2010 Wiley-Liss, Inc.

  1. Survival from colorectal cancer in Victoria: 10-year follow up of the 1987 management survey.

    PubMed

    McLeish, John A; Thursfield, Vicky J; Giles, Graham G

    2002-05-01

    In 1987, the Victorian Cancer Registry identified a population-based sample of patients who underwent surgery for colorectal cancer for an audit of management following resection. Over 10 years have passed since this survey, and data on the survival of these patients (incorporating various prognostic indicators collected at the time of the survey) are now discussed in the present report. Relative survival analysis was conducted for each prognostic indicator separately and then combined in a multivariate model. Relative survival at 5 years for patients undergoing curative resections was 76% compared with 7% for those whose treatment was considered palliative. Survival at 10 years was little changed (73% and 7% respectively). Survival did not differ significantly by sex or age irrespective of treatment intention. In the curative group, only stage was a significant predictor of survival. Multivariate analysis was performed only for the curative group. Adjusting for all variables simultaneously,stage was the only -significant predictor of survival. Patients with Dukes' stage C disease were at a significantly greater risk (OR 5.5 (1.7-17.6)) than those with Dukes' A. Neither tumour site, sex, age, surgeon activity level nor adjuvant therapies made a significant contribution to the model.

  2. A nomogram to predict brain metastasis as the first relapse in curatively resected non-small cell lung cancer patients.

    PubMed

    Won, Young-Woong; Joo, Jungnam; Yun, Tak; Lee, Geon-Kook; Han, Ji-Youn; Kim, Heung Tae; Lee, Jin Soo; Kim, Moon Soo; Lee, Jong Mog; Lee, Hyun-Sung; Zo, Jae Ill; Kim, Sohee

    2015-05-01

    Development of brain metastasis results in a significant reduction in overall survival. However, there is no an effective tool to predict brain metastasis in non-small cell lung cancer (NSCLC) patients. We conducted this study to develop a feasible nomogram that can predict metastasis to the brain as the first relapse site in patients with curatively resected NSCLC. A retrospective review of NSCLC patients who had received curative surgery at National Cancer Center (Goyang, South Korea) between 2001 and 2008 was performed. We chose metastasis to the brain as the first relapse site after curative surgery as the primary endpoint of the study. A nomogram was modeled using logistic regression. Among 1218 patients, brain metastasis as the first relapse developed in 87 patients (7.14%) during the median follow-up of 43.6 months. Occurrence rates of brain metastasis were higher in patients with adenocarcinoma or those with a high pT and pN stage. Younger age appeared to be associated with brain metastasis, but this result was not statistically significant. The final prediction model included histology, smoking status, pT stage, and the interaction between adenocarcinoma and pN stage. The model showed fairly good discriminatory ability with a C-statistic of 69.3% and 69.8% for predicting brain metastasis within 2 years and 5 years, respectively. Internal validation using 2000 bootstrap samples resulted in C-statistics of 67.0% and 67.4% which still indicated good discriminatory performances. The nomogram presented here provides the individual risk estimate of developing metastasis to the brain as the first relapse site in patients with NSCLC who have undergone curative surgery. Surveillance programs or preventive treatment strategies for brain metastasis could be established based on this nomogram. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Morbidity of curative cancer surgery and suicide risk.

    PubMed

    Jayakrishnan, Thejus T; Sekigami, Yurie; Rajeev, Rahul; Gamblin, T Clark; Turaga, Kiran K

    2017-11-01

    Curative cancer operations lead to debility and loss of autonomy in a population vulnerable to suicide death. The extent to which operative intervention impacts suicide risk is not well studied. To examine the effects of morbidity of curative cancer surgeries and prognosis of disease on the risk of suicide in patients with solid tumors. Retrospective cohort study using Surveillance, Epidemiology, and End Results data from 2004 to 2011; multilevel systematic review. General US population. Participants were 482 781 patients diagnosed with malignant neoplasm between 2004 and 2011 who underwent curative cancer surgeries. Death by suicide or self-inflicted injury. Among 482 781 patients that underwent curative cancer surgery, 231 committed suicide (16.58/100 000 person-years [95% confidence interval, CI, 14.54-18.82]). Factors significantly associated with suicide risk included male sex (incidence rate [IR], 27.62; 95% CI, 23.82-31.86) and age >65 years (IR, 22.54; 95% CI, 18.84-26.76). When stratified by 30-day overall postoperative morbidity, a significantly higher incidence of suicide was found for high-morbidity surgeries (IR, 33.30; 95% CI, 26.50-41.33) vs moderate morbidity (IR, 24.27; 95% CI, 18.92-30.69) and low morbidity (IR, 9.81; 95% CI, 7.90-12.04). Unit increase in morbidity was significantly associated with death by suicide (odds ratio, 1.01; 95% CI, 1.00-1.03; P = .02) and decreased suicide-specific survival (hazards ratio, 1.02; 95% CI, 1.00-1.03, P = .01) in prognosis-adjusted models. In this sample of cancer patients in the Surveillance, Epidemiology, and End Results database, patients that undergo high-morbidity surgeries appear most vulnerable to death by suicide. The identification of this high-risk cohort should motivate health care providers and particularly surgeons to adopt screening measures during the postoperative follow-up period for these patients. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Extracorporeal shock waves as curative therapy for varicose veins?

    PubMed Central

    Angehrn, Fiorenzo; Kuhn, Christoph; Sonnabend, Ortrud; Voss, Axel

    2008-01-01

    In this prospective design study the effects of low-energy partially focused extracorporeal generated shock waves (ESW) onto a subcutaneous located varicose vein – left vena saphena magna (VSM) – are investigated. The treatment consisted of 4 ESW applications within 21 days. The varicose VSM of both sides were removed by surgery, and samples analyzed comparing the treated and untreated by means of histopathology. No damage to the treated varicose vein in particular and no mechanical destruction to the varicose vein’s wall could be demonstrated. However, an induction of neo-collagenogenesis was observed. The thickness of the varicose vein’s wall increased. Optimization of critical application parameters by investigating a larger number of patients may turn ESW into a non-invasive curative varicose treatment. PMID:18488887

  5. Accelerating literature curation with text-mining tools: a case study of using PubTator to curate genes in PubMed abstracts

    PubMed Central

    Lu, Zhiyong

    2012-01-01

    Today’s biomedical research has become heavily dependent on access to the biological knowledge encoded in expert curated biological databases. As the volume of biological literature grows rapidly, it becomes increasingly difficult for biocurators to keep up with the literature because manual curation is an expensive and time-consuming endeavour. Past research has suggested that computer-assisted curation can improve efficiency, but few text-mining systems have been formally evaluated in this regard. Through participation in the interactive text-mining track of the BioCreative 2012 workshop, we developed PubTator, a PubMed-like system that assists with two specific human curation tasks: document triage and bioconcept annotation. On the basis of evaluation results from two external user groups, we find that the accuracy of PubTator-assisted curation is comparable with that of manual curation and that PubTator can significantly increase human curatorial speed. These encouraging findings warrant further investigation with a larger number of publications to be annotated. Database URL: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/PubTator/ PMID:23160414

  6. The MIntAct project—IntAct as a common curation platform for 11 molecular interaction databases

    PubMed Central

    Orchard, Sandra; Ammari, Mais; Aranda, Bruno; Breuza, Lionel; Briganti, Leonardo; Broackes-Carter, Fiona; Campbell, Nancy H.; Chavali, Gayatri; Chen, Carol; del-Toro, Noemi; Duesbury, Margaret; Dumousseau, Marine; Galeota, Eugenia; Hinz, Ursula; Iannuccelli, Marta; Jagannathan, Sruthi; Jimenez, Rafael; Khadake, Jyoti; Lagreid, Astrid; Licata, Luana; Lovering, Ruth C.; Meldal, Birgit; Melidoni, Anna N.; Milagros, Mila; Peluso, Daniele; Perfetto, Livia; Porras, Pablo; Raghunath, Arathi; Ricard-Blum, Sylvie; Roechert, Bernd; Stutz, Andre; Tognolli, Michael; van Roey, Kim; Cesareni, Gianni; Hermjakob, Henning

    2014-01-01

    IntAct (freely available at http://www.ebi.ac.uk/intact) is an open-source, open data molecular interaction database populated by data either curated from the literature or from direct data depositions. IntAct has developed a sophisticated web-based curation tool, capable of supporting both IMEx- and MIMIx-level curation. This tool is now utilized by multiple additional curation teams, all of whom annotate data directly into the IntAct database. Members of the IntAct team supply appropriate levels of training, perform quality control on entries and take responsibility for long-term data maintenance. Recently, the MINT and IntAct databases decided to merge their separate efforts to make optimal use of limited developer resources and maximize the curation output. All data manually curated by the MINT curators have been moved into the IntAct database at EMBL-EBI and are merged with the existing IntAct dataset. Both IntAct and MINT are active contributors to the IMEx consortium (http://www.imexconsortium.org). PMID:24234451

  7. Place of residence and primary treatment of prostate cancer: examining trends in rural and nonrural areas in Wisconsin.

    PubMed

    Cetnar, Jeremy P; Hampton, John M; Williamson, Amy A; Downs, Tracy; Wang, Dian; Owen, Jean B; Crouse, Byron; Jones, Nathan; Wilson, J Frank; Trentham-Dietz, Amy

    2013-03-01

    To determine whether rural residents were at a disadvantage compared with urban residents with regard to the receipt of curative therapy for prostate cancer. Using the Breast and Prostate Cancer Data Quality and Patterns of Care Study II, patients with prostate cancer who were diagnosed in 2004 were identified. Registrars reviewed the medical records of randomly selected patients with incident prostate cancer (n = 1906). The patients' residential address was geocoded and linked to the census tract from the 2000 U.S. Census. The place of residence was defined as rural or nonrural according to the census tract and rural-urban commuting area categorization. The distance from the residence to the nearest radiation oncology facility was calculated. The odds ratio and 95% confidence intervals associated with receipt of noncurative treatment was calculated from logistic regression models and adjusted for several potential confounders. Of the incident patients, 39.1% lived in urban census tracts, 41.5% lived in mixed tracts, and 19.4% lived in rural tracts. Hormone-only or active surveillance was received by 15.4% of the patients. Relative to the urban patients, the odds ratio for noncurative treatment was 1.01 (95% confidence interval 0.59-1.74) for those living in mixed tracts and 0.96 (95% confidence interval 0.52-1.77) for those living in rural tracts. No association was found for noncurative treatment according to the Rural-Urban Commuting Area categorization. The linear trend was null between noncurative treatment and the distance to nearest radiation oncology facility (P = .92). The choice of curative treatment did not significantly depend on the patient's place of residence, suggesting a lack of geographic disparity for the primary treatment of prostate cancer. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Biocuration workflows and text mining: overview of the BioCreative 2012 Workshop Track II.

    PubMed

    Lu, Zhiyong; Hirschman, Lynette

    2012-01-01

    Manual curation of data from the biomedical literature is a rate-limiting factor for many expert curated databases. Despite the continuing advances in biomedical text mining and the pressing needs of biocurators for better tools, few existing text-mining tools have been successfully integrated into production literature curation systems such as those used by the expert curated databases. To close this gap and better understand all aspects of literature curation, we invited submissions of written descriptions of curation workflows from expert curated databases for the BioCreative 2012 Workshop Track II. We received seven qualified contributions, primarily from model organism databases. Based on these descriptions, we identified commonalities and differences across the workflows, the common ontologies and controlled vocabularies used and the current and desired uses of text mining for biocuration. Compared to a survey done in 2009, our 2012 results show that many more databases are now using text mining in parts of their curation workflows. In addition, the workshop participants identified text-mining aids for finding gene names and symbols (gene indexing), prioritization of documents for curation (document triage) and ontology concept assignment as those most desired by the biocurators. DATABASE URL: http://www.biocreative.org/tasks/bc-workshop-2012/workflow/.

  9. On expert curation and scalability: UniProtKB/Swiss-Prot as a case study

    PubMed Central

    Arighi, Cecilia N; Magrane, Michele; Bateman, Alex; Wei, Chih-Hsuan; Lu, Zhiyong; Boutet, Emmanuel; Bye-A-Jee, Hema; Famiglietti, Maria Livia; Roechert, Bernd; UniProt Consortium, The

    2017-01-01

    Abstract Motivation Biological knowledgebases, such as UniProtKB/Swiss-Prot, constitute an essential component of daily scientific research by offering distilled, summarized and computable knowledge extracted from the literature by expert curators. While knowledgebases play an increasingly important role in the scientific community, their ability to keep up with the growth of biomedical literature is under scrutiny. Using UniProtKB/Swiss-Prot as a case study, we address this concern via multiple literature triage approaches. Results With the assistance of the PubTator text-mining tool, we tagged more than 10 000 articles to assess the ratio of papers relevant for curation. We first show that curators read and evaluate many more papers than they curate, and that measuring the number of curated publications is insufficient to provide a complete picture as demonstrated by the fact that 8000–10 000 papers are curated in UniProt each year while curators evaluate 50 000–70 000 papers per year. We show that 90% of the papers in PubMed are out of the scope of UniProt, that a maximum of 2–3% of the papers indexed in PubMed each year are relevant for UniProt curation, and that, despite appearances, expert curation in UniProt is scalable. Availability and implementation UniProt is freely available at http://www.uniprot.org/. Contact sylvain.poux@sib.swiss Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036270

  10. The curation of genetic variants: difficulties and possible solutions.

    PubMed

    Pandey, Kapil Raj; Maden, Narendra; Poudel, Barsha; Pradhananga, Sailendra; Sharma, Amit Kumar

    2012-12-01

    The curation of genetic variants from biomedical articles is required for various clinical and research purposes. Nowadays, establishment of variant databases that include overall information about variants is becoming quite popular. These databases have immense utility, serving as a user-friendly information storehouse of variants for information seekers. While manual curation is the gold standard method for curation of variants, it can turn out to be time-consuming on a large scale thus necessitating the need for automation. Curation of variants described in biomedical literature may not be straightforward mainly due to various nomenclature and expression issues. Though current trends in paper writing on variants is inclined to the standard nomenclature such that variants can easily be retrieved, we have a massive store of variants in the literature that are present as non-standard names and the online search engines that are predominantly used may not be capable of finding them. For effective curation of variants, knowledge about the overall process of curation, nature and types of difficulties in curation, and ways to tackle the difficulties during the task are crucial. Only by effective curation, can variants be correctly interpreted. This paper presents the process and difficulties of curation of genetic variants with possible solutions and suggestions from our work experience in the field including literature support. The paper also highlights aspects of interpretation of genetic variants and the importance of writing papers on variants following standard and retrievable methods. Copyright © 2012. Published by Elsevier Ltd.

  11. The Curation of Genetic Variants: Difficulties and Possible Solutions

    PubMed Central

    Pandey, Kapil Raj; Maden, Narendra; Poudel, Barsha; Pradhananga, Sailendra; Sharma, Amit Kumar

    2012-01-01

    The curation of genetic variants from biomedical articles is required for various clinical and research purposes. Nowadays, establishment of variant databases that include overall information about variants is becoming quite popular. These databases have immense utility, serving as a user-friendly information storehouse of variants for information seekers. While manual curation is the gold standard method for curation of variants, it can turn out to be time-consuming on a large scale thus necessitating the need for automation. Curation of variants described in biomedical literature may not be straightforward mainly due to various nomenclature and expression issues. Though current trends in paper writing on variants is inclined to the standard nomenclature such that variants can easily be retrieved, we have a massive store of variants in the literature that are present as non-standard names and the online search engines that are predominantly used may not be capable of finding them. For effective curation of variants, knowledge about the overall process of curation, nature and types of difficulties in curation, and ways to tackle the difficulties during the task are crucial. Only by effective curation, can variants be correctly interpreted. This paper presents the process and difficulties of curation of genetic variants with possible solutions and suggestions from our work experience in the field including literature support. The paper also highlights aspects of interpretation of genetic variants and the importance of writing papers on variants following standard and retrievable methods. PMID:23317699

  12. Data Management and Rescue at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.; McIntyre-Redden, M. R.

    2015-12-01

    As new technologies are developed to utilize data more fully, and as shrinking budgets mean more needs to be done with less, well-documented and discoverable legacy data is vital for continued research and economic growth. Many governmental agencies are mandated to maintain scientific data, and the Geological Survey of Alabama (GSA) is no different. As part of the mandate to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources for the betterment of Alabama's citizens, communities, and businesses, the GSA has increasingly been called upon to make our data (including samples) more accessible to stakeholders. The GSA has been involved in several data management, preservation, and rescue projects, including the National Geothermal Data System and the National Geological and Geophysical Data Preservation Program. GSA staff utilizes accepted standards for metadata, such as those found at the US Geoscience Information Network (USGIN). Through the use of semi-automated workflows, these standards can be applied to legacy data records. As demand for more detailed information on samples increases, especially so that a researcher can do a preliminary assessment prior to a site visit, it has become critical for the efficiency of the GSA to have better systems in place for sample tracking and data management. Thus, GSA is in the process of registering cores and related samples for International Geo Sample Numbers (IGSNs) through the System for Earth Sample Registration. IGSNs allow the GSA to use asset management software to better curate the physical samples and provide more accurate information to stakeholders. Working with other initiatives, such as EarthCube's iSamples project, will ensure that GSA continues to use best practices and standards for sample identification, documentation, citation, curation, and sharing.

  13. Gateways to the FANTOM5 promoter level mammalian expression atlas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lizio, Marina; Harshbarger, Jayson; Shimoji, Hisashi

    The FANTOM5 project investigates transcription initiation activities in more than 1,000 human and mouse primary cells, cell lines and tissues using CAGE. Based on manual curation of sample information and development of an ontology for sample classification, we assemble the resulting data into a centralized data resource (http://fantom.gsc.riken.jp/5/). In conclusion, this resource contains web-based tools and data-access points for the research community to search and extract data related to samples, genes, promoter activities, transcription factors and enhancers across the FANTOM5 atlas.

  14. Gateways to the FANTOM5 promoter level mammalian expression atlas

    DOE PAGES

    Lizio, Marina; Harshbarger, Jayson; Shimoji, Hisashi; ...

    2015-01-05

    The FANTOM5 project investigates transcription initiation activities in more than 1,000 human and mouse primary cells, cell lines and tissues using CAGE. Based on manual curation of sample information and development of an ontology for sample classification, we assemble the resulting data into a centralized data resource (http://fantom.gsc.riken.jp/5/). In conclusion, this resource contains web-based tools and data-access points for the research community to search and extract data related to samples, genes, promoter activities, transcription factors and enhancers across the FANTOM5 atlas.

  15. The 45th Annual Meteoritical Society Meeting

    NASA Technical Reports Server (NTRS)

    Jones, P. (Compiler); Turner, L. (Compiler)

    1982-01-01

    Impact craters and shock effects, chondrite formation and evolution, meteorites, chondrules, irons, nebular processes and meteorite parent bodies, regoliths and breccias, antarctic meteorite curation, isotopic studies of meteorites and lunar samples, organics and terrestrial weathering, refractory inclusions, cosmic dust, particle irradiations before and after compaction, and mineralogic studies and analytical techniques are discussed.

  16. A curated collection of tissue microarray images and clinical outcome data of prostate cancer patients

    PubMed Central

    Zhong, Qing; Guo, Tiannan; Rechsteiner, Markus; Rüschoff, Jan H.; Rupp, Niels; Fankhauser, Christian; Saba, Karim; Mortezavi, Ashkan; Poyet, Cédric; Hermanns, Thomas; Zhu, Yi; Moch, Holger; Aebersold, Ruedi; Wild, Peter J.

    2017-01-01

    Microscopy image data of human cancers provide detailed phenotypes of spatially and morphologically intact tissues at single-cell resolution, thus complementing large-scale molecular analyses, e.g., next generation sequencing or proteomic profiling. Here we describe a high-resolution tissue microarray (TMA) image dataset from a cohort of 71 prostate tissue samples, which was hybridized with bright-field dual colour chromogenic and silver in situ hybridization probes for the tumour suppressor gene PTEN. These tissue samples were digitized and supplemented with expert annotations, clinical information, statistical models of PTEN genetic status, and computer source codes. For validation, we constructed an additional TMA dataset for 424 prostate tissues, hybridized with FISH probes for PTEN, and performed survival analysis on a subset of 339 radical prostatectomy specimens with overall, disease-specific and recurrence-free survival (maximum 167 months). For application, we further produced 6,036 image patches derived from two whole slides. Our curated collection of prostate cancer data sets provides reuse potential for both biomedical and computational studies. PMID:28291248

  17. Space Art "Stardust"

    NASA Image and Video Library

    2008-01-08

    Artist Paul Henry Ramirez captured symbolically the Stardust mission in this peice titled "Stardust". The Stardust mission in January of 2006 completed a seven-year, 2.8 billion mile journey to fly by a comet and return samples to Earth. The material is a first sample of pristine cometary material which will increase human understanding of interstellar dust. Stardust, 2007. Acrylic Micaceous Iron Oxide, Aluminum and crystal, hologram glitter Mylar 20" round canvas. Copyrighted: For more information contact Curator, NASA Art Program.

  18. Annotation of phenotypic diversity: decoupling data curation and ontology curation using Phenex.

    PubMed

    Balhoff, James P; Dahdul, Wasila M; Dececchi, T Alexander; Lapp, Hilmar; Mabee, Paula M; Vision, Todd J

    2014-01-01

    Phenex (http://phenex.phenoscape.org/) is a desktop application for semantically annotating the phenotypic character matrix datasets common in evolutionary biology. Since its initial publication, we have added new features that address several major bottlenecks in the efficiency of the phenotype curation process: allowing curators during the data curation phase to provisionally request terms that are not yet available from a relevant ontology; supporting quality control against annotation guidelines to reduce later manual review and revision; and enabling the sharing of files for collaboration among curators. We decoupled data annotation from ontology development by creating an Ontology Request Broker (ORB) within Phenex. Curators can use the ORB to request a provisional term for use in data annotation; the provisional term can be automatically replaced with a permanent identifier once the term is added to an ontology. We added a set of annotation consistency checks to prevent common curation errors, reducing the need for later correction. We facilitated collaborative editing by improving the reliability of Phenex when used with online folder sharing services, via file change monitoring and continual autosave. With the addition of these new features, and in particular the Ontology Request Broker, Phenex users have been able to focus more effectively on data annotation. Phenoscape curators using Phenex have reported a smoother annotation workflow, with much reduced interruptions from ontology maintenance and file management issues.

  19. A Window to the World: Lessons Learned from NASA's Collaborative Metadata Curation Effort

    NASA Astrophysics Data System (ADS)

    Bugbee, K.; Dixon, V.; Baynes, K.; Shum, D.; le Roux, J.; Ramachandran, R.

    2017-12-01

    Well written descriptive metadata adds value to data by making data easier to discover as well as increases the use of data by providing the context or appropriateness of use. While many data centers acknowledge the importance of correct, consistent and complete metadata, allocating resources to curate existing metadata is often difficult. To lower resource costs, many data centers seek guidance on best practices for curating metadata but struggle to identify those recommendations. In order to assist data centers in curating metadata and to also develop best practices for creating and maintaining metadata, NASA has formed a collaborative effort to improve the Earth Observing System Data and Information System (EOSDIS) metadata in the Common Metadata Repository (CMR). This effort has taken significant steps in building consensus around metadata curation best practices. However, this effort has also revealed gaps in EOSDIS enterprise policies and procedures within the core metadata curation task. This presentation will explore the mechanisms used for building consensus on metadata curation, the gaps identified in policies and procedures, the lessons learned from collaborating with both the data centers and metadata curation teams, and the proposed next steps for the future.

  20. Discovering New Global Climate Patterns: Curating a 21-Year High Temporal (Hourly) and Spatial (40km) Resolution Reanalysis Dataset

    NASA Astrophysics Data System (ADS)

    Hou, C. Y.; Dattore, R.; Peng, G. S.

    2014-12-01

    The National Center for Atmospheric Research's Global Climate Four-Dimensional Data Assimilation (CFDDA) Hourly 40km Reanalysis dataset is a dynamically downscaled dataset with high temporal and spatial resolution. The dataset contains three-dimensional hourly analyses in netCDF format for the global atmospheric state from 1985 to 2005 on a 40km horizontal grid (0.4°grid increment) with 28 vertical levels, providing good representation of local forcing and diurnal variation of processes in the planetary boundary layer. This project aimed to make the dataset publicly available, accessible, and usable in order to provide a unique resource to allow and promote studies of new climate characteristics. When the curation project started, it had been five years since the data files were generated. Also, although the Principal Investigator (PI) had generated a user document at the end of the project in 2009, the document had not been maintained. Furthermore, the PI had moved to a new institution, and the remaining team members were reassigned to other projects. These factors made data curation in the areas of verifying data quality, harvest metadata descriptions, documenting provenance information especially challenging. As a result, the project's curation process found that: Data curator's skill and knowledge helped make decisions, such as file format and structure and workflow documentation, that had significant, positive impact on the ease of the dataset's management and long term preservation. Use of data curation tools, such as the Data Curation Profiles Toolkit's guidelines, revealed important information for promoting the data's usability and enhancing preservation planning. Involving data curators during each stage of the data curation life cycle instead of at the end could improve the curation process' efficiency. Overall, the project showed that proper resources invested in the curation process would give datasets the best chance to fulfill their potential to help with new climate pattern discovery.

  1. The origin of amino acids in lunar regolith samples

    NASA Astrophysics Data System (ADS)

    Elsila, Jamie E.; Callahan, Michael P.; Dworkin, Jason P.; Glavin, Daniel P.; McLain, Hannah L.; Noble, Sarah K.; Gibson, Everett K.

    2016-01-01

    We analyzed the amino acid content of seven lunar regolith samples returned by the Apollo 16 and Apollo 17 missions and stored under NASA curation since collection using ultrahigh-performance liquid chromatography with fluorescence detection and time-of-flight mass spectrometry. Consistent with results from initial analyses shortly after collection in the 1970s, we observed amino acids at low concentrations in all of the curated samples, ranging from 0.2 parts-per-billion (ppb) to 42.7 ppb in hot-water extracts and 14.5-651.1 ppb in 6 M HCl acid-vapor-hydrolyzed, hot-water extracts. Amino acids identified in the Apollo soil extracts include glycine, D- and L-alanine, D- and L-aspartic acid, D- and L-glutamic acid, D- and L-serine, L-threonine, and L-valine, all of which had previously been detected in lunar samples, as well as several compounds not previously identified in lunar regoliths: α-aminoisobutyric acid (AIB), D- and L-β-amino-n-butyric acid (β-ABA), DL-α-amino-n-butyric acid, γ-amino-n-butyric acid, β-alanine, and ε-amino-n-caproic acid. We observed an excess of the L enantiomer in most of the detected proteinogenic amino acids, but racemic alanine and racemic β-ABA were present in some samples. We also examined seven samples from Apollo 15, 16, and 17 that had been previously allocated to a non-curation laboratory, as well as two samples of terrestrial dunite from studies of lunar module engine exhaust that had been stored in the same laboratory. The amino acid content of these samples suggested that contamination had occurred during non-curatorial storage. We measured the compound-specific carbon isotopic ratios of glycine, β-alanine, and L-alanine in Apollo regolith sample 70011 and found values of -21‰ to -33‰. These values are consistent with those seen in terrestrial biology and, together with the enantiomeric compositions of the proteinogenic amino acids, suggest that terrestrial biological contamination is a primary source of the amino acids in these samples. However, the presence of the non-proteinogenic amino acids such as AIB and β-ABA suggests the possibility of some contribution from exogenous sources. We did not observe a correlation of amino acid content with proximity to the Apollo 17 lunar module, implying that lunar module exhaust was not a primary source of amino acid precursors. Solar-wind-implanted precursors such as HCN also appear to be at most a minor contributor, given a lack of correlation between amino acid content and soil maturity (as measured by Is/FeO ratio) and the differences between the δ13C values of the amino acids and the solar wind.

  2. Can contracted out health facilities improve access, equity, and quality of maternal and newborn health services? Evidence from Pakistan.

    PubMed

    Zaidi, Shehla; Riaz, Atif; Rabbani, Fauziah; Azam, Syed Iqbal; Imran, Syeda Nida; Pradhan, Nouhseen Akber; Khan, Gul Nawaz

    2015-11-25

    The case of contracting out government health services to non-governmental organizations (NGOs) has been weak for maternal, newborn, and child health (MNCH) services, with documented gains being mainly in curative services. We present an in-depth assessment of the comparative advantages of contracting out on MNCH access, quality, and equity, using a case study from Pakistan. An end-line, cross-sectional assessment was conducted of government facilities contracted out to a large national NGO and government-managed centres serving as controls, in two remote rural districts of Pakistan. Contracting out was specific for augmenting MNCH services but without contractual performance incentives. A household survey, a health facility survey, and focus group discussions with client and spouses were used for assessment. Contracted out facilities had a significantly higher utilization as compared to control facilities for antenatal care, delivery, postnatal care, emergency obstetric care, and neonatal illness. Contracted facilities had comparatively better quality of MNCH services but not in all aspects. Better household practices were also seen in the district where contracting involved administrative control over outreach programs. Contracting was also faced with certain drawbacks. Facility utilization was inequitably higher amongst more educated and affluent clients. Contracted out catchments had higher out-of-pocket expenses on MNCH services, driven by steeper transport costs and user charges for additional diagnostics. Contracting out did not influence higher MNCH service coverage rates across the catchment. Physical distances, inadequate transport, and low demand for facility-based care in non-emergency settings were key client-reported barriers. Contracting out MNCH services at government health facilities can improve facility utilization and bring some improvement in  quality of services. However, contracting out of health facilities is insufficient to increase service access across the catchment in remote rural contexts and requires accompanying measures for demand enhancement, transportation access, and targeting of the more disadvantaged clientele.

  3. How should the completeness and quality of curated nanomaterial data be evaluated?

    NASA Astrophysics Data System (ADS)

    Marchese Robinson, Richard L.; Lynch, Iseult; Peijnenburg, Willie; Rumble, John; Klaessig, Fred; Marquardt, Clarissa; Rauscher, Hubert; Puzyn, Tomasz; Purian, Ronit; Åberg, Christoffer; Karcher, Sandra; Vriens, Hanne; Hoet, Peter; Hoover, Mark D.; Hendren, Christine Ogilvie; Harper, Stacey L.

    2016-05-01

    Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials' behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated?Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials' behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated? Electronic supplementary information (ESI) available: (1) Detailed information regarding issues raised in the main text; (2) original survey responses. See DOI: 10.1039/c5nr08944a

  4. Directly e-mailing authors of newly published papers encourages community curation

    PubMed Central

    Bunt, Stephanie M.; Grumbling, Gary B.; Field, Helen I.; Marygold, Steven J.; Brown, Nicholas H.; Millburn, Gillian H.

    2012-01-01

    Much of the data within Model Organism Databases (MODs) comes from manual curation of the primary research literature. Given limited funding and an increasing density of published material, a significant challenge facing all MODs is how to efficiently and effectively prioritize the most relevant research papers for detailed curation. Here, we report recent improvements to the triaging process used by FlyBase. We describe an automated method to directly e-mail corresponding authors of new papers, requesting that they list the genes studied and indicate (‘flag’) the types of data described in the paper using an online tool. Based on the author-assigned flags, papers are then prioritized for detailed curation and channelled to appropriate curator teams for full data extraction. The overall response rate has been 44% and the flagging of data types by authors is sufficiently accurate for effective prioritization of papers. In summary, we have established a sustainable community curation program, with the result that FlyBase curators now spend less time triaging and can devote more effort to the specialized task of detailed data extraction. Database URL: http://flybase.org/ PMID:22554788

  5. Auditing the Assignments of Top-Level Semantic Types in the UMLS Semantic Network to UMLS Concepts

    PubMed Central

    He, Zhe; Perl, Yehoshua; Elhanan, Gai; Chen, Yan; Geller, James; Bian, Jiang

    2018-01-01

    The Unified Medical Language System (UMLS) is an important terminological system. By the policy of its curators, each concept of the UMLS should be assigned the most specific Semantic Types (STs) in the UMLS Semantic Network (SN). Hence, the Semantic Types of most UMLS concepts are assigned at or near the bottom (leaves) of the UMLS Semantic Network. While most ST assignments are correct, some errors do occur. Therefore, Quality Assurance efforts of UMLS curators for ST assignments should concentrate on automatically detected sets of UMLS concepts with higher error rates than random sets. In this paper, we investigate the assignments of top-level semantic types in the UMLS semantic network to concepts, identify potential erroneous assignments, define four categories of errors, and thus provide assistance to curators of the UMLS to avoid these assignments errors. Human experts analyzed samples of concepts assigned 10 of the top-level semantic types and categorized the erroneous ST assignments into these four logical categories. Two thirds of the concepts assigned these 10 top-level semantic types are erroneous. Our results demonstrate that reviewing top-level semantic type assignments to concepts provides an effective way for UMLS quality assurance, comparing to reviewing a random selection of semantic type assignments. PMID:29375930

  6. Auditing the Assignments of Top-Level Semantic Types in the UMLS Semantic Network to UMLS Concepts.

    PubMed

    He, Zhe; Perl, Yehoshua; Elhanan, Gai; Chen, Yan; Geller, James; Bian, Jiang

    2017-11-01

    The Unified Medical Language System (UMLS) is an important terminological system. By the policy of its curators, each concept of the UMLS should be assigned the most specific Semantic Types (STs) in the UMLS Semantic Network (SN). Hence, the Semantic Types of most UMLS concepts are assigned at or near the bottom (leaves) of the UMLS Semantic Network. While most ST assignments are correct, some errors do occur. Therefore, Quality Assurance efforts of UMLS curators for ST assignments should concentrate on automatically detected sets of UMLS concepts with higher error rates than random sets. In this paper, we investigate the assignments of top-level semantic types in the UMLS semantic network to concepts, identify potential erroneous assignments, define four categories of errors, and thus provide assistance to curators of the UMLS to avoid these assignments errors. Human experts analyzed samples of concepts assigned 10 of the top-level semantic types and categorized the erroneous ST assignments into these four logical categories. Two thirds of the concepts assigned these 10 top-level semantic types are erroneous. Our results demonstrate that reviewing top-level semantic type assignments to concepts provides an effective way for UMLS quality assurance, comparing to reviewing a random selection of semantic type assignments.

  7. BioSurfDB: knowledge and algorithms to support biosurfactants and biodegradation studies

    PubMed Central

    Oliveira, Jorge S.; Araújo, Wydemberg; Lopes Sales, Ana Isabela; de Brito Guerra, Alaine; da Silva Araújo, Sinara Carla; de Vasconcelos, Ana Tereza Ribeiro; Agnez-Lima, Lucymara F.; Freitas, Ana Teresa

    2015-01-01

    Crude oil extraction, transportation and use provoke the contamination of countless ecosystems. Therefore, bioremediation through surfactants mobilization or biodegradation is an important subject, both economically and environmentally. Bioremediation research had a great boost with the recent advances in Metagenomics, as it enabled the sequencing of uncultured microorganisms providing new insights on surfactant-producing and/or oil-degrading bacteria. Many research studies are making available genomic data from unknown organisms obtained from metagenomics analysis of oil-contaminated environmental samples. These new datasets are presently demanding the development of new tools and data repositories tailored for the biological analysis in a context of bioremediation data analysis. This work presents BioSurfDB, www.biosurfdb.org, a curated relational information system integrating data from: (i) metagenomes; (ii) organisms; (iii) biodegradation relevant genes; proteins and their metabolic pathways; (iv) bioremediation experiments results, with specific pollutants treatment efficiencies by surfactant producing organisms; and (v) a biosurfactant-curated list, grouped by producing organism, surfactant name, class and reference. The main goal of this repository is to gather information on the characterization of biological compounds and mechanisms involved in biosurfactant production and/or biodegradation and make it available in a curated way and associated with a number of computational tools to support studies of genomic and metagenomic data. Database URL: www.biosurfdb.org PMID:25833955

  8. PFR²: a curated database of planktonic foraminifera 18S ribosomal DNA as a resource for studies of plankton ecology, biogeography and evolution.

    PubMed

    Morard, Raphaël; Darling, Kate F; Mahé, Frédéric; Audic, Stéphane; Ujiié, Yurika; Weiner, Agnes K M; André, Aurore; Seears, Heidi A; Wade, Christopher M; Quillévéré, Frédéric; Douady, Christophe J; Escarguel, Gilles; de Garidel-Thoron, Thibault; Siccha, Michael; Kucera, Michal; de Vargas, Colomban

    2015-11-01

    Planktonic foraminifera (Rhizaria) are ubiquitous marine pelagic protists producing calcareous shells with conspicuous morphology. They play an important role in the marine carbon cycle, and their exceptional fossil record serves as the basis for biochronostratigraphy and past climate reconstructions. A major worldwide sampling effort over the last two decades has resulted in the establishment of multiple large collections of cryopreserved individual planktonic foraminifera samples. Thousands of 18S rDNA partial sequences have been generated, representing all major known morphological taxa across their worldwide oceanic range. This comprehensive data coverage provides an opportunity to assess patterns of molecular ecology and evolution in a holistic way for an entire group of planktonic protists. We combined all available published and unpublished genetic data to build PFR(2), the Planktonic foraminifera Ribosomal Reference database. The first version of the database includes 3322 reference 18S rDNA sequences belonging to 32 of the 47 known morphospecies of extant planktonic foraminifera, collected from 460 oceanic stations. All sequences have been rigorously taxonomically curated using a six-rank annotation system fully resolved to the morphological species level and linked to a series of metadata. The PFR(2) website, available at http://pfr2.sb-roscoff.fr, allows downloading the entire database or specific sections, as well as the identification of new planktonic foraminiferal sequences. Its novel, fully documented curation process integrates advances in morphological and molecular taxonomy. It allows for an increase in its taxonomic resolution and assures that integrity is maintained by including a complete contingency tracking of annotations and assuring that the annotations remain internally consistent. © 2015 John Wiley & Sons Ltd.

  9. The Genesis Mission: Contamination Control and Curation

    NASA Technical Reports Server (NTRS)

    Stansbery, E. K.

    2002-01-01

    The Genesis mission, launched in August 2001, is collecting samples of the solar wind and will return to Earth in 2004. Genesis can be viewed as the most fundamental of NASA's sample return missions because it is expected to provide insight into the initial elemental and isotopic composition of the solar nebula from which all other planetary objects formed. The data from this mission will have a large impact on understanding the origins and diversity of planetary materials. The collectors consist of clean, pure materials into which the solar wind will imbed. Science and engineering issues such as bulk purity, cleanliness, retention of solar wind, and ability to withstand launch and entry drove material choices. Most of the collector materials are installed on array frames that are deployed from a clean science canister. Two of the arrays are continuously exposed for collecting the bulk solar wind; the other three are only exposed during specific solar wind regimes as measured by ion and electron monitors. Other materials are housed as targets at the focal point of an electrostatic mirror, or "concentrator", designed to enhance the flux of specific solar wind species. Johnson Space Center (JSC) has two principal responsibilities for the Genesis mission: contamination control and curation. Precise and accurate measurements of the composition of the solar atoms require that the collector materials be extremely clean and well characterized before launch and during the mission. Early involvement of JSC curation personnel in concept development resulted in a mission designed to minimize contaminants from the spacecraft and operations. A major goal of the Genesis mission is to provide a reservoir of materials for the 21 51 century. When the collector materials are returned to Earth, they must be handled in a clean manner and their condition well documented. Information gained in preliminary examination of the arrays and detailed surveys of each collector will be used to guide sample allocations to the scientific community. Samples allocated for analysis are likely to be small sections of individual collectors, therefore subdividing the materials must take place in a clean, well characterized way. A major focus of current research at JSC includes identifying and characterizing the contamination, waste, and alteration of the sample when using different subdividing, transport, and storage techniques and developing protocols for reducing their impact on the scientific integrity of the mission.

  10. Gene regulation knowledge commons: community action takes care of DNA binding transcription factors

    PubMed Central

    Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin

    2016-01-01

    A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715

  11. Antarctic Meteorite Newsletter, Volume 8, Number 2

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Requests for samples are welcomed from research scientists of all countries, regardless of their current state of funding for meteorite studies. All sample requests will be reviewed by the Meteorite Working Group (MWG), a peer-review committee that guides the collection, curation, allocation, and distribution of the U.S. Antarctic meteorites. Issurance of samples does not imply a commitment by any agency to fund the proposed research. Requests for financial support must be submitted separately to the appropriate funding agencies. As a matter of policy, U.S. Antarctic meteorites are the property of the National Science Foundation and all allocations are subject to recall.

  12. Mineralogy and Petrology of COMET WILD2 Nucleus Samples

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael; Bland, Phil; Bradley, John; Brearley, Adrian; Brennan, Sean; Bridges, John; Brownlee, Donald; Butterworth, Anna; Dai, Zurong; Ebel, Denton

    2006-01-01

    The sample return capsule of the Stardust spacecraft will be recovered in northern Utah on January 15, 2006, and under nominal conditions it will be delivered to the new Stardust Curation Laboratory at the Johnson Space Center two days later. Within the first week we plan to begin the harvesting of aerogel cells, and the comet nucleus samples they contain for detailed analysis. By the time of the LPSC meeting we will have been analyzing selected removed grains for more than one month. This presentation will present the first results from the mineralogical and petrological analyses that will have been performed.

  13. Osiris-Rex and Hayabusa2 Sample Cleanroom Design and Construction Planning at NASA-JSC

    NASA Technical Reports Server (NTRS)

    Righter, Kevin; Pace, Lisa F.; Messenger, Keiko

    2018-01-01

    Final Paper and not the abstract is attached. The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu September 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After confirma-tion of successful sample stowage, the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston. All curation-specific ex-amination and documentation activities related to Ben-nu samples will be conducted in the dedicated OSIRIS-REx sample cleanroom to be built at NASA-JSC.

  14. How should the completeness and quality of curated nanomaterial data be evaluated?†

    PubMed Central

    Marchese Robinson, Richard L.; Lynch, Iseult; Peijnenburg, Willie; Rumble, John; Klaessig, Fred; Marquardt, Clarissa; Rauscher, Hubert; Puzyn, Tomasz; Purian, Ronit; Åberg, Christoffer; Karcher, Sandra; Vriens, Hanne; Hoet, Peter; Hoover, Mark D.; Hendren, Christine Ogilvie; Harper, Stacey L.

    2016-01-01

    Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials’ behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated? PMID:27143028

  15. Where Will All Your Samples Go?

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to define standards that institutions must comply with to function as a trustworthy sample repository similar to trustworthy digital repositories. The iSamples Research Coordination Network of the EarthCube program aims to address some of these questions in workshops planned for 2018. This panel session offers an opportunity to ignite the discussion.

  16. Integrative Functional Genomics for Systems Genetics in GeneWeaver.org.

    PubMed

    Bubier, Jason A; Langston, Michael A; Baker, Erich J; Chesler, Elissa J

    2017-01-01

    The abundance of existing functional genomics studies permits an integrative approach to interpreting and resolving the results of diverse systems genetics studies. However, a major challenge lies in assembling and harmonizing heterogeneous data sets across species for facile comparison to the positional candidate genes and coexpression networks that come from systems genetic studies. GeneWeaver is an online database and suite of tools at www.geneweaver.org that allows for fast aggregation and analysis of gene set-centric data. GeneWeaver contains curated experimental data together with resource-level data such as GO annotations, MP annotations, and KEGG pathways, along with persistent stores of user entered data sets. These can be entered directly into GeneWeaver or transferred from widely used resources such as GeneNetwork.org. Data are analyzed using statistical tools and advanced graph algorithms to discover new relations, prioritize candidate genes, and generate function hypotheses. Here we use GeneWeaver to find genes common to multiple gene sets, prioritize candidate genes from a quantitative trait locus, and characterize a set of differentially expressed genes. Coupling a large multispecies repository curated and empirical functional genomics data to fast computational tools allows for the rapid integrative analysis of heterogeneous data for interpreting and extrapolating systems genetics results.

  17. MET network in PubMed: a text-mined network visualization and curation system.

    PubMed

    Dai, Hong-Jie; Su, Chu-Hsien; Lai, Po-Ting; Huang, Ming-Siang; Jonnagaddala, Jitendra; Rose Jue, Toni; Rao, Shruti; Chou, Hui-Jou; Milacic, Marija; Singh, Onkar; Syed-Abdul, Shabbir; Hsu, Wen-Lian

    2016-01-01

    Metastasis is the dissemination of a cancer/tumor from one organ to another, and it is the most dangerous stage during cancer progression, causing more than 90% of cancer deaths. Improving the understanding of the complicated cellular mechanisms underlying metastasis requires investigations of the signaling pathways. To this end, we developed a METastasis (MET) network visualization and curation tool to assist metastasis researchers retrieve network information of interest while browsing through the large volume of studies in PubMed. MET can recognize relations among genes, cancers, tissues and organs of metastasis mentioned in the literature through text-mining techniques, and then produce a visualization of all mined relations in a metastasis network. To facilitate the curation process, MET is developed as a browser extension that allows curators to review and edit concepts and relations related to metastasis directly in PubMed. PubMed users can also view the metastatic networks integrated from the large collection of research papers directly through MET. For the BioCreative 2015 interactive track (IAT), a curation task was proposed to curate metastatic networks among PubMed abstracts. Six curators participated in the proposed task and a post-IAT task, curating 963 unique metastatic relations from 174 PubMed abstracts using MET.Database URL: http://btm.tmu.edu.tw/metastasisway. © The Author(s) 2016. Published by Oxford University Press.

  18. Agile Data Curation Case Studies Leading to the Identification and Development of Data Curation Design Patterns

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.; Lenhardt, W. C.; Young, J. W.; Gordon, L. C.; Hughes, S.; Santhana Vannan, S. K.

    2017-12-01

    The planning for and development of efficient workflows for the creation, reuse, sharing, documentation, publication and preservation of research data is a general challenge that research teams of all sizes face. In response to: requirements from funding agencies for full-lifecycle data management plans that will result in well documented, preserved, and shared research data products increasing requirements from publishers for shared data in conjunction with submitted papers interdisciplinary research team's needs for efficient data sharing within projects, and increasing reuse of research data for replication and new, unanticipated research, policy development, and public use alternative strategies to traditional data life cycle approaches must be developed and shared that enable research teams to meet these requirements while meeting the core science objectives of their projects within the available resources. In support of achieving these goals, the concept of Agile Data Curation has been developed in which there have been parallel activities in support of 1) identifying a set of shared values and principles that underlie the objectives of agile data curation, 2) soliciting case studies from the Earth science and other research communities that illustrate aspects of what the contributors consider agile data curation methods and practices, and 3) identifying or developing design patterns that are high-level abstractions from successful data curation practice that are related to common data curation problems for which common solution strategies may be employed. This paper provides a collection of case studies that have been contributed by the Earth science community, and an initial analysis of those case studies to map them to emerging shared data curation problems and their potential solutions. Following the initial analysis of these problems and potential solutions, existing design patterns from software engineering and related disciplines are identified as a starting point for the development of a catalog of data curation design patterns that may be reused in the design and execution of new data curation processes.

  19. Natural Language Processing in aid of FlyBase curators

    PubMed Central

    Karamanis, Nikiforos; Seal, Ruth; Lewin, Ian; McQuilton, Peter; Vlachos, Andreas; Gasperin, Caroline; Drysdale, Rachel; Briscoe, Ted

    2008-01-01

    Background Despite increasing interest in applying Natural Language Processing (NLP) to biomedical text, whether this technology can facilitate tasks such as database curation remains unclear. Results PaperBrowser is the first NLP-powered interface that was developed under a user-centered approach to improve the way in which FlyBase curators navigate an article. In this paper, we first discuss how observing curators at work informed the design and evaluation of PaperBrowser. Then, we present how we appraise PaperBrowser's navigational functionalities in a user-based study using a text highlighting task and evaluation criteria of Human-Computer Interaction. Our results show that PaperBrowser reduces the amount of interactions between two highlighting events and therefore improves navigational efficiency by about 58% compared to the navigational mechanism that was previously available to the curators. Moreover, PaperBrowser is shown to provide curators with enhanced navigational utility by over 74% irrespective of the different ways in which they highlight text in the article. Conclusion We show that state-of-the-art performance in certain NLP tasks such as Named Entity Recognition and Anaphora Resolution can be combined with the navigational functionalities of PaperBrowser to support curation quite successfully. PMID:18410678

  20. Text Mining to Support Gene Ontology Curation and Vice Versa.

    PubMed

    Ruch, Patrick

    2017-01-01

    In this chapter, we explain how text mining can support the curation of molecular biology databases dealing with protein functions. We also show how curated data can play a disruptive role in the developments of text mining methods. We review a decade of efforts to improve the automatic assignment of Gene Ontology (GO) descriptors, the reference ontology for the characterization of genes and gene products. To illustrate the high potential of this approach, we compare the performances of an automatic text categorizer and show a large improvement of +225 % in both precision and recall on benchmarked data. We argue that automatic text categorization functions can ultimately be embedded into a Question-Answering (QA) system to answer questions related to protein functions. Because GO descriptors can be relatively long and specific, traditional QA systems cannot answer such questions. A new type of QA system, so-called Deep QA which uses machine learning methods trained with curated contents, is thus emerging. Finally, future advances of text mining instruments are directly dependent on the availability of high-quality annotated contents at every curation step. Databases workflows must start recording explicitly all the data they curate and ideally also some of the data they do not curate.

  1. Practices of research data curation in institutional repositories: A qualitative view from repository staff

    PubMed Central

    Stvilia, Besiki

    2017-01-01

    The importance of managing research data has been emphasized by the government, funding agencies, and scholarly communities. Increased access to research data increases the impact and efficiency of scientific activities and funding. Thus, many research institutions have established or plan to establish research data curation services as part of their Institutional Repositories (IRs). However, in order to design effective research data curation services in IRs, and to build active research data providers and user communities around those IRs, it is essential to study current data curation practices and provide rich descriptions of the sociotechnical factors and relationships shaping those practices. Based on 13 interviews with 15 IR staff members from 13 large research universities in the United States, this paper provides a rich, qualitative description of research data curation and use practices in IRs. In particular, the paper identifies data curation and use activities in IRs, as well as their structures, roles played, skills needed, contradictions and problems present, solutions sought, and workarounds applied. The paper can inform the development of best practice guides, infrastructure and service templates, as well as education in research data curation in Library and Information Science (LIS) schools. PMID:28301533

  2. Practices of research data curation in institutional repositories: A qualitative view from repository staff.

    PubMed

    Lee, Dong Joon; Stvilia, Besiki

    2017-01-01

    The importance of managing research data has been emphasized by the government, funding agencies, and scholarly communities. Increased access to research data increases the impact and efficiency of scientific activities and funding. Thus, many research institutions have established or plan to establish research data curation services as part of their Institutional Repositories (IRs). However, in order to design effective research data curation services in IRs, and to build active research data providers and user communities around those IRs, it is essential to study current data curation practices and provide rich descriptions of the sociotechnical factors and relationships shaping those practices. Based on 13 interviews with 15 IR staff members from 13 large research universities in the United States, this paper provides a rich, qualitative description of research data curation and use practices in IRs. In particular, the paper identifies data curation and use activities in IRs, as well as their structures, roles played, skills needed, contradictions and problems present, solutions sought, and workarounds applied. The paper can inform the development of best practice guides, infrastructure and service templates, as well as education in research data curation in Library and Information Science (LIS) schools.

  3. Development and Validation of Pathogen Environmental Monitoring Programs for Small Cheese Processing Facilities.

    PubMed

    Beno, Sarah M; Stasiewicz, Matthew J; Andrus, Alexis D; Ralyea, Robert D; Kent, David J; Martin, Nicole H; Wiedmann, Martin; Boor, Kathryn J

    2016-12-01

    Pathogen environmental monitoring programs (EMPs) are essential for food processing facilities of all sizes that produce ready-to-eat food products exposed to the processing environment. We developed, implemented, and evaluated EMPs targeting Listeria spp. and Salmonella in nine small cheese processing facilities, including seven farmstead facilities. Individual EMPs with monthly sample collection protocols were designed specifically for each facility. Salmonella was detected in only one facility, with likely introduction from the adjacent farm indicated by pulsed-field gel electrophoresis data. Listeria spp. were isolated from all nine facilities during routine sampling. The overall Listeria spp. (other than Listeria monocytogenes ) and L. monocytogenes prevalences in the 4,430 environmental samples collected were 6.03 and 1.35%, respectively. Molecular characterization and subtyping data suggested persistence of a given Listeria spp. strain in seven facilities and persistence of L. monocytogenes in four facilities. To assess routine sampling plans, validation sampling for Listeria spp. was performed in seven facilities after at least 6 months of routine sampling. This validation sampling was performed by independent individuals and included collection of 50 to 150 samples per facility, based on statistical sample size calculations. Two of the facilities had a significantly higher frequency of detection of Listeria spp. during the validation sampling than during routine sampling, whereas two other facilities had significantly lower frequencies of detection. This study provides a model for a science- and statistics-based approach to developing and validating pathogen EMPs.

  4. Ultra Pure Water Cleaning Baseline Study on NASA JSC Astromaterial Curation Gloveboxes

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Burkett, P. J.; Allton, J. H.; Allen, C. C.

    2013-01-01

    Future sample return missions will require strict protocols and procedures for reducing inorganic and organic contamination in isolation containment systems. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs [1, 2]. As part of this in-depth organic study, the current curatorial technical support procedure (TSP) 23 was used for cleaning the gloveboxes with ultra pure water (UPW) [3-5]. Particle counts and identification were obtained that could be used as a benchmark for future mission designs that require glovebox decontamination. The UPW baseline study demonstrates that TSP 23 works well for gloveboxes that have been thoroughly degreased. However, TSP 23 could be augmented to provide even better glovebox decontamination. JSC 03243 could be used as a starting point for further investigating optimal cleaning techniques and procedures. DuPont Vertrel XF or other chemical substitutes to replace Freon- 113, mechanical scrubbing, and newer technology could be used to enhance glovebox cleanliness in addition to high purity UPW final rinsing. Future sample return missions will significantly benefit from further cleaning studies to reduce inorganic and organic contamination.

  5. Text Mining Effectively Scores and Ranks the Literature for Improving Chemical-Gene-Disease Curation at the Comparative Toxicogenomics Database

    PubMed Central

    Johnson, Robin J.; Lay, Jean M.; Lennon-Hopkins, Kelley; Saraceni-Richards, Cynthia; Sciaky, Daniela; Murphy, Cynthia Grondin; Mattingly, Carolyn J.

    2013-01-01

    The Comparative Toxicogenomics Database (CTD; http://ctdbase.org/) is a public resource that curates interactions between environmental chemicals and gene products, and their relationships to diseases, as a means of understanding the effects of environmental chemicals on human health. CTD provides a triad of core information in the form of chemical-gene, chemical-disease, and gene-disease interactions that are manually curated from scientific articles. To increase the efficiency, productivity, and data coverage of manual curation, we have leveraged text mining to help rank and prioritize the triaged literature. Here, we describe our text-mining process that computes and assigns each article a document relevancy score (DRS), wherein a high DRS suggests that an article is more likely to be relevant for curation at CTD. We evaluated our process by first text mining a corpus of 14,904 articles triaged for seven heavy metals (cadmium, cobalt, copper, lead, manganese, mercury, and nickel). Based upon initial analysis, a representative subset corpus of 3,583 articles was then selected from the 14,094 articles and sent to five CTD biocurators for review. The resulting curation of these 3,583 articles was analyzed for a variety of parameters, including article relevancy, novel data content, interaction yield rate, mean average precision, and biological and toxicological interpretability. We show that for all measured parameters, the DRS is an effective indicator for scoring and improving the ranking of literature for the curation of chemical-gene-disease information at CTD. Here, we demonstrate how fully incorporating text mining-based DRS scoring into our curation pipeline enhances manual curation by prioritizing more relevant articles, thereby increasing data content, productivity, and efficiency. PMID:23613709

  6. The Nanomaterial Data Curation Initiative: A Collaborative Approach to Assessing, Evaluating, and Advancing the State of the Field

    EPA Science Inventory

    The Nanomaterial Data Curation Initiative (NDCI) explores the critical aspect of data curation within the development of informatics approaches to understanding nanomaterial behavior. Data repositories and tools for integrating and interrogating complex nanomaterial datasets are...

  7. The art and science of data curation: Lessons learned from constructing a virtual collection

    NASA Astrophysics Data System (ADS)

    Bugbee, Kaylin; Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick

    2018-03-01

    A digital, or virtual, collection is a value added service developed by libraries that curates information and resources around a topic, theme or organization. Adoption of the virtual collection concept as an Earth science data service improves the discoverability, accessibility and usability of data both within individual data centers but also across data centers and disciplines. In this paper, we introduce a methodology for systematically and rigorously curating Earth science data and information into a cohesive virtual collection. This methodology builds on the geocuration model of searching, selecting and synthesizing Earth science data, metadata and other information into a single and useful collection. We present our experiences curating a virtual collection for one of NASA's twelve Distributed Active Archive Centers (DAACs), the Global Hydrology Resource Center (GHRC), and describe lessons learned as a result of this curation effort. We also provide recommendations and best practices for data centers and data providers who wish to curate virtual collections for the Earth sciences.

  8. Non-synonymous variations in cancer and their effects on the human proteome: workflow for NGS data biocuration and proteome-wide analysis of TCGA data.

    PubMed

    Cole, Charles; Krampis, Konstantinos; Karagiannis, Konstantinos; Almeida, Jonas S; Faison, William J; Motwani, Mona; Wan, Quan; Golikov, Anton; Pan, Yang; Simonyan, Vahan; Mazumder, Raja

    2014-01-27

    Next-generation sequencing (NGS) technologies have resulted in petabytes of scattered data, decentralized in archives, databases and sometimes in isolated hard-disks which are inaccessible for browsing and analysis. It is expected that curated secondary databases will help organize some of this Big Data thereby allowing users better navigate, search and compute on it. To address the above challenge, we have implemented a NGS biocuration workflow and are analyzing short read sequences and associated metadata from cancer patients to better understand the human variome. Curation of variation and other related information from control (normal tissue) and case (tumor) samples will provide comprehensive background information that can be used in genomic medicine research and application studies. Our approach includes a CloudBioLinux Virtual Machine which is used upstream of an integrated High-performance Integrated Virtual Environment (HIVE) that encapsulates Curated Short Read archive (CSR) and a proteome-wide variation effect analysis tool (SNVDis). As a proof-of-concept, we have curated and analyzed control and case breast cancer datasets from the NCI cancer genomics program - The Cancer Genome Atlas (TCGA). Our efforts include reviewing and recording in CSR available clinical information on patients, mapping of the reads to the reference followed by identification of non-synonymous Single Nucleotide Variations (nsSNVs) and integrating the data with tools that allow analysis of effect nsSNVs on the human proteome. Furthermore, we have also developed a novel phylogenetic analysis algorithm that uses SNV positions and can be used to classify the patient population. The workflow described here lays the foundation for analysis of short read sequence data to identify rare and novel SNVs that are not present in dbSNP and therefore provides a more comprehensive understanding of the human variome. Variation results for single genes as well as the entire study are available from the CSR website (http://hive.biochemistry.gwu.edu/dna.cgi?cmd=csr). Availability of thousands of sequenced samples from patients provides a rich repository of sequence information that can be utilized to identify individual level SNVs and their effect on the human proteome beyond what the dbSNP database provides.

  9. Non-synonymous variations in cancer and their effects on the human proteome: workflow for NGS data biocuration and proteome-wide analysis of TCGA data

    PubMed Central

    2014-01-01

    Background Next-generation sequencing (NGS) technologies have resulted in petabytes of scattered data, decentralized in archives, databases and sometimes in isolated hard-disks which are inaccessible for browsing and analysis. It is expected that curated secondary databases will help organize some of this Big Data thereby allowing users better navigate, search and compute on it. Results To address the above challenge, we have implemented a NGS biocuration workflow and are analyzing short read sequences and associated metadata from cancer patients to better understand the human variome. Curation of variation and other related information from control (normal tissue) and case (tumor) samples will provide comprehensive background information that can be used in genomic medicine research and application studies. Our approach includes a CloudBioLinux Virtual Machine which is used upstream of an integrated High-performance Integrated Virtual Environment (HIVE) that encapsulates Curated Short Read archive (CSR) and a proteome-wide variation effect analysis tool (SNVDis). As a proof-of-concept, we have curated and analyzed control and case breast cancer datasets from the NCI cancer genomics program - The Cancer Genome Atlas (TCGA). Our efforts include reviewing and recording in CSR available clinical information on patients, mapping of the reads to the reference followed by identification of non-synonymous Single Nucleotide Variations (nsSNVs) and integrating the data with tools that allow analysis of effect nsSNVs on the human proteome. Furthermore, we have also developed a novel phylogenetic analysis algorithm that uses SNV positions and can be used to classify the patient population. The workflow described here lays the foundation for analysis of short read sequence data to identify rare and novel SNVs that are not present in dbSNP and therefore provides a more comprehensive understanding of the human variome. Variation results for single genes as well as the entire study are available from the CSR website (http://hive.biochemistry.gwu.edu/dna.cgi?cmd=csr). Conclusions Availability of thousands of sequenced samples from patients provides a rich repository of sequence information that can be utilized to identify individual level SNVs and their effect on the human proteome beyond what the dbSNP database provides. PMID:24467687

  10. The yellow Fever epidemic in Western mali, september-november 1987: why did epidemiological surveillance fail?

    PubMed

    Kurz, X

    1990-03-01

    Recent yellow fever epidemics in West Africa have underlined the discrepancy between the official number of cases and deaths and those estimated by a retrospective epidemiological investigation. During the yellow fever epidemic that broke out in western Mali in September 1987, a total of 305 cases and 145 deaths were officially notified, but estimates revealed true figures abut five times higher. This paper attempts to discuss the factors that hindered early case detection and more complete reporting. They were, first, the insufficient training on the clinical diagnosis, the blood sampling method for laboratory confirmation, and the curative treatment of patients (resulting in low utilization of services); second, the lack of an action plan to prepare in advance a quick response to the epidemic, affecting reporting procedures at the peripheral level and active case-finding during the outbreak; and third, the lack of laboratory facilities for a quick confirmation of the disease. The difficulties experienced during the yellow fever epidemic in Mali demonstrated the importance of a preparedness strategy for epidemic control, based on an integrated approach of epidemiological surveillance within basic health service activities. The need for regional collaboration and for institutionalized funds in the donor community that could be mobilized for epidemic preparedness activities is also emphasized.

  11. Preserving the Science Legacy from the Apollo Missions to the Moon

    NASA Technical Reports Server (NTRS)

    Evans, Cindy; Zeigler, Ryan; Lehnert, Kerstin; Todd, Nancy; Blumenfeld, Erika

    2015-01-01

    Six Apollo missions landed on the Moon from 1969-72, returning to Earth 382 kg of lunar rock, soil, and core samples-among the best documented and preserved samples on Earth that have supported a robust research program for 45 years. From mission planning through sample collection, preliminary examination, and subsequent research, strict protocols and procedures are followed for handling and allocating Apollo subsamples. Even today, 100s of samples are allocated for research each year, building on the science foundation laid down by the early Apollo sample studies and combining new data from today's instrumentation, lunar remote sensing missions and lunar meteorites. Today's research includes advances in our understanding of lunar volatiles, lunar formation and evolution, and the origin of evolved lunar lithologies. Much sample information is available to researchers at curator.jsc.nasa.gov. Decades of analyses on lunar samples are published in LPSC proceedings volumes and other peer-reviewed journals, and tabulated in lunar sample compendia entries. However, for much of the 1969-1995 period, the processing documentation, individual and consortia analyses, and unpublished results exist only in analog forms or primitive digital formats that are either inaccessible or at risk of being lost forever because critical data from early investigators remain unpublished. We have initiated several new efforts to rescue some of the early Apollo data, including unpublished analytical data. We are scanning NASA documentation that is related to the Apollo missions and sample processing, and we are collaborating with IEDA to establish a geochemical database called Moon DB. To populate this database, we are working with prominent lunar PIs to organize and transcribe years of both published and unpublished data. Other initiatives include micro-CT scanning of complex lunar samples to document their interior structure (e.g. clasts, vesicles); linking high-resolution scans of Apollo film products to samples; and new procedures for systematic high resolution photography of samples before additional processing, enabling detailed 3D reconstructions of the samples. All of these efforts will provide comprehensive access to Apollo samples and support better curation of the samples for decades to come.

  12. Sample Return Primer and Handbook

    NASA Technical Reports Server (NTRS)

    Barrow, Kirk; Cheuvront, Allan; Faris, Grant; Hirst, Edward; Mainland, Nora; McGee, Michael; Szalai, Christine; Vellinga, Joseph; Wahl, Thomas; Williams, Kenneth; hide

    2007-01-01

    This three-part Sample Return Primer and Handbook provides a road map for conducting the terminal phase of a sample return mission. The main chapters describe element-by-element analyses and trade studies, as well as required operations plans, procedures, contingencies, interfaces, and corresponding documentation. Based on the experiences of the lead Stardust engineers, the topics include systems engineering (in particular range safety compliance), mission design and navigation, spacecraft hardware and entry, descent, and landing certification, flight and recovery operations, mission assurance and system safety, test and training, and the very important interactions with external support organizations (non-NASA tracking assets, landing site support, and science curation).

  13. Effect of an exercise training intervention with resistance bands on blood cell counts during chemotherapy for lung cancer: a pilot randomized controlled trial.

    PubMed

    Karvinen, Kristina H; Esposito, David; Raedeke, Thomas D; Vick, Joshua; Walker, Paul R

    2014-01-01

    Chemotherapy for lung cancer can have a detrimental effect on white blood cell (WBC) and red blood cell (RBC) counts. Physical exercise may have a role in improving WBCs and RBCs, although few studies have examined cancer patients receiving adjuvant therapies. The purpose of this pilot trial was to examine the effects of an exercise intervention utilizing resistance bands on WBCs and RBCs in lung cancer patients receiving curative intent chemotherapy. A sample of lung cancer patients scheduled for curative intent chemotherapy was randomly assigned to the exercise intervention (EX) condition or usual care (UC) condition. The EX condition participated in a three times weekly exercise program using resistance bands for the duration of chemotherapy. A total of 14 lung cancer patients completed the trial. EX condition participants completed 79% of planned exercise sessions. The EX condition was able to maintain WBCs over the course of the intervention compared to declines in the UC condition (p = .008; d = 1.68). There were no significant differences in change scores in RBCs. Exercise with resistance bands may help attenuate declines in WBCs in lung cancer patients receiving curative intent chemotherapy. Larger trials are warranted to validate these findings. Ultimately these findings could be informative for the development of supportive care strategies for lung cancer patients receiving chemotherapy. Clinical Trials Registration #: NCT01130714.

  14. Preserving the Science Legacy from the Apollo Missions to the Moon

    NASA Astrophysics Data System (ADS)

    Todd, N. S.; Evans, C. A.; Zeigler, R. A.; Lehnert, K. A.

    2015-12-01

    Six Apollo missions landed on the Moon from 1969-72, returning to Earth 382 kg of lunar rock, soil, and core samples—among the best documented and preserved samples on Earth that have supported a robust research program for 45 years. From mission planning through sample collection, preliminary examination, and subsequent research, strict protocols and procedures are followed for handling and allocating Apollo subsamples. Even today, 100s of samples are allocated for research each year, building on the science foundation laid down by the early Apollo sample studies and combining new data from today's instrumentation, lunar remote sensing missions and lunar meteorites. Today's research includes advances in our understanding of lunar volatiles, lunar formation and evolution, and the origin of evolved lunar lithologies. Much sample information is available to researchers at curator.jsc.nasa.gov. Decades of analyses on lunar samples are published in LPSC proceedings volumes and other peer-reviewed journals, and tabulated in lunar sample compendia entries. However, for much of the 1969-1995 period, the processing documentation, individual and consortia analyses, and unpublished results exist only in analog forms or primitive digital formats that are either inaccessible or at risk of being lost forever because critical data from early investigators remain unpublished. We have initiated several new efforts to rescue some of the early Apollo data, including unpublished analytical data. We are scanning NASA documentation that is related to the Apollo missions and sample processing, and we are collaborating with IEDA to establish a geochemical database called Moon DB. To populate this database, we are working with prominent lunar PIs to organize and transcribe years of both published and unpublished data. Other initiatives include micro-CT scanning of complex lunar samples to document their interior structure (e.g. clasts, vesicles); linking high-resolution scans of Apollo film products to samples; and new procedures for systematic high resolution photography of samples before additional processing, enabling detailed 3D reconstructions of the samples. All of these efforts will provide comprehensive access to Apollo samples and support better curation of the samples for decades to come.

  15. SolCyc: a database hub at the Sol Genomics Network (SGN) for the manual curation of metabolic networks in Solanum and Nicotiana specific databases

    PubMed Central

    Foerster, Hartmut; Bombarely, Aureliano; Battey, James N D; Sierro, Nicolas; Ivanov, Nikolai V; Mueller, Lukas A

    2018-01-01

    Abstract SolCyc is the entry portal to pathway/genome databases (PGDBs) for major species of the Solanaceae family hosted at the Sol Genomics Network. Currently, SolCyc comprises six organism-specific PGDBs for tomato, potato, pepper, petunia, tobacco and one Rubiaceae, coffee. The metabolic networks of those PGDBs have been computationally predicted by the pathologic component of the pathway tools software using the manually curated multi-domain database MetaCyc (http://www.metacyc.org/) as reference. SolCyc has been recently extended by taxon-specific databases, i.e. the family-specific SolanaCyc database, containing only curated data pertinent to species of the nightshade family, and NicotianaCyc, a genus-specific database that stores all relevant metabolic data of the Nicotiana genus. Through manual curation of the published literature, new metabolic pathways have been created in those databases, which are complemented by the continuously updated, relevant species-specific pathways from MetaCyc. At present, SolanaCyc comprises 199 pathways and 29 superpathways and NicotianaCyc accounts for 72 pathways and 13 superpathways. Curator-maintained, taxon-specific databases such as SolanaCyc and NicotianaCyc are characterized by an enrichment of data specific to these taxa and free of falsely predicted pathways. Both databases have been used to update recently created Nicotiana-specific databases for Nicotiana tabacum, Nicotiana benthamiana, Nicotiana sylvestris and Nicotiana tomentosiformis by propagating verifiable data into those PGDBs. In addition, in-depth curation of the pathways in N.tabacum has been carried out which resulted in the elimination of 156 pathways from the 569 pathways predicted by pathway tools. Together, in-depth curation of the predicted pathway network and the supplementation with curated data from taxon-specific databases has substantially improved the curation status of the species–specific N.tabacum PGDB. The implementation of this strategy will significantly advance the curation status of all organism-specific databases in SolCyc resulting in the improvement on database accuracy, data analysis and visualization of biochemical networks in those species. Database URL https://solgenomics.net/tools/solcyc/ PMID:29762652

  16. Automated PDF highlighting to support faster curation of literature for Parkinson's and Alzheimer's disease.

    PubMed

    Wu, Honghan; Oellrich, Anika; Girges, Christine; de Bono, Bernard; Hubbard, Tim J P; Dobson, Richard J B

    2017-01-01

    Neurodegenerative disorders such as Parkinson's and Alzheimer's disease are devastating and costly illnesses, a source of major global burden. In order to provide successful interventions for patients and reduce costs, both causes and pathological processes need to be understood. The ApiNATOMY project aims to contribute to our understanding of neurodegenerative disorders by manually curating and abstracting data from the vast body of literature amassed on these illnesses. As curation is labour-intensive, we aimed to speed up the process by automatically highlighting those parts of the PDF document of primary importance to the curator. Using techniques similar to those of summarisation, we developed an algorithm that relies on linguistic, semantic and spatial features. Employing this algorithm on a test set manually corrected for tool imprecision, we achieved a macro F 1 -measure of 0.51, which is an increase of 132% compared to the best bag-of-words baseline model. A user based evaluation was also conducted to assess the usefulness of the methodology on 40 unseen publications, which reveals that in 85% of cases all highlighted sentences are relevant to the curation task and in about 65% of the cases, the highlights are sufficient to support the knowledge curation task without needing to consult the full text. In conclusion, we believe that these are promising results for a step in automating the recognition of curation-relevant sentences. Refining our approach to pre-digest papers will lead to faster processing and cost reduction in the curation process. https://github.com/KHP-Informatics/NapEasy. © The Author(s) 2017. Published by Oxford University Press.

  17. Automated PDF highlighting to support faster curation of literature for Parkinson’s and Alzheimer’s disease

    PubMed Central

    Oellrich, Anika; Girges, Christine; de Bono, Bernard; Hubbard, Tim J.P.; Dobson, Richard J.B.

    2017-01-01

    Abstract Neurodegenerative disorders such as Parkinson’s and Alzheimer’s disease are devastating and costly illnesses, a source of major global burden. In order to provide successful interventions for patients and reduce costs, both causes and pathological processes need to be understood. The ApiNATOMY project aims to contribute to our understanding of neurodegenerative disorders by manually curating and abstracting data from the vast body of literature amassed on these illnesses. As curation is labour-intensive, we aimed to speed up the process by automatically highlighting those parts of the PDF document of primary importance to the curator. Using techniques similar to those of summarisation, we developed an algorithm that relies on linguistic, semantic and spatial features. Employing this algorithm on a test set manually corrected for tool imprecision, we achieved a macro F1-measure of 0.51, which is an increase of 132% compared to the best bag-of-words baseline model. A user based evaluation was also conducted to assess the usefulness of the methodology on 40 unseen publications, which reveals that in 85% of cases all highlighted sentences are relevant to the curation task and in about 65% of the cases, the highlights are sufficient to support the knowledge curation task without needing to consult the full text. In conclusion, we believe that these are promising results for a step in automating the recognition of curation-relevant sentences. Refining our approach to pre-digest papers will lead to faster processing and cost reduction in the curation process. Database URL: https://github.com/KHP-Informatics/NapEasy PMID:28365743

  18. Design of Community Resource Inventories as a Component of Scalable Earth Science Infrastructure: Experience of the Earthcube CINERGI Project

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Richard, S. M.; Valentine, D. W., Jr.; Grethe, J. S.; Hsu, L.; Malik, T.; Bermudez, L. E.; Gupta, A.; Lehnert, K. A.; Whitenack, T.; Ozyurt, I. B.; Condit, C.; Calderon, R.; Musil, L.

    2014-12-01

    EarthCube is envisioned as a cyberinfrastructure that fosters new, transformational geoscience by enabling sharing, understanding and scientifically-sound and efficient re-use of formerly unconnected data resources, software, models, repositories, and computational power. Its purpose is to enable science enterprise and workforce development via an extensible and adaptable collaboration and resource integration framework. A key component of this vision is development of comprehensive inventories supporting resource discovery and re-use across geoscience domains. The goal of the EarthCube CINERGI (Community Inventory of EarthCube Resources for Geoscience Interoperability) project is to create a methodology and assemble a large inventory of high-quality information resources with standard metadata descriptions and traceable provenance. The inventory is compiled from metadata catalogs maintained by geoscience data facilities, as well as from user contributions. The latter mechanism relies on community resource viewers: online applications that support update and curation of metadata records. Once harvested into CINERGI, metadata records from domain catalogs and community resource viewers are loaded into a staging database implemented in MongoDB, and validated for compliance with ISO 19139 metadata schema. Several types of metadata defects detected by the validation engine are automatically corrected with help of several information extractors or flagged for manual curation. The metadata harvesting, validation and processing components generate provenance statements using W3C PROV notation, which are stored in a Neo4J database. Thus curated metadata, along with the provenance information, is re-published and accessed programmatically and via a CINERGI online application. This presentation focuses on the role of resource inventories in a scalable and adaptable information infrastructure, and on the CINERGI metadata pipeline and its implementation challenges. Key project components are described at the project's website (http://workspace.earthcube.org/cinergi), which also provides access to the initial resource inventory, the inventory metadata model, metadata entry forms and a collection of the community resource viewers.

  19. Biodiversity and molecular ecology of Russula and Lactarius in Alaska based on soil and sporocarp DNA sequences

    Treesearch

    Geml J.; D.L. Taylor

    2013-01-01

    Although critical for the functioning of ecosystems, fungi are poorly known in highlatitude regions. This paper summarizes the results of the first genetic diversity assessments of Russula and Lactarius, two of the most diverse and abundant fungal genera in Alaska. SU rDNA sequences from both curated sporocarp collections and soil PCR clone libraries sampled in...

  20. Individualized Mutation Detection in Circulating Tumor DNA for Monitoring Colorectal Tumor Burden Using a Cancer-Associated Gene Sequencing Panel.

    PubMed

    Sato, Kei A; Hachiya, Tsuyoshi; Iwaya, Takeshi; Kume, Kohei; Matsuo, Teppei; Kawasaki, Keisuke; Abiko, Yukito; Akasaka, Risaburo; Matsumoto, Takayuki; Otsuka, Koki; Nishizuka, Satoshi S

    2016-01-01

    Circulating tumor DNA (ctDNA) carries information on tumor burden. However, the mutation spectrum is different among tumors. This study was designed to examine the utility of ctDNA for monitoring tumor burden based on an individual mutation profile. DNA was extracted from a total of 176 samples, including pre- and post-operational plasma, primary tumors, and peripheral blood mononuclear cells (PBMC), from 44 individuals with colorectal tumor who underwent curative resection of colorectal tumors, as well as nine healthy individuals. Using a panel of 50 cancer-associated genes, tumor-unique mutations were identified by comparing the single nucleotide variants (SNVs) from tumors and PBMCs with an Ion PGM sequencer. A group of the tumor-unique mutations from individual tumors were designated as individual marker mutations (MMs) to trace tumor burden by ctDNA using droplet digital PCR (ddPCR). From these experiments, three major objectives were assessed: (a) Tumor-unique mutations; (b) mutation spectrum of a tumor; and (c) changes in allele frequency of the MMs in ctDNA after curative resection of the tumor. A total of 128 gene point mutations were identified in 27 colorectal tumors. Twenty-six genes were mutated in at least 1 sample, while 14 genes were found to be mutated in only 1 sample, respectively. An average of 2.7 genes were mutated per tumor. Subsequently, 24 MMs were selected from SNVs for tumor burden monitoring. Among the MMs found by ddPCR with > 0.1% variant allele frequency in plasma DNA, 100% (8 out of 8) exhibited a decrease in post-operation ctDNA, whereas none of the 16 MMs found by ddPCR with < 0.1% variant allele frequency in plasma DNA showed a decrease. This panel of 50 cancer-associated genes appeared to be sufficient to identify individual, tumor-unique, mutated ctDNA markers in cancer patients. The MMs showed the clinical utility in monitoring curatively-treated colorectal tumor burden if the allele frequency of MMs in plasma DNA is above 0.1%.

  1. Gender inequities in curative and preventive health care use among infants in Bihar, India.

    PubMed

    Vilms, Rohan J; McDougal, Lotus; Atmavilas, Yamini; Hay, Katherine; Triplett, Daniel P; Silverman, Jay; Raj, Anita

    2017-12-01

    India has the highest rate of excess female infant deaths in the world. Studies with decade-old data suggest gender inequities in infant health care seeking, but little new large-scale research has examined this issue. We assessed differences in health care utilization by sex of the child, using 2014 data for Bihar, India. This was a cross-sectional analysis of statewide representative survey data collected for a non-blinded maternal and child health evaluation study. Participants included mothers of living singleton infants (n = 11 570). Sex was the main exposure. Outcomes included neonatal illness, care seeking for neonatal illness, hospitalization, facility-based postnatal visits, immunizations, and postnatal home visits by frontline workers. Analyses were conducted via multiple logistic regression with survey weights. The estimated infant sex ratio was 863 females per 1000 males. Females had lower rates of reported neonatal illness (odds ratio (OR) = 0.7, 95% confidence interval (CI) = 0.6-0.9) and hospitalization during infancy (OR = 0.4, 95% CI = 0.3-0.6). Girl neonates had a significantly lower odds of receiving care if ill (80.6% vs 89.1%; OR = 0.5; 95% CI = 0.3-0.8) and lower odds of having a postnatal checkup visit within one month of birth (5.4% vs 7.3%; OR = 0.7, 95% CI = 0.6-0.9). The gender inequity in care seeking was more profound at lower wealth and higher numbers of siblings. Gender differences in immunization and frontline worker visits were not seen. Girls in Bihar have lower odds than boys of receiving facility-based curative and preventive care, and this inequity may partially explain the persistent sex ratio imbalance and excess female mortality. Frontline worker home visits may offer a means of helping better support care for girls.

  2. The Library as Partner in University Data Curation: A Case Study in Collaboration

    ERIC Educational Resources Information Center

    Latham, Bethany; Poe, Jodi Welch

    2012-01-01

    Data curation is a concept with many facets. Curation goes beyond research-generated data, and its principles can support the preservation of institutions' historical data. Libraries are well-positioned to bring relevant expertise to such problems, especially those requiring collaboration, because of their experience as neutral caretakers and…

  3. Curating the Shelves

    ERIC Educational Resources Information Center

    Schiano, Deborah

    2013-01-01

    Curation: to gather, organize, and present resources in a way that meets information needs and interests, makes sense for virtual as well as physical resources. A Northern New Jersey middle school library made the decision to curate its physical resources according to the needs of its users, and, in so doing, created a shelving system that is,…

  4. Data Curation

    ERIC Educational Resources Information Center

    Mallon, Melissa, Ed.

    2012-01-01

    In their Top Trends of 2012, the Association of College and Research Libraries (ACRL) named data curation as one of the issues to watch in academic libraries in the near future (ACRL, 2012, p. 312). Data curation can be summarized as "the active and ongoing management of data through its life cycle of interest and usefulness to scholarship,…

  5. Health Care Reform and Concurrent Curative Care for Terminally Ill Children: A Policy Analysis

    PubMed Central

    Lindley, Lisa C.

    2012-01-01

    Within the Patient Protection and Affordable Care Act of 2010 or health care reform, is a relatively small provision about concurrent curative care that significantly affects terminally ill children. Effective on March 23, 2010, terminally ill children, who are enrolled in a Medicaid or state Children’s Health Insurance Plans (CHIP) hospice benefit, may concurrently receive curative care related to their terminal health condition. The purpose of this article was to conduct a policy analysis of the concurrent curative care legislation by examining the intended goals of the policy to improve access to care and enhance quality of end of life care for terminally ill children. In addition, the policy analysis explored the political feasibility of implementing concurrent curative care at the state-level. Based on this policy analysis, the federal policy of concurrent curative care for children would generally achieve its intended goals. However, important policy omissions focus attention on the need for further federal end of life care legislation for children. These findings have implications nurses. PMID:22822304

  6. The Nanomaterial Data Curation Initiative: A collaborative approach to assessing, evaluating, and advancing the state of the field

    PubMed Central

    Powers, Christina M; Hoover, Mark D; Harper, Stacey L

    2015-01-01

    Summary The Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Program Nanotechnology Working Group (NCIP NanoWG), explores the critical aspect of data curation within the development of informatics approaches to understanding nanomaterial behavior. Data repositories and tools for integrating and interrogating complex nanomaterial datasets are gaining widespread interest, with multiple projects now appearing in the US and the EU. Even in these early stages of development, a single common aspect shared across all nanoinformatics resources is that data must be curated into them. Through exploration of sub-topics related to all activities necessary to enable, execute, and improve the curation process, the NDCI will provide a substantive analysis of nanomaterial data curation itself, as well as a platform for multiple other important discussions to advance the field of nanoinformatics. This article outlines the NDCI project and lays the foundation for a series of papers on nanomaterial data curation. The NDCI purpose is to: 1) present and evaluate the current state of nanomaterial data curation across the field on multiple specific data curation topics, 2) propose ways to leverage and advance progress for both individual efforts and the nanomaterial data community as a whole, and 3) provide opportunities for similar publication series on the details of the interactive needs and workflows of data customers, data creators, and data analysts. Initial responses from stakeholder liaisons throughout the nanoinformatics community reveal a shared view that it will be critical to focus on integration of datasets with specific orientation toward the purposes for which the individual resources were created, as well as the purpose for integrating multiple resources. Early acknowledgement and undertaking of complex topics such as uncertainty, reproducibility, and interoperability is proposed as an important path to addressing key challenges within the nanomaterial community, such as reducing collateral negative impacts and decreasing the time from development to market for this new class of technologies. PMID:26425427

  7. Implementation of patient charges at primary care facilities in Kenya: implications of low adherence to user fee policy for users and facility revenue

    PubMed Central

    Opwora, Antony; Waweru, Evelyn; Toda, Mitsuru; Noor, Abdisalan; Edwards, Tansy; Fegan, Greg; Molyneux, Sassy; Goodman, Catherine

    2015-01-01

    With user fees now seen as a major hindrance to universal health coverage, many countries have introduced fee reduction or elimination policies, but there is growing evidence that adherence to reduced fees is often highly imperfect. In 2004, Kenya adopted a reduced and uniform user fee policy providing fee exemptions to many groups. We present data on user fee implementation, revenue and expenditure from a nationally representative survey of Kenyan primary health facilities. Data were collected from 248 randomly selected public health centres and dispensaries in 2010, comprising an interview with the health worker in charge, exit interviews with curative outpatients, and a financial record review. Adherence to user fee policy was assessed for eight tracer conditions based on health worker reports, and patients were asked about actual amounts paid. No facilities adhered fully to the user fee policy across all eight tracers, with adherence ranging from 62.2% for an adult with tuberculosis to 4.2% for an adult with malaria. Three quarters of exit interviewees had paid some fees, with a median payment of US dollars (USD) 0.39, and a quarter of interviewees were required to purchase additional medical supplies at a later stage from a private drug retailer. No consistent pattern of association was identified between facility characteristics and policy adherence. User fee revenues accounted for almost all facility cash income, with average revenue of USD 683 per facility per year. Fee revenue was mainly used to cover support staff, non-drug supplies and travel allowances. Adherence to user fee policy was very low, leading to concerns about the impact on access and the financial burden on households. However, the potential to ensure adherence was constrained by the facilities’ need for revenue to cover basic operating costs, highlighting the need for alternative funding strategies for peripheral health facilities. PMID:24837638

  8. Treatment at high-volume facilities and academic centers is independently associated with improved survival in patients with locally advanced head and neck cancer.

    PubMed

    David, John M; Ho, Allen S; Luu, Michael; Yoshida, Emi J; Kim, Sungjin; Mita, Alain C; Scher, Kevin S; Shiao, Stephen L; Tighiouart, Mourad; Zumsteg, Zachary S

    2017-10-15

    The treatment of head and neck cancers is complex and associated with significant morbidity, requiring multidisciplinary care and physician expertise. Thus, facility characteristics, such as clinical volume and academic status, may influence outcomes. The current study included 46,567 patients taken from the National Cancer Data Base who were diagnosed with locally advanced invasive squamous cell carcinomas of the oropharynx, larynx, and hypopharynx and were undergoing definitive radiotherapy. High-volume facilities (HVFs) were defined as the top 1% of centers by the number of patients treated from 2004 through 2012. Multivariable Cox regression and propensity score matching were performed to account for imbalances in covariates. The median follow-up was 55.1 months. Treatment at a HVF (hazard ratio, 0.798; 95% confidence interval, 0.753-0.845 [P<.001]) and treatment at an academic facility (hazard ratio, 0.897; 95% confidence interval, 0.871-0.923 [P<.001]) were found to be independently associated with improved overall survival in multivariable analysis. In propensity score-matched cohorts, the 5-year overall survival rate was 61.6% versus 55.5% for patients treated at an HVF versus lower-volume facilities, respectively (P<.001). Similarly, the 5-year overall survival rate was 52.3% versus 49.7% for patients treated at academic versus nonacademic facilities (P<.001). Analysis of facility volume as a continuous variable demonstrated continual improvement in survival with an increased number of patients treated. The impact of facility volume and academic designation on survival was observed when using a variety of thresholds to define HVF, and across the vast majority of subgroups, including both oropharyngeal and nonoropharyngeal subsites. Patients with locally advanced head and neck squamous cell carcinoma who are undergoing curative radiotherapy at HVFs and academic centers appear to have improved survival. Cancer 2017;123:3933-42. © 2017 American Cancer Society. © 2017 American Cancer Society.

  9. A semi-automated methodology for finding lipid-related GO terms.

    PubMed

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  10. The curation paradigm and application tool used for manual curation of the scientific literature at the Comparative Toxicogenomics Database

    PubMed Central

    Davis, Allan Peter; Wiegers, Thomas C.; Murphy, Cynthia G.; Mattingly, Carolyn J.

    2011-01-01

    The Comparative Toxicogenomics Database (CTD) is a public resource that promotes understanding about the effects of environmental chemicals on human health. CTD biocurators read the scientific literature and convert free-text information into a structured format using official nomenclature, integrating third party controlled vocabularies for chemicals, genes, diseases and organisms, and a novel controlled vocabulary for molecular interactions. Manual curation produces a robust, richly annotated dataset of highly accurate and detailed information. Currently, CTD describes over 349 000 molecular interactions between 6800 chemicals, 20 900 genes (for 330 organisms) and 4300 diseases that have been manually curated from over 25 400 peer-reviewed articles. This manually curated data are further integrated with other third party data (e.g. Gene Ontology, KEGG and Reactome annotations) to generate a wealth of toxicogenomic relationships. Here, we describe our approach to manual curation that uses a powerful and efficient paradigm involving mnemonic codes. This strategy allows biocurators to quickly capture detailed information from articles by generating simple statements using codes to represent the relationships between data types. The paradigm is versatile, expandable, and able to accommodate new data challenges that arise. We have incorporated this strategy into a web-based curation tool to further increase efficiency and productivity, implement quality control in real-time and accommodate biocurators working remotely. Database URL: http://ctd.mdibl.org PMID:21933848

  11. Receptivity to Library Involvement in Scientific Data Curation: A Case Study at the University of Colorado Boulder

    ERIC Educational Resources Information Center

    Lage, Kathryn; Losoff, Barbara; Maness, Jack

    2011-01-01

    Increasingly libraries are expected to play a role in scientific data curation initiatives, i.e., "the management and preservation of digital data over the long-term." This case study offers a novel approach for identifying researchers who are receptive toward library involvement in data curation. The authors interviewed researchers at…

  12. Co-Curate: Working with Schools and Communities to Add Value to Open Collections

    ERIC Educational Resources Information Center

    Cotterill, Simon; Hudson, Martyn; Lloyd, Katherine; Outterside, James; Peterson, John; Coburn, John; Thomas, Ulrike; Tiplady, Lucy; Robinson, Phil; Heslop, Phil

    2016-01-01

    Co-Curate North East is a cross-disciplinary initiative involving Newcastle University and partner organisations, working with schools and community groups in the North East of England. Co-curation builds on the concept of the "ecomuseum" model for heritage based around a virtual territory, social memory and participative input from the…

  13. Digital Curation as a Core Competency in Current Learning and Literacy: A Higher Education Perspective

    ERIC Educational Resources Information Center

    Ungerer, Leona M.

    2016-01-01

    Digital curation may be regarded as a core competency in higher education since it contributes to establishing a sense of metaliteracy (an essential requirement for optimally functioning in a modern media environment) among students. Digital curation is gradually finding its way into higher education curricula aimed at fostering social media…

  14. The preventive-curative conflict in primary health care.

    PubMed

    De Sa, C

    1993-04-01

    Approximately 80% of the rural population in developing countries do not have access to appropriate curative care. The primary health care (PHC) approach emphasizes promotive and preventive services. Yet most people in developing countries consider curative care to be more important. Thus, PHC should include curative and rehabilitative care along with preventive and promotive care. The conflict between preventive and curative care is apparent at the community level, among health workers from all levels of the health system, and among policy makers. Community members are sometimes willing to pay for curative services but not preventive services. Further, they believe that they already know enough to prevent illness. Community health workers (CHWs), the mainstays of most PHC projects are trained in preventive efforts, but this hinders their effectiveness, since the community expects curative care. Besides, 66% of villagers' health problems require curative care. Further, CHWs are isolated from health professionals, adding to their inability to effect positive change. Health professionals are often unable to set up a relationship of trust with the community, largely due to their urban-based medical education. They tend not to explain treatment to patients or to simplify explanations in a condescending manner. They also mystify diseases, preventing people from understanding their own bodies and managing their illnesses. National governments often misinterpret national health policies promoting PHC and implement them from a top-down approach rather than from the bottom-up PHC-advocated approach. Nongovernmental organizations (NGOs) and international agencies also interpret PHC in different ways. Still, strong partnerships between government, NGOs, private sector, and international agencies are needed for effective implementation of PHC. Yet, many countries continue to have complex hierarchical social structures, inequitable distribution, and inadequate resources, making it difficult to implement effective PHC.

  15. Text Mining Genotype-Phenotype Relationships from Biomedical Literature for Database Curation and Precision Medicine.

    PubMed

    Singhal, Ayush; Simmons, Michael; Lu, Zhiyong

    2016-11-01

    The practice of precision medicine will ultimately require databases of genes and mutations for healthcare providers to reference in order to understand the clinical implications of each patient's genetic makeup. Although the highest quality databases require manual curation, text mining tools can facilitate the curation process, increasing accuracy, coverage, and productivity. However, to date there are no available text mining tools that offer high-accuracy performance for extracting such triplets from biomedical literature. In this paper we propose a high-performance machine learning approach to automate the extraction of disease-gene-variant triplets from biomedical literature. Our approach is unique because we identify the genes and protein products associated with each mutation from not just the local text content, but from a global context as well (from the Internet and from all literature in PubMed). Our approach also incorporates protein sequence validation and disease association using a novel text-mining-based machine learning approach. We extract disease-gene-variant triplets from all abstracts in PubMed related to a set of ten important diseases (breast cancer, prostate cancer, pancreatic cancer, lung cancer, acute myeloid leukemia, Alzheimer's disease, hemochromatosis, age-related macular degeneration (AMD), diabetes mellitus, and cystic fibrosis). We then evaluate our approach in two ways: (1) a direct comparison with the state of the art using benchmark datasets; (2) a validation study comparing the results of our approach with entries in a popular human-curated database (UniProt) for each of the previously mentioned diseases. In the benchmark comparison, our full approach achieves a 28% improvement in F1-measure (from 0.62 to 0.79) over the state-of-the-art results. For the validation study with UniProt Knowledgebase (KB), we present a thorough analysis of the results and errors. Across all diseases, our approach returned 272 triplets (disease-gene-variant) that overlapped with entries in UniProt and 5,384 triplets without overlap in UniProt. Analysis of the overlapping triplets and of a stratified sample of the non-overlapping triplets revealed accuracies of 93% and 80% for the respective categories (cumulative accuracy, 77%). We conclude that our process represents an important and broadly applicable improvement to the state of the art for curation of disease-gene-variant relationships.

  16. Data and the Shift in Systems, Services, and Literacy

    ERIC Educational Resources Information Center

    Mitchell, Erik T.

    2012-01-01

    This month, the "Journal of Web Librarianship" is exploring the idea of data curation and its uses in libraries. The word "data" is as universal now as the word "cloud" was last year, and it is no accident that libraries are exploring how best to support data curation services. Data curation involves library activities in just about every way,…

  17. Characteristics of Quality OER Textbook Curators: An Analysis of Tullahoma City Schools' Flexbook Curators

    ERIC Educational Resources Information Center

    Hodge, Zach

    2017-01-01

    Tullahoma City Schools, a rural district in Middle Tennessee, recently switched from traditional static textbooks to an online, open educational resource platform. As a result of this change the role of curator, a teacher who creates the Flexbook by compiling and organizing content, was created. This research project sought to add to the limited…

  18. Jointly creating digital abstracts: dealing with synonymy and polysemy

    PubMed Central

    2012-01-01

    Background Ideally each Life Science article should get a ‘structured digital abstract’. This is a structured summary of the paper’s findings that is both human-verified and machine-readable. But articles can contain a large variety of information types and contextual details that all need to be reconciled with appropriate names, terms and identifiers, which poses a challenge to any curator. Current approaches mostly use tagging or limited entry-forms for semantic encoding. Findings We implemented a ‘controlled language’ as a more expressive representation method. We studied how usable this format was for wet-lab-biologists that volunteered as curators. We assessed some issues that arise with the usability of ontologies and other controlled vocabularies, for the encoding of structured information by ‘untrained’ curators. We take a user-oriented viewpoint, and make recommendations that may prove useful for creating a better curation environment: one that can engage a large community of volunteer curators. Conclusions Entering information in a biocuration environment could improve in expressiveness and user-friendliness, if curators would be enabled to use synonymous and polysemous terms literally, whereby each term stays linked to an identifier. PMID:23110757

  19. Research Problems in Data Curation: Outcomes from the Data Curation Education in Research Centers Program

    NASA Astrophysics Data System (ADS)

    Palmer, C. L.; Mayernik, M. S.; Weber, N.; Baker, K. S.; Kelly, K.; Marlino, M. R.; Thompson, C. A.

    2013-12-01

    The need for data curation is being recognized in numerous institutional settings as national research funding agencies extend data archiving mandates to cover more types of research grants. Data curation, however, is not only a practical challenge. It presents many conceptual and theoretical challenges that must be investigated to design appropriate technical systems, social practices and institutions, policies, and services. This presentation reports on outcomes from an investigation of research problems in data curation conducted as part of the Data Curation Education in Research Centers (DCERC) program. DCERC is developing a new model for educating data professionals to contribute to scientific research. The program is organized around foundational courses and field experiences in research and data centers for both master's and doctoral students. The initiative is led by the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign, in collaboration with the School of Information Sciences at the University of Tennessee, and library and data professionals at the National Center for Atmospheric Research (NCAR). At the doctoral level DCERC is educating future faculty and researchers in data curation and establishing a research agenda to advance the field. The doctoral seminar, Research Problems in Data Curation, was developed and taught in 2012 by the DCERC principal investigator and two doctoral fellows at the University of Illinois. It was designed to define the problem space of data curation, examine relevant concepts and theories related to both technical and social perspectives, and articulate research questions that are either unexplored or under theorized in the current literature. There was a particular emphasis on the Earth and environmental sciences, with guest speakers brought in from NCAR, National Snow and Ice Data Center (NSIDC), and Rensselaer Polytechnic Institute. Through the assignments, students constructed dozens of research questions informed by class readings, presentations, and discussions. A technical report is in progress on the resulting research agenda covering: data standards; infrastructure; research context; data reuse; sharing and access; preservation; and conceptual foundations. This presentation will discuss the agenda and its importance for the geosciences, highlighting high priority research questions. It will also introduce the related research to be undertaken by two DCERC doctoral students at NCAR during the 2013-2014 academic year and other data curation research in progress by the doctoral DCERC team.

  20. -A curated transcriptomic dataset collection relevant to embryonic development associated with in vitro fertilization in healthy individuals and patients with polycystic ovary syndrome.

    PubMed

    Mackeh, Rafah; Boughorbel, Sabri; Chaussabel, Damien; Kino, Tomoshige

    2017-01-01

    The collection of large-scale datasets available in public repositories is rapidly growing and providing opportunities to identify and fill gaps in different fields of biomedical research. However, users of these datasets should be able to selectively browse datasets related to their field of interest. Here we made available a collection of transcriptome datasets related to human follicular cells from normal individuals or patients with polycystic ovary syndrome, in the process of their development, during in vitro fertilization. After RNA-seq dataset exclusion and careful selection based on study description and sample information, 12 datasets, encompassing a total of 85 unique transcriptome profiles, were identified in NCBI Gene Expression Omnibus and uploaded to the Gene Expression Browser (GXB), a web application specifically designed for interactive query and visualization of integrated large-scale data. Once annotated in GXB, multiple sample grouping has been made in order to create rank lists to allow easy data interpretation and comparison. The GXB tool also allows the users to browse a single gene across multiple projects to evaluate its expression profiles in multiple biological systems/conditions in a web-based customized graphical views. The curated dataset is accessible at the following link: http://ivf.gxbsidra.org/dm3/landing.gsp.

  1. ­A curated transcriptomic dataset collection relevant to embryonic development associated with in vitro fertilization in healthy individuals and patients with polycystic ovary syndrome

    PubMed Central

    Mackeh, Rafah; Boughorbel, Sabri; Chaussabel, Damien; Kino, Tomoshige

    2017-01-01

    The collection of large-scale datasets available in public repositories is rapidly growing and providing opportunities to identify and fill gaps in different fields of biomedical research. However, users of these datasets should be able to selectively browse datasets related to their field of interest. Here we made available a collection of transcriptome datasets related to human follicular cells from normal individuals or patients with polycystic ovary syndrome, in the process of their development, during in vitro fertilization. After RNA-seq dataset exclusion and careful selection based on study description and sample information, 12 datasets, encompassing a total of 85 unique transcriptome profiles, were identified in NCBI Gene Expression Omnibus and uploaded to the Gene Expression Browser (GXB), a web application specifically designed for interactive query and visualization of integrated large-scale data. Once annotated in GXB, multiple sample grouping has been made in order to create rank lists to allow easy data interpretation and comparison. The GXB tool also allows the users to browse a single gene across multiple projects to evaluate its expression profiles in multiple biological systems/conditions in a web-based customized graphical views. The curated dataset is accessible at the following link: http://ivf.gxbsidra.org/dm3/landing.gsp. PMID:28413616

  2. Atmospheric Radiation Measurement's Data Management Facility captures metadata and uses visualization tools to assist in routine data management.

    NASA Astrophysics Data System (ADS)

    Keck, N. N.; Macduff, M.; Martin, T.

    2017-12-01

    The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.

  3. Data Curation Education Grounded in Earth Sciences and the Science of Data

    NASA Astrophysics Data System (ADS)

    Palmer, C. L.

    2015-12-01

    This presentation looks back over ten years of experience advancing data curation education at two Information Schools, highlighting the vital role of earth science case studies, expertise, and collaborations in development of curriculum and internships. We also consider current data curation practices and workforce demand in data centers in the geosciences, drawing on studies conducted in the Data Curation Education in Research Centers (DCERC) initiative and the Site-Based Data Curation project. Outcomes from this decade of data curation research and education has reinforced the importance of key areas of information science in preparing data professionals to respond to the needs of user communities, provide services across disciplines, invest in standards and interoperability, and promote open data practices. However, a serious void remains in principles to guide education and practice that are distinct to the development of data systems and services that meet both local and global aims. We identify principles emerging from recent empirical studies on the reuse value of data in the earth sciences and propose an approach for advancing data curation education that depends on systematic coordination with data intensive research and propagation of current best practices from data centers into curriculum. This collaborative model can increase both domain-based and cross-disciplinary expertise among data professionals, ultimately improving data systems and services in our universities and data centers while building the new base of knowledge needed for a foundational science of data.

  4. The MR-Base platform supports systematic causal inference across the human phenome

    PubMed Central

    Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George

    2018-01-01

    Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171

  5. Using Image Pro Plus Software to Develop Particle Mapping on Genesis Solar Wind Collector Surfaces

    NASA Technical Reports Server (NTRS)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.

    2012-01-01

    The continued success of the Genesis mission science team in analyzing solar wind collector array samples is partially based on close collaboration of the JSC curation team with science team members who develop cleaning techniques and those who assess elemental cleanliness at the levels of detection. The goal of this collaboration is to develop a reservoir of solar wind collectors of known cleanliness to be available to investigators. The heart and driving force behind this effort is Genesis mission PI Don Burnett. While JSC contributes characterization, safe clean storage, and benign collector cleaning with ultrapure water (UPW) and UV ozone, Burnett has coordinated more exotic and rigorous cleaning which is contributed by science team members. He also coordinates cleanliness assessment requiring expertise and instruments not available in curation, such as XPS, TRXRF [1,2] and synchrotron TRXRF. JSC participates by optically documenting the particle distributions as cleaning steps progress. Thus, optical document supplements SEM imaging and analysis, and elemental assessment by TRXRF.

  6. Rock and Core Repository Coming Digital

    NASA Astrophysics Data System (ADS)

    Maicher, Doris; Fleischer, Dirk; Czerniak, Andreas

    2016-04-01

    In times of whole city centres being available by a mouse click in 3D to virtually walk through, reality sometimes becomes neglected. The reality of scientific sample collections not being digitised to the essence of molecules, isotopes and electrons becomes unbelievable to the upgrowing generation of scientists. Just like any other geological institute the Helmholtz Centre for Ocean Research GEOMAR accumulated thousands of specimen. The samples, collected mainly during marine expeditions, date back as far as 1964. Today GEOMAR houses a central geological sample collection of at least 17 000 m of sediment core and more than 4 500 boxes with hard rock samples and refined sample specimen. This repository, having been dormant, missed the onset of the interconnected digital age. Physical samples without barcodes, QR codes or RFID tags need to be migrated and reconnected, urgently. In our use case, GEOMAR opted for the International Geo Sample Number IGSN as the persistent identifier. Consequentially, the software CurationDIS by smartcube GmbH as the central component of this project was selected. The software is designed to handle acquisition and administration of sample material and sample archiving in storage places. In addition, the software allows direct embedding of IGSN. We plan to adopt IGSN as a future asset, while for the initial inventory taking of our sample material, simple but unique QR codes act as "bridging identifiers" during the process. Currently we compile an overview of the broad variety of sample types and their associated data. QR-coding of the boxes of rock samples and sediment cores is near completion, delineating their location in the repository and linking a particular sample to any information available about the object. Planning is in progress to streamline the flow from receiving new samples to their curation to sharing samples and information publically. Additionally, interface planning for linkage to GEOMAR databases OceanRep (publications) and OSIS (expeditions) as well as for external data retrieval are in the pipeline. Looking ahead to implement IGSN, taking on board lessons learned from earlier generations, it will enable to comply with our institute's open science policy. Also it will allow to register newly collected samples already during ship expeditions. They thus receive their "birth certificate" contemporarily in this ever faster revolving scientific world.

  7. OntoMate: a text-mining tool aiding curation at the Rat Genome Database

    PubMed Central

    Liu, Weisong; Laulederkind, Stanley J. F.; Hayman, G. Thomas; Wang, Shur-Jen; Nigam, Rajni; Smith, Jennifer R.; De Pons, Jeff; Dwinell, Melinda R.; Shimoyama, Mary

    2015-01-01

    The Rat Genome Database (RGD) is the premier repository of rat genomic, genetic and physiologic data. Converting data from free text in the scientific literature to a structured format is one of the main tasks of all model organism databases. RGD spends considerable effort manually curating gene, Quantitative Trait Locus (QTL) and strain information. The rapidly growing volume of biomedical literature and the active research in the biological natural language processing (bioNLP) community have given RGD the impetus to adopt text-mining tools to improve curation efficiency. Recently, RGD has initiated a project to use OntoMate, an ontology-driven, concept-based literature search engine developed at RGD, as a replacement for the PubMed (http://www.ncbi.nlm.nih.gov/pubmed) search engine in the gene curation workflow. OntoMate tags abstracts with gene names, gene mutations, organism name and most of the 16 ontologies/vocabularies used at RGD. All terms/ entities tagged to an abstract are listed with the abstract in the search results. All listed terms are linked both to data entry boxes and a term browser in the curation tool. OntoMate also provides user-activated filters for species, date and other parameters relevant to the literature search. Using the system for literature search and import has streamlined the process compared to using PubMed. The system was built with a scalable and open architecture, including features specifically designed to accelerate the RGD gene curation process. With the use of bioNLP tools, RGD has added more automation to its curation workflow. Database URL: http://rgd.mcw.edu PMID:25619558

  8. TargetSearch--a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data.

    PubMed

    Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A

    2009-12-16

    Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.

  9. TargetSearch - a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data

    PubMed Central

    2009-01-01

    Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393

  10. A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework.

    PubMed

    Bandrowski, A E; Cachat, J; Li, Y; Müller, H M; Sternberg, P W; Ciccarese, P; Clark, T; Marenco, L; Wang, R; Astakhov, V; Grethe, J S; Martone, M E

    2012-01-01

    The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is 'hidden' from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to reduce the workload of the curators, it has resulted in valuable analytic by-products that address accessibility, use and citation of resources that can now be shared with resource owners and the larger scientific community. DATABASE URL: http://neuinfo.org.

  11. A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework

    PubMed Central

    Bandrowski, A. E.; Cachat, J.; Li, Y.; Müller, H. M.; Sternberg, P. W.; Ciccarese, P.; Clark, T.; Marenco, L.; Wang, R.; Astakhov, V.; Grethe, J. S.; Martone, M. E.

    2012-01-01

    The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is ‘hidden’ from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to reduce the workload of the curators, it has resulted in valuable analytic by-products that address accessibility, use and citation of resources that can now be shared with resource owners and the larger scientific community. Database URL: http://neuinfo.org PMID:22434839

  12. Patient-reported symptoms during radiotherapy : Clinically relevant symptom burden in patients treated with palliative and curative intent.

    PubMed

    Körner, Philipp; Ehrmann, Katja; Hartmannsgruber, Johann; Metz, Michaela; Steigerwald, Sabrina; Flentje, Michael; van Oorschot, Birgitt

    2017-07-01

    The benefits of patient-reported symptom assessment combined with integrated palliative care are well documented. This study assessed the symptom burden of palliative and curative-intent radiation oncology patients. Prior to first consultation and at the end of RT, all adult cancer patients planned to receive fractionated percutaneous radiotherapy (RT) were asked to answer the Edmonton Symptom Assessment Scale (ESAS; nine symptoms from 0 = no symptoms to 10 = worst possible symptoms). Mean values were used for curative vs. palliative and pre-post comparisons, and the clinical relevance was evaluated (symptom values ≥ 4). Of 163 participating patients, 151 patients (90.9%) completed both surveys (116 curative and 35 palliative patients). Before beginning RT, 88.6% of palliative and 72.3% of curative patients showed at least one clinically relevant symptom. Curative patients most frequently named decreased general wellbeing (38.6%), followed by tiredness (35.0%), anxiety (32.4%), depression (30.0%), pain (26.3%), lack of appetite (23.5%), dyspnea (17.8%), drowsiness (8.0%) and nausea (6.1%). Palliative patients most frequently named decreased general wellbeing (62.8%), followed by pain (62.8%), tiredness (60.0%), lack of appetite (40.0%), anxiety (38.0%), depression (33.3%), dyspnea (28.5%), drowsiness (25.7%) and nausea (14.2%). At the end of RT, the proportion of curative and palliative patients with a clinically relevant symptom had increased significantly to 79.8 and 91.4%, respectively; whereas the proportion of patients reporting clinically relevant pain had decreased significantly (42.8 vs. 62.8%, respectively). Palliative patients had significantly increased tiredness. Curative patients reported significant increases in pain, tiredness, nausea, drowsiness, lack of appetite and restrictions in general wellbeing. Assessment of patient-reported symptoms was successfully realized in radiation oncology routine. Overall, both groups showed a high symptom burden. The results prove the need of systematic symptom assessment and programs for early integrated supportive and palliative care in radiation oncology.

  13. Trust, but verify: On the importance of chemical structure curation in cheminformatics and QSAR modeling research

    PubMed Central

    Fourches, Denis; Muratov, Eugene; Tropsha, Alexander

    2010-01-01

    Molecular modelers and cheminformaticians typically analyze experimental data generated by other scientists. Consequently, when it comes to data accuracy, cheminformaticians are always at the mercy of data providers who may inadvertently publish (partially) erroneous data. Thus, dataset curation is crucial for any cheminformatics analysis such as similarity searching, clustering, QSAR modeling, virtual screening, etc., especially nowadays when the availability of chemical datasets in public domain has skyrocketed in recent years. Despite the obvious importance of this preliminary step in the computational analysis of any dataset, there appears to be no commonly accepted guidance or set of procedures for chemical data curation. The main objective of this paper is to emphasize the need for a standardized chemical data curation strategy that should be followed at the onset of any molecular modeling investigation. Herein, we discuss several simple but important steps for cleaning chemical records in a database including the removal of a fraction of the data that cannot be appropriately handled by conventional cheminformatics techniques. Such steps include the removal of inorganic and organometallic compounds, counterions, salts and mixtures; structure validation; ring aromatization; normalization of specific chemotypes; curation of tautomeric forms; and the deletion of duplicates. To emphasize the importance of data curation as a mandatory step in data analysis, we discuss several case studies where chemical curation of the original “raw” database enabled the successful modeling study (specifically, QSAR analysis) or resulted in a significant improvement of model's prediction accuracy. We also demonstrate that in some cases rigorously developed QSAR models could be even used to correct erroneous biological data associated with chemical compounds. We believe that good practices for curation of chemical records outlined in this paper will be of value to all scientists working in the fields of molecular modeling, cheminformatics, and QSAR studies. PMID:20572635

  14. Implementation of patient charges at primary care facilities in Kenya: implications of low adherence to user fee policy for users and facility revenue.

    PubMed

    Opwora, Antony; Waweru, Evelyn; Toda, Mitsuru; Noor, Abdisalan; Edwards, Tansy; Fegan, Greg; Molyneux, Sassy; Goodman, Catherine

    2015-05-01

    With user fees now seen as a major hindrance to universal health coverage, many countries have introduced fee reduction or elimination policies, but there is growing evidence that adherence to reduced fees is often highly imperfect. In 2004, Kenya adopted a reduced and uniform user fee policy providing fee exemptions to many groups. We present data on user fee implementation, revenue and expenditure from a nationally representative survey of Kenyan primary health facilities. Data were collected from 248 randomly selected public health centres and dispensaries in 2010, comprising an interview with the health worker in charge, exit interviews with curative outpatients, and a financial record review. Adherence to user fee policy was assessed for eight tracer conditions based on health worker reports, and patients were asked about actual amounts paid. No facilities adhered fully to the user fee policy across all eight tracers, with adherence ranging from 62.2% for an adult with tuberculosis to 4.2% for an adult with malaria. Three quarters of exit interviewees had paid some fees, with a median payment of US dollars (USD) 0.39, and a quarter of interviewees were required to purchase additional medical supplies at a later stage from a private drug retailer. No consistent pattern of association was identified between facility characteristics and policy adherence. User fee revenues accounted for almost all facility cash income, with average revenue of USD 683 per facility per year. Fee revenue was mainly used to cover support staff, non-drug supplies and travel allowances. Adherence to user fee policy was very low, leading to concerns about the impact on access and the financial burden on households. However, the potential to ensure adherence was constrained by the facilities' need for revenue to cover basic operating costs, highlighting the need for alternative funding strategies for peripheral health facilities. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014.

  15. Between land and sea: divergent data stewardship practices in deep-sea biosphere research

    NASA Astrophysics Data System (ADS)

    Cummings, R.; Darch, P.

    2013-12-01

    Data in deep-sea biosphere research often live a double life. While the original data generated on IODP expeditions are highly structured, professionally curated, and widely shared, the downstream data practices of deep-sea biosphere laboratories are far more localized and ad hoc. These divergent data practices make it difficult to track the provenance of datasets from the cruise ships to the laboratory or to integrate IODP data with laboratory data. An in-depth study of the divergent data practices in deep-sea biosphere research allows us to: - Better understand the social and technical forces that shape data stewardship throughout the data lifecycle; - Develop policy, infrastructure, and best practices to improve data stewardship in small labs; - Track provenance of datasets from IODP cruises to labs and publications; - Create linkages between laboratory findings, cruise data, and IODP samples. In this paper, we present findings from the first year of a case study of the Center for Dark Energy Biosphere Investigations (C-DEBI), an NSF Science and Technology Center that studies life beneath the seafloor. Our methods include observation in laboratories, interviews, document analysis, and participation in scientific meetings. Our research uncovers the data stewardship norms of geologists, biologists, chemists, and hydrologists conducting multi-disciplinary research. Our research team found that data stewardship on cruises is a clearly defined task performed by an IODP curator, while downstream it is a distributed task that develops in response to local need and to the extent necessary for the immediate research team. IODP data are expensive to collect and challenging to obtain, often costing $50,000/day and requiring researchers to work twelve hours a day onboard the ships. To maximize this research investment, a highly trained IODP data curator controls data stewardship on the cruise and applies best practices such as standardized formats, proper labeling, and centralized storage. In the laboratory, a scientist is his or her own curator. In contrast to the IODP research parties, laboratory research teams analyze diverse datasets, share them internally, implement ad hoc data management practices, optimize methods for their specific research questions, and release data on request through personal transactions. We discovered that while these workflows help small research teams retain flexibility and local control - crucial in exploratory deep-sea biosphere research - they also hinder data interoperability, discoverability, and consistency of methods from one research team to the next. Additional consequences of this contrast between IODP and lab practices are that it is difficult to track the provenance of data and to create linkages between laboratory findings, cruise data, and archived IODP samples. The ability to track provenance would add value to datasets and provide a clearer picture of the decisions made throughout the data lifecycle. Better linkages between the original data, laboratory data, and samples would allow secondary researchers to locate IODP data that may be useful to their research after laboratory findings are published. Our case study is funded by the Sloan Foundation and NSF.

  16. Curatr: a web application for creating, curating and sharing a mass spectral library.

    PubMed

    Palmer, Andrew; Phapale, Prasad; Fay, Dominik; Alexandrov, Theodore

    2018-04-15

    We have developed a web application curatr for the rapid generation of high quality mass spectral fragmentation libraries from liquid-chromatography mass spectrometry datasets. Curatr handles datasets from single or multiplexed standards and extracts chromatographic profiles and potential fragmentation spectra for multiple adducts. An intuitive interface helps users to select high quality spectra that are stored along with searchable molecular information, the providence of each standard and experimental metadata. Curatr supports exports to several standard formats for use with third party software or submission to repositories. We demonstrate the use of curatr to generate the EMBL Metabolomics Core Facility spectral library http://curatr.mcf.embl.de. Source code and example data are at http://github.com/alexandrovteam/curatr/. palmer@embl.de. Supplementary data are available at Bioinformatics online.

  17. An overview of the BioCreative 2012 Workshop Track III: interactive text mining task

    PubMed Central

    Arighi, Cecilia N.; Carterette, Ben; Cohen, K. Bretonnel; Krallinger, Martin; Wilbur, W. John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E.; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L.; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P.; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O.; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy

    2013-01-01

    In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators’ overall experience of a system, regardless of the system’s high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV. PMID:23327936

  18. An overview of the BioCreative 2012 Workshop Track III: interactive text mining task.

    PubMed

    Arighi, Cecilia N; Carterette, Ben; Cohen, K Bretonnel; Krallinger, Martin; Wilbur, W John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy

    2013-01-01

    In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators' overall experience of a system, regardless of the system's high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV.

  19. Income-related inequalities in preventive and curative dental care use among working-age Japanese adults in urban areas: a cross-sectional study.

    PubMed

    Murakami, Keiko; Aida, Jun; Ohkubo, Takayoshi; Hashimoto, Hideki

    2014-09-19

    Preventive dental care use remains relatively low in Japan, especially among working-age adults. Universal health insurance in Japan covers curative dental care with an out-of-pocket payment limit, though its coverage of preventive dental care is limited. The aim of this study was to test the hypothesis that income inequality in dental care use is found in preventive, but not curative dental care among working-age Japanese adults. A cross-sectional survey was conducted using a computer-assisted, self-administered format for community residents aged 25-50 years. In all, 4357 residents agreed to participate and complete the questionnaire (valid response rate: 31.3%). Preventive dental care use was measured according to whether the participant had visited a dentist or a dental hygienist during the past year for dental scaling or fluoride or orthodontic treatments. Curative dental care use was assessed by dental visits for other reasons. The main explanatory variable was equivalent household income. Logistic regression analyses with linear trend tests were conducted to determine whether there were significant income-related gradients with curative or preventive dental care use. Among the respondents, 40.0% of men and 41.5% of women had used curative dental care in the past year; 24.1% of men and 34.1% of women had used preventive care. We found no significant income-related gradients of curative dental care among either men or women (p = 0.234 and p = 0.270, respectively). Significant income-related gradients of preventive care were observed among both men and women (p < 0.001 and p = 0.003, respectively). Among women, however, income-related differences were no longer significant (p = 0.126) after adjusting for education and other covariates. Compared with men with the lowest income, the highest-income group had a 1.79-fold significantly higher probability for using preventive dental care. The prevalence of preventive dental care use was lower than that of curative care. The results showed income-related inequality in preventive dental care use among men, though there were no significant income-related gradients of curative dental care use among either men or women. Educational attainment had a positive association with preventive dental care use only among women.

  20. Analysis on curative effects and safety of 2% liranaftate ointment in treating tinea pedis and tinea corporis & cruris.

    PubMed

    Sulaiman, Akebaier; Wan, Xuefeng; Fan, Junwei; Kasimu, Hadiliya; Dong, Xiaoyang; Wang, Xiaodong; Zhang, Lijuan; Abliz, Paride; Upur, Halmurat

    2017-05-01

    The paper is intended to analyze and evaluate the specific curative effect and safety of 2% liranaftate ointment in treating patients with tinea pedis and tinea cruris. 1,100 cases of patients with tinea pedis and tinea corporis & cruris were selected as research objects and were divided into two groups according to the random number table method. They were treated with different methods: 550 cases of patients were treated with 2% liranaftate ointment for external use in the observation group and the rest 550 cases of patients were treated with 1% bifonazole cream in the control group. The treatment time was two weeks for patients with tinea corporis & cruris and four weeks for those with tinea pedis respectively. Meanwhile, the one-month follow-up visit was conducted among the patients to compare the curative effects of two groups. After the medication, the curative effectiveness rate was 87.65% (482/550) in the observation group, while that was 84.91% (467/550) in the control group. After the average follow-up visits of (15.5±2.4), the curative effectiveness rate 96.55% (531/550) in the observation group, while that was 91.45% (503/550) in the control group. Two groups of patients recovered well with a low incidence of adverse reactions in the treatment, and the overall curative effect was good with the inter-group difference at P>0.05, so it was without statistical significance. The curative effect of 2% liranaftate ointment is safe and obvious in treating tinea pedis and tinea corporis & cruris, so it is valuable for clinical popularization and application.

  1. The hepatocurative effects of Cynara scolymus L. leaf extract on carbon tetrachloride-induced oxidative stress and hepatic injury in rats.

    PubMed

    Colak, Emine; Ustuner, Mehmet Cengiz; Tekin, Neslihan; Colak, Ertugrul; Burukoglu, Dilek; Degirmenci, Irfan; Gunes, Hasan Veysi

    2016-01-01

    Cynara scolymus is a pharmacologically important medicinal plant containing phenolic acids and flavonoids. Experimental studies indicate antioxidant and hepatoprotective effects of C. scolymus but there have been no studies about therapeutic effects of liver diseases yet. In the present study, hepatocurative effects of C. scolymus leaf extract on carbon tetrachloride (CCl4)-induced oxidative stress and hepatic injury in rats were investigated by serum hepatic enzyme levels, oxidative stress indicator (malondialdehyde-MDA), endogenous antioxidants, DNA fragmentation, p53, caspase 3 and histopathology. Animals were divided into six groups: control, olive oil, CCl4, C. scolymus leaf extract, recovery and curative. CCl4 was administered at a dose of 0.2 mL/kg twice daily on CCl4, recovery and curative groups. Cynara scolymus extract was given orally for 2 weeks at a dose of 1.5 g/kg after CCl4 application on the curative group. Significant decrease of serum alanine-aminotransferase (ALT) and aspartate-aminotransferase (AST) levels were determined in the curative group. MDA levels were significantly lower in the curative group. Significant increase of superoxide dismutase (SOD) and catalase (CAT) activity in the curative group was determined. In the curative group, C. scolymus leaf extract application caused the DNA % fragmentation, p53 and caspase 3 levels of liver tissues towards the normal range. Our results indicated that C. scolymus leaf extract has hepatocurative effects of on CCl4-induced oxidative stress and hepatic injury by reducing lipid peroxidation, providing affected antioxidant systems towards the normal range. It also had positive effects on the pathway of the regulatory mechanism allowing repair of DNA damage on CCl4-induced hepatotoxicity.

  2. Hospital of Diagnosis Influences the Probability of Receiving Curative Treatment for Esophageal Cancer.

    PubMed

    van Putten, Margreet; Koëter, Marijn; van Laarhoven, Hanneke W M; Lemmens, Valery E P P; Siersema, Peter D; Hulshof, Maarten C C M; Verhoeven, Rob H A; Nieuwenhuijzen, Grard A P

    2018-02-01

    The aim of this article was to study the influence of hospital of diagnosis on the probability of receiving curative treatment and its impact on survival among patients with esophageal cancer (EC). Although EC surgery is centralized in the Netherlands, the disease is often diagnosed in hospitals that do not perform this procedure. Patients with potentially curable esophageal or gastroesophageal junction tumors diagnosed between 2005 and 2013 who were potentially curable (cT1-3,X, any N, M0,X) were selected from the Netherlands Cancer Registry. Multilevel logistic regression was performed to examine the probability to undergo curative treatment (resection with or without neoadjuvant treatment, definitive chemoradiotherapy, or local tumor excision) according to hospital of diagnosis. Effects of variation in probability of undergoing curative treatment among these hospitals on survival were investigated by Cox regression. All 13,017 patients with potentially curable EC, diagnosed in 91 hospitals, were included. The proportion of patients receiving curative treatment ranged from 37% to 83% and from 45% to 86% in the periods 2005-2009 and 2010-2013, respectively, depending on hospital of diagnosis. After adjustment for patient- and hospital-related characteristics these proportions ranged from 41% to 77% and from 50% to 82%, respectively (both P < 0.001). Multivariable survival analyses showed that patients diagnosed in hospitals with a low probability of undergoing curative treatment had a worse overall survival (hazard ratio = 1.13, 95% confidence interval 1.06-1.20; hazard ratio = 1.15, 95% confidence interval 1.07-1.24). The variation in probability of undergoing potentially curative treatment for EC between hospitals of diagnosis and its impact on survival indicates that treatment decision making in EC may be improved.

  3. Automatic reconstruction of a bacterial regulatory network using Natural Language Processing

    PubMed Central

    Rodríguez-Penagos, Carlos; Salgado, Heladia; Martínez-Flores, Irma; Collado-Vides, Julio

    2007-01-01

    Background Manual curation of biological databases, an expensive and labor-intensive process, is essential for high quality integrated data. In this paper we report the implementation of a state-of-the-art Natural Language Processing system that creates computer-readable networks of regulatory interactions directly from different collections of abstracts and full-text papers. Our major aim is to understand how automatic annotation using Text-Mining techniques can complement manual curation of biological databases. We implemented a rule-based system to generate networks from different sets of documents dealing with regulation in Escherichia coli K-12. Results Performance evaluation is based on the most comprehensive transcriptional regulation database for any organism, the manually-curated RegulonDB, 45% of which we were able to recreate automatically. From our automated analysis we were also able to find some new interactions from papers not already curated, or that were missed in the manual filtering and review of the literature. We also put forward a novel Regulatory Interaction Markup Language better suited than SBML for simultaneously representing data of interest for biologists and text miners. Conclusion Manual curation of the output of automatic processing of text is a good way to complement a more detailed review of the literature, either for validating the results of what has been already annotated, or for discovering facts and information that might have been overlooked at the triage or curation stages. PMID:17683642

  4. WikiGenomes: an open web application for community consumption and curation of gene annotation data in Wikidata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Putman, Tim E.; Lelong, Sebastien; Burgstaller-Muehlbacher, Sebastian

    With the advancement of genome-sequencing technologies, new genomes are being sequenced daily. Although these sequences are deposited in publicly available data warehouses, their functional and genomic annotations (beyond genes which are predicted automatically) mostly reside in the text of primary publications. Professional curators are hard at work extracting those annotations from the literature for the most studied organisms and depositing them in structured databases. However, the resources don’t exist to fund the comprehensive curation of the thousands of newly sequenced organisms in this manner. Here, we describe WikiGenomes (wikigenomes.org), a web application that facilitates the consumption and curation of genomicmore » data by the entire scientific community. WikiGenomes is based on Wikidata, an openly editable knowledge graph with the goal of aggregating published knowledge into a free and open database. WikiGenomes empowers the individual genomic researcher to contribute their expertise to the curation effort and integrates the knowledge into Wikidata, enabling it to be accessed by anyone without restriction.« less

  5. [Current status in the commercialization and application of genetically modified plants and their effects on human and livestock health and phytoremediation].

    PubMed

    Yoshimatsu, Kayo; Kawano, Noriaki; Kawahara, Nobuo; Akiyama, Hiroshi; Teshima, Reiko; Nishijima, Masahiro

    2012-01-01

    Developments in the use of genetically modified plants for human and livestock health and phytoremediation were surveyed using information retrieved from Entrez PubMed, Chemical Abstracts Service, Google, congress abstracts and proceedings of related scientific societies, scientific journals, etc. Information obtained was classified into 8 categories according to the research objective and the usage of the transgenic plants as 1: nutraceuticals (functional foods), 2: oral vaccines, 3: edible curatives, 4: vaccine antigens, 5: therapeutic antibodies, 6: curatives, 7: diagnostic agents and reagents, and 8: phytoremediation. In total, 405 cases were collected from 2006 to 2010. The numbers of cases were 120 for nutraceuticals, 65 for oral vaccines, 25 for edible curatives, 36 for vaccine antigens, 36 for therapeutic antibodies, 76 for curatives, 15 for diagnostic agents and reagents, and 40 for phytoremediation (sum of each cases was 413 because some reports were related to several categories). Nutraceuticals, oral vaccines and curatives were predominant. The most frequently used edible crop was rice (51 cases), and tomato (28 cases), lettuce (22 cases), potato (18 cases), corn (15 cases) followed.

  6. WikiGenomes: an open web application for community consumption and curation of gene annotation data in Wikidata

    DOE PAGES

    Putman, Tim E.; Lelong, Sebastien; Burgstaller-Muehlbacher, Sebastian; ...

    2017-03-06

    With the advancement of genome-sequencing technologies, new genomes are being sequenced daily. Although these sequences are deposited in publicly available data warehouses, their functional and genomic annotations (beyond genes which are predicted automatically) mostly reside in the text of primary publications. Professional curators are hard at work extracting those annotations from the literature for the most studied organisms and depositing them in structured databases. However, the resources don’t exist to fund the comprehensive curation of the thousands of newly sequenced organisms in this manner. Here, we describe WikiGenomes (wikigenomes.org), a web application that facilitates the consumption and curation of genomicmore » data by the entire scientific community. WikiGenomes is based on Wikidata, an openly editable knowledge graph with the goal of aggregating published knowledge into a free and open database. WikiGenomes empowers the individual genomic researcher to contribute their expertise to the curation effort and integrates the knowledge into Wikidata, enabling it to be accessed by anyone without restriction.« less

  7. STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation

    PubMed Central

    2013-01-01

    Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969

  8. An Analysis of the Climate Data Initiative's Data Collection

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Bugbee, K.

    2015-12-01

    The Climate Data Initiative (CDI) is a broad multi-agency effort of the U.S. government that seeks to leverage the extensive existing federal climate-relevant data to stimulate innovation and private-sector entrepreneurship to support national climate-change preparedness. The CDI project is a systematic effort to manually curate and share openly available climate data from various federal agencies. To date, the CDI has curated seven themes, or topics, relevant to climate change resiliency. These themes include Coastal Flooding, Food Resilience, Water, Ecosystem Vulnerability, Human Health, Energy Infrastructure, and Transportation. Each theme was curated by subject matter experts who selected datasets relevant to the topic at hand. An analysis of the entire Climate Data Initiative data collection and the data curated for each theme offers insights into which datasets are considered most relevant in addressing climate resiliency. Other aspects of the data collection will be examined including which datasets were the most visited or popular and which datasets were the most sought after for curation by the theme teams. Results from the analysis of the CDI collection will be presented in this talk.

  9. CARD 2017: expansion and model-centric curation of the comprehensive antibiotic resistance database

    PubMed Central

    Jia, Baofeng; Raphenya, Amogelang R.; Alcock, Brian; Waglechner, Nicholas; Guo, Peiyao; Tsang, Kara K.; Lago, Briony A.; Dave, Biren M.; Pereira, Sheldon; Sharma, Arjun N.; Doshi, Sachin; Courtot, Mélanie; Lo, Raymond; Williams, Laura E.; Frye, Jonathan G.; Elsayegh, Tariq; Sardar, Daim; Westman, Erin L.; Pawlowski, Andrew C.; Johnson, Timothy A.; Brinkman, Fiona S.L.; Wright, Gerard D.; McArthur, Andrew G.

    2017-01-01

    The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins and mutations involved in AMR. CARD is ontologically structured, model centric, and spans the breadth of AMR drug classes and resistance mechanisms, including intrinsic, mutation-driven and acquired resistance. It is built upon the Antibiotic Resistance Ontology (ARO), a custom built, interconnected and hierarchical controlled vocabulary allowing advanced data sharing and organization. Its design allows the development of novel genome analysis tools, such as the Resistance Gene Identifier (RGI) for resistome prediction from raw genome sequence. Recent improvements include extensive curation of additional reference sequences and mutations, development of a unique Model Ontology and accompanying AMR detection models to power sequence analysis, new visualization tools, and expansion of the RGI for detection of emergent AMR threats. CARD curation is updated monthly based on an interplay of manual literature curation, computational text mining, and genome analysis. PMID:27789705

  10. MicRhoDE: a curated database for the analysis of microbial rhodopsin diversity and evolution

    PubMed Central

    Boeuf, Dominique; Audic, Stéphane; Brillet-Guéguen, Loraine; Caron, Christophe; Jeanthon, Christian

    2015-01-01

    Microbial rhodopsins are a diverse group of photoactive transmembrane proteins found in all three domains of life and in viruses. Today, microbial rhodopsin research is a flourishing research field in which new understandings of rhodopsin diversity, function and evolution are contributing to broader microbiological and molecular knowledge. Here, we describe MicRhoDE, a comprehensive, high-quality and freely accessible database that facilitates analysis of the diversity and evolution of microbial rhodopsins. Rhodopsin sequences isolated from a vast array of marine and terrestrial environments were manually collected and curated. To each rhodopsin sequence are associated related metadata, including predicted spectral tuning of the protein, putative activity and function, taxonomy for sequences that can be linked to a 16S rRNA gene, sampling date and location, and supporting literature. The database currently covers 7857 aligned sequences from more than 450 environmental samples or organisms. Based on a robust phylogenetic analysis, we introduce an operational classification system with multiple phylogenetic levels ranging from superclusters to species-level operational taxonomic units. An integrated pipeline for online sequence alignment and phylogenetic tree construction is also provided. With a user-friendly interface and integrated online bioinformatics tools, this unique resource should be highly valuable for upcoming studies of the biogeography, diversity, distribution and evolution of microbial rhodopsins. Database URL: http://micrhode.sb-roscoff.fr. PMID:26286928

  11. MicRhoDE: a curated database for the analysis of microbial rhodopsin diversity and evolution.

    PubMed

    Boeuf, Dominique; Audic, Stéphane; Brillet-Guéguen, Loraine; Caron, Christophe; Jeanthon, Christian

    2015-01-01

    Microbial rhodopsins are a diverse group of photoactive transmembrane proteins found in all three domains of life and in viruses. Today, microbial rhodopsin research is a flourishing research field in which new understandings of rhodopsin diversity, function and evolution are contributing to broader microbiological and molecular knowledge. Here, we describe MicRhoDE, a comprehensive, high-quality and freely accessible database that facilitates analysis of the diversity and evolution of microbial rhodopsins. Rhodopsin sequences isolated from a vast array of marine and terrestrial environments were manually collected and curated. To each rhodopsin sequence are associated related metadata, including predicted spectral tuning of the protein, putative activity and function, taxonomy for sequences that can be linked to a 16S rRNA gene, sampling date and location, and supporting literature. The database currently covers 7857 aligned sequences from more than 450 environmental samples or organisms. Based on a robust phylogenetic analysis, we introduce an operational classification system with multiple phylogenetic levels ranging from superclusters to species-level operational taxonomic units. An integrated pipeline for online sequence alignment and phylogenetic tree construction is also provided. With a user-friendly interface and integrated online bioinformatics tools, this unique resource should be highly valuable for upcoming studies of the biogeography, diversity, distribution and evolution of microbial rhodopsins. Database URL: http://micrhode.sb-roscoff.fr. © The Author(s) 2015. Published by Oxford University Press.

  12. Omics Metadata Management Software (OMMS).

    PubMed

    Perez-Arriaga, Martha O; Wilson, Susan; Williams, Kelly P; Schoeniger, Joseph; Waymire, Russel L; Powell, Amy Jo

    2015-01-01

    Next-generation sequencing projects have underappreciated information management tasks requiring detailed attention to specimen curation, nucleic acid sample preparation and sequence production methods required for downstream data processing, comparison, interpretation, sharing and reuse. The few existing metadata management tools for genome-based studies provide weak curatorial frameworks for experimentalists to store and manage idiosyncratic, project-specific information, typically offering no automation supporting unified naming and numbering conventions for sequencing production environments that routinely deal with hundreds, if not thousands of samples at a time. Moreover, existing tools are not readily interfaced with bioinformatics executables, (e.g., BLAST, Bowtie2, custom pipelines). Our application, the Omics Metadata Management Software (OMMS), answers both needs, empowering experimentalists to generate intuitive, consistent metadata, and perform analyses and information management tasks via an intuitive web-based interface. Several use cases with short-read sequence datasets are provided to validate installation and integrated function, and suggest possible methodological road maps for prospective users. Provided examples highlight possible OMMS workflows for metadata curation, multistep analyses, and results management and downloading. The OMMS can be implemented as a stand alone-package for individual laboratories, or can be configured for webbased deployment supporting geographically-dispersed projects. The OMMS was developed using an open-source software base, is flexible, extensible and easily installed and executed. The OMMS can be obtained at http://omms.sandia.gov. The OMMS can be obtained at http://omms.sandia.gov.

  13. Filovirus RefSeq Entries: Evaluation and Selection of Filovirus Type Variants, Type Sequences, and Names

    PubMed Central

    Kuhn, Jens H.; Andersen, Kristian G.; Bào, Yīmíng; Bavari, Sina; Becker, Stephan; Bennett, Richard S.; Bergman, Nicholas H.; Blinkova, Olga; Bradfute, Steven; Brister, J. Rodney; Bukreyev, Alexander; Chandran, Kartik; Chepurnov, Alexander A.; Davey, Robert A.; Dietzgen, Ralf G.; Doggett, Norman A.; Dolnik, Olga; Dye, John M.; Enterlein, Sven; Fenimore, Paul W.; Formenty, Pierre; Freiberg, Alexander N.; Garry, Robert F.; Garza, Nicole L.; Gire, Stephen K.; Gonzalez, Jean-Paul; Griffiths, Anthony; Happi, Christian T.; Hensley, Lisa E.; Herbert, Andrew S.; Hevey, Michael C.; Hoenen, Thomas; Honko, Anna N.; Ignatyev, Georgy M.; Jahrling, Peter B.; Johnson, Joshua C.; Johnson, Karl M.; Kindrachuk, Jason; Klenk, Hans-Dieter; Kobinger, Gary; Kochel, Tadeusz J.; Lackemeyer, Matthew G.; Lackner, Daniel F.; Leroy, Eric M.; Lever, Mark S.; Mühlberger, Elke; Netesov, Sergey V.; Olinger, Gene G.; Omilabu, Sunday A.; Palacios, Gustavo; Panchal, Rekha G.; Park, Daniel J.; Patterson, Jean L.; Paweska, Janusz T.; Peters, Clarence J.; Pettitt, James; Pitt, Louise; Radoshitzky, Sheli R.; Ryabchikova, Elena I.; Saphire, Erica Ollmann; Sabeti, Pardis C.; Sealfon, Rachel; Shestopalov, Aleksandr M.; Smither, Sophie J.; Sullivan, Nancy J.; Swanepoel, Robert; Takada, Ayato; Towner, Jonathan S.; van der Groen, Guido; Volchkov, Viktor E.; Volchkova, Valentina A.; Wahl-Jensen, Victoria; Warren, Travis K.; Warfield, Kelly L.; Weidmann, Manfred; Nichol, Stuart T.

    2014-01-01

    Sequence determination of complete or coding-complete genomes of viruses is becoming common practice for supporting the work of epidemiologists, ecologists, virologists, and taxonomists. Sequencing duration and costs are rapidly decreasing, sequencing hardware is under modification for use by non-experts, and software is constantly being improved to simplify sequence data management and analysis. Thus, analysis of virus disease outbreaks on the molecular level is now feasible, including characterization of the evolution of individual virus populations in single patients over time. The increasing accumulation of sequencing data creates a management problem for the curators of commonly used sequence databases and an entry retrieval problem for end users. Therefore, utilizing the data to their fullest potential will require setting nomenclature and annotation standards for virus isolates and associated genomic sequences. The National Center for Biotechnology Information’s (NCBI’s) RefSeq is a non-redundant, curated database for reference (or type) nucleotide sequence records that supplies source data to numerous other databases. Building on recently proposed templates for filovirus variant naming [ ()////-], we report consensus decisions from a majority of past and currently active filovirus experts on the eight filovirus type variants and isolates to be represented in RefSeq, their final designations, and their associated sequences. PMID:25256396

  14. Omics Metadata Management Software (OMMS)

    PubMed Central

    Perez-Arriaga, Martha O; Wilson, Susan; Williams, Kelly P; Schoeniger, Joseph; Waymire, Russel L; Powell, Amy Jo

    2015-01-01

    Next-generation sequencing projects have underappreciated information management tasks requiring detailed attention to specimen curation, nucleic acid sample preparation and sequence production methods required for downstream data processing, comparison, interpretation, sharing and reuse. The few existing metadata management tools for genome-based studies provide weak curatorial frameworks for experimentalists to store and manage idiosyncratic, project-specific information, typically offering no automation supporting unified naming and numbering conventions for sequencing production environments that routinely deal with hundreds, if not thousands of samples at a time. Moreover, existing tools are not readily interfaced with bioinformatics executables, (e.g., BLAST, Bowtie2, custom pipelines). Our application, the Omics Metadata Management Software (OMMS), answers both needs, empowering experimentalists to generate intuitive, consistent metadata, and perform analyses and information management tasks via an intuitive web-based interface. Several use cases with short-read sequence datasets are provided to validate installation and integrated function, and suggest possible methodological road maps for prospective users. Provided examples highlight possible OMMS workflows for metadata curation, multistep analyses, and results management and downloading. The OMMS can be implemented as a stand alone-package for individual laboratories, or can be configured for webbased deployment supporting geographically-dispersed projects. The OMMS was developed using an open-source software base, is flexible, extensible and easily installed and executed. The OMMS can be obtained at http://omms.sandia.gov. Availability The OMMS can be obtained at http://omms.sandia.gov PMID:26124554

  15. Payao: a community platform for SBML pathway model curation

    PubMed Central

    Matsuoka, Yukiko; Ghosh, Samik; Kikuchi, Norihiro; Kitano, Hiroaki

    2010-01-01

    Summary: Payao is a community-based, collaborative web service platform for gene-regulatory and biochemical pathway model curation. The system combines Web 2.0 technologies and online model visualization functions to enable a collaborative community to annotate and curate biological models. Payao reads the models in Systems Biology Markup Language format, displays them with CellDesigner, a process diagram editor, which complies with the Systems Biology Graphical Notation, and provides an interface for model enrichment (adding tags and comments to the models) for the access-controlled community members. Availability and implementation: Freely available for model curation service at http://www.payaologue.org. Web site implemented in Seaser Framework 2.0 with S2Flex2, MySQL 5.0 and Tomcat 5.5, with all major browsers supported. Contact: kitano@sbi.jp PMID:20371497

  16. Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.

    PubMed

    Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul

    2015-01-01

    As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.

  17. Building an efficient curation workflow for the Arabidopsis literature corpus

    PubMed Central

    Li, Donghui; Berardini, Tanya Z.; Muller, Robert J.; Huala, Eva

    2012-01-01

    TAIR (The Arabidopsis Information Resource) is the model organism database (MOD) for Arabidopsis thaliana, a model plant with a literature corpus of about 39 000 articles in PubMed, with over 4300 new articles added in 2011. We have developed a literature curation workflow incorporating both automated and manual elements to cope with this flood of new research articles. The current workflow can be divided into two phases: article selection and curation. Structured controlled vocabularies, such as the Gene Ontology and Plant Ontology are used to capture free text information in the literature as succinct ontology-based annotations suitable for the application of computational analysis methods. We also describe our curation platform and the use of text mining tools in our workflow. Database URL: www.arabidopsis.org PMID:23221298

  18. Optimising mHealth helpdesk responsiveness in South Africa: towards automated message triage

    PubMed Central

    Engelhard, Matthew; Copley, Charles; Watson, Jacqui; Pillay, Yogan; Barron, Peter

    2018-01-01

    In South Africa, a national-level helpdesk was established in August 2014 as a social accountability mechanism for improving governance, allowing recipients of public sector services to send complaints, compliments and questions directly to a team of National Department of Health (NDoH) staff members via text message. As demand increases, mechanisms to streamline and improve the helpdesk must be explored. This work aims to evaluate the need for and feasibility of automated message triage to improve helpdesk responsiveness to high-priority messages. Drawing from 65 768 messages submitted between October 2016 and July 2017, the quality of helpdesk message handling was evaluated via detailed inspection of (1) a random sample of 481 messages and (2) messages reporting mistreatment of women, as identified using expert-curated keywords. Automated triage was explored by training a naïve Bayes classifier to replicate message labels assigned by NDoH staff. Classifier performance was evaluated on 12 526 messages withheld from the training set. 90 of 481 (18.7%) NDoH responses were scored as suboptimal or incorrect, with median response time of 4.0 hours. 32 reports of facility-based mistreatment and 39 of partner and family violence were identified; NDoH response time and appropriateness for these messages were not superior to the random sample (P>0.05). The naïve Bayes classifier had average accuracy of 85.4%, with ≥98% specificity for infrequently appearing (<50%) labels. These results show that helpdesk handling of mistreatment of women could be improved. Keyword matching and naïve Bayes effectively identified uncommon messages of interest and could support automated triage to improve handling of high-priority messages. PMID:29713508

  19. Rescue of Long-Tail Data from the Ocean Bottom to the Moon

    NASA Astrophysics Data System (ADS)

    Hsu, L.; Lehnert, K. A.; Carbotte, S. M.; Ferrini, V.; Delano, J. W.; Gill, J. B.; Tivey, M.

    2013-12-01

    IEDA (Integrated Earth Data Applications, www.iedadata.org), the NSF facility that operates EarthChem, the Marine Geoscience Data System, and the System for Earth Sample Registration, launched a Data Rescue Initiative in 2013 to advance preservation and re-use of valuable legacy datasets that are in danger of being lost. As part of this initiative, IEDA established a competition for Data Rescue Mini-Awards that provide modest funds to investigators to properly compile, document, and transfer legacy data sets to IEDA. Applications from late-career and near-retirement investigators were specifically encouraged. Awardees were given approximately six months to complete their data rescue activities. Three projects were awarded in 2013: (1) Geochemistry of Lunar Glasses: assembly of major element, trace element, volatile element, and isotope ratio data for lunar volcanic glasses and lunar impact glasses, (2) Geochemical and Geochronological data from Fiji, Izu-Bonin-Marianas, and Endeavor segment: assembly of published and unpublished data and metadata from large rock sample collections, and (3) Near-bottom Magnetic Data: curation and archival of 35 years of high-resolution, near-bottom magnetic field data from deeptowed platforms, submersibles, and ROVs. IEDA is working closely with the awardees to guide and support the data rescue effort and to assist with specific challenges related to outdated storage media or technology, diversity of platforms over decades of research, and the lack of established standards for data documentation. In this contribution we describe procedures and tools used for each project, summarize lessons learned and best practices, and present the final output of each data rescue project. Depending on the experiences of this first year and the availability of funds, we plan to continue the competition in future years.

  20. Application of curative therapy in the ward. 1920.

    PubMed

    Marble, Henry Chase

    2009-06-01

    This Classic article is a reprint of the original work by Henry Chase Marble, Application of Curative Therapy in the Ward. An accompanying biographical sketch on Henry Chase Marble, MD, is available at DOI 10.1007/s11999-009-0789-7 . The Classic Article is (c)1920 by the Journal of Bone and Joint Surgery, Inc. and is reprinted with permission from Marble HC. Application of curative therapy in the ward. J Bone Joint Surg Am. 1920;2:136-138.

  1. A new dimensional-reducing variable obtained from original inflammatory scores is highly associated to morbidity after curative surgery for colorectal cancer.

    PubMed

    Bailon-Cuadrado, Martin; Perez-Saborido, Baltasar; Sanchez-Gonzalez, Javier; Rodriguez-Lopez, Mario; Mayo-Iscar, Agustin; Pacheco-Sanchez, David

    2018-06-20

    Several scores have been developed to define the inflammatory status of oncological patients. We suspect they share iterative information. Our hypothesis is that we may summarise their information into one or two new variables which will be independent. This will help us to predict, more accurately, which patients are at an increased risk of suffering postoperative complications after curative surgery for CRC. Observational prospective study with those patients undergoing curative surgery for CRC between September 2015 and February 2017. We analysed the influence of inflammatory scores (PNI, GPS, NLR, PLR) on postoperative morbidity (overall and severe complications, anastomotic leakage and reoperation). Finally, 168 patients were analysed. We checked these four original scores are interrelated among them. Using a complex and innovative statistical method, we created two new independent variables (resultant A and resultant B) which resume the information coming from them. One of these two new variables (resultant A) was statistically associated to overall complications (OR, 2.239; 95% CI, 1.541-3.253; p = 0.0001), severe complications (OR, 1.773; 95% CI, 1.129-2.785; p = 0.013), anastomotic leakage (OR, 3.208; 95% CI, 1.416-7.268; p = 0.005) and reoperation (OR, 2.349; 95% CI, 1.281-4.305; p = 0.006). We evinced the four original scores we used share redundant information. We created two new independent new variables which resume their information. In our sample of patients, one of these variables turned out to be a great predictive factor for the four complications we analysed.

  2. A human functional protein interaction network and its application to cancer data analysis

    PubMed Central

    2010-01-01

    Background One challenge facing biologists is to tease out useful information from massive data sets for further analysis. A pathway-based analysis may shed light by projecting candidate genes onto protein functional relationship networks. We are building such a pathway-based analysis system. Results We have constructed a protein functional interaction network by extending curated pathways with non-curated sources of information, including protein-protein interactions, gene coexpression, protein domain interaction, Gene Ontology (GO) annotations and text-mined protein interactions, which cover close to 50% of the human proteome. By applying this network to two glioblastoma multiforme (GBM) data sets and projecting cancer candidate genes onto the network, we found that the majority of GBM candidate genes form a cluster and are closer than expected by chance, and the majority of GBM samples have sequence-altered genes in two network modules, one mainly comprising genes whose products are localized in the cytoplasm and plasma membrane, and another comprising gene products in the nucleus. Both modules are highly enriched in known oncogenes, tumor suppressors and genes involved in signal transduction. Similar network patterns were also found in breast, colorectal and pancreatic cancers. Conclusions We have built a highly reliable functional interaction network upon expert-curated pathways and applied this network to the analysis of two genome-wide GBM and several other cancer data sets. The network patterns revealed from our results suggest common mechanisms in the cancer biology. Our system should provide a foundation for a network or pathway-based analysis platform for cancer and other diseases. PMID:20482850

  3. ARACNe-based inference, using curated microarray data, of Arabidopsis thaliana root transcriptional regulatory networks

    PubMed Central

    2014-01-01

    Background Uncovering the complex transcriptional regulatory networks (TRNs) that underlie plant and animal development remains a challenge. However, a vast amount of data from public microarray experiments is available, which can be subject to inference algorithms in order to recover reliable TRN architectures. Results In this study we present a simple bioinformatics methodology that uses public, carefully curated microarray data and the mutual information algorithm ARACNe in order to obtain a database of transcriptional interactions. We used data from Arabidopsis thaliana root samples to show that the transcriptional regulatory networks derived from this database successfully recover previously identified root transcriptional modules and to propose new transcription factors for the SHORT ROOT/SCARECROW and PLETHORA pathways. We further show that these networks are a powerful tool to integrate and analyze high-throughput expression data, as exemplified by our analysis of a SHORT ROOT induction time-course microarray dataset, and are a reliable source for the prediction of novel root gene functions. In particular, we used our database to predict novel genes involved in root secondary cell-wall synthesis and identified the MADS-box TF XAL1/AGL12 as an unexpected participant in this process. Conclusions This study demonstrates that network inference using carefully curated microarray data yields reliable TRN architectures. In contrast to previous efforts to obtain root TRNs, that have focused on particular functional modules or tissues, our root transcriptional interactions provide an overview of the transcriptional pathways present in Arabidopsis thaliana roots and will likely yield a plethora of novel hypotheses to be tested experimentally. PMID:24739361

  4. Lunar Samples: Apollo Collection Tools, Curation Handling, Surveyor III and Soviet Luna Samples

    NASA Technical Reports Server (NTRS)

    Allton, J.H.

    2009-01-01

    The 6 Apollo missions that landed on the lunar surface returned 2196 samples comprised of 382 kg. The 58 samples weighing 21.5 kg collected on Apollo 11 expanded to 741 samples weighing 110.5 kg by the time of Apollo 17. The main goal on Apollo 11 was to obtain some material and return it safely to Earth. As we gained experience, the sampling tools and a more specific sampling strategy evolved. A summary of the sample types returned is shown in Table 1. By year 1989, some statistics on allocation by sample type were compiled [2]. The "scientific interest index" is based on the assumption that the more allocations per gram of sample, the higher the scientific interest. It is basically a reflection of the amount of diversity within a given sample type. Samples were also set aside for biohazard testing. The samples set aside and used for biohazard testing were represen-tative, as opposed to diverse. They tended to be larger and be comprised of less scientifically valuable mate-rial, such as dust and debris in the bottom of sample containers.

  5. The BioC-BioGRID corpus: full text articles annotated for curation of protein–protein and genetic interactions

    PubMed Central

    Kim, Sun; Chatr-aryamontri, Andrew; Chang, Christie S.; Oughtred, Rose; Rust, Jennifer; Wilbur, W. John; Comeau, Donald C.; Dolinski, Kara; Tyers, Mike

    2017-01-01

    A great deal of information on the molecular genetics and biochemistry of model organisms has been reported in the scientific literature. However, this data is typically described in free text form and is not readily amenable to computational analyses. To this end, the BioGRID database systematically curates the biomedical literature for genetic and protein interaction data. This data is provided in a standardized computationally tractable format and includes structured annotation of experimental evidence. BioGRID curation necessarily involves substantial human effort by expert curators who must read each publication to extract the relevant information. Computational text-mining methods offer the potential to augment and accelerate manual curation. To facilitate the development of practical text-mining strategies, a new challenge was organized in BioCreative V for the BioC task, the collaborative Biocurator Assistant Task. This was a non-competitive, cooperative task in which the participants worked together to build BioC-compatible modules into an integrated pipeline to assist BioGRID curators. As an integral part of this task, a test collection of full text articles was developed that contained both biological entity annotations (gene/protein and organism/species) and molecular interaction annotations (protein–protein and genetic interactions (PPIs and GIs)). This collection, which we call the BioC-BioGRID corpus, was annotated by four BioGRID curators over three rounds of annotation and contains 120 full text articles curated in a dataset representing two major model organisms, namely budding yeast and human. The BioC-BioGRID corpus contains annotations for 6409 mentions of genes and their Entrez Gene IDs, 186 mentions of organism names and their NCBI Taxonomy IDs, 1867 mentions of PPIs and 701 annotations of PPI experimental evidence statements, 856 mentions of GIs and 399 annotations of GI evidence statements. The purpose, characteristics and possible future uses of the BioC-BioGRID corpus are detailed in this report. Database URL: http://bioc.sourceforge.net/BioC-BioGRID.html PMID:28077563

  6. neXtA5: accelerating annotation of articles via automated approaches in neXtProt.

    PubMed

    Mottin, Luc; Gobeill, Julien; Pasche, Emilie; Michel, Pierre-André; Cusin, Isabelle; Gaudet, Pascale; Ruch, Patrick

    2016-01-01

    The rapid increase in the number of published articles poses a challenge for curated databases to remain up-to-date. To help the scientific community and database curators deal with this issue, we have developed an application, neXtA5, which prioritizes the literature for specific curation requirements. Our system, neXtA5, is a curation service composed of three main elements. The first component is a named-entity recognition module, which annotates MEDLINE over some predefined axes. This report focuses on three axes: Diseases, the Molecular Function and Biological Process sub-ontologies of the Gene Ontology (GO). The automatic annotations are then stored in a local database, BioMed, for each annotation axis. Additional entities such as species and chemical compounds are also identified. The second component is an existing search engine, which retrieves the most relevant MEDLINE records for any given query. The third component uses the content of BioMed to generate an axis-specific ranking, which takes into account the density of named-entities as stored in the Biomed database. The two ranked lists are ultimately merged using a linear combination, which has been specifically tuned to support the annotation of each axis. The fine-tuning of the coefficients is formally reported for each axis-driven search. Compared with PubMed, which is the system used by most curators, the improvement is the following: +231% for Diseases, +236% for Molecular Functions and +3153% for Biological Process when measuring the precision of the top-returned PMID (P0 or mean reciprocal rank). The current search methods significantly improve the search effectiveness of curators for three important curation axes. Further experiments are being performed to extend the curation types, in particular protein-protein interactions, which require specific relationship extraction capabilities. In parallel, user-friendly interfaces powered with a set of JSON web services are currently being implemented into the neXtProt annotation pipeline.Available on: http://babar.unige.ch:8082/neXtA5Database URL: http://babar.unige.ch:8082/neXtA5/fetcher.jsp. © The Author(s) 2016. Published by Oxford University Press.

  7. Variations in Receipt of Curative-Intent Surgery for Early-Stage Non-Small Cell Lung Cancer (NSCLC) by State.

    PubMed

    Sineshaw, Helmneh M; Wu, Xiao-Cheng; Flanders, W Dana; Osarogiagbon, Raymond Uyiosa; Jemal, Ahmedin

    2016-06-01

    Previous studies reported racial and socioeconomic disparities in receipt of curative-intent surgery for early-stage non-small cell lung cancer (NSCLC) in the United States. We examined variation in receipt of surgery and whether the racial disparity varies by state. Patients in whom stage I or II NSCLC was diagnosed from 2007 to 2011 were identified from 38 state and the District of Columbia population-based cancer registries compiled by the North American Association of Central Cancer Registries. Percentage of patients receiving curative-intent surgery was calculated for each registry. Adjusted risk ratios were generated by using modified Poisson regression to control for sociodemographic (e.g., age, sex, race, insurance) and clinical (e.g., grade, stage) factors. Non-Hispanic (NH) whites and Massachusetts were used as references for comparisons because they had the lowest uninsured rates. In all registries combined, 66.4% of patients with early-stage NSCLC (73,475 of 110,711) received curative-intent surgery. Receipt of curative-intent surgery for early-stage NSCLC varied substantially by state, ranging from 52.2% to 56.1% in Wyoming, Louisiana, and New Mexico to 75.2% to 77.2% in Massachusetts, New Jersey, and Utah. In a multivariable analysis, the likelihood of receiving curative-intent surgery was significantly lower in all but nine states/registries compared with Massachusetts, ranging from 7% lower in California to 25% lower in Wyoming. Receipt of curative-intent surgery for early-stage NSCLC was lower for NH blacks than for NH whites in every state, although statistically significant in Florida and Texas. Receipt of curative-intent surgery for early-stage NSCLC varies substantially across states in the United States, with northeastern states generally showing the highest rates. Further, receipt of treatment appeared to be lower in NH blacks than in NH whites in every state, although statistically significant in Florida and Texas. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  8. Advancing the application of systems thinking in health: why cure crowds out prevention.

    PubMed

    Bishai, David; Paina, Ligia; Li, Qingfeng; Peters, David H; Hyder, Adnan A

    2014-06-16

    This paper presents a system dynamics computer simulation model to illustrate unintended consequences of apparently rational allocations to curative and preventive services. A modeled population is subject to only two diseases. Disease A is a curable disease that can be shortened by curative care. Disease B is an instantly fatal but preventable disease. Curative care workers are financed by public spending and private fees to cure disease A. Non-personal, preventive services are delivered by public health workers supported solely by public spending to prevent disease B. Each type of worker tries to tilt the balance of government spending towards their interests. Their influence on the government is proportional to their accumulated revenue. The model demonstrates effects on lost disability-adjusted life years and costs over the course of several epidemics of each disease. Policy interventions are tested including: i) an outside donor rationally donates extra money to each type of disease precisely in proportion to the size of epidemics of each disease; ii) lobbying is eliminated; iii) fees for personal health services are eliminated; iv) the government continually rebalances the funding for prevention by ring-fencing it to protect it from lobbying.The model exhibits a "spend more get less" equilibrium in which higher revenue by the curative sector is used to influence government allocations away from prevention towards cure. Spending more on curing disease A leads paradoxically to a higher overall disease burden of unprevented cases of disease B. This paradoxical behavior of the model can be stopped by eliminating lobbying, eliminating fees for curative services, and ring-fencing public health funding. We have created an artificial system as a laboratory to gain insights about the trade-offs between curative and preventive health allocations, and the effect of indicative policy interventions. The underlying dynamics of this artificial system resemble features of modern health systems where a self-perpetuating industry has grown up around disease-specific curative programs like HIV/AIDS or malaria. The model shows how the growth of curative care services can crowd both fiscal and policy space for the practice of population level prevention work, requiring dramatic interventions to overcome these trends.

  9. A statistical approach to identify, monitor, and manage incomplete curated data sets.

    PubMed

    Howe, Douglas G

    2018-04-02

    Many biological knowledge bases gather data through expert curation of published literature. High data volume, selective partial curation, delays in access, and publication of data prior to the ability to curate it can result in incomplete curation of published data. Knowing which data sets are incomplete and how incomplete they are remains a challenge. Awareness that a data set may be incomplete is important for proper interpretation, to avoiding flawed hypothesis generation, and can justify further exploration of published literature for additional relevant data. Computational methods to assess data set completeness are needed. One such method is presented here. In this work, a multivariate linear regression model was used to identify genes in the Zebrafish Information Network (ZFIN) Database having incomplete curated gene expression data sets. Starting with 36,655 gene records from ZFIN, data aggregation, cleansing, and filtering reduced the set to 9870 gene records suitable for training and testing the model to predict the number of expression experiments per gene. Feature engineering and selection identified the following predictive variables: the number of journal publications; the number of journal publications already attributed for gene expression annotation; the percent of journal publications already attributed for expression data; the gene symbol; and the number of transgenic constructs associated with each gene. Twenty-five percent of the gene records (2483 genes) were used to train the model. The remaining 7387 genes were used to test the model. One hundred and twenty-two and 165 of the 7387 tested genes were identified as missing expression annotations based on their residuals being outside the model lower or upper 95% confidence interval respectively. The model had precision of 0.97 and recall of 0.71 at the negative 95% confidence interval and precision of 0.76 and recall of 0.73 at the positive 95% confidence interval. This method can be used to identify data sets that are incompletely curated, as demonstrated using the gene expression data set from ZFIN. This information can help both database resources and data consumers gauge when it may be useful to look further for published data to augment the existing expertly curated information.

  10. neXtA5: accelerating annotation of articles via automated approaches in neXtProt

    PubMed Central

    Mottin, Luc; Gobeill, Julien; Pasche, Emilie; Michel, Pierre-André; Cusin, Isabelle; Gaudet, Pascale; Ruch, Patrick

    2016-01-01

    The rapid increase in the number of published articles poses a challenge for curated databases to remain up-to-date. To help the scientific community and database curators deal with this issue, we have developed an application, neXtA5, which prioritizes the literature for specific curation requirements. Our system, neXtA5, is a curation service composed of three main elements. The first component is a named-entity recognition module, which annotates MEDLINE over some predefined axes. This report focuses on three axes: Diseases, the Molecular Function and Biological Process sub-ontologies of the Gene Ontology (GO). The automatic annotations are then stored in a local database, BioMed, for each annotation axis. Additional entities such as species and chemical compounds are also identified. The second component is an existing search engine, which retrieves the most relevant MEDLINE records for any given query. The third component uses the content of BioMed to generate an axis-specific ranking, which takes into account the density of named-entities as stored in the Biomed database. The two ranked lists are ultimately merged using a linear combination, which has been specifically tuned to support the annotation of each axis. The fine-tuning of the coefficients is formally reported for each axis-driven search. Compared with PubMed, which is the system used by most curators, the improvement is the following: +231% for Diseases, +236% for Molecular Functions and +3153% for Biological Process when measuring the precision of the top-returned PMID (P0 or mean reciprocal rank). The current search methods significantly improve the search effectiveness of curators for three important curation axes. Further experiments are being performed to extend the curation types, in particular protein–protein interactions, which require specific relationship extraction capabilities. In parallel, user-friendly interfaces powered with a set of JSON web services are currently being implemented into the neXtProt annotation pipeline. Available on: http://babar.unige.ch:8082/neXtA5 Database URL: http://babar.unige.ch:8082/neXtA5/fetcher.jsp PMID:27374119

  11. OSIRIS-REx Asterod Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Nakamura-Messinger, Keiki; Connolly, Harold C. Jr.; Messenger, Scott; Lauretta, Dante S.

    2017-01-01

    OSIRIS-REx is NASA's third New Frontiers Program mission, following New Horizons that completed a flyby of Pluto in 2015 and the Juno mission to Jupiter that has just begun science operations. The OSIRIS-REx mission's primary objective is to collect pristine surface samples of a carbonaceous asteroid and return to Earth for analysis. Carbonaceous asteroids and comets are 'primitive' bodies that preserved remnants of the Solar System starting materials and through their study scientists can learn about the origin and the earliest evolution of the Solar System. The OSIRIS-REx spacecraft was successfully launched on September 8, 2016, beginning its seven year journey to asteroid 101955 Bennu. The robotic arm will collect 60-2000 grams of material from the surface of Bennu and will return to Earth in 2023 for worldwide distribution by the Astromaterials Curation Facility at NASA Johnson Space Center. The name OSIRIS-REx embodies the mission objectives (1) Origins: Return and analyze a sample of a carbonaceous asteroid, (2) Spectral Interpretation: Provide ground-truth for remote observation of asteroids, (3) Resource Identification: Determine the mineral and chemical makeup of a near-Earth asteroid (4) Security: Measure the non-gravitational that changes asteroidal orbits and (5) Regolith Explorer: Determine the properties of the material covering an asteroid surface. Asteroid Bennu may preserve remnants of stardust, interstellar materials and the first solids to form in the Solar System and the molecular precursors to the origin of life and the Earth's oceans. Bennu is a potentially hazardous asteroid, with an approximately 1 in 2700 chance of impacting the Earth late in the 22nd century. OSIRIS-REx collects from Bennu will help formulate the types of operations and identify mission activities that astronauts will perform during their expeditions. Such information is crucial in preparing for humanity's next steps beyond low Earthy orbit and on to deep space destinations.

  12. MetaRNA-Seq: An Interactive Tool to Browse and Annotate Metadata from RNA-Seq Studies.

    PubMed

    Kumar, Pankaj; Halama, Anna; Hayat, Shahina; Billing, Anja M; Gupta, Manish; Yousri, Noha A; Smith, Gregory M; Suhre, Karsten

    2015-01-01

    The number of RNA-Seq studies has grown in recent years. The design of RNA-Seq studies varies from very simple (e.g., two-condition case-control) to very complicated (e.g., time series involving multiple samples at each time point with separate drug treatments). Most of these publically available RNA-Seq studies are deposited in NCBI databases, but their metadata are scattered throughout four different databases: Sequence Read Archive (SRA), Biosample, Bioprojects, and Gene Expression Omnibus (GEO). Although the NCBI web interface is able to provide all of the metadata information, it often requires significant effort to retrieve study- or project-level information by traversing through multiple hyperlinks and going to another page. Moreover, project- and study-level metadata lack manual or automatic curation by categories, such as disease type, time series, case-control, or replicate type, which are vital to comprehending any RNA-Seq study. Here we describe "MetaRNA-Seq," a new tool for interactively browsing, searching, and annotating RNA-Seq metadata with the capability of semiautomatic curation at the study level.

  13. A framework for organizing cancer-related variations from existing databases, publications and NGS data using a High-performance Integrated Virtual Environment (HIVE).

    PubMed

    Wu, Tsung-Jung; Shamsaddini, Amirhossein; Pan, Yang; Smith, Krista; Crichton, Daniel J; Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    Years of sequence feature curation by UniProtKB/Swiss-Prot, PIR-PSD, NCBI-CDD, RefSeq and other database biocurators has led to a rich repository of information on functional sites of genes and proteins. This information along with variation-related annotation can be used to scan human short sequence reads from next-generation sequencing (NGS) pipelines for presence of non-synonymous single-nucleotide variations (nsSNVs) that affect functional sites. This and similar workflows are becoming more important because thousands of NGS data sets are being made available through projects such as The Cancer Genome Atlas (TCGA), and researchers want to evaluate their biomarkers in genomic data. BioMuta, an integrated sequence feature database, provides a framework for automated and manual curation and integration of cancer-related sequence features so that they can be used in NGS analysis pipelines. Sequence feature information in BioMuta is collected from the Catalogue of Somatic Mutations in Cancer (COSMIC), ClinVar, UniProtKB and through biocuration of information available from publications. Additionally, nsSNVs identified through automated analysis of NGS data from TCGA are also included in the database. Because of the petabytes of data and information present in NGS primary repositories, a platform HIVE (High-performance Integrated Virtual Environment) for storing, analyzing, computing and curating NGS data and associated metadata has been developed. Using HIVE, 31 979 nsSNVs were identified in TCGA-derived NGS data from breast cancer patients. All variations identified through this process are stored in a Curated Short Read archive, and the nsSNVs from the tumor samples are included in BioMuta. Currently, BioMuta has 26 cancer types with 13 896 small-scale and 308 986 large-scale study-derived variations. Integration of variation data allows identifications of novel or common nsSNVs that can be prioritized in validation studies. Database URL: BioMuta: http://hive.biochemistry.gwu.edu/tools/biomuta/index.php; CSR: http://hive.biochemistry.gwu.edu/dna.cgi?cmd=csr; HIVE: http://hive.biochemistry.gwu.edu.

  14. Liverome: a curated database of liver cancer-related gene signatures with self-contained context information.

    PubMed

    Lee, Langho; Wang, Kai; Li, Gang; Xie, Zhi; Wang, Yuli; Xu, Jiangchun; Sun, Shaoxian; Pocalyko, David; Bhak, Jong; Kim, Chulhong; Lee, Kee-Ho; Jang, Ye Jin; Yeom, Young Il; Yoo, Hyang-Sook; Hwang, Seungwoo

    2011-11-30

    Hepatocellular carcinoma (HCC) is the fifth most common cancer worldwide. A number of molecular profiling studies have investigated the changes in gene and protein expression that are associated with various clinicopathological characteristics of HCC and generated a wealth of scattered information, usually in the form of gene signature tables. A database of the published HCC gene signatures would be useful to liver cancer researchers seeking to retrieve existing differential expression information on a candidate gene and to make comparisons between signatures for prioritization of common genes. A challenge in constructing such database is that a direct import of the signatures as appeared in articles would lead to a loss or ambiguity of their context information that is essential for a correct biological interpretation of a gene's expression change. This challenge arises because designation of compared sample groups is most often abbreviated, ad hoc, or even missing from published signature tables. Without manual curation, the context information becomes lost, leading to uninformative database contents. Although several databases of gene signatures are available, none of them contains informative form of signatures nor shows comprehensive coverage on liver cancer. Thus we constructed Liverome, a curated database of liver cancer-related gene signatures with self-contained context information. Liverome's data coverage is more than three times larger than any other signature database, consisting of 143 signatures taken from 98 HCC studies, mostly microarray and proteome, and involving 6,927 genes. The signatures were post-processed into an informative and uniform representation and annotated with an itemized summary so that all context information is unambiguously self-contained within the database. The signatures were further informatively named and meaningfully organized according to ten functional categories for guided browsing. Its web interface enables a straightforward retrieval of known differential expression information on a query gene and a comparison of signatures to prioritize common genes. The utility of Liverome-collected data is shown by case studies in which useful biological insights on HCC are produced. Liverome database provides a comprehensive collection of well-curated HCC gene signatures and straightforward interfaces for gene search and signature comparison as well. Liverome is available at http://liverome.kobic.re.kr.

  15. Identification of New Lithic Clasts in Lunar Breccia 14305 by Micro-CT and Micro-XRF Analysis

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan A.; Carpenter, Paul K.; Jolliff, Bradley L.

    2014-01-01

    From 1969 to 1972, Apollo astronauts collected 382 kg of rocks, soils, and core samples from six locations on the surface of the Moon. The samples were initially characterized, largely by binocular examination, in a custom-built facility at Johnson Space Center (JSC), and the samples have been curated at JSC ever since. Despite over 40 years of study, demand for samples remains high (500 subsamples per year are allocated to scientists around the world), particularly for plutonic (e.g., anorthosites, norites, etc.) and evolved (e.g., granites, KREEP basalts) lithologies. The reason for the prolonged interest is that as new scientists and new techniques examine the samples, our understanding of how the Moon, Earth, and other inner Solar System bodies formed and evolved continues to grow. Scientists continually clamor for new samples to test their emerging hypotheses. Although all of the large Apollo samples that are igneous rocks have been classified, many Apollo samples are complex polymict breccias that have previously yielded large (cm-sized) igneous clasts. In this work we present the initial efforts to use the non-destructive techniques of micro-computed tomography (micro-CT) and micro x-ray fluorescence (micro-XRF) to identify large lithic clasts in Apollo 14 polymict breccia sample 14305. The sample in this study is 14305,483, a 150 g slab of regolith breccia 14305 measuring 10x6x2 cm (Figure 1a). The sample was scanned at the University of Texas High-Resolution X-ray CT Facility on an Xradia MicroXCT scanner. Two adjacent overlapping volumes were acquired at 49.2 micrometer resolution and stitched together, resulting in 1766 slices. Each volume was acquired at 100 kV accelerating voltage and 98 mA beam current with a 1 mm CaF2 filter, with 2161 views gathered over 360deg at 3 seconds acquisition time per view. Micro-XRF analyses were done at Washington University in St. Louis, Missouri on an EDAX Orbis PC micro-XRF instrument. Multiple scans were made at 40 kV accelerating voltage, 800 mA beam current, 30 µm beam diameter, and a beam spacing of 30-120 micrometer. The micro-CT scan of 14305,483 (Figure 2) was able to identify several large lithic clasts (approx. 1 cm) within the interior of the slab. These clasts will be exposed by band-sawing or chipping of the slab, and their composition more fully characterized by subsequent micro-XRF analysis. In addition to lithic clasts, the micro-CT scans identified numerous mineral clasts, including many FeNi metal grains, as well as the prominent fractures within the slab. The micro-XRF analyses (Figure 1b,c) of the slab surfaces revealed the bulk chemical compositions (qualitative) of the different clast types observed. In particular, by looking at the ratios of major elements (e.g. Ca:Mg:Fe), differences among the many observed clast types are readily observed. Moreover, several clasts not apparent to the naked eye were revealed in the K:Al:Si ratio map. The scans are also able to identify small grains of Zr- and P-rich minerals (not shown), which could in turn yield important age dates for the samples.

  16. DEMO: ECOTOX Knowledgebase

    EPA Science Inventory

    The ECOTOXicology Knowledgebase (ECOTOX), is a comprehensive, curated database that summarizes toxicology data fromsingle chemical exposure studies to aquatic life, terrestrial plants, and wildlife. The ECOTOX Knowledgebase currently has curated data from over 47,000 references a...

  17. Ci4SeR--curation interface for semantic resources--evaluation with adverse drug reactions.

    PubMed

    Souvignet, Julien; Asfari, Hadyl; Declerck, Gunnar; Lardon, Jérémy; Trombert-Paviot, Béatrice; Jaulent, Marie-Christine; Bousquet, Cédric

    2014-01-01

    Evaluation and validation have become a crucial problem for the development of semantic resources. We developed Ci4SeR, a Graphical User Interface to optimize the curation work (not taking into account structural aspects), suitable for any type of resource with lightweight description logic. We tested it on OntoADR, an ontology of adverse drug reactions. A single curator has reviewed 326 terms (1020 axioms) in an estimated time of 120 hours (2.71 concepts and 8.5 axioms reviewed per hour) and added 1874 new axioms (15.6 axioms per hour). Compared with previous manual endeavours, the interface allows increasing the speed-rate of reviewed concepts by 68% and axiom addition by 486%. A wider use of Ci4SeR would help semantic resources curation and improve completeness of knowledge modelling.

  18. Text mining for the biocuration workflow

    PubMed Central

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  19. Text mining for the biocuration workflow.

    PubMed

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  20. Improving links between literature and biological data with text mining: a case study with GEO, PDB and MEDLINE.

    PubMed

    Névéol, Aurélie; Wilbur, W John; Lu, Zhiyong

    2012-01-01

    High-throughput experiments and bioinformatics techniques are creating an exploding volume of data that are becoming overwhelming to keep track of for biologists and researchers who need to access, analyze and process existing data. Much of the available data are being deposited in specialized databases, such as the Gene Expression Omnibus (GEO) for microarrays or the Protein Data Bank (PDB) for protein structures and coordinates. Data sets are also being described by their authors in publications archived in literature databases such as MEDLINE and PubMed Central. Currently, the curation of links between biological databases and the literature mainly relies on manual labour, which makes it a time-consuming and daunting task. Herein, we analysed the current state of link curation between GEO, PDB and MEDLINE. We found that the link curation is heterogeneous depending on the sources and databases involved, and that overlap between sources is low, <50% for PDB and GEO. Furthermore, we showed that text-mining tools can automatically provide valuable evidence to help curators broaden the scope of articles and database entries that they review. As a result, we made recommendations to improve the coverage of curated links, as well as the consistency of information available from different databases while maintaining high-quality curation. Database URLs: http://www.ncbi.nlm.nih.gov/PubMed, http://www.ncbi.nlm.nih.gov/geo/, http://www.rcsb.org/pdb/

  1. Improving links between literature and biological data with text mining: a case study with GEO, PDB and MEDLINE

    PubMed Central

    Névéol, Aurélie; Wilbur, W. John; Lu, Zhiyong

    2012-01-01

    High-throughput experiments and bioinformatics techniques are creating an exploding volume of data that are becoming overwhelming to keep track of for biologists and researchers who need to access, analyze and process existing data. Much of the available data are being deposited in specialized databases, such as the Gene Expression Omnibus (GEO) for microarrays or the Protein Data Bank (PDB) for protein structures and coordinates. Data sets are also being described by their authors in publications archived in literature databases such as MEDLINE and PubMed Central. Currently, the curation of links between biological databases and the literature mainly relies on manual labour, which makes it a time-consuming and daunting task. Herein, we analysed the current state of link curation between GEO, PDB and MEDLINE. We found that the link curation is heterogeneous depending on the sources and databases involved, and that overlap between sources is low, <50% for PDB and GEO. Furthermore, we showed that text-mining tools can automatically provide valuable evidence to help curators broaden the scope of articles and database entries that they review. As a result, we made recommendations to improve the coverage of curated links, as well as the consistency of information available from different databases while maintaining high-quality curation. Database URLs: http://www.ncbi.nlm.nih.gov/PubMed, http://www.ncbi.nlm.nih.gov/geo/, http://www.rcsb.org/pdb/ PMID:22685160

  2. Transterm—extended search facilities and improved integration with other databases

    PubMed Central

    Jacobs, Grant H.; Stockwell, Peter A.; Tate, Warren P.; Brown, Chris M.

    2006-01-01

    Transterm has now been publicly available for >10 years. Major changes have been made since its last description in this database issue in 2002. The current database provides data for key regions of mRNA sequences, a curated database of mRNA motifs and tools to allow users to investigate their own motifs or mRNA sequences. The key mRNA regions database is derived computationally from Genbank. It contains 3′ and 5′ flanking regions, the initiation and termination signal context and coding sequence for annotated CDS features from Genbank and RefSeq. The database is non-redundant, enabling summary files and statistics to be prepared for each species. Advances include providing extended search facilities, the database may now be searched by BLAST in addition to regular expressions (patterns) allowing users to search for motifs such as known miRNA sequences, and the inclusion of RefSeq data. The database contains >40 motifs or structural patterns important for translational control. In this release, patterns from UTRsite and Rfam are also incorporated with cross-referencing. Users may search their sequence data with Transterm or user-defined patterns. The system is accessible at . PMID:16381889

  3. The Apollo Lunar Sample Image Collection: Digital Archiving and Online Access

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Lofgren, Gary E.; Stefanov, William L.; Garcia, Patricia A.

    2014-01-01

    The primary goal of the Apollo Program was to land human beings on the Moon and bring them safely back to Earth. This goal was achieved during six missions - Apollo 11, 12, 14, 15, 16, and 17 - that took place between 1969 and 1972. Among the many noteworthy engineering and scientific accomplishments of these missions, perhaps the most important in terms of scientific impact was the return of 382 kg (842 lb.) of lunar rocks, core samples, pebbles, sand, and dust from the lunar surface to Earth. Returned samples were curated at JSC (then known as the Manned Spacecraft Center) and, as part of the original processing, high-quality photographs were taken of each sample. The top, bottom, and sides of each rock sample were photographed, along with 16 stereo image pairs taken at 45-degree intervals. Photographs were also taken whenever a sample was subdivided and when thin sections were made. This collection of lunar sample images consists of roughly 36,000 photographs; all six Apollo missions are represented.

  4. Favourable prognosis of cystadeno- over adenocarcinoma of the pancreas after curative resection.

    PubMed

    Ridder, G J; Maschek, H; Klempnauer, J

    1996-06-01

    This report details nine patients after curative surgical resection of histologically proven mucinous cystadenocarcinoma of the pancreas and compares the prognosis with ductal adenocarcinomas. Cystadenocarcinomas represented 2.1% (10/ 466) of a total of 466 patients who underwent surgical exploration and 5.5%, of all curatively resected carcinomas of the exocrine pancreas at Hanover Medical School from 1971 to 1994. Forty percent of adenocarcinomas and 90% of cystadenocarcinomas were resectable. A curative R0 resection was possible in all patients with cystadenocarcinoma and 85 % with adenocarcinoma. Six of the patients with cystadenocarcinoma were female and three were male. Their median age was 54 +/- 12 years (range: 44 to 81 years). Four cystic neoplasms were located in the head, one in the head and body, three in the tail, and one in the body and tail of the pancreas. There was no hospital mortality in this group. The prognosis after resection of cystadenocarcinomas was significantly better compared to ductal adenocarcinomas of the pancreas. The Kaplan-Meier survival was 89% vs 52% after 1 year, and 56% vs 13% at 5 years. Our results indicate the favourable prognosis of cystadeno- over ductal adenocarcinomas of the pancreas in a cohort of patients with curative tumour resection.

  5. Standards-based curation of a decade-old digital repository dataset of molecular information.

    PubMed

    Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Murray-Rust, Peter; Rzepa, Henry S; Stewart, James J P

    2015-01-01

    The desirable curation of 158,122 molecular geometries derived from the NCI set of reference molecules together with associated properties computed using the MOPAC semi-empirical quantum mechanical method and originally deposited in 2005 into the Cambridge DSpace repository as a data collection is reported. The procedures involved in the curation included annotation of the original data using new MOPAC methods, updating the syntax of the CML documents used to express the data to ensure schema conformance and adding new metadata describing the entries together with a XML schema transformation to map the metadata schema to that used by the DataCite organisation. We have adopted a granularity model in which a DataCite persistent identifier (DOI) is created for each individual molecule to enable data discovery and data metrics at this level using DataCite tools. We recommend that the future research data management (RDM) of the scientific and chemical data components associated with journal articles (the "supporting information") should be conducted in a manner that facilitates automatic periodic curation. Graphical abstractStandards and metadata-based curation of a decade-old digital repository dataset of molecular information.

  6. An automated curation procedure for addressing chemical errors and inconsistencies in public datasets used in QSAR modelling.

    PubMed

    Mansouri, K; Grulke, C M; Richard, A M; Judson, R S; Williams, A J

    2016-11-01

    The increasing availability of large collections of chemical structures and associated experimental data provides an opportunity to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experimental data. Here we describe the development of an automated KNIME workflow to curate and correct errors in the structure and identity of chemicals using the publicly available PHYSPROP physicochemical properties and environmental fate datasets. The workflow first assembles structure-identity pairs using up to four provided chemical identifiers, including chemical name, CASRNs, SMILES, and MolBlock. Problems detected included errors and mismatches in chemical structure formats, identifiers and various structure validation issues, including hypervalency and stereochemistry descriptions. Subsequently, a machine learning procedure was applied to evaluate the impact of this curation process. The performance of QSAR models built on only the highest-quality subset of the original dataset was compared with the larger curated and corrected dataset. The latter showed statistically improved predictive performance. The final workflow was used to curate the full list of PHYSPROP datasets, and is being made publicly available for further usage and integration by the scientific community.

  7. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts.

    PubMed

    Ravagli, Carlo; Pognan, Francois; Marc, Philippe

    2017-01-01

    The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com This software is designed to run on a Java EE application server and store data in a relational database. philippe.marc@novartis.com. © The Author 2016. Published by Oxford University Press.

  8. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts

    PubMed Central

    Ravagli, Carlo; Pognan, Francois

    2017-01-01

    Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099

  9. Quantitative PCR monitoring of the effect of azoxystrobin treatments on Mycosphaerella graminicola epidemics in the field.

    PubMed

    Rohel, Eric A; Laurent, Paul; Fraaije, Bart A; Cavelier, Nadine; Hollomon, Derek W

    2002-03-01

    Quantitative PCR and visual monitoring of Mycosphaerella graminicola epidemics were performed to investigate the effect of curative and preventative applications of azoxystrobin in wheat field crops. A non-systemic protectant and a systemic curative fungicide, chlorothalonil and epoxiconazole, respectively, were used as references. PCR diagnosis detected leaf infection by M graminicola 3 weeks before symptom appearance, thereby allowing a clear distinction between curative and preventative treatments. When applied 1 week after the beginning of infection, azoxystrobin curative activity was intermediate between chlorothalonil (low effect) and epoxiconazole. When applied preventatively, none of the fungicides completely prevented leaf infection. There was some indication that azoxystrobin preventative treatments may delay fungal DNA increase more than epoxiconazole at the beginning of leaf infection. Both curative and preventative treatments increased the time lapse between the earliest PCR detection and the measurement of a 10% necrotic leaf area. Azoxystrobin only slightly decreased the speed of necrotic area increase compared with epoxiconazole. Hence, azoxystrobin activity toward M graminicola mainly resides in lengthening the time lapse between the earliest PCR detection and the measurement of a 10% necrotic leaf area. Information generated in this way is useful for optimal positioning of azoxystrobin treatments on M graminicola.

  10. [Curative Effects of Hydroxyurea on the Patients with β-thalassaemia Intermadia].

    PubMed

    Huang, Li; Yao, Hong-Xia

    2016-06-01

    To investigate the clinical features of β-thalassaemia intermediate (TI) patients and the curative effect and side reactions of hydroxyurea therapys. Twenty nine patients with TI were divided into hydroxyurea therapy group and no hydroxyurea therapy group; the curative effect and side reactions in 2 groups were compared; the situation of blood transfusion in the 2 groups was evaluated. In hydroxyurea therapy group, the hemoglobin level increased after treatment for 3 months; the reticulocyte percentage obviously decreased after treatment for 12 months; the serum ferritin had been maintained at a low level; while in no hydroxyurea therapy group, the levels of hemoglobin and reticulocytes were not significantly improved after treatment, the serum ferritin level gradually increased. In hydroxyurea therapy group, 12 cases were out of blood transfusion after treatment for 12 months, effective rate of treatment was 85.71%; while in no hydroxyurea therapy group, the blood transfusion dependency was not improved after treatment. No serious side reactions were found in all the hydroxyurea treated patients. The hydroxyurea shows a better curative effect on TI patients, no serious side reactions occur in all the patients treated with hydroxyurea, but the long-term curative effect and side reactions should be observed continuously.

  11. R-Syst::diatom: an open-access and curated barcode database for diatoms and freshwater monitoring.

    PubMed

    Rimet, Frédéric; Chaumeil, Philippe; Keck, François; Kermarrec, Lenaïg; Vasselon, Valentin; Kahlert, Maria; Franc, Alain; Bouchez, Agnès

    2016-01-01

    Diatoms are micro-algal indicators of freshwater pollution. Current standardized methodologies are based on microscopic determinations, which is time consuming and prone to identification uncertainties. The use of DNA-barcoding has been proposed as a way to avoid these flaws. Combining barcoding with next-generation sequencing enables collection of a large quantity of barcodes from natural samples. These barcodes are identified as certain diatom taxa by comparing the sequences to a reference barcoding library using algorithms. Proof of concept was recently demonstrated for synthetic and natural communities and underlined the importance of the quality of this reference library. We present an open-access and curated reference barcoding database for diatoms, called R-Syst::diatom, developed in the framework of R-Syst, the network of systematic supported by INRA (French National Institute for Agricultural Research), see http://www.rsyst.inra.fr/en. R-Syst::diatom links DNA-barcodes to their taxonomical identifications, and is dedicated to identify barcodes from natural samples. The data come from two sources, a culture collection of freshwater algae maintained in INRA in which new strains are regularly deposited and barcoded and from the NCBI (National Center for Biotechnology Information) nucleotide database. Two kinds of barcodes were chosen to support the database: 18S (18S ribosomal RNA) and rbcL (Ribulose-1,5-bisphosphate carboxylase/oxygenase), because of their efficiency. Data are curated using innovative (Declic) and classical bioinformatic tools (Blast, classical phylogenies) and up-to-date taxonomy (Catalogues and peer reviewed papers). Every 6 months R-Syst::diatom is updated. The database is available through the R-Syst microalgae website (http://www.rsyst.inra.fr/) and a platform dedicated to next-generation sequencing data analysis, virtual_BiodiversityL@b (https://galaxy-pgtp.pierroton.inra.fr/). We present here the content of the library regarding the number of barcodes and diatom taxa. In addition to these information, morphological features (e.g. biovolumes, chloroplasts…), life-forms (mobility, colony-type) or ecological features (taxa preferenda to pollution) are indicated in R-Syst::diatom. Database URL: http://www.rsyst.inra.fr/. © The Author(s) 2016. Published by Oxford University Press.

  12. R-Syst::diatom: an open-access and curated barcode database for diatoms and freshwater monitoring

    PubMed Central

    Rimet, Frédéric; Chaumeil, Philippe; Keck, François; Kermarrec, Lenaïg; Vasselon, Valentin; Kahlert, Maria; Franc, Alain; Bouchez, Agnès

    2016-01-01

    Diatoms are micro-algal indicators of freshwater pollution. Current standardized methodologies are based on microscopic determinations, which is time consuming and prone to identification uncertainties. The use of DNA-barcoding has been proposed as a way to avoid these flaws. Combining barcoding with next-generation sequencing enables collection of a large quantity of barcodes from natural samples. These barcodes are identified as certain diatom taxa by comparing the sequences to a reference barcoding library using algorithms. Proof of concept was recently demonstrated for synthetic and natural communities and underlined the importance of the quality of this reference library. We present an open-access and curated reference barcoding database for diatoms, called R-Syst::diatom, developed in the framework of R-Syst, the network of systematic supported by INRA (French National Institute for Agricultural Research), see http://www.rsyst.inra.fr/en. R-Syst::diatom links DNA-barcodes to their taxonomical identifications, and is dedicated to identify barcodes from natural samples. The data come from two sources, a culture collection of freshwater algae maintained in INRA in which new strains are regularly deposited and barcoded and from the NCBI (National Center for Biotechnology Information) nucleotide database. Two kinds of barcodes were chosen to support the database: 18S (18S ribosomal RNA) and rbcL (Ribulose-1,5-bisphosphate carboxylase/oxygenase), because of their efficiency. Data are curated using innovative (Declic) and classical bioinformatic tools (Blast, classical phylogenies) and up-to-date taxonomy (Catalogues and peer reviewed papers). Every 6 months R-Syst::diatom is updated. The database is available through the R-Syst microalgae website (http://www.rsyst.inra.fr/) and a platform dedicated to next-generation sequencing data analysis, virtual_BiodiversityL@b (https://galaxy-pgtp.pierroton.inra.fr/). We present here the content of the library regarding the number of barcodes and diatom taxa. In addition to these information, morphological features (e.g. biovolumes, chloroplasts…), life-forms (mobility, colony-type) or ecological features (taxa preferenda to pollution) are indicated in R-Syst::diatom. Database URL: http://www.rsyst.inra.fr/ PMID:26989149

  13. CMR Metadata Curation

    NASA Technical Reports Server (NTRS)

    Shum, Dana; Bugbee, Kaylin

    2017-01-01

    This talk explains the ongoing metadata curation activities in the Common Metadata Repository. It explores tools that exist today which are useful for building quality metadata and also opens up the floor for discussions on other potentially useful tools.

  14. Contamination of the cold water distribution system of health care facilities by Legionella pneumophila: do we know the true dimension?

    PubMed

    Arvand, M; Jungkind, K; Hack, A

    2011-04-21

    German water guidelines do not recommend routine assessment of cold water for Legionella in healthcare facilities, except if the water temperature at distal sites exceeds 25°C. This study evaluates Legionella contamination in cold and warm water supplies of healthcare facilities in Hesse, Germany, and analyses the relationship between cold water temperature and Legionella contamination. Samples were collected from four facilities, with cases of healthcare-associated Legionnaires' disease or notable contamination of their water supply. Fifty-nine samples were from central lines and 625 from distal sites, comprising 316 cold and 309 warm water samples. Legionella was isolated from central lines in two facilities and from distal sites in four facilities. 17% of all central and 32% of all distal samples were contaminated. At distal sites, cold water samples were more frequently contaminated with Legionella (40% vs 23%, p <0.001) and with higher concentrations of Legionella (≥1,000 colony-forming unit/100 ml) (16% vs 6%, p<0.001) than warm water samples. There was no clear correlation between the cold water temperature at sampling time and the contamination rate. 35% of cold water samples under 20 °C at collection were contaminated. Our data highlight the importance of assessing the cold water supply of healthcare facilities for Legionella in the context of an intensified analysis.

  15. Science Support: The Building Blocks of Active Data Curation

    NASA Astrophysics Data System (ADS)

    Guillory, A.

    2013-12-01

    While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.

  16. Prevalence of Clostridium difficile in uncooked ground meat products from Pittsburgh, Pennsylvania.

    PubMed

    Curry, Scott R; Marsh, Jane W; Schlackman, Jessica L; Harrison, Lee H

    2012-06-01

    The prevalence of Clostridium difficile in retail meat samples has varied widely. The food supply may be a source for C. difficile infections. A total of 102 ground meat and sausage samples from 3 grocers in Pittsburgh, PA, were cultured for C. difficile. Brand A pork sausages were resampled between May 2011 and January 2012. Two out of 102 (2.0%) meat products initially sampled were positive for C. difficile; both were pork sausage from brand A from the same processing facility (facility A). On subsequent sampling of brand A products, 10/19 samples from processing facility A and 1/10 samples from 3 other facilities were positive for C. difficile. The isolates recovered were inferred ribotype 078, comprising 6 genotypes. The prevalence of C. difficile in retail meat may not be as high as previously reported in North America. When contamination occurs, it may be related to events at processing facilities.

  17. Contamination Control and Evaluation for Manufacturing, Ground Tests, Flight Operation and Post-Retrieval Analyses of the TANPOPO Exposed Panels and Capture Panels

    NASA Astrophysics Data System (ADS)

    Yano, Hajime; Hashimoto, Hirofumi; Kawaguchi, Yuko; Yokobori, Shin-ichi; Uchihori, Yukio; Tabata, Makoto; Yamagishi, Akihiko; Sasaki, Satoshi; Imai, Eiichi

    The TANPOPO (“dandelion” in Japanese) is Japan’s first astrobiology space experiment to be exposed on and retrieved from the ISS-Kibo Exposed Facility from the 2014-5 timeframe. During its 1-3 years of continuous exposure operation in the low earth orbit (LEO) of the Earth, it aims to test key questions consisted of the “quasi-panspermia” hypothesis, a theory for exogenesis origin of life and their precursor transports among celestial bodies The TANPOPO experiment consists of following six sub-themes (ST): 1) the first intact capture of terrestrial microbial colonies in LEO, 2) survival test of terrestrial microbes long exposed in LEO, 3) alteration tests of artificially composed “astronomical organic analogs” long exposed in LEO, 4) intact capture of organic-bearing micrometeoroids with the lowest peak temperature ever in LEO, 5) space flight verification of the world’s lowest density aerogels for intact capture of microparticles, and 6) meteoroid and orbital debris flux assessment only capable to be measured in-situ in LEO. Each will utilize one or more Capture Panel(CP) and Exposure Panel (EP) samples from various pointing faces onboard the Kibo Exposed Facility, i.e., anti-Earth pointing face(Space), leading face (East) and anti-Pressurized Facility face (North), as the ISS is an Earth gravity gradient three-axis stabilized satellite. In order to both satisfy scientific values and planetary protection policy, contamination control and evaluation protocols are implemented for the whole process of manufacturing, ground tests, flight operation and post-retrieval initial analyses of both CPs and EPs. The CPs employ blocks of 0.01g/ccultra-low dense aerogels on its to intact capture impacting solid microparticles such as organic-bearing micrometeoroids, artificial orbital debris and possible terrestrial aerosols temporally reached to the LEO, for assessing the possibility of interplanetary transport of life and its precursors. By analyzing them captured along tracks formed inside the aerogels, we will learn what kinds of extraterrestrial organic compounds in the pristine states inside micrometeoroids can be transported to the Earth from primitive bodies and how they will be altered in outer space. Also if we discover microparticles of terrestrial origin, we can examine if they present aerosols embedding microbial colonies by DNA and other analytical techniques on ground, in order to propose a yet-unknown mechanism for terrestrial life forms to be released to outer space even temporarily Either case of “sample return missions from LEO” is compliant with the COSPAR planetary protection policy. The EPs will contain a number of different UV-resistant and other terrestrial extremephile microbes and astronomical organic analogs to be exposed in the LEO with glass covers above. In order to assess synergy effects of the space environmental factors properly, the EPs on each exposed faces will be simultaneously logged peak temperatures, UV irradiation and cosmic ray radiation dosage by respective passive sensors, which will be either visually recorded in orbit and evaluated in ground laboratories after their retrievals. We will also keep identical blank samples inside the Kibo Pressurized Facility (PF) in the same duration as the TANPOPO exposure. They are also compliant with both COSPAR planetary protection policy and NASA human spaceflight safety regulations while maintaining scientific values of these samples under the suitable contamination control measures. TANPOPO’s Initial Sample Analysis and Curation (ISAC) is planned and will be conducted by its Preliminary Examination Team (PET). The ISAC plan for CPs covers the receipt of retrieved samples, their initial inspection and documentation, processing and distribution of the samples for detailed analyses of all the sub-themes, cataloging for data archiving and sample storage. For initial inspection and documentation, they will map and measure aerogel penetration tracks and captured particles (e.g., incoming angle, track depth and track volume). Then they will process keystone containing microparticles to be inspected further and their penetration tracks for allocation to respective sub-theme researchers, in accordance with their requests for the subsequent detailed analyses.

  18. Textpresso Central: a customizable platform for searching, text mining, viewing, and curating biomedical literature.

    PubMed

    Müller, H-M; Van Auken, K M; Li, Y; Sternberg, P W

    2018-03-09

    The biomedical literature continues to grow at a rapid pace, making the challenge of knowledge retrieval and extraction ever greater. Tools that provide a means to search and mine the full text of literature thus represent an important way by which the efficiency of these processes can be improved. We describe the next generation of the Textpresso information retrieval system, Textpresso Central (TPC). TPC builds on the strengths of the original system by expanding the full text corpus to include the PubMed Central Open Access Subset (PMC OA), as well as the WormBase C. elegans bibliography. In addition, TPC allows users to create a customized corpus by uploading and processing documents of their choosing. TPC is UIMA compliant, to facilitate compatibility with external processing modules, and takes advantage of Lucene indexing and search technology for efficient handling of millions of full text documents. Like Textpresso, TPC searches can be performed using keywords and/or categories (semantically related groups of terms), but to provide better context for interpreting and validating queries, search results may now be viewed as highlighted passages in the context of full text. To facilitate biocuration efforts, TPC also allows users to select text spans from the full text and annotate them, create customized curation forms for any data type, and send resulting annotations to external curation databases. As an example of such a curation form, we describe integration of TPC with the Noctua curation tool developed by the Gene Ontology (GO) Consortium. Textpresso Central is an online literature search and curation platform that enables biocurators and biomedical researchers to search and mine the full text of literature by integrating keyword and category searches with viewing search results in the context of the full text. It also allows users to create customized curation interfaces, use those interfaces to make annotations linked to supporting evidence statements, and then send those annotations to any database in the world. Textpresso Central URL: http://www.textpresso.org/tpc.

  19. Targeted journal curation as a method to improve data currency at the Comparative Toxicogenomics Database

    PubMed Central

    Davis, Allan Peter; Johnson, Robin J.; Lennon-Hopkins, Kelley; Sciaky, Daniela; Rosenstein, Michael C.; Wiegers, Thomas C.; Mattingly, Carolyn J.

    2012-01-01

    The Comparative Toxicogenomics Database (CTD) is a public resource that promotes understanding about the effects of environmental chemicals on human health. CTD biocurators read the scientific literature and manually curate a triad of chemical–gene, chemical–disease and gene–disease interactions. Typically, articles for CTD are selected using a chemical-centric approach by querying PubMed to retrieve a corpus containing the chemical of interest. Although this technique ensures adequate coverage of knowledge about the chemical (i.e. data completeness), it does not necessarily reflect the most current state of all toxicological research in the community at large (i.e. data currency). Keeping databases current with the most recent scientific results, as well as providing a rich historical background from legacy articles, is a challenging process. To address this issue of data currency, CTD designed and tested a journal-centric approach of curation to complement our chemical-centric method. We first identified priority journals based on defined criteria. Next, over 7 weeks, three biocurators reviewed 2425 articles from three consecutive years (2009–2011) of three targeted journals. From this corpus, 1252 articles contained relevant data for CTD and 52 752 interactions were manually curated. Here, we describe our journal selection process, two methods of document delivery for the biocurators and the analysis of the resulting curation metrics, including data currency, and both intra-journal and inter-journal comparisons of research topics. Based on our results, we expect that curation by select journals can (i) be easily incorporated into the curation pipeline to complement our chemical-centric approach; (ii) build content more evenly for chemicals, genes and diseases in CTD (rather than biasing data by chemicals-of-interest); (iii) reflect developing areas in environmental health and (iv) improve overall data currency for chemicals, genes and diseases. Database URL: http://ctdbase.org/ PMID:23221299

  20. Closing the loop: from paper to protein annotation using supervised Gene Ontology classification.

    PubMed

    Gobeill, Julien; Pasche, Emilie; Vishnyakova, Dina; Ruch, Patrick

    2014-01-01

    Gene function curation of the literature with Gene Ontology (GO) concepts is one particularly time-consuming task in genomics, and the help from bioinformatics is highly requested to keep up with the flow of publications. In 2004, the first BioCreative challenge already designed a task of automatic GO concepts assignment from a full text. At this time, results were judged far from reaching the performances required by real curation workflows. In particular, supervised approaches produced the most disappointing results because of lack of training data. Ten years later, the available curation data have massively grown. In 2013, the BioCreative IV GO task revisited the automatic GO assignment task. For this issue, we investigated the power of our supervised classifier, GOCat. GOCat computes similarities between an input text and already curated instances contained in a knowledge base to infer GO concepts. The subtask A consisted in selecting GO evidence sentences for a relevant gene in a full text. For this, we designed a state-of-the-art supervised statistical approach, using a naïve Bayes classifier and the official training set, and obtained fair results. The subtask B consisted in predicting GO concepts from the previous output. For this, we applied GOCat and reached leading results, up to 65% for hierarchical recall in the top 20 outputted concepts. Contrary to previous competitions, machine learning has this time outperformed standard dictionary-based approaches. Thanks to BioCreative IV, we were able to design a complete workflow for curation: given a gene name and a full text, this system is able to select evidence sentences for curation and to deliver highly relevant GO concepts. Contrary to previous competitions, machine learning this time outperformed dictionary-based systems. Observed performances are sufficient for being used in a real semiautomatic curation workflow. GOCat is available at http://eagl.unige.ch/GOCat/. http://eagl.unige.ch/GOCat4FT/. © The Author(s) 2014. Published by Oxford University Press.

  1. 46 CFR 162.050-15 - Designation of facilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... This is the mean and standard deviation, respectively, of the differences between the known sample... sample analysis, and the materials necessary to perform the tests; (2) Each facility test rig must be of... facilities. (a) Each request for designation as a facility authorized to perform approval tests must be...

  2. 77 FR 58313 - Revisions to the California State Implementation Plan, San Diego County, Antelope Valley and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ..., closures and coils, from graphic arts operations, from the provision of sampling and testing facilities... Provision of Sampling and Revised 03/21/01....... 05/31/01 Testing Facilities. AVAQMD 1168 Adhesive and... District (1) Rule 205, ``Provision of Sampling and Testing Facilities,'' revised on March 21, 2001...

  3. University of Maryland MRSEC - Facilities: SEM/STM/AFM

    Science.gov Websites

    MRSEC Templates Opportunities Search Home » Facilities » SEM/STM/AFM Shared Experimental Facilities conducting and non conducting samples. The sample stage permits electronic device imaging under operational Specifications: Image Modes - STM, STS, MFM, EFM, SKPM, contact- and non-contact AFM Three Sample Contacts 0.1 nm

  4. Comparative Studies of the Proteome, Glycoproteome, and N-Glycome of Clear Cell Renal Cell Carcinoma Plasma before and after Curative Nephrectomy

    PubMed Central

    2015-01-01

    Clear cell renal cell carcinoma is the most prevalent of all reported kidney cancer cases, and currently there are no markers for early diagnosis. This has stimulated great research interest recently because early detection of the disease can significantly improve the low survival rate. Combining the proteome, glycoproteome, and N-glycome data from clear cell renal cell carcinoma plasma has the potential of identifying candidate markers for early diagnosis and prognosis and/or to monitor disease recurrence. Here, we report on the utilization of a multi-dimensional fractionation approach (12P-M-LAC) and LC–MS/MS to comprehensively investigate clear cell renal cell carcinoma plasma collected before (disease) and after (non-disease) curative nephrectomy (n = 40). Proteins detected in the subproteomes were investigated via label-free quantification. Protein abundance analysis revealed a number of low-level proteins with significant differential expression levels in disease samples, including HSPG2, CD146, ECM1, SELL, SYNE1, and VCAM1. Importantly, we observed a strong correlation between differentially expressed proteins and clinical status of the patient. Investigation of the glycoproteome returned 13 candidate glycoproteins with significant differential M-LAC column binding. Qualitative analysis indicated that 62% of selected candidate glycoproteins showed higher levels (upregulation) in M-LAC bound fraction of disease samples. This observation was further confirmed by released N-glycans data in which 53% of identified N-glycans were present at different levels in plasma in the disease vs non-disease samples. This striking result demonstrates the potential for significant protein glycosylation alterations in clear cell renal cell carcinoma cancer plasma. With future validation in a larger cohort, information derived from this study may lead to the development of clear cell renal cell carcinoma candidate biomarkers. PMID:25184692

  5. Enabling phenotypic big data with PheNorm.

    PubMed

    Yu, Sheng; Ma, Yumeng; Gronsbell, Jessica; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Liao, Katherine P; Cai, Tianxi

    2018-01-01

    Electronic health record (EHR)-based phenotyping infers whether a patient has a disease based on the information in his or her EHR. A human-annotated training set with gold-standard disease status labels is usually required to build an algorithm for phenotyping based on a set of predictive features. The time intensiveness of annotation and feature curation severely limits the ability to achieve high-throughput phenotyping. While previous studies have successfully automated feature curation, annotation remains a major bottleneck. In this paper, we present PheNorm, a phenotyping algorithm that does not require expert-labeled samples for training. The most predictive features, such as the number of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes or mentions of the target phenotype, are normalized to resemble a normal mixture distribution with high area under the receiver operating curve (AUC) for prediction. The transformed features are then denoised and combined into a score for accurate disease classification. We validated the accuracy of PheNorm with 4 phenotypes: coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis. The AUCs of the PheNorm score reached 0.90, 0.94, 0.95, and 0.94 for the 4 phenotypes, respectively, which were comparable to the accuracy of supervised algorithms trained with sample sizes of 100-300, with no statistically significant difference. The accuracy of the PheNorm algorithms is on par with algorithms trained with annotated samples. PheNorm fully automates the generation of accurate phenotyping algorithms and demonstrates the capacity for EHR-driven annotations to scale to the next level - phenotypic big data. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. Endoscopic submucosal dissection for early esophageal neoplasms using the stag beetle knife

    PubMed Central

    Kuwai, Toshio; Yamaguchi, Toshiki; Imagawa, Hiroki; Miura, Ryoichi; Sumida, Yuki; Takasago, Takeshi; Miyasako, Yuki; Nishimura, Tomoyuki; Iio, Sumio; Yamaguchi, Atsushi; Kouno, Hirotaka; Kohno, Hiroshi; Ishaq, Sauid

    2018-01-01

    AIM To determine short- and long-term outcomes of endoscopic submucosal dissection (ESD) using the stag beetle (SB) knife, a scissor-shaped device. METHODS Seventy consecutive patients with 96 early esophageal neoplasms, who underwent ESD using a SB knife at Kure Medical Center and Chugoku Cancer Center, Japan, between April 2010 and August 2016, were retrospectively evaluated. Clinicopathological characteristics of lesions and procedural adverse events were assessed. Therapeutic success was evaluated on the basis of en bloc, histologically complete, and curative or non-curative resection rates. Overall and tumor-specific survival, local or distant recurrence, and 3- and 5-year cumulative overall metachronous cancer rates were also assessed. RESULTS Eligible patients had dysplasia/intraepithelial neoplasia (22%) or early cancers (squamous cell carcinoma, 78%). The median procedural time was 60 min and on average, the lesions measured 24 mm in diameter, yielding 33-mm tissue defects. The en bloc resection rate was 100%, with 95% and 81% of dissections deemed histologically complete and curative, respectively. All procedures were completed without accidental incisions/perforations or delayed bleeding. During follow-up (mean, 35 ± 23 mo), no local recurrences or metastases were observed. The 3- and 5-year survival rates were 83% and 70%, respectively, with corresponding rates of 85% and 75% for curative resections and 74% and 49% for non-curative resections. The 3- and 5-year cumulative rates of metachronous cancer in the patients with curative resections were 14% and 26%, respectively. CONCLUSION ESD procedures using the SB knife are feasible, safe, and effective for treating early esophageal neoplasms, yielding favorable short- and long-term outcomes. PMID:29686470

  7. Endoscopic submucosal dissection for early esophageal neoplasms using the stag beetle knife.

    PubMed

    Kuwai, Toshio; Yamaguchi, Toshiki; Imagawa, Hiroki; Miura, Ryoichi; Sumida, Yuki; Takasago, Takeshi; Miyasako, Yuki; Nishimura, Tomoyuki; Iio, Sumio; Yamaguchi, Atsushi; Kouno, Hirotaka; Kohno, Hiroshi; Ishaq, Sauid

    2018-04-21

    To determine short- and long-term outcomes of endoscopic submucosal dissection (ESD) using the stag beetle (SB) knife, a scissor-shaped device. Seventy consecutive patients with 96 early esophageal neoplasms, who underwent ESD using a SB knife at Kure Medical Center and Chugoku Cancer Center, Japan, between April 2010 and August 2016, were retrospectively evaluated. Clinicopathological characteristics of lesions and procedural adverse events were assessed. Therapeutic success was evaluated on the basis of en bloc , histologically complete, and curative or non-curative resection rates. Overall and tumor-specific survival, local or distant recurrence, and 3- and 5-year cumulative overall metachronous cancer rates were also assessed. Eligible patients had dysplasia/intraepithelial neoplasia (22%) or early cancers (squamous cell carcinoma, 78%). The median procedural time was 60 min and on average, the lesions measured 24 mm in diameter, yielding 33-mm tissue defects. The en bloc resection rate was 100%, with 95% and 81% of dissections deemed histologically complete and curative, respectively. All procedures were completed without accidental incisions/perforations or delayed bleeding. During follow-up (mean, 35 ± 23 mo), no local recurrences or metastases were observed. The 3- and 5-year survival rates were 83% and 70%, respectively, with corresponding rates of 85% and 75% for curative resections and 74% and 49% for non-curative resections. The 3- and 5-year cumulative rates of metachronous cancer in the patients with curative resections were 14% and 26%, respectively. ESD procedures using the SB knife are feasible, safe, and effective for treating early esophageal neoplasms, yielding favorable short- and long-term outcomes.

  8. Sequencing Data Discovery and Integration for Earth System Science with MetaSeek

    NASA Astrophysics Data System (ADS)

    Hoarfrost, A.; Brown, N.; Arnosti, C.

    2017-12-01

    Microbial communities play a central role in biogeochemical cycles. Sequencing data resources from environmental sources have grown exponentially in recent years, and represent a singular opportunity to investigate microbial interactions with Earth system processes. Carrying out such meta-analyses depends on our ability to discover and curate sequencing data into large-scale integrated datasets. However, such integration efforts are currently challenging and time-consuming, with sequencing data scattered across multiple repositories and metadata that is not easily or comprehensively searchable. MetaSeek is a sequencing data discovery tool that integrates sequencing metadata from all the major data repositories, allowing the user to search and filter on datasets in a lightweight application with an intuitive, easy-to-use web-based interface. Users can save and share curated datasets, while other users can browse these data integrations or use them as a jumping off point for their own curation. Missing and/or erroneous metadata are inferred automatically where possible, and where not possible, users are prompted to contribute to the improvement of the sequencing metadata pool by correcting and amending metadata errors. Once an integrated dataset has been curated, users can follow simple instructions to download their raw data and quickly begin their investigations. In addition to the online interface, the MetaSeek database is easily queryable via an open API, further enabling users and facilitating integrations of MetaSeek with other data curation tools. This tool lowers the barriers to curation and integration of environmental sequencing data, clearing the path forward to illuminating the ecosystem-scale interactions between biological and abiotic processes.

  9. The Antaeus Project - An orbital quarantine facility for analysis of planetary return samples

    NASA Technical Reports Server (NTRS)

    Sweet, H. C.; Bagby, J. R.; Devincenzi, D. L.

    1983-01-01

    A design is presented for an earth-orbiting facility for the analysis of planetary return samples under conditions of maximum protection against contamination but minimal damage to the sample. The design is keyed to a Mars sample return mission profile, returning 1 kg of documented subsamples, to be analyzed in low earth orbit by a small crew aided by automated procedures, tissue culture and microassay. The facility itself would consist of Spacelab shells, formed into five modules of different sizes with purposes of power supply, habitation, supplies and waste storage, the linking of the facility, and both quarantine and investigation of the samples. Three barriers are envisioned to protect the biosphere from any putative extraterrestrial organisms: sealed biological containment cabinets within the Laboratory Module, the Laboratory Module itself, and the conditions of space surrounding the facility.

  10. [Thoracoscopic diagnosis and treatment of postoperative residual cavities].

    PubMed

    Ioffe, D Ts; Dashiev, V A; Amanov, S A

    1987-03-01

    Investigations performed in 41 patients with postoperative residual cavities after surgical interventions of different volume have shown high value of thoracoscopy as an additional diagnostic and curative method. The endoscopy findings determinate further curative tactics--surgery or conservative therapy.

  11. Advancing the application of systems thinking in health: why cure crowds out prevention

    PubMed Central

    2014-01-01

    Introduction This paper presents a system dynamics computer simulation model to illustrate unintended consequences of apparently rational allocations to curative and preventive services. Methods A modeled population is subject to only two diseases. Disease A is a curable disease that can be shortened by curative care. Disease B is an instantly fatal but preventable disease. Curative care workers are financed by public spending and private fees to cure disease A. Non-personal, preventive services are delivered by public health workers supported solely by public spending to prevent disease B. Each type of worker tries to tilt the balance of government spending towards their interests. Their influence on the government is proportional to their accumulated revenue. Results The model demonstrates effects on lost disability-adjusted life years and costs over the course of several epidemics of each disease. Policy interventions are tested including: i) an outside donor rationally donates extra money to each type of disease precisely in proportion to the size of epidemics of each disease; ii) lobbying is eliminated; iii) fees for personal health services are eliminated; iv) the government continually rebalances the funding for prevention by ring-fencing it to protect it from lobbying. The model exhibits a “spend more get less” equilibrium in which higher revenue by the curative sector is used to influence government allocations away from prevention towards cure. Spending more on curing disease A leads paradoxically to a higher overall disease burden of unprevented cases of disease B. This paradoxical behavior of the model can be stopped by eliminating lobbying, eliminating fees for curative services, and ring-fencing public health funding. Conclusions We have created an artificial system as a laboratory to gain insights about the trade-offs between curative and preventive health allocations, and the effect of indicative policy interventions. The underlying dynamics of this artificial system resemble features of modern health systems where a self-perpetuating industry has grown up around disease-specific curative programs like HIV/AIDS or malaria. The model shows how the growth of curative care services can crowd both fiscal and policy space for the practice of population level prevention work, requiring dramatic interventions to overcome these trends. PMID:24935344

  12. The art, science and philosophy of newborn care.

    PubMed

    Singh, Meharban

    2014-06-01

    Neonates truly constitute the foundation of a nation and no sensible government can afford to neglect their needs and rights. In the last 50 y, technology has revolutionized neonatology and we have moved from an exceedingly passive or "hands-off" philosophy to an extremely aggressive or mechanistic approach. Deaths during first 28 d of life account for over 60 % of all infant deaths and 40 % of all deaths of under-5 children. If we have to further reduce infant mortality rate in our country we must focus our strategies to improve health and survival of newborn babies. There should be equitable distribution of resources for the care of mothers and babies in the community and establishment of high-tech newborn care facilities. In 21st century, we must delink and sever our dependence on traditional birth attendants or dais and develop necessary infrastructure and facilities to ensure that every pregnant woman is provided with essential antenatal care and all deliveries take place at health care facilities and they are conducted by trained health care professionals. In the best pediatric tradition, there is a need for greater focus on preventive rather than curative health care strategies because a large number of neonatal deaths occur due to potentially preventable disorders like birth asphyxia, hypothermia, hypoglycemia and infections. The art and science of neonatology should be integrated and we should follow a "middle path" and strike a balance between art and technology in the care of newborns.

  13. Lived experiences of everyday life during curative radiotherapy in patients with non-small-cell lung cancer: A phenomenological study

    PubMed Central

    Petri, Suzanne; Berthelsen, Connie B.

    2015-01-01

    Aim To explore and describe the essential meaning of lived experiences of the phenomenon: Everyday life during curative radiotherapy in patients with non-small-cell lung cancer (NSCLC). Background Radiotherapy treatment in patients with NSCLC is associated with severe side effects such as fatigue, anxiety, and reduced quality of life. However, little is known about the patients’ experience of everyday life during the care trajectory. Design This study takes a reflective lifeworld approach using an empirical application of phenomenological philosophy described by Dahlberg and colleagues. Method A sample of three patients treated with curative radiotherapy for NSCLC was interviewed 3 weeks after the end of radiotherapy treatment about their experiences of everyday life during their treatment. Data were collected in 2014 and interviews and analysis were conducted within the descriptive phenomenological framework. Findings The essential meaning structure of the phenomenon studied was described as “Hope for recovery serving as a compass in a changed everyday life,” which was a guide for the patients through the radiotherapy treatment to support their efforts in coping with side effects. The constituents of the structure were: Radiotherapy as a life priority, A struggle for acceptance of an altered everyday life, Interpersonal relationships for better or worse, and Meeting the health care system. Conclusion The meaning of hope was essential during radiotherapy treatment and our results suggest that interpersonal relationships can be a prerequisite to the experience of hope. “Hope for recovery serving as a compass in a changed everyday life,” furthermore identifies the essentials in the patients’ assertive approach to believing in recovery and thereby enabling hope in a serious situation. PMID:26610116

  14. Lived experiences of everyday life during curative radiotherapy in patients with non-small-cell lung cancer: A phenomenological study.

    PubMed

    Petri, Suzanne; Berthelsen, Connie B

    2015-01-01

    To explore and describe the essential meaning of lived experiences of the phenomenon: Everyday life during curative radiotherapy in patients with non-small-cell lung cancer (NSCLC). Radiotherapy treatment in patients with NSCLC is associated with severe side effects such as fatigue, anxiety, and reduced quality of life. However, little is known about the patients' experience of everyday life during the care trajectory. This study takes a reflective lifeworld approach using an empirical application of phenomenological philosophy described by Dahlberg and colleagues. A sample of three patients treated with curative radiotherapy for NSCLC was interviewed 3 weeks after the end of radiotherapy treatment about their experiences of everyday life during their treatment. Data were collected in 2014 and interviews and analysis were conducted within the descriptive phenomenological framework. The essential meaning structure of the phenomenon studied was described as "Hope for recovery serving as a compass in a changed everyday life," which was a guide for the patients through the radiotherapy treatment to support their efforts in coping with side effects. The constituents of the structure were: Radiotherapy as a life priority, A struggle for acceptance of an altered everyday life, Interpersonal relationships for better or worse, and Meeting the health care system. The meaning of hope was essential during radiotherapy treatment and our results suggest that interpersonal relationships can be a prerequisite to the experience of hope. "Hope for recovery serving as a compass in a changed everyday life," furthermore identifies the essentials in the patients' assertive approach to believing in recovery and thereby enabling hope in a serious situation.

  15. Reasons doctors provide futile treatment at the end of life: a qualitative study.

    PubMed

    Willmott, Lindy; White, Benjamin; Gallois, Cindy; Parker, Malcolm; Graves, Nicholas; Winch, Sarah; Callaway, Leonie Kaye; Shepherd, Nicole; Close, Eliana

    2016-08-01

    Futile treatment, which by definition cannot benefit a patient, is undesirable. This research investigated why doctors believe that treatment that they consider to be futile is sometimes provided at the end of a patient's life. Semistructured in-depth interviews. Three large tertiary public hospitals in Brisbane, Australia. 96 doctors from emergency, intensive care, palliative care, oncology, renal medicine, internal medicine, respiratory medicine, surgery, cardiology, geriatric medicine and medical administration departments. Participants were recruited using purposive maximum variation sampling. Doctors attributed the provision of futile treatment to a wide range of inter-related factors. One was the characteristics of treating doctors, including their orientation towards curative treatment, discomfort or inexperience with death and dying, concerns about legal risk and poor communication skills. Second, the attributes of the patient and family, including their requests or demands for further treatment, prognostic uncertainty and lack of information about patient wishes. Third, there were hospital factors including a high degree of specialisation, the availability of routine tests and interventions, and organisational barriers to diverting a patient from a curative to a palliative pathway. Doctors nominated family or patient request and doctors being locked into a curative role as the main reasons for futile care. Doctors believe that a range of factors contribute to the provision of futile treatment. A combination of strategies is necessary to reduce futile treatment, including better training for doctors who treat patients at the end of life, educating the community about the limits of medicine and the need to plan for death and dying, and structural reform at the hospital level. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    PubMed

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  17. 76 FR 43712 - Notice of Inventory Completion: U.S. Department of the Interior, Bureau of Indian Affairs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-21

    ... Science and Industry at the address below by August 22, 2011. ADDRESSES: Lori Erickson, Curator, Oregon... the human remains should contact Lori Erickson, Curator, Oregon Museum of Science and Industry, 1945...

  18. [Biochemical failure after curative treatment for localized prostate cancer].

    PubMed

    Zouhair, Abderrahim; Jichlinski, Patrice; Mirimanoff, René-Olivier

    2005-12-07

    Biochemical failure after curative treatment for localized prostate cancer is frequent. The diagnosis of biochemical failure is clear when PSA levels rise after radical prostatectomy, but may be more difficult after external beam radiation therapy. The main difficulty once biochemical failure is diagnosed is to distinguish between local and distant failure, given the low sensitivity of standard work-up exams. Metabolic imaging techniques currently under evaluation may in the future help us to localize the site of failures. There are several therapeutic options depending on the initial curative treatment, each with morbidity risks that should be considered in multidisciplinary decision-making.

  19. Factors Affecting Health-Related Quality of Life in Children Undergoing Curative Treatment for Cancer: A Review of the Literature.

    PubMed

    Momani, Tha'er G; Hathaway, Donna K; Mandrell, Belinda N

    2016-01-01

    Health-related quality of life (HRQoL) is an important measure to evaluate a child's reported treatment experience. Although there are numerous studies of HRQoL in children undergoing curative cancer treatment, there is limited literature on factors that influence this. To review published studies that describe the HRQoL and associated factors in children undergoing curative cancer treatment. Full-text publications in English from January 2005 to March 2013 were searched in PubMed, PsychINFO, and CINAHL for children ≤18 years of age undergoing curative cancer treatment. HRQoL-associated factors were categorized as cancer diagnosis, treatment, child, family, and community. Twenty-six studies met the inclusion criteria. The most frequently used generic and cancer-specific instruments were PedsQL (Pediatric Quality of Life Inventory) Generic and PedsQL Cancer, respectively. Cancer diagnosis and treatment were the most frequently identified variables; fewer studies measured family and community domains. Gender, treatment intensity, type of cancer treatments, time in treatment, and cancer diagnosis were correlated with HRQoL. Our study highlights the need to develop interventions based on diagnosis and treatment regimen to improve the HRQoL in children undergoing curative cancer treatment. © 2015 by Association of Pediatric Hematology/Oncology Nurses.

  20. Curation of food-relevant chemicals in ToxCast.

    PubMed

    Karmaus, Agnes L; Trautman, Thomas D; Krishan, Mansi; Filer, Dayne L; Fix, Laurel A

    2017-05-01

    High-throughput in vitro assays and exposure prediction efforts are paving the way for modeling chemical risk; however, the utility of such extensive datasets can be limited or misleading when annotation fails to capture current chemical usage. To address this data gap and provide context for food-use in the United States (US), manual curation of food-relevant chemicals in ToxCast was conducted. Chemicals were categorized into three food-use categories: (1) direct food additives, (2) indirect food additives, or (3) pesticide residues. Manual curation resulted in 30% of chemicals having new annotation as well as the removal of 319 chemicals, most due to cancellation or only foreign usage. These results highlight that manual curation of chemical use information provided significant insight affecting the overall inventory and chemical categorization. In total, 1211 chemicals were confirmed as current day food-use in the US by manual curation; 1154 of these chemicals were also identified as food-related in the globally sourced chemical use information from Chemical/Product Categories database (CPCat). The refined list of food-use chemicals and the sources highlighted for compiling annotated information required to confirm food-use are valuable resources for providing needed context when evaluating large-scale inventories such as ToxCast. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Opening Data in the Long Tail for Community Discovery, Curation and Action Using Active and Social Curation

    NASA Astrophysics Data System (ADS)

    Hedstrom, M. L.; Kumar, P.; Myers, J.; Plale, B. A.

    2012-12-01

    In data science, the most common sequence of steps for data curation are to 1) curate data, 2) enable data discovery, and 3) provide for data reuse. The Sustainable Environments - Actionable Data (SEAD) project, funded through NSF's DataNet program, is creating an environment for sustainability scientists to discover data first, reuse data next, and curate data though an on-going process that we call Active and Social Curation. For active curation we are developing tools and services that support data discovery, data management, and data enhancement for the community while the data is still being used actively for research. We are creating an Active Content Repository, using drop box, semantic web technologies, and a Flickr-like interface for researchers to "drop" data into a repository where it will be replicated and minimally discoverable. For social curation, we are deploying a social networking tool, VIVO, which will allow researchers to discover data-publications-people (e.g. expertise) through a route that can start at any of those entry points. The other dimension of social curation is developing mechanisms to open data for community input, for example, using ranking and commenting mechanisms for data sets and a community-sourcing capability to add tags, clean up and validate data sets. SEAD's strategies and services are aimed at the sustainability science community, which faces numerous challenges including discovery of useful data, cleaning noisy observational data, synthesizing data of different types, defining appropriate models, managing and preserving their research data, and conveying holistic results to colleagues, students, decision makers, and the public. Sustainability researchers make significant use of centrally managed data from satellites and national sensor networks, national scientific and statistical agencies, and data archives. At the same time, locally collected data and custom derived data products that combine observations and measurements from local, national, and global sources are critical resources that have disproportionately high value relative to their size. Sustainability science includes a diverse and growing community of domain scientists, policy makers, private sector investors, green manufacturers, citizen scientists, and informed consumers. These communities need actionable data in order to assess the impacts of alternate scenarios, evaluate the cost-benefit tradeoffs of different solutions, and defend their recommendations and decisions. SEAD's goal is to extend its services to other communities in the "long tail" that may benefit from new approaches to infrastructure development which take into account the social and economic characteristics of diverse and dispersed data producers and consumers. For example, one barrier to data reuse is the difficulty of discovering data that might be valuable for a particular study, model, or decision. Making data minimally discoverable saves the community time expended on futile searches and creates a market, of sorts, for the data. Creating very low barriers to entry to a network where data can be discovered and acted upon vastly reduces this disincentive to sharing data. SEAD's approach allows communities to make small incremental improvements in data curation based on their own priorities and needs.

  2. Making Metadata Better with CMR and MMT

    NASA Technical Reports Server (NTRS)

    Gilman, Jason Arthur; Shum, Dana

    2016-01-01

    Ensuring complete, consistent and high quality metadata is a challenge for metadata providers and curators. The CMR and MMT systems provide providers and curators options to build in metadata quality from the start and also assess and improve the quality of already existing metadata.

  3. A CTD–Pfizer collaboration: manual curation of 88 000 scientific articles text mined for drug–disease and drug–phenotype interactions

    PubMed Central

    Davis, Allan Peter; Wiegers, Thomas C.; Roberts, Phoebe M.; King, Benjamin L.; Lay, Jean M.; Lennon-Hopkins, Kelley; Sciaky, Daniela; Johnson, Robin; Keating, Heather; Greene, Nigel; Hernandez, Robert; McConnell, Kevin J.; Enayetallah, Ahmed E.; Mattingly, Carolyn J.

    2013-01-01

    Improving the prediction of chemical toxicity is a goal common to both environmental health research and pharmaceutical drug development. To improve safety detection assays, it is critical to have a reference set of molecules with well-defined toxicity annotations for training and validation purposes. Here, we describe a collaboration between safety researchers at Pfizer and the research team at the Comparative Toxicogenomics Database (CTD) to text mine and manually review a collection of 88 629 articles relating over 1 200 pharmaceutical drugs to their potential involvement in cardiovascular, neurological, renal and hepatic toxicity. In 1 year, CTD biocurators curated 2 54 173 toxicogenomic interactions (1 52 173 chemical–disease, 58 572 chemical–gene, 5 345 gene–disease and 38 083 phenotype interactions). All chemical–gene–disease interactions are fully integrated with public CTD, and phenotype interactions can be downloaded. We describe Pfizer’s text-mining process to collate the articles, and CTD’s curation strategy, performance metrics, enhanced data content and new module to curate phenotype information. As well, we show how data integration can connect phenotypes to diseases. This curation can be leveraged for information about toxic endpoints important to drug safety and help develop testable hypotheses for drug–disease events. The availability of these detailed, contextualized, high-quality annotations curated from seven decades’ worth of the scientific literature should help facilitate new mechanistic screening assays for pharmaceutical compound survival. This unique partnership demonstrates the importance of resource sharing and collaboration between public and private entities and underscores the complementary needs of the environmental health science and pharmaceutical communities. Database URL: http://ctdbase.org/ PMID:24288140

  4. A CTD-Pfizer collaboration: manual curation of 88,000 scientific articles text mined for drug-disease and drug-phenotype interactions.

    PubMed

    Davis, Allan Peter; Wiegers, Thomas C; Roberts, Phoebe M; King, Benjamin L; Lay, Jean M; Lennon-Hopkins, Kelley; Sciaky, Daniela; Johnson, Robin; Keating, Heather; Greene, Nigel; Hernandez, Robert; McConnell, Kevin J; Enayetallah, Ahmed E; Mattingly, Carolyn J

    2013-01-01

    Improving the prediction of chemical toxicity is a goal common to both environmental health research and pharmaceutical drug development. To improve safety detection assays, it is critical to have a reference set of molecules with well-defined toxicity annotations for training and validation purposes. Here, we describe a collaboration between safety researchers at Pfizer and the research team at the Comparative Toxicogenomics Database (CTD) to text mine and manually review a collection of 88,629 articles relating over 1,200 pharmaceutical drugs to their potential involvement in cardiovascular, neurological, renal and hepatic toxicity. In 1 year, CTD biocurators curated 254,173 toxicogenomic interactions (152,173 chemical-disease, 58,572 chemical-gene, 5,345 gene-disease and 38,083 phenotype interactions). All chemical-gene-disease interactions are fully integrated with public CTD, and phenotype interactions can be downloaded. We describe Pfizer's text-mining process to collate the articles, and CTD's curation strategy, performance metrics, enhanced data content and new module to curate phenotype information. As well, we show how data integration can connect phenotypes to diseases. This curation can be leveraged for information about toxic endpoints important to drug safety and help develop testable hypotheses for drug-disease events. The availability of these detailed, contextualized, high-quality annotations curated from seven decades' worth of the scientific literature should help facilitate new mechanistic screening assays for pharmaceutical compound survival. This unique partnership demonstrates the importance of resource sharing and collaboration between public and private entities and underscores the complementary needs of the environmental health science and pharmaceutical communities. Database URL: http://ctdbase.org/

  5. Text mining and expert curation to develop a database on psychiatric diseases and their genes

    PubMed Central

    Gutiérrez-Sacristán, Alba; Bravo, Àlex; Portero-Tresserra, Marta; Valverde, Olga; Armario, Antonio; Blanco-Gandía, M.C.; Farré, Adriana; Fernández-Ibarrondo, Lierni; Fonseca, Francina; Giraldo, Jesús; Leis, Angela; Mané, Anna; Mayer, M.A.; Montagud-Romero, Sandra; Nadal, Roser; Ortiz, Jordi; Pavon, Francisco Javier; Perez, Ezequiel Jesús; Rodríguez-Arias, Marta; Serrano, Antonia; Torrens, Marta; Warnault, Vincent; Sanz, Ferran

    2017-01-01

    Abstract Psychiatric disorders constitute one of the main causes of disability worldwide. During the past years, considerable research has been conducted on the genetic architecture of such diseases, although little understanding of their etiology has been achieved. The difficulty to access up-to-date, relevant genotype-phenotype information has hampered the application of this wealth of knowledge to translational research and clinical practice in order to improve diagnosis and treatment of psychiatric patients. PsyGeNET (http://www.psygenet.org/) has been developed with the aim of supporting research on the genetic architecture of psychiatric diseases, by providing integrated and structured accessibility to their genotype–phenotype association data, together with analysis and visualization tools. In this article, we describe the protocol developed for the sustainable update of this knowledge resource. It includes the recruitment of a team of domain experts in order to perform the curation of the data extracted by text mining. Annotation guidelines and a web-based annotation tool were developed to support the curators’ tasks. A curation workflow was designed including a pilot phase and two rounds of curation and analysis phases. Negative evidence from the literature on gene–disease associations (GDAs) was taken into account in the curation process. We report the results of the application of this workflow to the curation of GDAs for PsyGeNET, including the analysis of the inter-annotator agreement and suggest this model as a suitable approach for the sustainable development and update of knowledge resources. Database URL: http://www.psygenet.org PsyGeNET corpus: http://www.psygenet.org/ds/PsyGeNET/results/psygenetCorpus.tar PMID:29220439

  6. Sampling Strategy and Curation Plan of "Hayabusa" Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Yano, H.; Fujiwara, A.; Abe, M.; Hasegawa, S.; Kushiro, I.; Zolensky, M. E.

    2004-01-01

    On the 9th May 2003 JST, Japanese spacecraft MUSES-C was successfully launched from Uchinoura. The spacecraft was directly inserted to interplanetary trajectory and renamed as Hayabusa , or "Falcon" to be the world s first sample return spacecraft to a near Earth asteroid (NEA). The NEA (25143)Itokawa (formerly known as "1998SF36") is its mission target. Its orbital and physical characteristics were well observed; the size is (490 +/- 100)x (250 +/- 55)x(180 +/- 50) m with about 12-hour rotation period. It has a red-sloped S(IV)-type spectrum with strong 1- and 2-micron absorption bands, analogous to ordinary LL chondrites with space weathering effect. Assuming its bulk density, the surface gravity level of Itokawa is in the order of 10 micro-G with its escape velocity = approx. 20 cm/s.

  7. Antarctic Meteorite Classification and Petrographic Database

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Satterwhite, C. E.; Righter, Kevin

    2011-01-01

    The Antarctic Meteorite collection, which is comprised of over 18,700 meteorites, is one of the largest collections of meteorites in the world. These meteorites have been collected since the late 1970's as part of a three-agency agreement between NASA, the National Science Foundation, and the Smithsonian Institution [1]. Samples collected each season are analyzed at NASA s Meteorite Lab and the Smithsonian Institution and results are published twice a year in the Antarctic Meteorite Newsletter, which has been in publication since 1978. Each newsletter lists the samples collected and processed and provides more in-depth details on selected samples of importance to the scientific community. Data about these meteorites is also published on the NASA Curation website [2] and made available through the Meteorite Classification Database allowing scientists to search by a variety of parameters

  8. Entomopathogen ID: a curated sequence resource for entomopathogenic fungi

    USDA-ARS?s Scientific Manuscript database

    We report the development of a publicly accessible, curated database of Hypocrealean entomopathogenic fungi sequence data. The goal is to provide a platform for users to easily access sequence data from reference strains. The database can be used to accurately identify unknown entomopathogenic fungi...

  9. Cognitive Curations of Collaborative Curricula

    ERIC Educational Resources Information Center

    Ackerman, Amy S.

    2015-01-01

    Assuming the role of learning curators, 22 graduate students (in-service teachers) addressed authentic problems (challenges) within their respective classrooms by selecting digital tools as part of implementation of interdisciplinary lesson plans. Students focused on formative assessment tools as a means to gather evidence to make improvements in…

  10. MaizeGDB: New tools and resource

    USDA-ARS?s Scientific Manuscript database

    MaizeGDB, the USDA-ARS genetics and genomics database, is a highly curated, community-oriented informatics service to researchers focused on the crop plant and model organism Zea mays. MaizeGDB facilitates maize research by curating, integrating, and maintaining a database that serves as the central...

  11. Teacher Training in Curative Education.

    ERIC Educational Resources Information Center

    Juul, Kristen D.; Maier, Manfred

    1992-01-01

    This article considers the application of the philosophical and educational principles of Rudolf Steiner, called "anthroposophy," to the training of teachers and curative educators in the Waldorf schools. Special emphasis is on the Camphill movement which focuses on therapeutic schools and communities for children with special needs. (DB)

  12. [Curative effect of ozone hydrotherapy for pemphigus].

    PubMed

    Jiang, Fuqiong; Deng, Danqi; Li, Xiaolan; Wang, Wenfang; Xie, Hong; Wu, Yongzhuo; Luan, Chunyan; Yang, Binbin

    2018-02-28

    To determine clinical curative effects of ozone therapy for pemphigus vulgaris.
 Methods: Ozone hydrotherapy was used as an aid treatment for 32 patients with pemphigus vulgaris. The hydropathic compression of potassium permanganate solution for 34 patients with pemphigus vulgaris served as a control. The main treatment for both groups were glucocorticoids and immune inhibitors. The lesions of patients, bacterial infection, usage of antibiotics, patient's satisfaction, and clinical curative effect were evaluated in the 2 groups.
 Results: There was no significant difference in the curative effect and the average length of staying at hospital between the 2 groups (P>0.05). But rate for the usage of antibiotics was significantly reduced in the group of ozone hydrotherapy (P=0.039). The patients were more satisfied in using ozone hydrotherapy than the potassium permanganate solution after 7-day therapy (P>0.05).
 Conclusion: Ozone hydrotherapy is a safe and effective aid method for pemphigus vulgaris. It can reduce the usage of antibiotics.

  13. Low Prevalence of Substandard and Falsified Antimalarial and Antibiotic Medicines in Public and Faith-Based Health Facilities of Southern Malawi

    PubMed Central

    Khuluza, Felix; Kigera, Stephen; Heide, Lutz

    2017-01-01

    Substandard and falsified antimalarial and antibiotic medicines represent a serious problem for public health, especially in low- and middle-income countries. However, information on the prevalence of poor-quality medicines is limited. In the present study, samples of six antimalarial and six antibiotic medicines were collected from 31 health facilities and drug outlets in southern Malawi. Random sampling was used in the selection of health facilities. For sample collection, an overt approach was used in licensed facilities, and a mystery shopper approach in nonlicensed outlets. One hundred and fifty-five samples were analyzed by visual and physical examination and by rapid prescreening tests, that is, disintegration testing and thin-layer chromatography using the GPHF-Minilab. Fifty-six of the samples were analyzed according to pharmacopeial monographs in a World Health Organization-prequalified quality control laboratory. Seven out-of-specification medicines were identified. One sample was classified as falsified, lacking the declared active ingredients, and containing other active ingredients instead. Three samples were classified as substandard with extreme deviations from the pharmacopeial standards, and three further samples as substandard with nonextreme deviations. Of the substandard medicines, three failed in dissolution testing, two in the assay for the content of the active pharmaceutical ingredient, and one failed in both dissolution testing and assay. Six of the seven out-of-specification medicines were from private facilities. Only one out-of-specification medicine was found within the samples from public and faith-based health facilities. Although the observed presence of substandard and falsified medicines in Malawi requires action, their low prevalence in public and faith-based health facilities is encouraging. PMID:28219993

  14. Prepared to react? Assessing the functional capacity of the primary health care system in rural Orissa, India to respond to the devastating flood of September 2008.

    PubMed

    Phalkey, Revati; Dash, Shisir R; Mukhopadhyay, Alok; Runge-Ranzinger, Silvia; Marx, Michael

    2012-01-01

    Early detection of an impending flood and the availability of countermeasures to deal with it can significantly reduce its health impacts. In developing countries like India, public primary health care facilities are frontline organizations that deal with disasters particularly in rural settings. For developing robust counter reacting systems evaluating preparedness capacities within existing systems becomes necessary. The objective of the study is to assess the functional capacity of the primary health care system in Jagatsinghpur district of rural Orissa in India to respond to the devastating flood of September 2008. An onsite survey was conducted in all 29 primary and secondary facilities in five rural blocks (administrative units) of Jagatsinghpur district in Orissa state. A pre-tested structured questionnaire was administered face to face in the facilities. The data was entered, processed and analyzed using STATA(®) 10. Data from our primary survey clearly shows that the healthcare facilities are ill prepared to handle the flood despite being faced by them annually. Basic utilities like electricity backup and essential medical supplies are lacking during floods. Lack of human resources along with missing standard operating procedures; pre-identified communication and incident command systems; effective leadership; and weak financial structures are the main hindering factors in mounting an adequate response to the floods. The 2008 flood challenged the primary curative and preventive health care services in Jagatsinghpur. Simple steps like developing facility specific preparedness plans which detail out standard operating procedures during floods and identify clear lines of command will go a long way in strengthening the response to future floods. Performance critiques provided by the grass roots workers, like this one, should be used for institutional learning and effective preparedness planning. Additionally each facility should maintain contingency funds for emergency response along with local vendor agreements to ensure stock supplies during floods. The facilities should ensure that baseline public health standards for health care delivery identified by the Government are met in non-flood periods in order to improve the response during floods. Building strong public primary health care systems is a development challenge. The recovery phases of disasters should be seen as an opportunity to expand and improve services and facilities.

  15. Prepared to react? Assessing the functional capacity of the primary health care system in rural Orissa, India to respond to the devastating flood of September 2008

    PubMed Central

    Phalkey, Revati; Dash, Shisir R.; Mukhopadhyay, Alok; Runge-Ranzinger, Silvia; Marx, Michael

    2012-01-01

    Background Early detection of an impending flood and the availability of countermeasures to deal with it can significantly reduce its health impacts. In developing countries like India, public primary health care facilities are frontline organizations that deal with disasters particularly in rural settings. For developing robust counter reacting systems evaluating preparedness capacities within existing systems becomes necessary. Objective The objective of the study is to assess the functional capacity of the primary health care system in Jagatsinghpur district of rural Orissa in India to respond to the devastating flood of September 2008. Methods An onsite survey was conducted in all 29 primary and secondary facilities in five rural blocks (administrative units) of Jagatsinghpur district in Orissa state. A pre-tested structured questionnaire was administered face to face in the facilities. The data was entered, processed and analyzed using STATA® 10. Results Data from our primary survey clearly shows that the healthcare facilities are ill prepared to handle the flood despite being faced by them annually. Basic utilities like electricity backup and essential medical supplies are lacking during floods. Lack of human resources along with missing standard operating procedures; pre-identified communication and incident command systems; effective leadership; and weak financial structures are the main hindering factors in mounting an adequate response to the floods. Conclusion The 2008 flood challenged the primary curative and preventive health care services in Jagatsinghpur. Simple steps like developing facility specific preparedness plans which detail out standard operating procedures during floods and identify clear lines of command will go a long way in strengthening the response to future floods. Performance critiques provided by the grass roots workers, like this one, should be used for institutional learning and effective preparedness planning. Additionally each facility should maintain contingency funds for emergency response along with local vendor agreements to ensure stock supplies during floods. The facilities should ensure that baseline public health standards for health care delivery identified by the Government are met in non-flood periods in order to improve the response during floods. Building strong public primary health care systems is a development challenge. The recovery phases of disasters should be seen as an opportunity to expand and improve services and facilities. PMID:22435044

  16. Outcomes of the 'Data Curation for Geobiology at Yellowstone National Park' Workshop

    NASA Astrophysics Data System (ADS)

    Thomer, A.; Palmer, C. L.; Fouke, B. W.; Rodman, A.; Choudhury, G. S.; Baker, K. S.; Asangba, A. E.; Wickett, K.; DiLauro, T.; Varvel, V.

    2013-12-01

    The continuing proliferation of geological and biological data generated at scientifically significant sites (such as hot springs, coral reefs, volcanic fields and other unique, data-rich locales) has created a clear need for the curation and active management of these data. However, there has been little exploration of what these curation processes and policies would entail. To that end, the Site-Based Data Curation (SBDC) project is developing a framework of guidelines and processes for the curation of research data generated at scientifically significant sites. A workshop was held in April 2013 at Yellowstone National Park (YNP) to gather input from scientists and stakeholders. Workshop participants included nine researchers actively conducting geobiology research at YNP, and seven YNP representatives, including permitting staff and information professionals from the YNP research library and archive. Researchers came from a range of research areas -- geology, molecular and microbial biology, ecology, environmental engineering, and science education. Through group discussions, breakout sessions and hands-on activities, we sought to generate policy recommendations and curation guidelines for the collection, representation, sharing and quality control of geobiological datasets. We report on key themes that emerged from workshop discussions, including: - participants' broad conceptions of the long-term usefulness, reusability and value of data. - the benefits of aggregating site-specific data in general, and geobiological data in particular. - the importance of capturing a dataset's originating context, and the potential usefulness of photographs as a reliable and easy way of documenting context. - researchers' and resource managers' overlapping priorities with regards to 'big picture' data collection and management in the long-term. Overall, we found that workshop participants were enthusiastic and optimistic about future collaboration and development of community approaches to data sharing. We hope to continue discussion of geobiology data curation challenges and potential strategies at AGU. Outcomes from the workshop are guiding next steps in the SBDC project, led by investigators at the Center for Informatics Research in Science and Scholarship and Institute for Genomic Biology at the University of Illinois, in collaboration with partners at Johns Hopkins University and YNP.

  17. Ebola Preparedness in the Netherlands: The Need for Coordination Between the Public Health and the Curative Sector.

    PubMed

    Swaan, Corien M; Öry, Alexander V; Schol, Lianne G C; Jacobi, André; Richardus, Jan Hendrik; Timen, Aura

    During the Ebola outbreak in West Africa in 2014-2015, close cooperation between the curative sector and the public health sector in the Netherlands was necessary for timely identification, referral, and investigation of patients with suspected Ebola virus disease (EVD). In this study, we evaluated experiences in preparedness among stakeholders of both curative and public health sectors to formulate recommendations for optimizing preparedness protocols. Timeliness of referred patients with suspected EVD was used as indicator for preparedness. In focus group sessions and semistructured interviews, experiences of curative and public health stakeholders about the regional and national process of preparedness and response were listed. Timeliness recordings of all referred patients with suspected EVD (13) were collected from first date of illness until arrival in the referral academic hospital. Ebola preparedness was considered extensive compared with the risk of an actual patient, however necessary. Regional coordination varied between regions. More standardization of regional preparation and operational guidelines was requested, as well as nationally standardized contingency criteria, and the National Centre for Infectious Disease Control was expected to coordinate the development of these guidelines. For the timeliness of referred patients with suspected EVD, the median delay between first date of illness until triage was 2.0 days (range: 0-10 days), and between triage and arrival in the referral hospital, it was 5.0 hours (range: 2-7.5 hours). In none of these patients Ebola infection was confirmed. Coordination between the public health sector and the curative sector needs improvement to reduce delay in patient management in emerging infectious diseases. Standardization of preparedness and response practices, through guidelines for institutional preparedness and blueprints for regional and national coordination, is necessary, as preparedness for emerging infectious diseases needs a multidisciplinary approach overarching both the public health sector and the curative sector. In the Netherlands a national platform for preparedness is established, in which both the curative sector and public health sector participate, in order to implement the outcomes of this study.

  18. ITEP: an integrated toolkit for exploration of microbial pan-genomes.

    PubMed

    Benedict, Matthew N; Henriksen, James R; Metcalf, William W; Whitaker, Rachel J; Price, Nathan D

    2014-01-03

    Comparative genomics is a powerful approach for studying variation in physiological traits as well as the evolution and ecology of microorganisms. Recent technological advances have enabled sequencing large numbers of related genomes in a single project, requiring computational tools for their integrated analysis. In particular, accurate annotations and identification of gene presence and absence are critical for understanding and modeling the cellular physiology of newly sequenced genomes. Although many tools are available to compare the gene contents of related genomes, new tools are necessary to enable close examination and curation of protein families from large numbers of closely related organisms, to integrate curation with the analysis of gain and loss, and to generate metabolic networks linking the annotations to observed phenotypes. We have developed ITEP, an Integrated Toolkit for Exploration of microbial Pan-genomes, to curate protein families, compute similarities to externally-defined domains, analyze gene gain and loss, and generate draft metabolic networks from one or more curated reference network reconstructions in groups of related microbial species among which the combination of core and variable genes constitute the their "pan-genomes". The ITEP toolkit consists of: (1) a series of modular command-line scripts for identification, comparison, curation, and analysis of protein families and their distribution across many genomes; (2) a set of Python libraries for programmatic access to the same data; and (3) pre-packaged scripts to perform common analysis workflows on a collection of genomes. ITEP's capabilities include de novo protein family prediction, ortholog detection, analysis of functional domains, identification of core and variable genes and gene regions, sequence alignments and tree generation, annotation curation, and the integration of cross-genome analysis and metabolic networks for study of metabolic network evolution. ITEP is a powerful, flexible toolkit for generation and curation of protein families. ITEP's modular design allows for straightforward extension as analysis methods and tools evolve. By integrating comparative genomics with the development of draft metabolic networks, ITEP harnesses the power of comparative genomics to build confidence in links between genotype and phenotype and helps disambiguate gene annotations when they are evaluated in both evolutionary and metabolic network contexts.

  19. Overview of the gene ontology task at BioCreative IV.

    PubMed

    Mao, Yuqing; Van Auken, Kimberly; Li, Donghui; Arighi, Cecilia N; McQuilton, Peter; Hayman, G Thomas; Tweedie, Susan; Schaeffer, Mary L; Laulederkind, Stanley J F; Wang, Shur-Jen; Gobeill, Julien; Ruch, Patrick; Luu, Anh Tuan; Kim, Jung-Jae; Chiang, Jung-Hsien; Chen, Yu-De; Yang, Chia-Jung; Liu, Hongfang; Zhu, Dongqing; Li, Yanpeng; Yu, Hong; Emadzadeh, Ehsan; Gonzalez, Graciela; Chen, Jian-Ming; Dai, Hong-Jie; Lu, Zhiyong

    2014-01-01

    Gene ontology (GO) annotation is a common task among model organism databases (MODs) for capturing gene function data from journal articles. It is a time-consuming and labor-intensive task, and is thus often considered as one of the bottlenecks in literature curation. There is a growing need for semiautomated or fully automated GO curation techniques that will help database curators to rapidly and accurately identify gene function information in full-length articles. Despite multiple attempts in the past, few studies have proven to be useful with regard to assisting real-world GO curation. The shortage of sentence-level training data and opportunities for interaction between text-mining developers and GO curators has limited the advances in algorithm development and corresponding use in practical circumstances. To this end, we organized a text-mining challenge task for literature-based GO annotation in BioCreative IV. More specifically, we developed two subtasks: (i) to automatically locate text passages that contain GO-relevant information (a text retrieval task) and (ii) to automatically identify relevant GO terms for the genes in a given article (a concept-recognition task). With the support from five MODs, we provided teams with >4000 unique text passages that served as the basis for each GO annotation in our task data. Such evidence text information has long been recognized as critical for text-mining algorithm development but was never made available because of the high cost of curation. In total, seven teams participated in the challenge task. From the team results, we conclude that the state of the art in automatically mining GO terms from literature has improved over the past decade while much progress is still needed for computer-assisted GO curation. Future work should focus on addressing remaining technical challenges for improved performance of automatic GO concept recognition and incorporating practical benefits of text-mining tools into real-world GO annotation. http://www.biocreative.org/tasks/biocreative-iv/track-4-GO/. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  20. Southern African Treatment Resistance Network (SATuRN) RegaDB HIV drug resistance and clinical management database: supporting patient management, surveillance and research in southern Africa

    PubMed Central

    Manasa, Justen; Lessells, Richard; Rossouw, Theresa; Naidu, Kevindra; Van Vuuren, Cloete; Goedhals, Dominique; van Zyl, Gert; Bester, Armand; Skingsley, Andrew; Stott, Katharine; Danaviah, Siva; Chetty, Terusha; Singh, Lavanya; Moodley, Pravi; Iwuji, Collins; McGrath, Nuala; Seebregts, Christopher J.; de Oliveira, Tulio

    2014-01-01

    Abstract Substantial amounts of data have been generated from patient management and academic exercises designed to better understand the human immunodeficiency virus (HIV) epidemic and design interventions to control it. A number of specialized databases have been designed to manage huge data sets from HIV cohort, vaccine, host genomic and drug resistance studies. Besides databases from cohort studies, most of the online databases contain limited curated data and are thus sequence repositories. HIV drug resistance has been shown to have a great potential to derail the progress made thus far through antiretroviral therapy. Thus, a lot of resources have been invested in generating drug resistance data for patient management and surveillance purposes. Unfortunately, most of the data currently available relate to subtype B even though >60% of the epidemic is caused by HIV-1 subtype C. A consortium of clinicians, scientists, public health experts and policy markers working in southern Africa came together and formed a network, the Southern African Treatment and Resistance Network (SATuRN), with the aim of increasing curated HIV-1 subtype C and tuberculosis drug resistance data. This article describes the HIV-1 data curation process using the SATuRN Rega database. The data curation is a manual and time-consuming process done by clinical, laboratory and data curation specialists. Access to the highly curated data sets is through applications that are reviewed by the SATuRN executive committee. Examples of research outputs from the analysis of the curated data include trends in the level of transmitted drug resistance in South Africa, analysis of the levels of acquired resistance among patients failing therapy and factors associated with the absence of genotypic evidence of drug resistance among patients failing therapy. All these studies have been important for informing first- and second-line therapy. This database is a free password-protected open source database available on www.bioafrica.net. Database URL: http://www.bioafrica.net/regadb/ PMID:24504151

  1. Rates and Durability of Response to Salvage Radiation Therapy Among Patients With Refractory or Relapsed Aggressive Non-Hodgkin Lymphoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tseng, Yolanda D., E-mail: ydt2@uw.edu; Chen, Yu-Hui; Catalano, Paul J.

    Purpose: To evaluate the response rate (RR) and time to local recurrence (TTLR) among patients who received salvage radiation therapy for relapsed or refractory aggressive non-Hodgkin lymphoma (NHL) and investigate whether RR and TTLR differed according to disease characteristics. Methods and Materials: A retrospective review was performed for all patients who completed a course of salvage radiation therapy between January 2001 and May 2011 at Brigham and Women's Hospital/Dana-Farber Cancer Institute. Separate analyses were conducted for patients treated with palliative and curative intent. Predictors of RR for each subgroup were assessed using a generalized estimating equation model. For patients treatedmore » with curative intent, local control (LC) and progression-free survival were estimated with the Kaplan-Meier method; predictors for TTLR were evaluated using a Cox proportional hazards regression model. Results: Salvage radiation therapy was used to treat 110 patients to 121 sites (76 curative, 45 palliative). Salvage radiation therapy was given as part of consolidation in 18% of patients treated with curative intent. Median dose was 37.8 Gy, with 58% and 36% of curative and palliative patients, respectively, receiving 39.6 Gy or higher. The RR was high (86% curative, 84% palliative). With a median follow-up of 4.8 years among living patients, 5-year LC and progression-free survival for curative patients were 66% and 34%, respectively. Refractory disease (hazard ratio 3.3; P=.024) and lack of response to initial chemotherapy (hazard ratio 4.3; P=.007) but not dose (P=.93) were associated with shorter TTLR. Despite doses of 39.6 Gy or higher, 2-year LC was only 61% for definitive patients with refractory disease or disease that did not respond to initial chemotherapy. Conclusions: Relapsed or refractory aggressive NHL is responsive to salvage radiation therapy, and durable LC can be achieved in some cases. However, refractory disease is associated with a shorter TTLR, suggesting that radiation dose escalation, addition of radiosensitizers, or a combination of both may be indicated in these patients.« less

  2. Managing the data deluge: data-driven GO category assignment improves while complexity of functional annotation increases.

    PubMed

    Gobeill, Julien; Pasche, Emilie; Vishnyakova, Dina; Ruch, Patrick

    2013-01-01

    The available curated data lag behind current biological knowledge contained in the literature. Text mining can assist biologists and curators to locate and access this knowledge, for instance by characterizing the functional profile of publications. Gene Ontology (GO) category assignment in free text already supports various applications, such as powering ontology-based search engines, finding curation-relevant articles (triage) or helping the curator to identify and encode functions. Popular text mining tools for GO classification are based on so called thesaurus-based--or dictionary-based--approaches, which exploit similarities between the input text and GO terms themselves. But their effectiveness remains limited owing to the complex nature of GO terms, which rarely occur in text. In contrast, machine learning approaches exploit similarities between the input text and already curated instances contained in a knowledge base to infer a functional profile. GO Annotations (GOA) and MEDLINE make possible to exploit a growing amount of curated abstracts (97 000 in November 2012) for populating this knowledge base. Our study compares a state-of-the-art thesaurus-based system with a machine learning system (based on a k-Nearest Neighbours algorithm) for the task of proposing a functional profile for unseen MEDLINE abstracts, and shows how resources and performances have evolved. Systems are evaluated on their ability to propose for a given abstract the GO terms (2.8 on average) used for curation in GOA. We show that since 2006, although a massive effort was put into adding synonyms in GO (+300%), our thesaurus-based system effectiveness is rather constant, reaching from 0.28 to 0.31 for Recall at 20 (R20). In contrast, thanks to its knowledge base growth, our machine learning system has steadily improved, reaching from 0.38 in 2006 to 0.56 for R20 in 2012. Integrated in semi-automatic workflows or in fully automatic pipelines, such systems are more and more efficient to provide assistance to biologists. DATABASE URL: http://eagl.unige.ch/GOCat/

  3. Off-site movement of pesticide-contaminated fill from agrichemical facilities during the 1993 flooding in Illinois

    USGS Publications Warehouse

    Roy, W.R.; Chou, S.-F.J.; Krapac, I.G.

    1995-01-01

    Twenty retail agrichemical facilities were flooded. There was a concern that pesticide-contaminated road fill at these facilities had been transported into residential areas by the flooding. Forty fill and flood- related sediment samples were collected at six facilities. No significant accumulation of sediments was present at any of the six facilities. At five of the six facilities, it did not appear that road fill had been transported off-site. Pesticides were detected in sediment samples collected off-site adjacent to five of the facilities. Of the 21 samples collected off-site, atrazine (2-chloro-4-ethylamino-6-isopropylamino-1,3,5-triazide) and metolachlor (2-chloro-6'-ethyl-N-(2-methoxy-1-methylethyl)acet-o-toluidine) were detected in 86 and 81% of the samples, respectively. When compared with on-site concentrations, off-site pesticide concentrations were either at similar levels, or were as much as three orders of magnitude less. The interpretation of the pesticide data was difficult and often inconclusive, because there were no background data on the occurrence and distribution of pesticides at each site before flooding.

  4. Design of a solar concentrator considering arbitrary surfaces

    NASA Astrophysics Data System (ADS)

    Jiménez-Rodríguez, Martín.; Avendaño-Alejo, Maximino; Verduzco-Grajeda, Lidia Elizabeth; Martínez-Enríquez, Arturo I.; García-Díaz, Reyes; Díaz-Uribe, Rufino

    2017-10-01

    We study the propagation of light in order to efficiently redirect the reflected light on photocatalytic samples placed inside a commercial solar simulator, and we have designed a small-scale prototype of Cycloidal Collectors (CCs), resembling a compound parabolic collector. The prototype consists of either cycloidal trough or cycloidal collector having symmetry of rotation, which has been designed considering an exact ray tracing assuming a bundle of rays propagating parallel to the optical axis and impinging on a curate cycloidal surface, obtaining its caustic surface produced by reflection.

  5. Facility Concepts for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Cohen, Marc M.; Briggs, Geoff (Technical Monitor)

    2001-01-01

    Samples returned from Mars must be held in quarantine until their biological safety has been determined. A significant challenge, unique to NASA's needs, is how to contain the samples (to protect the blaspheme) while simultaneously protecting their pristine nature. This paper presents a comparative analysis of several quarantine facility concepts for handling and analyzing these samples. The considerations in this design analysis include: modes of manipulation; capability for destructive as well as non-destructive testing; avoidance of cross-contamination; linear versus recursive processing; and sample storage and retrieval within a closed system. The ability to rigorously contain biologically hazardous materials has been amply demonstrated by facilities that meet the specifications of the Center for Disease Control Biosafety Level 4. The newly defined Planetary Protection Level Alpha must provide comparable containment while assuring that the samples remain pristine; the latter requirement is based on the need to avoid compromising science analyses by instrumentation of the highest possible sensitivity (among other things this will assure that there is no false positive detection of organisms or organic molecules - a situation that would delay or prevent the release of the samples from quarantine). Protection of the samples against contamination by terrestrial organisms and organic molecules makes a considerable impact upon the sample handling facility. The use of glove boxes appears to be impractical because of their tendency to leak and to surges. As a result, a returned sample quarantine facility must consider the use of automation and remote manipulation to carry out the various functions of sample handling and transfer within the system. The problem of maintaining sensitive and bulky instrumentation under the constraints of simultaneous sample containment and contamination protection also places demands on the architectural configuration of the facility that houses it.

  6. [Determination of aconitine, hypaconitine and mesaconitine in Shenfu injection].

    PubMed

    Zhang, Pan-Pan; Zhang, Jun-Zhen; Wang, Zhao-Hong; Lu, Yong-Jiang; Jiang, Ye

    2013-05-01

    To establish a method for the content determination of indexes for measuring aconitic compounds contained in Shenfu injection, in order to provide basis for the evaluation of the curative effect of monkshood in Shenfu injection. The sample were purified and enriched with HF-LPME. ACQUITY UPLC BEH C18 column (2.1 mm x 50 mm, 1.7 microm) was adopted and eluted with a gradient program, with acetonitrile-10 mmol x L(-1) NH4HCO3 (pH 10) as the mobile phases. The flow rate was 0.45 mL x min(-1). The content was determined with ESI and MRM. The results showed that aconitine, hypaconitine and mesaconitine showed a good linear relationship, with r > 0.999, within the range of 0.1-100 ng x L(-1). The recoveries were detected to be 100.1%, 97.4%, 97.5%, with RSD being 1.2%, 1.1%, 1.5%, respectively. This method was used to prove the safety of Shenfu injection, and provide scientific basis for correct evaluation of curative effect of monkshood, as well as a reliable, simple and practical means for quality control of monkshood-containing Chinese materia medica preparations.

  7. Phylotranscriptomic consolidation of the jawed vertebrate timetree.

    PubMed

    Irisarri, Iker; Baurain, Denis; Brinkmann, Henner; Delsuc, Frédéric; Sire, Jean-Yves; Kupfer, Alexander; Petersen, Jörn; Jarek, Michael; Meyer, Axel; Vences, Miguel; Philippe, Hervé

    2017-09-01

    Phylogenomics is extremely powerful but introduces new challenges as no agreement exists on "standards" for data selection, curation and tree inference. We use jawed vertebrates (Gnathostomata) as model to address these issues. Despite considerable efforts in resolving their evolutionary history and macroevolution, few studies have included a full phylogenetic diversity of gnathostomes and some relationships remain controversial. We tested a novel bioinformatic pipeline to assemble large and accurate phylogenomic datasets from RNA sequencing and find this phylotranscriptomic approach successful and highly cost-effective. Increased sequencing effort up to ca. 10Gbp allows recovering more genes, but shallower sequencing (1.5Gbp) is sufficient to obtain thousands of full-length orthologous transcripts. We reconstruct a robust and strongly supported timetree of jawed vertebrates using 7,189 nuclear genes from 100 taxa, including 23 new transcriptomes from previously unsampled key species. Gene jackknifing of genomic data corroborates the robustness of our tree and allows calculating genome-wide divergence times by overcoming gene sampling bias. Mitochondrial genomes prove insufficient to resolve the deepest relationships because of limited signal and among-lineage rate heterogeneity. Our analyses emphasize the importance of large curated nuclear datasets to increase the accuracy of phylogenomics and provide a reference framework for the evolutionary history of jawed vertebrates.

  8. Facilitators and Barriers to Implementing Clinical Governance: A Qualitative Study among Senior Managers in Iran.

    PubMed

    Ravaghi, Hamid; Rafiei, Sima; Heidarpour, Peigham; Mohseni, Maryam

    2014-09-01

    Health care systems should assign quality improvement as their main mission. Clinical governance (CG) is a key strategy to improve quality of health care services. The Iranian Ministry of Health and Medical Education (MOHME) has promoted CG as a framework for safeguarding quality and safety in all hospitals since 2009. The purpose of this study was to explore perceived facilitators and barriers to implementing CG by deputies for curative affairs of Iranian medical universities. A qualitative study was conducted using face to face interviews with a purposeful sample of 43 deputies for curative affairs of Iranian Medical Universities and documents review. Thematic analysis was used to analyze the data. Five themes were explored including: knowledge and attitude toward CG, culture, organizational factors, managerial factors and barriers. The main perceived facilitating factors were adequate knowledge and positive attitude toward CG, supporting culture, managers' commitment, effective communication and well designed incentives. Pe rceived barriers were the reverse of facilitators noted above in addition to insufficient resources, legal challenges, workload and parallel quality programs. Successful implementation of CG in Iran will require identifying barriers and challenges existing in the way of CG implementation and try to mitigate them by using appropriate facilitators.

  9. TEMPUS: A facility for containerless electromagnetic processing onboard spacelab

    NASA Technical Reports Server (NTRS)

    Lenski, H.; Willnecker, R.

    1990-01-01

    The electromagnetic containerless processing facility TEMPUS was recently assigned for a flight on the IML-2 mission. In comparison to the TEMPUS facility already flown on a sounding rocket, several improvements had to be implemented. These are in particular related to: safety; resource management; and the possibility to process different samples with different requirements in one mission. The basic design of this facility as well as the expected processing capabilities are presented. Two operational aspects turned out to strongly influence the facility design: control of the sample motion (first experimental results indicate that crew or ground interaction will be necessary to minimize residual sample motions during processing); and exchange of RF-coils (during processing in vacuum, evaporated sample materials will condense at the cold surface and may force a coil exchange, when a critical thickness is exceeded).

  10. Integrating query of relational and textual data in clinical databases: a case study.

    PubMed

    Fisk, John M; Mutalik, Pradeep; Levin, Forrest W; Erdos, Joseph; Taylor, Caroline; Nadkarni, Prakash

    2003-01-01

    The authors designed and implemented a clinical data mart composed of an integrated information retrieval (IR) and relational database management system (RDBMS). Using commodity software, which supports interactive, attribute-centric text and relational searches, the mart houses 2.8 million documents that span a five-year period and supports basic IR features such as Boolean searches, stemming, and proximity and fuzzy searching. Results are relevance-ranked using either "total documents per patient" or "report type weighting." Non-curated medical text has a significant degree of malformation with respect to spelling and punctuation, which creates difficulties for text indexing and searching. Presently, the IR facilities of RDBMS packages lack the features necessary to handle such malformed text adequately. A robust IR+RDBMS system can be developed, but it requires integrating RDBMSs with third-party IR software. RDBMS vendors need to make their IR offerings more accessible to non-programmers.

  11. Molecular digital pathology: progress and potential of exchanging molecular data.

    PubMed

    Roy, Somak; Pfeifer, John D; LaFramboise, William A; Pantanowitz, Liron

    2016-09-01

    Many of the demands to perform next generation sequencing (NGS) in the clinical laboratory can be resolved using the principles of telepathology. Molecular telepathology can allow facilities to outsource all or a portion of their NGS operation such as cloud computing, bioinformatics pipelines, variant data management, and knowledge curation. Clinical pathology laboratories can electronically share diverse types of molecular data with reference laboratories, technology service providers, and/or regulatory agencies. Exchange of electronic molecular data allows laboratories to perform validation of rare diseases using foreign data, check the accuracy of their test results against benchmarks, and leverage in silico proficiency testing. This review covers the emerging subject of molecular telepathology, describes clinical use cases for the appropriate exchange of molecular data, and highlights key issues such as data integrity, interoperable formats for massive genomic datasets, security, malpractice and emerging regulations involved with this novel practice.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einck, John P., E-mail: jeinck@ucsd.edu; Hudson, Alana; Shulman, Adam C.

    West Africa has one of the highest incidence rates of carcinoma of the cervix in the world. The vast majority of women do not have access to screening or disease treatment, leading to presentation at advanced stages and to high mortality rates. Compounding this problem is the lack of radiation treatment facilities in Senegal and many other parts of the African continent. Senegal, a country of 13 million people, had a single {sup 60}Co teletherapy unit before our involvement and no brachytherapy capabilities. Radiating Hope, a nonprofit organization whose mission is to provide radiation therapy equipment to countries in themore » developing world, provided a high-dose-rate afterloading unit to the cancer center for curative cervical cancer treatment. Here we describe the implementation of high-dose-rate brachytherapy in Senegal requiring a nonstandard fractionation schedule and a novel treatment planning approach as a possible blueprint to providing this technology to other developing countries.« less

  13. 16. VIEW OF ROBERT VOGEL, CURATOR, DIVISION OF MECHANICAL & ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. VIEW OF ROBERT VOGEL, CURATOR, DIVISION OF MECHANICAL & CIVIL ENGINEER, NATIONAL MUSEUM OF AMERICAN HISTORY, SMITHSONIAN INSTITUTION, SITTING IN ELEVATOR CAR. MR. VOGEL IS RESPONSIBLE FOR THE RELOCATION OF THE ELEVATOR TO THE SMITHSONIAN INSTITUTION - 72 Marlborough Street, Residential Hydraulic Elevator, Boston, Suffolk County, MA

  14. Triage by ranking to support the curation of protein interactions

    PubMed Central

    Pasche, Emilie; Gobeill, Julien; Rech de Laval, Valentine; Gleizes, Anne; Michel, Pierre-André; Bairoch, Amos

    2017-01-01

    Abstract Today, molecular biology databases are the cornerstone of knowledge sharing for life and health sciences. The curation and maintenance of these resources are labour intensive. Although text mining is gaining impetus among curators, its integration in curation workflow has not yet been widely adopted. The Swiss Institute of Bioinformatics Text Mining and CALIPHO groups joined forces to design a new curation support system named nextA5. In this report, we explore the integration of novel triage services to support the curation of two types of biological data: protein–protein interactions (PPIs) and post-translational modifications (PTMs). The recognition of PPIs and PTMs poses a special challenge, as it not only requires the identification of biological entities (proteins or residues), but also that of particular relationships (e.g. binding or position). These relationships cannot be described with onto-terminological descriptors such as the Gene Ontology for molecular functions, which makes the triage task more challenging. Prioritizing papers for these tasks thus requires the development of different approaches. In this report, we propose a new method to prioritize articles containing information specific to PPIs and PTMs. The new resources (RESTful APIs, semantically annotated MEDLINE library) enrich the neXtA5 platform. We tuned the article prioritization model on a set of 100 proteins previously annotated by the CALIPHO group. The effectiveness of the triage service was tested with a dataset of 200 annotated proteins. We defined two sets of descriptors to support automatic triage: the first set to enrich for papers with PPI data, and the second for PTMs. All occurrences of these descriptors were marked-up in MEDLINE and indexed, thus constituting a semantically annotated version of MEDLINE. These annotations were then used to estimate the relevance of a particular article with respect to the chosen annotation type. This relevance score was combined with a local vector-space search engine to generate a ranked list of PMIDs. We also evaluated a query refinement strategy, which adds specific keywords (such as ‘binds’ or ‘interacts’) to the original query. Compared to PubMed, the search effectiveness of the nextA5 triage service is improved by 190% for the prioritization of papers with PPIs information and by 260% for papers with PTMs information. Combining advanced retrieval and query refinement strategies with automatically enriched MEDLINE contents is effective to improve triage in complex curation tasks such as the curation of protein PPIs and PTMs. Database URL: http://candy.hesge.ch/nextA5 PMID:29220432

  15. Prognostic impacts of postoperative complications in patients with intrahepatic cholangiocarcinoma after curative operations.

    PubMed

    Miyata, Tatsunori; Yamashita, Yo-Ichi; Yamao, Takanobu; Umezaki, Naoki; Tsukamoto, Masayo; Kitano, Yuki; Yamamura, Kensuke; Arima, Kota; Kaida, Takayoshi; Nakagawa, Shigeki; Imai, Katsunori; Hashimoto, Daisuke; Chikamoto, Akira; Ishiko, Takatoshi; Baba, Hideo

    2017-06-01

    The postoperative complication is one of an indicator of poor prognosis in patients with several gastroenterological cancers after curative operations. We, herein, examined prognostic impacts of postoperative complications in patients with intrahepatic cholangiocarcinoma after curative operations. We retrospectively analyzed 60 patients with intrahepatic cholangiocarcinoma who underwent primary curative operations from June 2002 to February 2016. Prognostic impacts of postoperative complications were analyzed using log-rank test and Cox proportional hazard model. Postoperative complications (Clavien-Dindo classification grade 3 or more) occurred in 13 patients (21.7%). Overall survival of patients without postoperative complications was significantly better than that of patients with postoperative complications (p = 0.025). Postoperative complications are independent prognostic factor of overall survival (hazard ratio 3.02; p = 0.030). In addition, bile duct resection and reconstruction (Odds ratio 59.1; p = 0.002) and hepatitis C virus antibody positive (Odds ratio 7.14; p= 0.022), and lymph node dissection (Odds ratio 6.28; p = 0.040) were independent predictors of postoperative complications. Postoperative complications may be an independent predictor of poorer survival in patients with intrahepatic cholangiocarcinoma after curative operations. Lymph node dissection and bile duct resection and reconstruction were risk factors for postoperative complications, therefore we should pay attentions to perform lymph node dissections, bile duct resection and reconstruction in patients with intrahepatic cholangiocarcinoma.

  16. Correcting Inconsistencies and Errors in Bacterial Genome Metadata Using an Automated Curation Tool in Excel (AutoCurE).

    PubMed

    Schmedes, Sarah E; King, Jonathan L; Budowle, Bruce

    2015-01-01

    Whole-genome data are invaluable for large-scale comparative genomic studies. Current sequencing technologies have made it feasible to sequence entire bacterial genomes with relative ease and time with a substantially reduced cost per nucleotide, hence cost per genome. More than 3,000 bacterial genomes have been sequenced and are available at the finished status. Publically available genomes can be readily downloaded; however, there are challenges to verify the specific supporting data contained within the download and to identify errors and inconsistencies that may be present within the organizational data content and metadata. AutoCurE, an automated tool for bacterial genome database curation in Excel, was developed to facilitate local database curation of supporting data that accompany downloaded genomes from the National Center for Biotechnology Information. AutoCurE provides an automated approach to curate local genomic databases by flagging inconsistencies or errors by comparing the downloaded supporting data to the genome reports to verify genome name, RefSeq accession numbers, the presence of archaea, BioProject/UIDs, and sequence file descriptions. Flags are generated for nine metadata fields if there are inconsistencies between the downloaded genomes and genomes reports and if erroneous or missing data are evident. AutoCurE is an easy-to-use tool for local database curation for large-scale genome data prior to downstream analyses.

  17. PHI-base: a new interface and further additions for the multi-species pathogen–host interactions database

    PubMed Central

    Urban, Martin; Cuzick, Alayne; Rutherford, Kim; Irvine, Alistair; Pedro, Helder; Pant, Rashmi; Sadanadan, Vidyendra; Khamari, Lokanath; Billal, Santoshkumar; Mohanty, Sagar; Hammond-Kosack, Kim E.

    2017-01-01

    The pathogen–host interactions database (PHI-base) is available at www.phi-base.org. PHI-base contains expertly curated molecular and biological information on genes proven to affect the outcome of pathogen–host interactions reported in peer reviewed research articles. In addition, literature that indicates specific gene alterations that did not affect the disease interaction phenotype are curated to provide complete datasets for comparative purposes. Viruses are not included. Here we describe a revised PHI-base Version 4 data platform with improved search, filtering and extended data display functions. A PHIB-BLAST search function is provided and a link to PHI-Canto, a tool for authors to directly curate their own published data into PHI-base. The new release of PHI-base Version 4.2 (October 2016) has an increased data content containing information from 2219 manually curated references. The data provide information on 4460 genes from 264 pathogens tested on 176 hosts in 8046 interactions. Prokaryotic and eukaryotic pathogens are represented in almost equal numbers. Host species belong ∼70% to plants and 30% to other species of medical and/or environmental importance. Additional data types included into PHI-base 4 are the direct targets of pathogen effector proteins in experimental and natural host organisms. The curation problems encountered and the future directions of the PHI-base project are briefly discussed. PMID:27915230

  18. The BioGRID Interaction Database: 2011 update

    PubMed Central

    Stark, Chris; Breitkreutz, Bobby-Joe; Chatr-aryamontri, Andrew; Boucher, Lorrie; Oughtred, Rose; Livstone, Michael S.; Nixon, Julie; Van Auken, Kimberly; Wang, Xiaodong; Shi, Xiaoqi; Reguly, Teresa; Rust, Jennifer M.; Winter, Andrew; Dolinski, Kara; Tyers, Mike

    2011-01-01

    The Biological General Repository for Interaction Datasets (BioGRID) is a public database that archives and disseminates genetic and protein interaction data from model organisms and humans (http://www.thebiogrid.org). BioGRID currently holds 347 966 interactions (170 162 genetic, 177 804 protein) curated from both high-throughput data sets and individual focused studies, as derived from over 23 000 publications in the primary literature. Complete coverage of the entire literature is maintained for budding yeast (Saccharomyces cerevisiae), fission yeast (Schizosaccharomyces pombe) and thale cress (Arabidopsis thaliana), and efforts to expand curation across multiple metazoan species are underway. The BioGRID houses 48 831 human protein interactions that have been curated from 10 247 publications. Current curation drives are focused on particular areas of biology to enable insights into conserved networks and pathways that are relevant to human health. The BioGRID 3.0 web interface contains new search and display features that enable rapid queries across multiple data types and sources. An automated Interaction Management System (IMS) is used to prioritize, coordinate and track curation across international sites and projects. BioGRID provides interaction data to several model organism databases, resources such as Entrez-Gene and other interaction meta-databases. The entire BioGRID 3.0 data collection may be downloaded in multiple file formats, including PSI MI XML. Source code for BioGRID 3.0 is freely available without any restrictions. PMID:21071413

  19. Metavir and FIB-4 scores are associated with patient prognosis after curative hepatectomy in hepatitis B virus-related hepatocellular carcinoma: a retrospective cohort study at two centers in China.

    PubMed

    Liao, Rui; Fu, Yi-Peng; Wang, Ting; Deng, Zhi-Gang; Li, De-Wei; Fan, Jia; Zhou, Jian; Feng, Gen-Sheng; Qiu, Shuang-Jian; Du, Cheng-You

    2017-01-03

    Although Metavir and Fibrosis-4 (FIB-4) scores are typically used to assess the severity of liver fibrosis, the relationship between these scores and patient outcome in hepatocellular carcinoma (HCC) is unclear. The aim of this study was to evaluate the prognostic value of the severity of hepatic fibrosis in HBV-related HCC patients after curative resection. We examined the prognostic roles of the Metavir and preoperative FIB-4 scores in 432 HBV-HCC patients who underwent curative resection at two different medical centers located in western (Chongqing) and eastern (Shanghai) China. In the testing set (n = 108), the Metavir, FIB-4, and combined Metavir/FIB-4 scores were predictive of overall survival (OS) and recurrence-free survival (RFS). Additionally, they were associated with several clinicopathologic variables. In the validation set (n = 324), the Metavir, FIB-4, and combined Metavir/FIB-4 scores were associated with poor prognosis in HCC patients after curative resection. Importantly, in the negative alpha-fetoprotein subgroup (≤ 20 ng/mL), the FIB-4 index (I vs. II) could discriminate between patient outcomes (high or low OS and RFS). Thus Metavir, preoperative FIB-4, and combined Metavir/FIB-4 scores are prognostic markers in HBV-HCC patients after curative hepatectomy.

  20. Metavir and FIB-4 scores are associated with patient prognosis after curative hepatectomy in hepatitis B virus-related hepatocellular carcinoma: a retrospective cohort study at two centers in China

    PubMed Central

    Li, De-Wei; Fan, Jia; Zhou, Jian; Feng, Gen-Sheng; Qiu, Shuang-Jian; Du, Cheng-You

    2017-01-01

    Although Metavir and Fibrosis-4 (FIB-4) scores are typically used to assess the severity of liver fibrosis, the relationship between these scores and patient outcome in hepatocellular carcinoma (HCC) is unclear. The aim of this study was to evaluate the prognostic value of the severity of hepatic fibrosis in HBV-related HCC patients after curative resection. We examined the prognostic roles of the Metavir and preoperative FIB-4 scores in 432 HBV-HCC patients who underwent curative resection at two different medical centers located in western (Chongqing) and eastern (Shanghai) China. In the testing set (n = 108), the Metavir, FIB-4, and combined Metavir/FIB-4 scores were predictive of overall survival (OS) and recurrence-free survival (RFS). Additionally, they were associated with several clinicopathologic variables. In the validation set (n = 324), the Metavir, FIB-4, and combined Metavir/FIB-4 scores were associated with poor prognosis in HCC patients after curative resection. Importantly, in the negative alpha-fetoprotein subgroup (≤ 20 ng/mL), the FIB-4 index (I vs. II) could discriminate between patient outcomes (high or low OS and RFS). Thus Metavir, preoperative FIB-4, and combined Metavir/FIB-4 scores are prognostic markers in HBV-HCC patients after curative hepatectomy. PMID:27662665

  1. Lessons learned from surface wipe sampling for lead in three workplaces.

    PubMed

    Beaucham, Catherine; Ceballos, Diana; King, Bradley

    2017-08-01

    Surface wipe sampling in the occupational environment is a technique widely used by industrial hygienists. Although several organizations have promulgated standards for sampling lead and other metals, uncertainty still exists when trying to determine an appropriate wipe sampling strategy and how to interpret sampling results. Investigators from the National Institute for Occupational Safety and Health (NIOSH) Health Hazard Evaluation Program have used surface wipe sampling as part of their exposure assessment sampling strategies in a wide range of workplaces. This article discusses wipe sampling for measuring lead on surfaces in three facilities: (1) a battery recycling facility; (2) a firing range and gun store; and (3) an electronic scrap recycling facility. We summarize our findings from the facilities and what we learned by integrating wipe sampling into our sampling plan. Wiping sampling demonstrated lead in non-production surfaces in all three workplaces and that the potential that employees were taking lead home to their families existed. We also found that the presence of metals such as tin can interfere with the colorimetric results. We also discuss the advantages and disadvantages of colorimetric analysis of surface wipe samples and the challenges we faced when interpreting wipe sampling results.

  2. Phase II clinical trial of local use of GM-CSF for prevention and treatment of chemotherapy- and concomitant chemoradiotherapy-induced severe oral mucositis in advanced head and neck cancer patients: an evaluation of effectiveness, safety and costs.

    PubMed

    Mantovani, Giovanni; Massa, Elena; Astara, Giorgio; Murgia, Viviana; Gramignano, Giulia; Lusso, Maria Rita; Camboni, Paolo; Ferreli, Luca; Mocci, Miria; Perboni, Simona; Mura, Loredana; Madeddu, Clelia; Macciò, Antonio

    2003-01-01

    In the present open non-randomized phase II study we looked for effectiveness, safety, tolerability and costs of locally applied GM-CSF in preventing or treating mucositis in patients receiving chemotherapy or chemoradiotherapy for head and neck cancer. In addition to clinical mucositis scoring system, the effects of treatment with GM-CSF were evaluated by its impact on patient quality of life and by laboratory immunological assays such as serum proinflammatory cytokines, IL-2 and leptin. The trial was designed to assess the effectiveness of local GM-CSF treatment in two different settings: i) prophylaxis of mucositis; ii) treatment of mucositis. Prophylaxis was chosen for chemoradiotherapy treatments of high mucosatoxic potential, while curative treatment was reserved for chemotherapy or chemoradiotherapy treatments of lesser potential of inducing mucositis. From January 1998 to December 2001, 68 patients entered the study. The great majority of patients of both groups had head and neck cancer, were stage IV, PS ECOG 0-1, were habitual smokers and were treated with chemotherapy and concomitant (or sequential) chemoradiotherapy. Forty-six patients were included in the 'prophylactic' setting and 22 patients in the 'curative' setting. The main findings of our study are: only 50% of patients included in the 'prophylactic' setting developed mucositis; the duration of oral mucositis from appearance until complete remission was significantly shorter in the 'prophylactic' than in the 'curative' setting; the mean grade of oral mucositis at baseline, on day 3 of therapy and on day 6 of therapy was significantly lower in the 'prophylactic' than in the 'curative' setting; 24 (55.82%) patients in the 'prophylactic' setting had grade 3/4 oral mucositis at baseline compared to 25 (80.60%) patients in the 'curative' setting (p=0.048). Thirteen (30.23%) patients in the 'prophylactic' setting had grade 3/4 oral mucositis on day 3 of therapy compared to 19 (61.29%) patients in the 'curative' setting (p=0.015); 'prophylactic' setting was able to shorten grade 3/4 oral mucositis to grade 0/1 more effectively than the 'curative' one on day 6 of therapy (p=0.05). The present clinical trial is to date by far the largest study assessing the effectiveness of topical GM-CSF and it is the first study comparing the efficacy of topical GM-CSF in the 'prophylactic' setting, i.e., with the aim to prevent the chemoradiotherapy-induced oral mucositis, with that in the 'curative' treatment, i.e., the therapy for established oral mucositis. The topical application of GM-CSF was demonstrated to be effective for oral mucositis induced by chemotherapy and chemoradiotherapy regimens. Moreover, the 'prophylactic' setting was demonstrated to be more effective than the 'curative' one.

  3. Changing the Curation Equation: A Data Lifecycle Approach to Lowering Costs and Increasing Value

    NASA Astrophysics Data System (ADS)

    Myers, J.; Hedstrom, M.; Plale, B. A.; Kumar, P.; McDonald, R.; Kooper, R.; Marini, L.; Kouper, I.; Chandrasekar, K.

    2013-12-01

    What if everything that researchers know about their data, and everything their applications know, were directly available to curators? What if all the information that data consumers discover and infer about data were also available? What if curation and preservation activities occurred incrementally, during research projects instead of after they end, and could be leveraged to make it easier to manage research data from the moment of its creation? These are questions that the Sustainable Environments - Actionable Data (SEAD) project, funded as part of the National Science Foundation's DataNet partnership, was designed to answer. Data curation is challenging, but it is made more difficult by the historical separation of data production, data use, and formal curation activities across organizations, locations, and applications, and across time. Modern computing and networking technologies allow a much different approach in which data and metadata can easily flow between these activities throughout the data lifecycle, and in which heterogeneous and evolving data and metadata can be managed. Sustainability research, SEAD's initial focus area, is a clear example of an area where the nature of the research (cross-disciplinary, integrating heterogeneous data from independent sources, small teams, rapid evolution of sensing and analysis techniques) and the barriers and costs inherent in traditional methods have limited adoption of existing curation tools and techniques, to the detriment of overall scientific progress. To explore these ideas and create a sustainable curation capability for communities such as sustainability research, the SEAD team has developed and is now deploying an interacting set of open source data services that demonstrate this approach. These services provide end-to-end support for management of data during research projects; publication of that data into long-term archives; and integration of it into community networks of publications, research center activities, and synthesis efforts. They build on a flexible ';semantic content management' architecture and incorporate notions of ';active' and ';social' curation - continuous, incremental curation activities performed by the data producers (active) and the community (social) that are motivated by a range of direct benefits. Examples include the use of metadata (tags) to allow generation of custom geospatial maps, automated metadata extraction to generate rich data pages for known formats, and the use of information about data authorship to allow automatic updates of personal and project research profiles when data is published. In this presentation, we describe the core capabilities of SEAD's services and their application in sustainability research. We also outline the key features of the SEAD architecture - the use of global semantic identifiers, extensible data and metadata models, web services to manage context shifts, scalable cloud storage - and highlight how this approach is particularly well suited to extension by independent third parties. We conclude with thoughts on how this approach can be applied to challenging issues such as exposing ';dark' data and reducing duplicate creation of derived data products, and can provide a new level of analytics for community analysis and coordination.

  4. BioData: a national aquatic bioassessment database

    USGS Publications Warehouse

    MacCoy, Dorene

    2011-01-01

    BioData is a U.S. Geological Survey (USGS) web-enabled database that for the first time provides for the capture, curation, integration, and delivery of bioassessment data collected by local, regional, and national USGS projects. BioData offers field biologists advanced capabilities for entering, editing, and reviewing the macroinvertebrate, algae, fish, and supporting habitat data from rivers and streams. It offers data archival and curation capabilities that protect and maintain data for the long term. BioData provides the Federal, State, and local governments, as well as the scientific community, resource managers, the private sector, and the public with easy access to tens of thousands of samples collected nationwide from thousands of stream and river sites. BioData also provides the USGS with centralized data storage for delivering data to other systems and applications through automated web services. BioData allows users to combine data sets of known quality from different projects in various locations over time. It provides a nationally aggregated database for users to leverage data from many independent projects that, until now, was not feasible at this scale. For example, from 1991 to 2011, the USGS Idaho Water Science Center collected more than 816 bioassessment samples from 63 sites for the National Water Quality Assessment (NAWQA) Program and more than 477 samples from 39 sites for a cooperative USGS and State of Idaho Statewide Water Quality Network (fig. 1). Using BioData, 20 years of samples collected for both of these projects can be combined for analysis. BioData delivers all of the data using current taxonomic nomenclature, thus relieving users of the difficult and time-consuming task of harmonizing taxonomy among samples collected during different time periods. Fish data are reported using the Integrated Taxonomic Information Service (ITIS) Taxonomic Serial Numbers (TSN's). A simple web-data input interface and self-guided, public data-retrieval web site provides access to bioassessment data. BioData currently accepts data collected using two national protocols: (1) NAWQA and (2) U.S. Environmental Protection Agency (USEPA) National Rivers and Streams Assessment (NRSA). Additional collection protocols are planned for future versions.

  5. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  6. Irradiation treatment for the protection and conservation of cultural heritage artefacts in Croatia

    NASA Astrophysics Data System (ADS)

    Katušin-Ražem, Branka; Ražem, Dušan; Braun, Mario

    2009-07-01

    The application of irradiation treatment for the protection of cultural heritage artefacts in Croatia was made possible by the development of radiation processing procedures at the Radiation Chemistry and Dosimetry Laboratory of the Ruđer Bo\\vsković Institute. After the upgrading of the 60Co gamma irradiation source in the panoramic irradiation facility in 1983 it became possible to perform both research and pilot plant-scale irradiations for sterilization, pasteurization and decontamination of various materials, including medical supplies, pharmaceuticals, cosmetics and foods, but also for disinfestation of cultural heritage artefects. The demand for irradiation treatment of cultural heritage objects has particularly increased as the increasing number of these objects, especially polychromic wooden sculptures, were requiring salvation, restauration and conservation as a consequence of direct and indirect damages inflicted to them during the war in Croatia, 1991-1995. The irradiation facility at the Ruđer Bo\\vsković Institute is briefly described and an account of its fifteen years' activities in the irradiation treatment of cultural heritage objects is given. Some case studies performed in cooperation with the Croatian Conservation Institute and other interested parties are presented, as well as some cases of protective and curative treatments for disinfestation and decontamination. International cooperations and activities are also mentioned.

  7. Effects of beam interruption time on tumor control probability in single-fractionated carbon-ion radiotherapy for non-small cell lung cancer

    NASA Astrophysics Data System (ADS)

    Inaniwa, T.; Kanematsu, N.; Suzuki, M.; Hawkins, R. B.

    2015-05-01

    Carbon-ion radiotherapy treatment plans are designed on the assumption that the beams are delivered instantaneously, irrespective of actual dose-delivery time structure in a treatment session. As the beam lines are fixed in the vertical and horizontal directions at our facility, beam delivery is interrupted in multi-field treatment due to the necessity of patient repositioning within the fields. Single-fractionated treatment for non-small cell lung cancer (NSCLC) is such a case, in which four treatment fields in multiple directions are delivered in one session with patient repositioning during the session. The purpose of this study was to investigate the effects of the period of dose delivery, including interruptions due to patient repositioning, on tumor control probability (TCP) of NSCLC. All clinical doses were weighted by relative biological effectiveness (RBE) evaluated for instantaneous irradiation. The rate equations defined in the microdosimetric kinetic model (MKM) for primary lesions induced in DNA were applied to the single-fractionated treatment of NSCLC. Treatment plans were made for an NSCLC case for various prescribed doses ranging from 25 to 50 Gy (RBE), on the assumption of instantaneous beam delivery. These plans were recalculated by varying the interruption time τ ranging from 0 to 120 min between the second and third fields for continuous irradiations of 3 min per field based on the MKM. The curative doses that would result in a TCP of 90% were deduced for the respective interruption times. The curative dose was 34.5 Gy (RBE) for instantaneous irradiation and 36.6 Gy (RBE), 39.2 Gy (RBE), 41.2 Gy (RBE), 43.3 Gy (RBE) and 44.4 Gy (RBE) for τ = 0 min, 15 min, 30 min, 60 min and 120 min, respectively. The realistic biological effectiveness of therapeutic carbon-ion beam decreased with increasing interruption time. These data suggest that the curative dose can increase by 20% or more compared to the planned dose if the interruption time extends to 30 min or longer. These effects should be considered in carbon-ion radiotherapy treatment planning if a longer dose-delivery procedure time is anticipated.

  8. Competency-Based Curriculum: An Effective Approach to Digital Curation Education

    ERIC Educational Resources Information Center

    Kim, Jeonghyun

    2015-01-01

    The University of North Texas conducted a project involving rigorous curriculum development and instructional design to address the goal of building capacity in the Library and Information Sciences curriculum. To prepare information professionals with the competencies needed for digital curation and data management practice, the project developed…

  9. CARD 2017: expansion and model-centric curation of the Comprehensive Antibiotic Resistance Database

    USDA-ARS?s Scientific Manuscript database

    The Comprehensive Antibiotic Resistance Database (CARD; http://arpcard.mcmaster.ca) is a manually curated resource containing high quality reference data on the molecular basis of antimicrobial resistance (AMR), with an emphasis on the genes, proteins, and mutations involved in AMR. CARD is ontologi...

  10. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  11. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  12. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  13. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  14. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  15. The art of curation at a biological database: principles and application

    USDA-ARS?s Scientific Manuscript database

    The variety and quantity of data being produced by biological research has grown dramatically in recent years, resulting in an expansion of our understanding of biological systems. However, this abundance of data has brought new challenges, especially in curation. The role of biocurators is in part ...

  16. Curating and Nudging in Virtual CLIL Environments

    ERIC Educational Resources Information Center

    Nielsen, Helle Lykke

    2014-01-01

    Foreign language teachers can benefit substantially from the notions of curation and nudging when scaffolding CLIL activities on the internet. This article shows how these principles can be integrated into CLILstore, a free multimedia-rich learning tool with seamless access to online dictionaries, and presents feedback from first and second year…

  17. Kids as Curators: Virtual Art at the Seattle Museum.

    ERIC Educational Resources Information Center

    Scanlan, Laura Wolff

    2000-01-01

    Discusses the use of technology at the Seattle Art Museum (Washington). Includes a Web site that enables students in grades six through ten to act as curators and offers integrations of technology in the exhibition "Leonardo Lives: The Codex Leicester and Leonardo da Vinci's Legacy of Art and Science." (CMK)

  18. Microbial Condition of Water Samples from Foreign Fuel Storage Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, C.J.; Fliermans, C.B.; Santo Domingo, J.

    1997-10-30

    In order to assess the microbial condition of foreign nuclear fuel storage facilities, fourteen different water samples were received from facilities outside the United States that have sent spent nuclear fuel to SRS for wet storage. Each water sample was analyzed for microbial content and activity as determined by total bacteria, viable aerobic bacteria, viable anaerobic bacteria, viable sulfate- reducing bacteria, viable acid-producing bacteria and enzyme diversity. The results for each water sample were then compared to other foreign samples and to data from the receiving basin for off- site fuel (RBOF) at SRS.

  19. 300 Area treated effluent disposal facility sampling schedule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1994-10-11

    This document is the interface between the 300 Area Liquid Effluent Process Engineering (LEPE) group and the Waste Sampling and Characterization Facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  20. Rescue and Preservation of Sample Data from the Apollo Missions to the Moon

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Zeigler, Ryan A.; Evans, Cindy A.; Lehnert, Kerstin

    2016-01-01

    Six Apollo missions landed on the Moon from 1969-72, returning to Earth 382 kg of lunar rock, soil, and core samples. These samples are among the best documented and preserved samples on Earth that have supported a robust research program for 45 years. From mission planning through sample collection, preliminary examination, and subsequent research, strict protocols and procedures are followed for handling and allocating Apollo subsamples, resulting in the production of vast amounts of documentation. Even today, hundreds of samples are allocated for research each year, building on the science foundation laid down by the early Apollo sample studies and combining new data from today's instrumentation, lunar remote sensing missions and lunar meteorites. Much sample information is available to researchers at curator.jsc.nasa.gov. Decades of analyses on lunar samples are published in LPSC proceedings volumes and other peer-reviewed journals, and tabulated in lunar sample compendia entries. However, for much of the 1969-1995 period, the processing documentation, individual and consortia analyses, and unpublished results exist only in analog forms or primitive digital formats that are either inaccessible or at risk of being lost forever because critical data from early investigators remain unpublished.

  1. A new AMS facility at Inter University Accelerator Centre, New Delhi

    NASA Astrophysics Data System (ADS)

    Kumar, Pankaj; Chopra, S.; Pattanaik, J. K.; Ojha, S.; Gargari, S.; Joshi, R.; Kanjilal, D.

    2015-10-01

    Inter University Accelerator Centre (IUAC), a national facility of government of India, is having a 15UD Pelletron accelerator for multidisciplinary ion beam based research programs. Recently, a new accelerator mass spectrometry (AMS) facility has been developed after incorporating many changes in the existing 15UD Pelletron accelerator. A clean chemistry laboratory for 10Be and 26Al with all the modern facilities has also been developed for the chemical processing of samples. 10Be measurements on sediment samples, inter laboratory comparison results and 26Al measurements on standard samples are presented in this paper. In addition to the 10Be and 26Al AMS facilities, a new 14C AMS facility based on a dedicated 500 kV tandem ion accelerator with two cesium sputter ion sources, is also being setup at IUAC.

  2. Curation of US Martian Meteorites Collected in Antarctica

    NASA Technical Reports Server (NTRS)

    Lindstrom, M.; Satterwhite, C.; Allton, J.; Stansbury, E.

    1998-01-01

    To date the ANSMET field team has collected five martian meteorites (see below) in Antarctica and returned them for curation at the Johnson Space Center (JSC) Meteorite Processing Laboratory (MPL). ne meteorites were collected with the clean procedures used by ANSMET in collecting all meteorites: They were handled with JSC-cleaned tools, packaged in clean bags, and shipped frozen to JSC. The five martian meteorites vary significantly in size (12-7942 g) and rock type (basalts, lherzolites, and orthopyroxenite). Detailed descriptions are provided in the Mars Meteorite compendium, which describes classification, curation and research results. A table gives the names, classifications and original and curatorial masses of the martian meteorites. The MPL and measures for contamination control are described.

  3. Curated protein information in the Saccharomyces genome database.

    PubMed

    Hellerstedt, Sage T; Nash, Robert S; Weng, Shuai; Paskov, Kelley M; Wong, Edith D; Karra, Kalpana; Engel, Stacia R; Cherry, J Michael

    2017-01-01

    Due to recent advancements in the production of experimental proteomic data, the Saccharomyces genome database (SGD; www.yeastgenome.org ) has been expanding our protein curation activities to make new data types available to our users. Because of broad interest in post-translational modifications (PTM) and their importance to protein function and regulation, we have recently started incorporating expertly curated PTM information on individual protein pages. Here we also present the inclusion of new abundance and protein half-life data obtained from high-throughput proteome studies. These new data types have been included with the aim to facilitate cellular biology research. : www.yeastgenome.org. © The Author(s) 2017. Published by Oxford University Press.

  4. Waste Sampling & Characterization Facility (WSCF) Complex Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MELOY, R.T.

    2002-04-01

    This document was prepared to analyze the Waste Sampling and Characterization Facility for safety consequences by: Determining radionuclide and highly hazardous chemical inventories; Comparing these inventories to the appropriate regulatory limits; Documenting the compliance status with respect to these limits; and Identifying the administrative controls necessary to maintain this status. The primary purpose of the Waste Sampling and Characterization Facility (WSCF) is to perform low-level radiological and chemical analyses on various types of samples taken from the Hanford Site. These analyses will support the fulfillment of federal, Washington State, and Department of Energy requirements.

  5. The evolutionary history of ferns inferred from 25 low-copy nuclear genes.

    PubMed

    Rothfels, Carl J; Li, Fay-Wei; Sigel, Erin M; Huiet, Layne; Larsson, Anders; Burge, Dylan O; Ruhsam, Markus; Deyholos, Michael; Soltis, Douglas E; Stewart, C Neal; Shaw, Shane W; Pokorny, Lisa; Chen, Tao; dePamphilis, Claude; DeGironimo, Lisa; Chen, Li; Wei, Xiaofeng; Sun, Xiao; Korall, Petra; Stevenson, Dennis W; Graham, Sean W; Wong, Gane K-S; Pryer, Kathleen M

    2015-07-01

    • Understanding fern (monilophyte) phylogeny and its evolutionary timescale is critical for broad investigations of the evolution of land plants, and for providing the point of comparison necessary for studying the evolution of the fern sister group, seed plants. Molecular phylogenetic investigations have revolutionized our understanding of fern phylogeny, however, to date, these studies have relied almost exclusively on plastid data.• Here we take a curated phylogenomics approach to infer the first broad fern phylogeny from multiple nuclear loci, by combining broad taxon sampling (73 ferns and 12 outgroup species) with focused character sampling (25 loci comprising 35877 bp), along with rigorous alignment, orthology inference and model selection.• Our phylogeny corroborates some earlier inferences and provides novel insights; in particular, we find strong support for Equisetales as sister to the rest of ferns, Marattiales as sister to leptosporangiate ferns, and Dennstaedtiaceae as sister to the eupolypods. Our divergence-time analyses reveal that divergences among the extant fern orders all occurred prior to ∼200 MYA. Finally, our species-tree inferences are congruent with analyses of concatenated data, but generally with lower support. Those cases where species-tree support values are higher than expected involve relationships that have been supported by smaller plastid datasets, suggesting that deep coalescence may be reducing support from the concatenated nuclear data.• Our study demonstrates the utility of a curated phylogenomics approach to inferring fern phylogeny, and highlights the need to consider underlying data characteristics, along with data quantity, in phylogenetic studies. © 2015 Botanical Society of America, Inc.

  6. Adventitious Carbon on Primary Sample Containment Metal Surfaces

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Fries, M. D.

    2015-01-01

    Future missions that return astromaterials with trace carbonaceous signatures will require strict protocols for reducing and controlling terrestrial carbon contamination. Adventitious carbon (AC) on primary sample containers and related hardware is an important source of that contamination. AC is a thin film layer or heterogeneously dispersed carbonaceous material that naturally accrues from the environment on the surface of atmospheric exposed metal parts. To test basic cleaning techniques for AC control, metal surfaces commonly used for flight hardware and curating astromaterials at JSC were cleaned using a basic cleaning protocol and characterized for AC residue. Two electropolished stainless steel 316L (SS- 316L) and two Al 6061 (Al-6061) test coupons (2.5 cm diameter by 0.3 cm thick) were subjected to precision cleaning in the JSC Genesis ISO class 4 cleanroom Precision Cleaning Laboratory. Afterwards, the samples were analyzed by X-ray photoelectron spectroscopy (XPS) and Raman spectroscopy.

  7. Three Proposed Compendia for Genesis Solar Wind Samples: Science Results, Collector Materials Characterization and Cleaning Techniques

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Calaway, M. J.; Nyquist, L. E.; Jurewicz, A. J. G.; Burnett, D. S.

    2018-01-01

    Final Paper and not the abstract is attached. Introduction: Planetary material and cosmochemistry research using Genesis solar wind samples (including the development and implementation of cleaning and analytical techniques) has matured sufficiently that compilations on several topics, if made publically accessible, would be beneficial for researchers and reviewers. We propose here three compendia based on content, organization and source of documents (e.g. published peer-reviewed, published, internal memos, archives). For planning purposes, suggestions are solicited from potential users of Genesis solar wind samples for the type of science content and/or organizational style that would be most useful to them. These compendia are proposed as living documents, periodically updated. Similar to the existing compendia described below, the curation compendia are like library or archival finding aids, they are guides to published or archival documents and should not be cited as primary sources.

  8. Science-Ready Data Production in the DKIST Data Center

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin; Berukoff, Steven; Hays, Tony; Spiess, DJ; Watson, Fraser

    2015-08-01

    The NSO's new flagship solar observatory, the four-meter Daniel K. Inouye Solar Telescope is under construction on Halekalala, Hawaii and slated for first light in 2019. The facility will operate an initial suite of five complementary spectroscopic and polarimetric instruments, with up to 11 detectors running simultaneously at typical cadences of 5-30 frames per second, or more. The instruments will generate data of notable volume, dimensionality, cardinality, and diversity. The facility is expected to record several hundred million images per year, for a total data volume in excess of 4 petabytes.Beyond the crucial informatics infrastructure necessary to transport, store, and curate this deluge of data, there are significant challenges in developing the robust calibration workflows that can autonomously process the range of data to generate science-ready datasets for a heterogeneous and growing community. Efforts will be made to improve our ability to compensate for the effects of the Earth's atmosphere, to identify and assess instrument and facility contributions to the measured signal, and to use of quality and fitness-of-use metrics to characterize and advertise datasets.In this talk, we will provide an overview of the methods and tools we are using to define and evaluate the calibration workflows. We will review the type of datasets that may be made available to scientists at the time of the initial operations of DKIST, as well as the potential mechanisms for the search and delivery of those data products. We will also suggest some of the likely secondary data products that could possibly be developed successively in collaboration with the community.

  9. 300 Area treated effluent disposal facility sampling schedule. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1995-03-28

    This document is the interface between the 300 Area liquid effluent process engineering (LEPE) group and the waste sampling and characterization facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  10. Enhanced Cleaning of Genesis Solar Wind Sample 61348 for Film Residue Removal

    NASA Technical Reports Server (NTRS)

    Allums, K. K.; Gonzalez, C. P.; Kuhlman, K. R.; Allton, J. H.

    2015-01-01

    The Genesis mission returned to Earth on September 8, 2004, experiencing a nonnominal reentry. During the recovery of the collector materials from the capsule, many of the collector fragments were placed on the adhesive protion of post-it notes to prevent the fragments from moving during transport back to Johnson Space Center. This unknowingly provided an additional contaminate that would prove difficult to remove with the limited chemistries allowed in the Genesis Curation Laboratory. Generally when collector material samples are prepared for allocation to PIs, the samples are cleaned front side only with Ultra-Pure Water (UPW) via megasonic dispersion to the collector surface to remove crash debris and contamination. While this cleaning method works well on samples that were not placed on post-its during recovery, it has caused movement of the residue on the back of the sample to be deposited on the front in at least two examples. Therefore, samples placed on the adhesive portion on post-it note, require enhanced cleaning methods since post-it residue has proved resistant to UPW cleaning.

  11. Determination of gross alpha and gross beta in soil around repository facility at Bukit Kledang, Perak, Malaysia

    NASA Astrophysics Data System (ADS)

    Adziz, Mohd Izwan Abdul; Siong, Khoo Kok

    2018-04-01

    Recently, the Long Term Storage Facility (LTSF) in Bukit Kledang, Perak, Malaysia, has been upgraded to repository facility upon the completion of decontamination and decommissioning (D&D) process. Thorium waste and contaminated material that may contain some minor amounts of thorium hydroxide were disposed in this facility. This study is conducted to determine the concentrations of gross alpha and gross beta radioactivities in soil samples collected around the repository facility. A total of 12 soil samples were collected consisting 10 samples from around the facility and 2 samples from selected residential area near the facility. In addition, the respective dose rates were measured 5 cm and 1 m above the ground by using survey meter with Geiger Muller (GM) detector and Sodium Iodide (NaI) detector. Soil samples were collected using hand auger and then were taken back to the laboratory for further analysis. Samples were cleaned, dried, pulverized and sieved prior to analysis. Gross alpha and gross beta activity measurements were carried out using gas flow proportional counter, Canberra Series 5 XLB - Automatic Low Background Alpha and Beta Counting System. The obtained results show that, the gross alpha and gross beta activity concentration ranged from 1.55 to 5.34 Bq/g with a mean value of 3.47 ± 0.09 Bq/g and 1.64 to 5.78 Bq/g with a mean value of 3.49 ± 0.09 Bq/g, respectively. These results can be used as an additional data to represent terrestrial radioactivity baseline data for Malaysia environment. This estimation will also serve as baseline for detection of any future related activities of contamination especially around the repository facility area.

  12. The Five Cs of Digital Curation: Supporting Twenty-First-Century Teaching and Learning

    ERIC Educational Resources Information Center

    Deschaine, Mark E.; Sharma, Sue Ann

    2015-01-01

    Digital curation is a process that allows university professors to adapt and adopt resources from multidisciplinary fields to meet the educational needs of twenty-first-century learners. Looking through the lens of new media literacy studies (Vasquez, Harste, & Albers, 2010) and new literacies studies (Gee, 2010), we propose that university…

  13. BC4GO: a full-text corpus for the BioCreative IV GO Task

    USDA-ARS?s Scientific Manuscript database

    Gene function curation via Gene Ontology (GO) annotation is a common task among Model Organism Database (MOD) groups. Due to its manual nature, this task is time-consuming and labor-intensive, and thus considered one of the bottlenecks in literature curation. There have been many previous attempts a...

  14. Participants' Perception of Therapeutic Factors in Groups for Incest Survivors.

    ERIC Educational Resources Information Center

    Wheeler, Inese; And Others

    1992-01-01

    Investigated member-perceived curative factors in an incest-survivor group, comparing therapeutic factors reported in closed, time-limited incest survivor group to those in Bonney et al.'s open, long-term survivor group and to Yalom's therapy groups. Findings suggest that relative importance of curative factors may be related to group stages.…

  15. Edited Excerpts from a Smithsonian Seminar Series: Part I: The Arts.

    ERIC Educational Resources Information Center

    Zilczar, Judith K.; And Others

    1991-01-01

    In this first of three excerpts from seminars sponsored by the Smithsonian Institution on collaborative knowledge generation in the arts, the sciences, and the humanities, two art curators and a filmmaker discuss the meaning of collaboration in their fields. Topics discussed include twentieth-century artists and art curators, Chinese art, and…

  16. Current Trends and Future Directions in Data Curation Research and Education

    ERIC Educational Resources Information Center

    Weber, Nicholas M.; Palmer, Carole L.; Chao, Tiffany C.

    2012-01-01

    Digital research data have introduced a new set of collection, preservation, and service demands into the tradition of digital librarianship. Consequently, the role of an information professional has evolved to include the activities of data curation. This new field more specifically addresses the needs of stewarding and preserving digital…

  17. Social Media Selves: College Students' Curation of Self and Others through Facebook

    ERIC Educational Resources Information Center

    Kasch, David Michael

    2013-01-01

    This qualitative study used cyber-ethnography and grounded theory to explore the ways in which 35 undergraduate students crafted and refined self-presentations on the social network site Facebook. Findings included the identification of two unique forms of self-presentation that students enacted: a "curated self" and a "commodified…

  18. Geospatial Data Curation at the University of Idaho

    ERIC Educational Resources Information Center

    Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.

    2012-01-01

    The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…

  19. A Study of Faculty Data Curation Behaviors and Attitudes at a Teaching-Centered University

    ERIC Educational Resources Information Center

    Scaramozzino, Jeanine Marie; Ramírez, Marisa L.; McGaughey, Karen J.

    2012-01-01

    Academic libraries need reliable information on researcher data needs, data curation practices, and attitudes to identify and craft appropriate services that support outreach and teaching. This paper describes information gathered from a survey distributed to the College of Science and Mathematics faculty at California Polytechnic State…

  20. New Roles for New Times: Digital Curation for Preservation

    ERIC Educational Resources Information Center

    Walters, Tyler; Skinner, Katherine

    2011-01-01

    Digital curation refers to the actions people take to maintain and add value to digital information over its lifecycle, including the processes used when creating digital content. Digital preservation focuses on the "series of managed activities necessary to ensure continued access to digital materials for as long as necessary." In this…

  1. Surveying the maize community for their diversity and pedigree visualization needs to prioritize tool development and curation

    USDA-ARS?s Scientific Manuscript database

    The Maize Genetics and Genomics Database (MaizeGDB) team prepared a survey to identify breeders’ needs for visualizing pedigrees, diversity data, and haplotypes in order to prioritize tool development and curation efforts at MaizeGDB. The survey was distributed to the maize research community on beh...

  2. Curating Media Learning: Towards a Porous Expertise

    ERIC Educational Resources Information Center

    McDougall, Julian; Potter, John

    2015-01-01

    This article combines research results from a range of projects with two consistent themes. Firstly, we explore the potential for curation to offer a productive metaphor for the convergence of digital media learning across and between home/lifeworld and formal educational/system-world spaces--or between the public and private spheres. Secondly, we…

  3. Student-Curated Exhibits: A Vehicle towards Student Engagement, Retention, and Success

    ERIC Educational Resources Information Center

    Marsee, Mickey; Davies-Wilson, Dennis

    2014-01-01

    In looking for ways to combine course content literacy and information literacy with active learning, in 2007, the English Department and Library at The University of New Mexico-Los Alamos combined efforts and created a course project for students to curate exhibits that would demonstrate their understanding of course material through library…

  4. A Relevancy Algorithm for Curating Earth Science Data Around Phenomenon

    NASA Technical Reports Server (NTRS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-01-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earthscience metadata records. Second, the methodology has been implemented as a standalone web service that is utilized to augment search and usability of data in a variety of tools.

  5. A relevancy algorithm for curating earth science data around phenomenon

    NASA Astrophysics Data System (ADS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-09-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.

  6. Utility of Inflammatory Marker- and Nutritional Status-based Prognostic Factors for Predicting the Prognosis of Stage IV Gastric Cancer Patients Undergoing Non-curative Surgery.

    PubMed

    Mimatsu, Kenji; Fukino, Nobutada; Ogasawara, Yasuo; Saino, Yoko; Oida, Takatsugu

    2017-08-01

    The present study aimed to compare the utility of various inflammatory marker- and nutritional status-based prognostic factors, including many previous established prognostic factors, for predicting the prognosis of stage IV gastric cancer patients undergoing non-curative surgery. A total of 33 patients with stage IV gastric cancer who had undergone palliative gastrectomy and gastrojejunostomy were included in the study. Univariate and multivariate analyses were performed to evaluate the relationships between the mGPS, PNI, NLR, PLR, the CONUT, various clinicopathological factors and cancer-specific survival (CS). Among patients who received non-curative surgery, univariate analysis of CS identified the following significant risk factors: chemotherapy, mGPS and NLR, and multivariate analysis revealed that the mGPS was independently associated with CS. The mGPS was a more useful prognostic factor than the PNI, NLR, PLR and CONUT in patients undergoing non-curative surgery for stage IV gastric cancer. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  7. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  8. The transprofessional model: blending intents in terminal care of AIDS.

    PubMed

    Cherin, D A; Simmons, W J; Hillary, K

    1998-01-01

    Current terminal care services present dying patients and their families with a dichotomy in service delivery and the intent care between curative treatments and palliative treatments. This arbitrary dichotomy reduces patients' quality of life in many cases and robs patients and families of benefiting from the psychosocial aspects of treatment until the last few weeks of life. This article presents a blended model of care, the Transprofessional Model, in which patients receive both curative and palliative service throughout their care process. The blended intent model differs from traditional home care in that services are provided by a care coordination team composed of nurses and social workers; the traditional model of care is often case managed by a single, registered nurse. The combination of the multi-disciplinary approach to care coordination and training in both curative and palliative services in the Transprofessional Model demonstrates that this blended model of care produces a bio-psychosocial focus to terminal care as compared to a primary focus on curative services present in the traditional model of home care.

  9. The immune epitope database (IEDB) 3.0

    PubMed Central

    Vita, Randi; Overton, James A.; Greenbaum, Jason A.; Ponomarenko, Julia; Clark, Jason D.; Cantrell, Jason R.; Wheeler, Daniel K.; Gabbard, Joseph L.; Hix, Deborah; Sette, Alessandro; Peters, Bjoern

    2015-01-01

    The IEDB, www.iedb.org, contains information on immune epitopes—the molecular targets of adaptive immune responses—curated from the published literature and submitted by National Institutes of Health funded epitope discovery efforts. From 2004 to 2012 the IEDB curation of journal articles published since 1960 has caught up to the present day, with >95% of relevant published literature manually curated amounting to more than 15 000 journal articles and more than 704 000 experiments to date. The revised curation target since 2012 has been to make recent research findings quickly available in the IEDB and thereby ensure that it continues to be an up-to-date resource. Having gathered a comprehensive dataset in the IEDB, a complete redesign of the query and reporting interface has been performed in the IEDB 3.0 release to improve how end users can access this information in an intuitive and biologically accurate manner. We here present this most recent release of the IEDB and describe the user testing procedures as well as the use of external ontologies that have enabled it. PMID:25300482

  10. Disease model curation improvements at Mouse Genome Informatics

    PubMed Central

    Bello, Susan M.; Richardson, Joel E.; Davis, Allan P.; Wiegers, Thomas C.; Mattingly, Carolyn J.; Dolan, Mary E.; Smith, Cynthia L.; Blake, Judith A.; Eppig, Janan T.

    2012-01-01

    Optimal curation of human diseases requires an ontology or structured vocabulary that contains terms familiar to end users, is robust enough to support multiple levels of annotation granularity, is limited to disease terms and is stable enough to avoid extensive reannotation following updates. At Mouse Genome Informatics (MGI), we currently use disease terms from Online Mendelian Inheritance in Man (OMIM) to curate mouse models of human disease. While OMIM provides highly detailed disease records that are familiar to many in the medical community, it lacks structure to support multilevel annotation. To improve disease annotation at MGI, we evaluated the merged Medical Subject Headings (MeSH) and OMIM disease vocabulary created by the Comparative Toxicogenomics Database (CTD) project. Overlaying MeSH onto OMIM provides hierarchical access to broad disease terms, a feature missing from the OMIM. We created an extended version of the vocabulary to meet the genetic disease-specific curation needs at MGI. Here we describe our evaluation of the CTD application, the extensions made by MGI and discuss the strengths and weaknesses of this approach. Database URL: http://www.informatics.jax.org/ PMID:22434831

  11. WormBase 2014: new views of curated biology

    PubMed Central

    Harris, Todd W.; Baran, Joachim; Bieri, Tamberlyn; Cabunoc, Abigail; Chan, Juancarlos; Chen, Wen J.; Davis, Paul; Done, James; Grove, Christian; Howe, Kevin; Kishore, Ranjana; Lee, Raymond; Li, Yuling; Muller, Hans-Michael; Nakamura, Cecilia; Ozersky, Philip; Paulini, Michael; Raciti, Daniela; Schindelman, Gary; Tuli, Mary Ann; Auken, Kimberly Van; Wang, Daniel; Wang, Xiaodong; Williams, Gary; Wong, J. D.; Yook, Karen; Schedl, Tim; Hodgkin, Jonathan; Berriman, Matthew; Kersey, Paul; Spieth, John; Stein, Lincoln; Sternberg, Paul W.

    2014-01-01

    WormBase (http://www.wormbase.org/) is a highly curated resource dedicated to supporting research using the model organism Caenorhabditis elegans. With an electronic history predating the World Wide Web, WormBase contains information ranging from the sequence and phenotype of individual alleles to genome-wide studies generated using next-generation sequencing technologies. In recent years, we have expanded the contents to include data on additional nematodes of agricultural and medical significance, bringing the knowledge of C. elegans to bear on these systems and providing support for underserved research communities. Manual curation of the primary literature remains a central focus of the WormBase project, providing users with reliable, up-to-date and highly cross-linked information. In this update, we describe efforts to organize the original atomized and highly contextualized curated data into integrated syntheses of discrete biological topics. Next, we discuss our experiences coping with the vast increase in available genome sequences made possible through next-generation sequencing platforms. Finally, we describe some of the features and tools of the new WormBase Web site that help users better find and explore data of interest. PMID:24194605

  12. The Internet of Scientific Research Things

    NASA Astrophysics Data System (ADS)

    Chandler, Cynthia; Shepherd, Adam; Arko, Robert; Leadbetter, Adam; Groman, Robert; Kinkade, Danie; Rauch, Shannon; Allison, Molly; Copley, Nancy; Gegg, Stephen; Wiebe, Peter; Glover, David

    2016-04-01

    The sum of the parts is greater than the whole, but for scientific research how do we identify the parts when they are curated at distributed locations? Results from environmental research represent an enormous investment and constitute essential knowledge required to understand our planet in this time of rapid change. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) curates data from US NSF Ocean Sciences funded research awards, but BCO-DMO is only one repository in a landscape that includes many other sites that carefully curate results of scientific research. Recent efforts to use persistent identifiers (PIDs), most notably Open Researcher and Contributor ID (ORCiD) for person, Digital Object Identifier (DOI) for publications including data sets, and Open Funder Registry (FundRef) codes for research grants and awards are realizing success in unambiguously identifying the pieces that represent results of environmental research. This presentation uses BCO-DMO as a test case for adding PIDs to the locally-curated information published out as standards compliant metadata records. We present a summary of progress made thus far; what has worked and why, and thoughts on logical next steps.

  13. Eye health seeking habits and barriers to accessing curative services among blind beggars in an urban community in Northern Nigeria.

    PubMed

    Balarabe, Aliyu Hamza; Hassan, Ramatu; Fatai, Olatunji O

    2014-01-01

    The aim of the following study was to determine the types of intervention sought by the blind street beggars and assess the barriers to accessing available eye care services. This cross-sectional study was conducted among consenting blind street beggars in Sokoto, Nigeria between May and June, 2009. A semi-structured interview was conducted to probe issues on historical antecedents of the blindness and the eye heath seeking behavior including the use of traditional eye medications. Assessment of barriers to accessing curative services among the blind persons was explored. Questions were asked and the individual responses were recorded in the questionnaire under the appropriate sections. Two hundred and two of 216 (94.7%) of the examined subjects were found to be blind and included in the analysis. The principal cause of blindness was corneal opacity. Overall 82% of the blindness was due to avoidable causes with majority irreversibly blind. Only 38 subjects (18.8%) sought for intervention in hospitals, others resorted to self-medication (42.1%), medicine store (31.2%) and traditional facility (7.9%). Those that accessed treatment at a hospital did so mainly at a primary health center (50.0%) and General Hospitals (34.2%). The barriers to accessing treatment at the hospital were mainly due to "not taken to any hospital" by the parents/relatives (50.3%) and "services not available" (25.2%). Most respondents resorted to ocular self-medication particularly traditional eye medicines. We advocate for a provision of affordable, accessible and qualitative eye care services with a strong health education component on avoidable causes of blindness.

  14. Fish Karyome version 2.1: a chromosome database of fishes and other aquatic organisms

    PubMed Central

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Rashid, Iliyas; Sharma, Jyoti; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra; Murali, S.

    2016-01-01

    A voluminous information is available on karyological studies of fishes; however, limited efforts were made for compilation and curation of the available karyological data in a digital form. ‘Fish Karyome’ database was the preliminary attempt to compile and digitize the available karyological information on finfishes belonging to the Indian subcontinent. But the database had limitations since it covered data only on Indian finfishes with limited search options. Perceiving the feedbacks from the users and its utility in fish cytogenetic studies, the Fish Karyome database was upgraded by applying Linux, Apache, MySQL and PHP (pre hypertext processor) (LAMP) technologies. In the present version, the scope of the system was increased by compiling and curating the available chromosomal information over the globe on fishes and other aquatic organisms, such as echinoderms, molluscs and arthropods, especially of aquaculture importance. Thus, Fish Karyome version 2.1 presently covers 866 chromosomal records for 726 species supported with 253 published articles and the information is being updated regularly. The database provides information on chromosome number and morphology, sex chromosomes, chromosome banding, molecular cytogenetic markers, etc. supported by fish and karyotype images through interactive tools. It also enables the users to browse and view chromosomal information based on habitat, family, conservation status and chromosome number. The system also displays chromosome number in model organisms, protocol for chromosome preparation and allied techniques and glossary of cytogenetic terms. A data submission facility has also been provided through data submission panel. The database can serve as a unique and useful resource for cytogenetic characterization, sex determination, chromosomal mapping, cytotaxonomy, karyo-evolution and systematics of fishes. Database URL: http://mail.nbfgr.res.in/Fish_Karyome PMID:26980518

  15. Fish Karyome version 2.1: a chromosome database of fishes and other aquatic organisms.

    PubMed

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Rashid, Iliyas; Sharma, Jyoti; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra; Murali, S

    2016-01-01

    A voluminous information is available on karyological studies of fishes; however, limited efforts were made for compilation and curation of the available karyological data in a digital form. 'Fish Karyome' database was the preliminary attempt to compile and digitize the available karyological information on finfishes belonging to the Indian subcontinent. But the database had limitations since it covered data only on Indian finfishes with limited search options. Perceiving the feedbacks from the users and its utility in fish cytogenetic studies, the Fish Karyome database was upgraded by applying Linux, Apache, MySQL and PHP (pre hypertext processor) (LAMP) technologies. In the present version, the scope of the system was increased by compiling and curating the available chromosomal information over the globe on fishes and other aquatic organisms, such as echinoderms, molluscs and arthropods, especially of aquaculture importance. Thus, Fish Karyome version 2.1 presently covers 866 chromosomal records for 726 species supported with 253 published articles and the information is being updated regularly. The database provides information on chromosome number and morphology, sex chromosomes, chromosome banding, molecular cytogenetic markers, etc. supported by fish and karyotype images through interactive tools. It also enables the users to browse and view chromosomal information based on habitat, family, conservation status and chromosome number. The system also displays chromosome number in model organisms, protocol for chromosome preparation and allied techniques and glossary of cytogenetic terms. A data submission facility has also been provided through data submission panel. The database can serve as a unique and useful resource for cytogenetic characterization, sex determination, chromosomal mapping, cytotaxonomy, karyo-evolution and systematics of fishes. Database URL: http://mail.nbfgr.res.in/Fish_Karyome. © The Author(s) 2016. Published by Oxford University Press.

  16. EarthCube Data Discovery Hub: Enhancing, Curating and Finding Data across Multiple Geoscience Data Sources.

    NASA Astrophysics Data System (ADS)

    Zaslavsky, I.; Valentine, D.; Richard, S. M.; Gupta, A.; Meier, O.; Peucker-Ehrenbrink, B.; Hudman, G.; Stocks, K. I.; Hsu, L.; Whitenack, T.; Grethe, J. S.; Ozyurt, I. B.

    2017-12-01

    EarthCube Data Discovery Hub (DDH) is an EarthCube Building Block project using technologies developed in CINERGI (Community Inventory of EarthCube Resources for Geoscience Interoperability) to enable geoscience users to explore a growing portfolio of EarthCube-created and other geoscience-related resources. Over 1 million metadata records are available for discovery through the project portal (cinergi.sdsc.edu). These records are retrieved from data facilities, including federal, state and academic sources, or contributed by geoscientists through workshops, surveys, or other channels. CINERGI metadata augmentation pipeline components 1) provide semantic enhancement based on a large ontology of geoscience terms, using text analytics to generate keywords with references to ontology classes, 2) add spatial extents based on place names found in the metadata record, and 3) add organization identifiers to the metadata. The records are indexed and can be searched via a web portal and standard search APIs. The added metadata content improves discoverability and interoperability of the registered resources. Specifically, the addition of ontology-anchored keywords enables faceted browsing and lets users navigate to datasets related by variables measured, equipment used, science domain, processes described, geospatial features studied, and other dataset characteristics that are generated by the pipeline. DDH also lets data curators access and edit the automatically generated metadata records using the CINERGI metadata editor, accept or reject the enhanced metadata content, and consider it in updating their metadata descriptions. We consider several complex data discovery workflows, in environmental seismology (quantifying sediment and water fluxes using seismic data), marine biology (determining available temperature, location, weather and bleaching characteristics of coral reefs related to measurements in a given coral reef survey), and river geochemistry (discovering observations relevant to geochemical measurements outside the tidal zone, given specific discharge conditions).

  17. Active and Social Data Curation: Reinventing the Business of Community-scale Lifecycle Data Management

    NASA Astrophysics Data System (ADS)

    McDonald, R. H.; Kumar, P.; Plale, B. A.; Myers, J.; Hedstrom, M. L.

    2012-12-01

    Effective long-term curation and preservation of data for community use has historically been limited to high-value and homogeneous collections produced by mission-oriented organizations. The technologies and practices that have been applied in these cases, e.g. relational data bases, development of comprehensive standardized vocabularies, and centralized support for reference data collections, are arguably applicable to the much broader range of data generated by the long tail of investigator-led research, with the logical conclusion of such an argument leading to the call for training, evangelism, and vastly increased funding as the best means of broadening community-scale data management. In this paper, we question this reasoning and explore how alternative approaches focused on the overall data lifecycle and the sociological and business realities of distributed multi-disciplinary research communities might dramatically lower costs, increase value, and consequently drive dramatic advances in our ability to use and re-use data, and ultimately enable more rapid scientific advance. Specifically, we introduce the concepts of active and social curation as a means to decrease coordination costs, align costs and values for individual data producers and data consumers, and improve the immediacy of returns for data curation investments. Further, we describe the specific architecture and services for active and social curation that are being prototyped within the Sustainable Environment - Actionable Data (SEAD) project within NSF's DataNet network and discuss how they are motivated by the long-tail dynamics in the cross-disciplinary sustainability research community.

  18. PHI-base: a new interface and further additions for the multi-species pathogen-host interactions database.

    PubMed

    Urban, Martin; Cuzick, Alayne; Rutherford, Kim; Irvine, Alistair; Pedro, Helder; Pant, Rashmi; Sadanadan, Vidyendra; Khamari, Lokanath; Billal, Santoshkumar; Mohanty, Sagar; Hammond-Kosack, Kim E

    2017-01-04

    The pathogen-host interactions database (PHI-base) is available at www.phi-base.org PHI-base contains expertly curated molecular and biological information on genes proven to affect the outcome of pathogen-host interactions reported in peer reviewed research articles. In addition, literature that indicates specific gene alterations that did not affect the disease interaction phenotype are curated to provide complete datasets for comparative purposes. Viruses are not included. Here we describe a revised PHI-base Version 4 data platform with improved search, filtering and extended data display functions. A PHIB-BLAST search function is provided and a link to PHI-Canto, a tool for authors to directly curate their own published data into PHI-base. The new release of PHI-base Version 4.2 (October 2016) has an increased data content containing information from 2219 manually curated references. The data provide information on 4460 genes from 264 pathogens tested on 176 hosts in 8046 interactions. Prokaryotic and eukaryotic pathogens are represented in almost equal numbers. Host species belong ∼70% to plants and 30% to other species of medical and/or environmental importance. Additional data types included into PHI-base 4 are the direct targets of pathogen effector proteins in experimental and natural host organisms. The curation problems encountered and the future directions of the PHI-base project are briefly discussed. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Long-Term Survival and Tumor Recurrence in Patients with Superficial Esophageal Cancer after Complete Non-Curative Endoscopic Resection: A Single-Center Case Series.

    PubMed

    Lee, Ji Wan; Cho, Charles J; Kim, Do Hoon; Ahn, Ji Yong; Lee, Jeong Hoon; Choi, Kee Don; Song, Ho June; Park, Sook Ryun; Lee, Hyun Joo; Kim, Yong Hee; Lee, Gin Hyug; Jung, Hwoon-Yong; Kim, Sung-Bae; Kim, Jong Hoon; Park, Seung-Il

    2018-06-01

    To report the long-term survival and tumor recurrence outcomes in patients with superficial esophageal cancer (SEC) after complete non-curative endoscopic resection (ER). We retrieved ER data for 24 patients with non-curatively resected SEC. Non-curative resection was defined as the presence of submucosal and/or lymphovascular invasion on ER pathology. Relevant clinical and tumor-specific parameters were reviewed. The mean age of the 24 study patients was 66.3±8.3 years. Ten patients were closely followed up without treatment, while 14 received additional treatment. During a mean follow-up of 59.0±33.2 months, the 3- and 5-year survival rates of all cases were 90.7% and 77.6%, respectively. The 5-year overall survival rates were 72.9% in the close observation group and 82.1% in the additional treatment group (p=0.958). The 5-year cumulative incidences of all cases of recurrence (25.0% vs. 43.3%, p=0.388), primary EC recurrence (10.0% vs. 16.4%, p=0.558), and metachronous EC recurrence (16.7% vs. 26.7%, p=0.667) were similar between the two groups. Patients with non-curatively resected SEC showed good long-term survival outcomes. Given the similar oncologic outcomes, close observation may be an option with appropriate caution taken for patients who are medically unfit to receive additional therapy.

  20. Collaborative biocuration--text-mining development task for document prioritization for curation.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2012-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation is a community-wide effort for evaluating text mining and information extraction systems for the biological domain. The 'BioCreative Workshop 2012' subcommittee identified three areas, or tracks, that comprised independent, but complementary aspects of data curation in which they sought community input: literature triage (Track I); curation workflow (Track II) and text mining/natural language processing (NLP) systems (Track III). Track I participants were invited to develop tools or systems that would effectively triage and prioritize articles for curation and present results in a prototype web interface. Training and test datasets were derived from the Comparative Toxicogenomics Database (CTD; http://ctdbase.org) and consisted of manuscripts from which chemical-gene-disease data were manually curated. A total of seven groups participated in Track I. For the triage component, the effectiveness of participant systems was measured by aggregate gene, disease and chemical 'named-entity recognition' (NER) across articles; the effectiveness of 'information retrieval' (IR) was also measured based on 'mean average precision' (MAP). Top recall scores for gene, disease and chemical NER were 49, 65 and 82%, respectively; the top MAP score was 80%. Each participating group also developed a prototype web interface; these interfaces were evaluated based on functionality and ease-of-use by CTD's biocuration project manager. In this article, we present a detailed description of the challenge and a summary of the results.

Top