Sample records for tooling technologies standardization

  1. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  14. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  15. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France

    PubMed Central

    Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8–7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities. PMID:27191164

  16. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France.

    PubMed

    Chacón, M Gema; Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8-7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities.

  17. Keeping up with Our Students: The Evolution of Technology and Standards in Art Education

    ERIC Educational Resources Information Center

    Patton, Ryan M.; Buffington, Melanie L.

    2016-01-01

    This article addresses the standards of technology in the visual arts, arguing the standards function as de facto policy, the guidelines that shape what teachers teach. In this study, we investigate how art education standards approach technology as a teaching tool and artmaking medium, analyzing the current National Visual Arts Standards, the…

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  19. Automated Formal Testing of C API Using T2C Framework

    NASA Astrophysics Data System (ADS)

    Khoroshilov, Alexey V.; Rubanov, Vladimir V.; Shatokhin, Eugene A.

    A problem of automated test development for checking basic functionality of program interfaces (API) is discussed. Different technologies and corresponding tools are surveyed. And T2C technology developed in ISPRAS is presented. The technology and associated tools facilitate development of "medium quality" (and "medium cost") tests. An important feature of T2C technology is that it enforces that each check in a developed test is explicitly linked to the corresponding place in the standard. T2C tools provide convenient means to create such linkage. The results of using T2C are considered by example of a project for testing interfaces of Linux system libraries defined by the LSB standard.

  20. Introducing a design exigency to promote student learning through assessment: A case study.

    PubMed

    Grealish, Laurie A; Shaw, Julie M

    2018-02-01

    Assessment technologies are often used to classify student and newly qualified nurse performance as 'pass' or 'fail', with little attention to how these decisions are achieved. Examining the design exigencies of classification technologies, such as performance assessment technologies, provides opportunities to explore flexibility and change in the process of using those technologies. Evaluate an established assessment technology for nursing performance as a classification system. A case study analysis that is focused on the assessment approach and a priori design exigencies of performance assessment technology, in this case the Australian Nursing Standards Assessment Tool 2016. Nurse assessors are required to draw upon their expertise to judge performance, but that judgement is described as a source of bias, creating confusion. The definition of satisfactory performance is 'ready to enter practice'. To pass, the performance on each criterion must be at least satisfactory, indicating to the student that no further improvement is required. The Australian Nursing Standards Assessment Tool 2016 does not have a third 'other' category, which is usually found in classification systems. Introducing a 'not yet competent' category and creating a two-part, mixed methods assessment process can improve the Australian Nursing Standards Assessment Tool 2016 assessment technology. Using a standards approach in the first part, judgement is valued and can generate learning opportunities across a program. Using a measurement approach in the second part, student performance can be 'not yet competent' but still meet criteria for year level performance and a graded pass. Subjecting the Australian Nursing Standards Assessment Tool 2016 assessment technology to analysis as a classification system provides opportunities for innovation in design. This design innovation has the potential to support students who move between programs and clinicians who assess students from different universities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. The Health Information Technology Competencies Tool: Does It Translate for Nursing Informatics in the United States?

    PubMed

    Sipes, Carolyn; Hunter, Kathleen; McGonigle, Dee; West, Karen; Hill, Taryn; Hebda, Toni

    2017-12-01

    Information technology use in healthcare delivery mandates a prepared workforce. The initial Health Information Technology Competencies tool resulted from a 2-year transatlantic effort by experts from the US and European Union to identify approaches to develop skills and knowledge needed by healthcare workers. It was determined that competencies must be identified before strategies are established, resulting in a searchable database of more than 1000 competencies representing five domains, five skill levels, and more than 250 roles. Health Information Technology Competencies is available at no cost and supports role- or competency-based queries. Health Information Technology Competencies developers suggest its use for curriculum planning, job descriptions, and professional development.The Chamberlain College of Nursing informatics research team examined Health Information Technology Competencies for its possible application to our research and our curricular development, comparing it originally with the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools, which examine informatics competencies at four levels of nursing practice. Additional analysis involved the 2015 Nursing Informatics: Scope and Standards of Practice. Informatics is a Health Information Technology Competencies domain, so clear delineation of nursing-informatics competencies was expected. Researchers found TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 differed from Health Information Technology Competencies 2016 in focus, definitions, ascribed competencies, and defined levels of expertise. When Health Information Technology Competencies 2017 was compared against the nursing informatics scope and standards, researchers found an increase in the number of informatics competencies but not to a significant degree. This is not surprising, given that Health Information Technology Competencies includes all healthcare workers, while the TIGER-based Assessment of Nursing Informatics Competencies and Nursing Informatics Competency Assessment of Level 3 and Level 4 tools and the American Nurses Association Nursing Informatics: Scope and Standards of Practice are nurse specific. No clear cross mapping across these tools and the standards of nursing informatics practice exists. Further examination and review are needed to translate Health Information Technology Competencies as a viable tool for nursing informatics use in the US.

  2. A Natural Fit: Problem-based Learning and Technology Standards.

    ERIC Educational Resources Information Center

    Sage, Sara M.

    2000-01-01

    Discusses the use of problem-based learning to meet technology standards. Highlights include technology as a tool for locating and organizing information; the Wolf Wars problem for elementary and secondary school students that provides resources, including Web sites, for information; Web-based problems; and technology as assessment and as a…

  3. Maximum Achievable Control Technology Standards in Region 7

    EPA Pesticide Factsheets

    Maximum Achievable Control Technology Standards (MACTs) are applicable requirements under the Title V operating permit program. This is a resource for permit writers and reviewers to learn about the rules and explore other helpful tools.

  4. Standardized and Repeatable Technology Evaluation for Cybersecurity Acquisition

    DTIC Science & Technology

    2017-02-01

    methodology for evaluating cybersecurity technologies. In this report, we introduce the Department of Defense (DoD)-centric and Independent Technology...Evaluation Capability (DITEC), an experimental decision support service within the U.S. DoD which aims to provide a standardized framework for...13 5.3.1 The Technology Matching Tool: A Recommender System for Security Non - Experts

  5. Standardizing practices: a socio-history of experimental systems in classical genetic and virological cancer research, ca. 1920-1978.

    PubMed

    Fujimura, J H

    1996-01-01

    This paper presents a narrative history of technologies in cancer research circa 1920-1978 and a theoretical perspective on the complex, intertwined relationships between scientific problems, material practices and technologies, concepts and theories, and other historical circumstances. The history presents several active lines of research and technology development in the genetics of cancer in the United States which were constitutive of proto-oncogene work in its current form. I write this history from the perspective of technology development. Scientists participating in cancer research created tools with which to study their problems of interest, but the development of the tools also influenced the questions asked and answered in the form of concepts and theories developed. These tools included genetic ideas of the 1920s, inbred mouse colonies, chemicals and antibiotics developed during World War Two, tissue cultures and their technical procedures, and viruses. I examine these tools as standardized experimental systems that standardized materials as well as practices in laboratories. Inbred animals, tissue culture materials and methods, and tumor viruses as experimental systems gave materiality to "genes' and "cancer'. They are technical-natural objects that stand-in for nature in the laboratory.

  6. Mission Systems Open Architecture Science and Technology (MOAST) program

    NASA Astrophysics Data System (ADS)

    Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.

    2017-04-01

    The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.

  7. Cool Reaction: Go! Temp as a Tool for Science Teaching and Learning

    ERIC Educational Resources Information Center

    Kim, Hanna

    2005-01-01

    The National Science Education Standards (NSES; National Research Council [NRC], 1996) include Science and Technology as one of the eight categories of content standards. The science and technology standards establish connections between the natural and designed worlds and provide students with opportunities to develop decision-making abilities.…

  8. An Examination of Secondary School Teachers' Technology Integration Recommended by ISTE's National Educational Technology Standards for Teachers and School Principal Support for Teacher Technology Efforts

    ERIC Educational Resources Information Center

    Esposito, Maria

    2013-01-01

    The National Educational Technology Standards for teachers (NETS-T) was adopted by New York State, and was critical to the development of students entering a global society. This study examines teachers' use of digital tools to promote student learning and reflection, promote digital citizenship, communicate and collaborate with parents and…

  9. Teachers' Implementation of Pre-Constructed Dynamic Geometry Tasks in Technology-Intensive Algebra 1 Classrooms

    ERIC Educational Resources Information Center

    Cayton, Charity Sue-Adams

    2012-01-01

    Technology use and a focus on 21st century skills, coupled with recent adoption of Common Core State Standards for Mathematics, marks a new challenge for mathematics teachers. Communication, discourse, and tools for enhancing discourse (NCTM, 1991, 2000) play an integral role in successful implementation of technology and mathematics standards.…

  10. Pennsylvania Teachers' Perceptions and Use of Social Media Communication Technologies as a Pedagogical Tool

    ERIC Educational Resources Information Center

    Tozer, Brett C.

    2017-01-01

    A number of states and organizations have begun to add cross-content technology elements to their educational standards, providing teachers opportunities to use social media communication (SMC) technology in teaching and learning. Specifically, in the Commonwealth of Pennsylvania, the PA Core Standards, which are adapted from the national Common…

  11. Accomplishing PETE Learning Standards and Program Accreditation through Teacher Candidates' Technology-Based Service Learning Projects

    ERIC Educational Resources Information Center

    Gibbone, Anne; Mercier, Kevin

    2014-01-01

    Teacher candidates' use of technology is a component of physical education teacher education (PETE) program learning goals and accreditation standards. The methods presented in this article can help teacher candidates to learn about and apply technology as an instructional tool prior to and during field or clinical experiences. The goal in…

  12. MO/DSD online information server and global information repository access

    NASA Technical Reports Server (NTRS)

    Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William

    1994-01-01

    Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.

  13. Helping Teachers Embrace Standards

    ERIC Educational Resources Information Center

    Cardillo, Darlene S.

    2005-01-01

    This article details how the author, the director of educational technology for the schools in the Roman Catholic Diocese of Albany, New York, adapted the National Educational Technology Standards for Students (NETS.S) and created assessment tools for her teachers. It describes the system that was devised to hold teachers accountable for…

  14. Teachers Connect "with" Technology: Online Tools Build New Pathways to Collaboration

    ERIC Educational Resources Information Center

    Phillips, Vicki L.; Olson, Lynn

    2013-01-01

    Teachers, curriculum experts, and other educators work together using online tools developed by the Bill & Melinda Gates Foundation to create high-quality, useful lessons and research-based instructional tools incorporating the Common Core State Standards.

  15. Untangling home care's Gordion knot. The Home Care Information Management and Technology Forum.

    PubMed

    Wilhelm, Lawrence

    2003-03-01

    As home care and hospice technological tools have evolved over the past six years, there have been no efforts to standardize the collection, storage, and reporting of data among different systems. The rapid pace of technological change, increased use of wireless and remote technology, a greater reliance on tools for collaboration and networking, and the ever-increasing regulatory burden on home care and hospice providers have resulted in the need for polices and procedures for the standardization of data across the industry. Agency administrators, already strapped for cash and time, need to know what technology investments they need to make now in order to remain competitive in the future. The National Association for Home Care & Hospice has created a forum to address these concerns and to develop a blueprint for the future development of home care and hospice technology.

  16. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  17. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  18. 15 CFR 292.3 - Technical tools, techniques, practices, and analyses projects.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST... demonstrate that the tool or resource will be integrated into and will be of service to the NIST Manufacturing...

  19. 15 CFR 292.3 - Technical tools, techniques, practices, and analyses projects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST... demonstrate that the tool or resource will be integrated into and will be of service to the NIST Manufacturing...

  20. 15 CFR 292.3 - Technical tools, techniques, practices, and analyses projects.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST... demonstrate that the tool or resource will be integrated into and will be of service to the NIST Manufacturing...

  1. Introducing New Learning Tools into a Standard Classroom: A Multi-Tool Approach to Integrating Fuel-Cell Concepts into Introductory College Chemistry

    ERIC Educational Resources Information Center

    D'Amato, Matthew J.; Lux, Kenneth W.; Walz, Kenneth A.; Kerby, Holly Walter; Anderegg, Barbara

    2007-01-01

    A multi-tool approach incorporating traditional lectures, multimedia learning objects, and a laboratory activity were introduced as the concepts surrounding hydrogen fuel-cell technology in college chemistry courses. The new tools are adaptable, facilitating use in different educational environments and address variety of learning styles to…

  2. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  3. E-learning as a technological tool to meet the requirements of occupational standards in training of it specialists

    NASA Astrophysics Data System (ADS)

    Tokareva, N. A.; Tyatyushkina, O. Y.; Cheremisina, E. N.

    2016-09-01

    We discuss issues of updating educational programs to meet requirements of the labor market and occupational standards of IT industry. We suggest the technology of e-learning that utilizes an open educational resource to provide the employers' participation in the development of educational content and the intensification of practical training.

  4. The Availability and Use of 21st Century Technology Tools in South Carolina Secondary Public School Library Media Centers

    ERIC Educational Resources Information Center

    DuRant, Kathleen D.

    2010-01-01

    The purpose of this study was to assess the readiness of South Carolina secondary school library media specialists to prepare students to meet the "AASL Standards for the 21st Century Learner" (American Association of School Librarians, 2009b) by investigating the availability of 21st century technology tools, the confidence level of…

  5. Relationship of Technology Skill Competencies and Reading and Math Standardized Test Scores

    ERIC Educational Resources Information Center

    Jordan, Stacie L.

    2012-01-01

    The purpose of this study was to determine if a relationship exists between technology skills and academic achievement among eighth-grade students. Previous studies investigated the relationship between the use of technology as a teaching tool and student outcomes, but none had specifically examined students' technology skill competencies with…

  6. Using Technology to Create and Administer Accessible Tests

    ERIC Educational Resources Information Center

    Salend, Spencer

    2009-01-01

    Technology is transforming many aspects of society including the ways teachers teach and students learn. Although technology has been firmly established as a teaching tool across a range of content areas, educators are realizing that technology also offers innovative ways to help their students take standardized tests that comply with the mandates…

  7. Energy-Saving Opportunities for Manufacturing Enterprises (International English Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This fact sheet provides information about the Industrial Technologies Program Save Energy Now energy audit process, software tools, training, energy management standards, and energy efficient technologies to help U.S. companies identify energy cost savings.

  8. Creation of an Integrated Environment to Supply e-Learning Platforms with Office Automation Features

    ERIC Educational Resources Information Center

    Palumbo, Emilio; Verga, Francesca

    2015-01-01

    Over the last years great efforts have been made within the University environment to implement e-learning technologies in the standard educational practice. These learning technologies distribute online educational multimedia contents through technological platforms. Even though specific e-learning tools for technical disciplines were already…

  9. Use and clinical efficacy of standard and health information technology fall risk assessment tools.

    PubMed

    Teh, Ruth C; Wilson, Anne; Ranasinghe, Damith; Visvanathan, Renuka

    2017-12-01

    To evaluate the health information technology (HIT) compared to Fall Risk for Older Persons (FROP) tool in fall risk screening. A HIT tool trial was conducted on the geriatric evaluation and management (GEM, n = 111) and acute medical units (AMU, n = 424). Health information technology and FROP scores were higher on GEM versus AMU, with no differences between people who fell and people who did not fall. Both score completion rates were similar, and their values correlated marginally (Spearman's correlation coefficient 0.33, P < 0.01). HIT and FROP scores demonstrated similar sensitivity (80 vs 82%) and specificity (32 vs 36%) for detecting hospital falls. Hospital fall rates trended towards reduction on AMU (4.20 vs 6.96, P = 0.15) and increase on GEM (10.98 vs 6.52, P = 0.54) with HIT tool implementation. Health information technology tool acceptability and scoring were comparable to FROP screening, with mixed effects on fall rate with HIT tool implementation. Clinician partnership remains key to effective tool development. © 2017 AJA Inc.

  10. Preparing Special Education Teachers to Use Educational Technology to Enhance Student Learning

    ERIC Educational Resources Information Center

    Wallace, Teresa; Georgina, David

    2014-01-01

    New standards require teachers to integrate the use of technology in their teaching and preparing teachers at the preservice level to integrate technology into the classroom is key. The way in which this is accomplished varies across institutions though often a technology tools course stands as an individual course with the hope professors are…

  11. The Potential of Digital Technologies to Support Literacy Instruction Relevant to the Common Core State Standards

    ERIC Educational Resources Information Center

    Hutchison, Amy C.; Colwell, Jamie

    2014-01-01

    Digital tools have the potential to transform instruction and promote literacies outlined in the Common Core State Standards. Empirical research is examined to illustrate this potential in grades 6-12 instruction.

  12. A Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T.; Welch, Tim; Witt, Adam M.

    The Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology (MYRP) presents a strategy for specifying, designing, testing, and demonstrating the efficacy of standard modular hydropower (SMH) as an environmentally compatible and cost-optimized renewable electricity generation technology. The MYRP provides the context, background, and vision for testing the SMH hypothesis: if standardization, modularity, and preservation of stream functionality become essential and fully realized features of hydropower technology, project design, and regulatory processes, they will enable previously unrealized levels of new project development with increased acceptance, reduced costs, increased predictability of outcomes, and increased value to stakeholders.more » To achieve success in this effort, the MYRP outlines a framework of stakeholder-validated criteria, models, design tools, testing facilities, and assessment protocols that will facilitate the development of next-generation hydropower technologies.« less

  13. Evaluating geographic information systems technology

    USGS Publications Warehouse

    Guptill, Stephen C.

    1989-01-01

    Computerized geographic information systems (GISs) are emerging as the spatial data handling tools of choice for solving complex geographical problems. However, few guidelines exist for assisting potential users in identifying suitable hardware and software. A process to be followed in evaluating the merits of GIS technology is presented. Related standards and guidelines, software functions, hardware components, and benchmarking are discussed. By making users aware of all aspects of adopting GIS technology, they can decide if GIS is an appropriate tool for their application and, if so, which GIS should be used.

  14. Information Technology: A Tool to Cut Health Care Costs

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.

    1996-01-01

    Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.

  15. Developing a Web-Based Tool Using Information and Communication Technologies to Expand the Reach and Impact of Photovoice

    ERIC Educational Resources Information Center

    Strack, Robert W.; Orsini, Muhsin Michael; Fearnow-Kenney, Melodie; Herget, Jennifer; Milroy, Jeffrey J.; Wyrick, David L.

    2015-01-01

    Information and communication technologies are opening up vast new arenas for conducting the work of health promotion. Technology-based health promotions expand reach, standardize information and its delivery, provide opportunities for tailoring, create engaging interactivity within content delivery, provide for privacy and autonomy, improve…

  16. New Technology in Schools: Is There a Payoff? CEE DP 55

    ERIC Educational Resources Information Center

    Machin, Stephen; McNally, Sandra; Silva, Olmo

    2006-01-01

    In recent years the role of investment in Information and Communication Technology (ICT) as an effective tool to raise educational standards has attracted growing attention from both policy makers and academic researchers. While the former tend to express enthusiastic claims about the use of new technologies in schools, the latter have raised…

  17. Section 508 Standards

    EPA Pesticide Factsheets

    Guidelines, design tips, and tools for ensuring that all Information and Communications Technology; including websites, software, hardware, multimedia, and telecommunication; is accessible to disabled users.

  18. ResearchEHR: use of semantic web technologies and archetypes for the description of EHRs.

    PubMed

    Robles, Montserrat; Fernández-Breis, Jesualdo Tomás; Maldonado, Jose A; Moner, David; Martínez-Costa, Catalina; Bosca, Diego; Menárguez-Tortosa, Marcos

    2010-01-01

    In this paper, we present the ResearchEHR project. It focuses on the usability of Electronic Health Record (EHR) sources and EHR standards for building advanced clinical systems. The aim is to support healthcare professional, institutions and authorities by providing a set of generic methods and tools for the capture, standardization, integration, description and dissemination of health related information. ResearchEHR combines several tools to manage EHR at two different levels. The internal level that deals with the normalization and semantic upgrading of exiting EHR by using archetypes and the external level that uses Semantic Web technologies to specify clinical archetypes for advanced EHR architectures and systems.

  19. EUV mask pilot line at Intel Corporation

    NASA Astrophysics Data System (ADS)

    Stivers, Alan R.; Yan, Pei-Yang; Zhang, Guojing; Liang, Ted; Shu, Emily Y.; Tejnil, Edita; Lieberman, Barry; Nagpal, Rajesh; Hsia, Kangmin; Penn, Michael; Lo, Fu-Chang

    2004-12-01

    The introduction of extreme ultraviolet (EUV) lithography into high volume manufacturing requires the development of a new mask technology. In support of this, Intel Corporation has established a pilot line devoted to encountering and eliminating barriers to manufacturability of EUV masks. It concentrates on EUV-specific process modules and makes use of the captive standard photomask fabrication capability of Intel Corporation. The goal of the pilot line is to accelerate EUV mask development to intersect the 32nm technology node. This requires EUV mask technology to be comparable to standard photomask technology by the beginning of the silicon wafer process development phase for that technology node. The pilot line embodies Intel's strategy to lead EUV mask development in the areas of the mask patterning process, mask fabrication tools, the starting material (blanks) and the understanding of process interdependencies. The patterning process includes all steps from blank defect inspection through final pattern inspection and repair. We have specified and ordered the EUV-specific tools and most will be installed in 2004. We have worked with International Sematech and others to provide for the next generation of EUV-specific mask tools. Our process of record is run repeatedly to ensure its robustness. This primes the supply chain and collects information needed for blank improvement.

  20. The role of standards in the development and implementation of clinical laboratory tests: a domestic and global perspective.

    PubMed

    Michaud, Ginette Y

    2005-01-01

    In the field of clinical laboratory medicine, standardization is aimed at increasing the trueness and reliability of measured values. Standardization relies on the use of written standards, reference measurement procedures and reference materials. These are important tools for the design and validation of new tests, and for establishing the metrological traceability of diagnostic assays. Their use supports the translation of research technologies into new diagnostic assays and leads to more rapid advances in science and medicine, as well as improvements in the quality of patient care. The various standardization tools are described, as are the procedures by which written standards, reference procedures and reference materials are developed. Recent efforts to develop standards for use in the field of molecular diagnostics are discussed. The recognition of standardization tools by the FDA and other regulatory authorities is noted as evidence of their important role in ensuring the safety and performance of in vitro diagnostic devices.

  1. cDNA Microarray Screening in Food Safety

    PubMed Central

    ROY, SASHWATI; SEN, CHANDAN K

    2009-01-01

    The cDNA microarray technology and related bioinformatics tools presents a wide range of novel application opportunities. The technology may be productively applied to address food safety. In this mini-review article, we present an update highlighting the late breaking discoveries that demonstrate the vitality of cDNA microarray technology as a tool to analyze food safety with reference to microbial pathogens and genetically modified foods. In order to bring the microarray technology to mainstream food safety, it is important to develop robust user-friendly tools that may be applied in a field setting. In addition, there needs to be a standardized process for regulatory agencies to interpret and act upon microarray-based data. The cDNA microarray approach is an emergent technology in diagnostics. Its values lie in being able to provide complimentary molecular insight when employed in addition to traditional tests for food safety, as part of a more comprehensive battery of tests. PMID:16466843

  2. The BiSciCol Triplifier: bringing biodiversity data to the Semantic Web.

    PubMed

    Stucky, Brian J; Deck, John; Conlin, Tom; Ziemba, Lukasz; Cellinese, Nico; Guralnick, Robert

    2014-07-29

    Recent years have brought great progress in efforts to digitize the world's biodiversity data, but integrating data from many different providers, and across research domains, remains challenging. Semantic Web technologies have been widely recognized by biodiversity scientists for their potential to help solve this problem, yet these technologies have so far seen little use for biodiversity data. Such slow uptake has been due, in part, to the relative complexity of Semantic Web technologies along with a lack of domain-specific software tools to help non-experts publish their data to the Semantic Web. The BiSciCol Triplifier is new software that greatly simplifies the process of converting biodiversity data in standard, tabular formats, such as Darwin Core-Archives, into Semantic Web-ready Resource Description Framework (RDF) representations. The Triplifier uses a vocabulary based on the popular Darwin Core standard, includes both Web-based and command-line interfaces, and is fully open-source software. Unlike most other RDF conversion tools, the Triplifier does not require detailed familiarity with core Semantic Web technologies, and it is tailored to a widely popular biodiversity data format and vocabulary standard. As a result, the Triplifier can often fully automate the conversion of biodiversity data to RDF, thereby making the Semantic Web much more accessible to biodiversity scientists who might otherwise have relatively little knowledge of Semantic Web technologies. Easy availability of biodiversity data as RDF will allow researchers to combine data from disparate sources and analyze them with powerful linked data querying tools. However, before software like the Triplifier, and Semantic Web technologies in general, can reach their full potential for biodiversity science, the biodiversity informatics community must address several critical challenges, such as the widespread failure to use robust, globally unique identifiers for biodiversity data.

  3. Synergistic control center development utilizing commercial technology and industry standards. [NASA space programs

    NASA Technical Reports Server (NTRS)

    Anderson, Brian L.

    1993-01-01

    The development of the Control Center Complex (CCC), a synergistic control center supporting both the Space Station Freedom and the Space Shuttle Program, is described. To provide maximum growth and flexibility, the CCC uses commercial off-the-shelf technology and industry standards. The discussion covers the development philosophy, CCC architecture, data distribution, the software platform concept, workstation platform, commercial tools for the CCC, and benefits of synergy.

  4. Measuring Up: Online Technology Assessment Tools Ease the Teacher's Burden and Help Students Learn

    ERIC Educational Resources Information Center

    Roland, Jennifer

    2006-01-01

    Standards are a reality in all academic disciplines, and they can be hard to measure using conventional methods. Technology skills in particular are hard to assess using multiple-choice, paper-based tests. A new generation of online assessments of student technology skills allows students to prove proficiency by completing tasks in their natural…

  5. National Technology Standards for K-12 Schools: A Case Study of Unresolved Issues in Public Relations

    ERIC Educational Resources Information Center

    Mullen, Carol A.; Kealy, William A.; Sullivan, Ashley

    2004-01-01

    This article addresses an important need--the dissemination of information relating to technology as a public relations tool--and the associated exigency for administrator and teacher technology training. Specifically, we identify the increased expectations for the performance of school leaders and teachers, as well as unresolved issues in public…

  6. Science 2.0: Communicating Science Creatively

    ERIC Educational Resources Information Center

    Smith, Ben; Mader, Jared

    2017-01-01

    This column shares web tools that support learning. The authors have been covering the International Society for Technology in Education (ISTE) standards in every issue since September 2016. This article examines the final standard, called Creative Communicator, which requires students to communicate effectively and creatively express themselves…

  7. Lessons learned from a secret Facebook support group.

    PubMed

    Oliver, Debra Parker; Washington, Karla; Wittenberg-Lyles, Elaine; Gage, Ashley; Mooney, Megan; Demiris, George

    2015-05-01

    The National Association of Social Workers developed practice standards for social workers using technology in their practice. These standards were derived from the foundation of the social work code of ethics and are helpful as social workers explore the use of new tools for the benefit of their clients. Hospice caregivers, both active and bereaved, are in great need of support but are often unable to attend traditional support groups. Facebook secret groups offer social workers a potential tool, given the geographic barriers that exist for traditional face-to-face support groups. The authors' experience with a secret Facebook group indicates that the technology can be useful when managed by a social worker facilitator. As social workers continue to explore helpful ways to use technology with clients, it is critical that they evaluate that practice and assess the clinical outcomes to establish an evidence base behind this practice.

  8. The future is now: Technology's impact on the practice of genetic counseling.

    PubMed

    Gordon, Erynn S; Babu, Deepti; Laney, Dawn A

    2018-03-01

    Smartphones, artificial intelligence, automation, digital communication, and other types of technology are playing an increasingly important role in our daily lives. It is no surprise that technology is also shaping the practice of medicine, and more specifically the practice of genetic counseling. While digital tools have been part of the practice of medical genetics for decades, such as internet- or CD-ROM-based tools like Online Mendelian Inheritance in Man and Pictures of Standard Syndromes and Undiagnosed Malformations in the 1980s, the potential for emerging tools to change how we practice and the way patients consume information is startling. Technology has the potential to aid in at-risk patient identification, assist in generating a differential diagnosis, improve efficiency in medical history collection and risk assessment, provide educational support for patients, and streamline follow-up. Here we review the historic and current uses of technology in genetic counseling, identify challenges to integration, and propose future applications of technology that can shape the practice of genetic counseling. © 2018 Wiley Periodicals, Inc.

  9. 48 CFR 23.705 - Electronic products environmental assessment tool.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REGULATION SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES... standard for personal computer products— (i) Was issued by the Institute of Electrical and Electronics.... 104-113, the “National Technology Transfer and Advancement Act of 1995”, (see 11.102(c)); (iii) Meets...

  10. 48 CFR 23.704 - Electronic products environmental assessment tool.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... REGULATION SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES... standard for personal computer products— (i) Was issued by the Institute of Electrical and Electronics.... 104-113, the “National Technology Transfer and Advancement Act of 1995”, (see 11.102(c)); (iii) Meets...

  11. 48 CFR 23.704 - Electronic products environmental assessment tool.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... REGULATION SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES... standard for personal computer products— (i) Was issued by the Institute of Electrical and Electronics.... 104-113, the “National Technology Transfer and Advancement Act of 1995”, (see 11.102(c)); (iii) Meets...

  12. Investigating Methods for Serving Visualizations of Vertical Profiles

    NASA Astrophysics Data System (ADS)

    Roberts, J. T.; Cechini, M. F.; Lanjewar, K.; Rodriguez, J.; Boller, R. A.; Baynes, K.

    2017-12-01

    Several geospatial web servers, web service standards, and mapping clients exist for the visualization of two-dimensional raster and vector-based Earth science data products. However, data products with a vertical component (i.e., vertical profiles) do not have the same mature set of technologies and pose a greater technical challenge when it comes to visualizations. There are a variety of tools and proposed standards, but no obvious solution that can handle the variety of visualizations found with vertical profiles. An effort is being led by members of the NASA Global Imagery Browse Services (GIBS) team to gather a list of technologies relevant to existing vertical profile data products and user stories. The goal is to find a subset of technologies, standards, and tools that can be used to build publicly accessible web services that can handle the greatest number of use cases for the widest audience possible. This presentation will describe results of the investigation and offer directions for moving forward with building a system that is capable of effectively and efficiently serving visualizations of vertical profiles.

  13. COMPASS: A general purpose computer aided scheduling tool

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Fox, Barry; Culbert, Chris

    1991-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas under the direction of the Software Technology Branch at JSC. COMPASS is intended to illustrate the latest advances in scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to potential NASA Space Station Freedom standards. COMPASS has some unique characteristics that distinguishes it from commercial products. These characteristics are discussed and used to illustrate some differences between scheduling tools.

  14. The development of internet based ship design support system for small and medium sized shipyards

    NASA Astrophysics Data System (ADS)

    Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho

    2012-03-01

    In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.

  15. Improving Nursing Satisfaction with Bedside-Information, Technology-Enhanced Handoffs

    ERIC Educational Resources Information Center

    Chapman, Yvonne L.

    2014-01-01

    Due to renewed national focus on patient safety and patient outcomes, the advent of the electronic health record (EHR) and standardization of data management has prompted the utilization of information technology (IT) tools to enhance nursing bedside handoff. However, there is limited literature regarding the nurses' satisfaction with the…

  16. Science Teaching Orientations and Technology-Enhanced Tools for Student Learning

    NASA Astrophysics Data System (ADS)

    Campbell, Todd; Longhurst, Max; Duffy, Aaron M.; Wolf, Paul G.; Shelton, Brett E.

    2013-10-01

    This qualitative study examines teacher orientations and technology-enhanced tools for student learning within a science literacy framework. Data for this study came from a group of 10 eighth grade science teachers. Each of these teachers was a participant in a professional development (PD) project focused on reformed and technology-enhanced science instruction shaped by national standards documents. The research is focused on identifying teacher orientations and use of technology-enhanced tools prior to or unaffected by PD. The primary data sources for this study are drawn from learning journals and classroom observations. Qualitative methods were used to analyze learning journals, while descriptive statistics were used from classroom observations to further explore and triangulate the emergent qualitative findings. Two teacher orientation teacher profiles were developed to reveal the emergent teacher orientation dimensions and technology-enhanced tool categories found: "more traditional teacher orientation profile" and "toward a reformed-based teacher orientation profile." Both profiles were founded on "knowledge of" beliefs about the goals and purposes for science education, while neither profile revealed sophisticated beliefs about the nature of science. The "traditional" profile revealed more teacher-centered beliefs about science teaching and learning, and the "towards reformed-based" profile revealed student-centered beliefs. Finally, only technology-enhanced tools supportive of collaborative construction of science knowledge were found connected to the "towards reformed-based" profile. This research is concluded with a proposed "reformed-based teacher orientation profile" as a future target for science teaching and learning with technology-enhanced tools in a science literacy framework.

  17. Development of the public information and communication technology assessment tool.

    PubMed

    Ripat, Jacquie; Watzke, James; Birch, Gary

    2008-09-01

    Public information and communication technologies, such as information kiosks, automated banking machines and ticket dispensers, allow people to access services in a convenient and timely manner. However, the development of these technologies has occurred largely without consideration of access by people with disabilities. Inaccessible technical features make operation of a public technology difficult and barriers in the environment create navigational challenges, limiting the opportunity of people with disabilities to use these devices and access the services they provide. This paper describes the development of a tool that individuals, disability advocacy groups, business owners, healthcare providers, and urban planners can use to evaluate the accessibility of public technologies and the surrounding environment. Evaluation results can then be used to develop recommendations and advocate for technical and environmental changes to improve access. Tool development consisted of a review of the literature and key Canadian Standards Association documents, task analysis, and consultation with accessibility experts. Studies of content validity, tool usability, inter-rater and test-retest reliability were conducted in sites across Canada. Accessibility experts verified the content validity of the tool. The current version of the tool has incorporated the findings of a usability study. Initial testing indicated excellent agreement for inter-rater and test-retest reliability scores. Social exclusion can arise when public technologies are not accessible. This newly developed instrument provides detailed information that can be used to advocate for more accessible and inclusive public information and communication technologies.

  18. The Semiautomated Test System: A Tool for Standardized Performance Testing.

    ERIC Educational Resources Information Center

    Ramsey, H. Rudy

    For performance tests to be truly standardized, they must be administered in a way that will minimize variation due to operator intervention and errors. Through such technological developments as low-cost digital computers and digital logic modules, automatic test administration without restriction of test content has become possible. A…

  19. Data Acceptance Criteria for Standardized Human-Associated Fecal Source Identification Quantitative Real-Time PCR Methods

    EPA Science Inventory

    There is a growing interest in the application of human-associated fecal sourceidentification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data q...

  20. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    NASA Astrophysics Data System (ADS)

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-02-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a specific questionnaire of three blocks was performed and validated according to the Delphi method. The questionnaire included motivation and attention tasks, autonomous work and three-dimensional interpretation tasks. A total of 211 students from 7 public and private Spanish universities were divided in two groups. Control group received standard teaching sessions supported by books, and video. The ARBOOK group received the same standard sessions but additionally used the ARBOOK tool. At the end of the training, a written test on lower limb anatomy was done by students. Statistically significant better scorings for the ARBOOK group were found on attention-motivation, autonomous work and three-dimensional comprehension tasks. Additionally, significantly better scoring was obtained by the ARBOOK group in the written test. The results strongly suggest that the use of AR is suitable for anatomical purposes. Concretely, the results indicate how this technology is helpful for student motivation, autonomous work or spatial interpretation. The use of this type of technologies must be taken into account even more at the present moment, when new technologies are naturally incorporated to our current lives.

  1. An Examination of the Changes in Science Teaching Orientations and Technology-Enhanced Tools for Student Learning in the Context of Professional Development

    NASA Astrophysics Data System (ADS)

    Campbell, Todd; Zuwallack, Rebecca; Longhurst, Max; Shelton, Brett E.; Wolf, Paul G.

    2014-07-01

    This research examines how science teaching orientations and beliefs about technology-enhanced tools change over time in professional development (PD). The primary data sources for this study came from learning journals of 8 eighth grade science teachers at the beginning and conclusion of a year of PD. Based on the analysis completed, Information Transmission (IT) and Struggling with Standards-Based Reform (SSBR) profiles were found at the beginning of the PD, while SSBR and Standards-Based Reform (SBR) profiles were identified at the conclusion of PD. All profiles exhibited Vision I beliefs about the goals and purposes for science education, while only the SBR profile exhibited Vision II goals and purposes for science teaching. The IT profile demonstrated naïve or unrevealed beliefs about the nature of science, while the SSBR and SBR profiles had more sophisticated beliefs in this area. The IT profile was grounded in more teacher-centered beliefs about science teaching and learning as the other two profiles revealed more student-centered beliefs. While no beliefs about technology-enhanced tools were found for the IT profile, these were found for the other two profiles. Our findings suggest promising implications for (a) Roberts' Vision II as a central support for reform efforts, (b) situating technology-enhanced tools within the beliefs about science teaching and learning dimension of science teaching orientations, and (c) revealing how teacher orientations develop as a result of PD.

  2. funcLAB/G-service-oriented architecture for standards-based analysis of functional magnetic resonance imaging in HealthGrids.

    PubMed

    Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D

    2007-01-01

    Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.

  3. Examining the Technology Integration Planning Cycle Model of Professional Development to Support Teachers' Instructional Practices

    ERIC Educational Resources Information Center

    Hutchison, Amy C.; Woodward, Lindsay

    2018-01-01

    Background: Presently, models of professional development aimed at supporting teachers' technology integration efforts are often short and decontextualized. With many schools across the country utilizing standards that require students to engage with digital tools, a situative model that supports building teachers' knowledge within their…

  4. Problem Solving in the Digital Age: New Ideas for Secondary Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Connell, Michael

    2017-01-01

    The paper reflects on an earlier research on the use of technology in secondary mathematics teacher education through the lenses of newer digital tools (Wolfram Alpha, Maple), most recent standards for teaching mathematics, and recommendations for the preparation of schoolteachers. New ideas of technology integration into mathematics education…

  5. 15 CFR 291.3 - Environmental tools and techniques projects.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS... resource will be integrated into and will be of service to the NIST Manufacturing Extension Centers...

  6. 15 CFR 291.3 - Environmental tools and techniques projects.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS... resource will be integrated into and will be of service to the NIST Manufacturing Extension Centers...

  7. 15 CFR 291.3 - Environmental tools and techniques projects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS... resource will be integrated into and will be of service to the NIST Manufacturing Extension Centers...

  8. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in an increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. The Automated Reasoning Tool-Ada (ART-Ada), an Ada expert system tool, is explained. ART-Ada allows applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom program and the U.S. Air Force.

  9. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1991-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. Automated Reasoning Tool (ART) Ada, an Ada Expert system tool is described. ART-Ada allow applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  10. Social, ethical and legal barriers to e-health.

    PubMed

    Anderson, James G

    2007-01-01

    Information technology such as electronic medical records (EMRs), electronic prescribing and decision support systems are recognized as essential tools in Europe, the U.S., Canada, Australia, and New Zealand. But significant barriers impede wide-scale adoption of these tools, especially EMR systems. The objectives of this study were to investigate the present status of information technology in health care, the perceived benefits and barriers by primary care physicians. Literature analysis and survey data from primary care physicians on adoption of information technology are reviewed. The U.S. trails European countries as well as Canada, Australia and New Zealand in the use of information technology in primary care. The results of the study indicate that physicians in general perceive benefits to information technology, but also cite major barriers to its implementation in their practices. These barriers include lack of access to capital by health care providers, complex systems and lack of data standards that permit exchange of clinical data, privacy concerns and legal barriers. Overcoming these barriers will require subsidies and performance incentives by payers and government; certification and standardization of vendor applications that permit clinical data exchange; removal of legal barriers; and greater security of medical data to convince practitioners and patients of the value of EMRs.

  11. Student Attitudes toward Web-Enhanced and Web-Based Versions of a Learning Tools Course

    ERIC Educational Resources Information Center

    Davidson-Shivers, Gayle V.; Wimberg, Jane E.; Jackson, M. Katherine

    2004-01-01

    The presentation describes the revisions to a course and the resulting student attitudes and learning. Learning Tools was revised in 2003 from oncampus only to both oncampus and online delivery. Revisions were made by standardizing the two versions, updating the technology applications presented, and modifying the instructional strategies used.…

  12. Visualization and modeling of smoke transport over landscape scales

    Treesearch

    Glenn P. Forney; William Mell

    2007-01-01

    Computational tools have been developed at the National Institute of Standards and Technology (NIST) for modeling fire spread and smoke transport. These tools have been adapted to address fire scenarios that occur in the wildland urban interface (WUI) over kilometer-scale distances. These models include the smoke plume transport model ALOFT (A Large Open Fire plume...

  13. Connecting Research to Teaching: Evaluating and Writing Dynamic Geometry Tasks

    ERIC Educational Resources Information Center

    Trocki, Aaron

    2014-01-01

    The advent of dynamic geometry software has changed the way students draw, construct, and measure by using virtual tools instead of or along with physical tools. Use of technology in general and of dynamic geometry in particular has gained traction in mathematics education, as evidenced in the Common Core State Standards for Mathematics (CCSSI…

  14. A Planning Cycle for Integrating Digital Technology into Literacy Instruction

    ERIC Educational Resources Information Center

    Hutchison, Amy; Woodward, Lindsay

    2014-01-01

    With the adoption of the Common Core State Standards by most states, the use of digital tools in literacy and language arts instruction has become of critical importance to educators. These changes produce a need for a better understanding of how literacy and language arts teachers can successfully integrate digital tools into their instruction…

  15. Advice Networks and Local Diffusion of Technological Innovations

    NASA Astrophysics Data System (ADS)

    Barahona, Juan Carlos; Pentland, Alex Sandy

    Classical writers such as John Stuart Mill and Karl Marx speculated that the standard of living could not rise indefinitely unless advances in technology increased the yield of the means of production. Neoclassical growth theory, based on capital accumulation, supports this intuition [1]. Digital tools increase personal productivity. Communication technologies enhance the coordination among individuals and increase the efficacy and efficiency of collective efforts. In both ways, technology contributes with wealth creation and the overall welfare of the community.

  16. Leveraging Open Standards and Technologies to Search and Display Planetary Image Data

    NASA Astrophysics Data System (ADS)

    Rose, M.; Schauer, C.; Quinol, M.; Trimble, J.

    2011-12-01

    Mars and the Moon have both been visited by multiple NASA spacecraft. A large number of images and other data have been gathered by the spacecraft and are publicly available in NASA's Planetary Data System. Through a collaboration with Google, Inc., the User Centered Technologies group at NASA Ames Resarch Center has developed at tool for searching and browsing among images from multiple Mars and Moon missions. Development of this tool was facilitated by the use of several open technologies and standards. First, an open-source full-text search engine is used to search both place names on the target and to find images matching a geographic region. Second, the published API of the Google Earth browser plugin is used to geolocate the images on a virtual globe and allow the user to navigate on the globe to see related images. The structure of the application also employs standard protocols and services. The back-end is exposed as RESTful APIs, which could be reused by other client systems in the future. Further, the communication between the front- and back-end portions of the system utilizes open data standards including XML and KML (Keyhole Markup Language) for representation of textual and geographic data. The creation of the search index was facilitated by reuse of existing, publicly available metadata, including the Gazetteer of Planetary Nomenclature from the USGS, available in KML format. And the image metadata was reused from standards-compliant archives in the Planetary Data System. The system also supports collaboration with other tools by allowing export of search results in KML, and the ability to display those results in the Google Earth desktop application. We will demonstrate the search and visualization capabilities of the system, with emphasis on how the system facilitates reuse of data and services through the adoption of open standards.

  17. Setting the standards for signal transduction research.

    PubMed

    Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo

    2011-02-15

    Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.

  18. A survey of enabling technologies in synthetic biology

    PubMed Central

    2013-01-01

    Background Realizing constructive applications of synthetic biology requires continued development of enabling technologies as well as policies and practices to ensure these technologies remain accessible for research. Broadly defined, enabling technologies for synthetic biology include any reagent or method that, alone or in combination with associated technologies, provides the means to generate any new research tool or application. Because applications of synthetic biology likely will embody multiple patented inventions, it will be important to create structures for managing intellectual property rights that best promote continued innovation. Monitoring the enabling technologies of synthetic biology will facilitate the systematic investigation of property rights coupled to these technologies and help shape policies and practices that impact the use, regulation, patenting, and licensing of these technologies. Results We conducted a survey among a self-identifying community of practitioners engaged in synthetic biology research to obtain their opinions and experiences with technologies that support the engineering of biological systems. Technologies widely used and considered enabling by survey participants included public and private registries of biological parts, standard methods for physical assembly of DNA constructs, genomic databases, software tools for search, alignment, analysis, and editing of DNA sequences, and commercial services for DNA synthesis and sequencing. Standards and methods supporting measurement, functional composition, and data exchange were less widely used though still considered enabling by a subset of survey participants. Conclusions The set of enabling technologies compiled from this survey provide insight into the many and varied technologies that support innovation in synthetic biology. Many of these technologies are widely accessible for use, either by virtue of being in the public domain or through legal tools such as non-exclusive licensing. Access to some patent protected technologies is less clear and use of these technologies may be subject to restrictions imposed by material transfer agreements or other contract terms. We expect the technologies considered enabling for synthetic biology to change as the field advances. By monitoring the enabling technologies of synthetic biology and addressing the policies and practices that impact their development and use, our hope is that the field will be better able to realize its full potential. PMID:23663447

  19. The Unknown Oldowan: ~1.7-Million-Year-Old Standardized Obsidian Small Tools from Garba IV, Melka Kunture, Ethiopia

    PubMed Central

    2015-01-01

    The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years. PMID:26690569

  20. The Unknown Oldowan: ~1.7-Million-Year-Old Standardized Obsidian Small Tools from Garba IV, Melka Kunture, Ethiopia.

    PubMed

    Gallotti, Rosalia; Mussi, Margherita

    2015-01-01

    The Oldowan Industrial Complex has long been thought to have been static, with limited internal variability, embracing techno-complexes essentially focused on small-to-medium flake production. The flakes were rarely modified by retouch to produce small tools, which do not show any standardized pattern. Usually, the manufacture of small standardized tools has been interpreted as a more complex behavior emerging with the Acheulean technology. Here we report on the ~1.7 Ma Oldowan assemblages from Garba IVE-F at Melka Kunture in the Ethiopian highland. This industry is structured by technical criteria shared by the other East African Oldowan assemblages. However, there is also evidence of a specific technical process never recorded before, i.e. the systematic production of standardized small pointed tools strictly linked to the obsidian exploitation. Standardization and raw material selection in the manufacture of small tools disappear at Melka Kunture during the Lower Pleistocene Acheulean. This proves that 1) the emergence of a certain degree of standardization in tool-kits does not reflect in itself a major step in cultural evolution; and that 2) the Oldowan knappers, when driven by functional needs and supported by a highly suitable raw material, were occasionally able to develop specific technical solutions. The small tool production at ~1.7 Ma, at a time when the Acheulean was already emerging elsewhere in East Africa, adds to the growing amount of evidence of Oldowan techno-economic variability and flexibility, further challenging the view that early stone knapping was static over hundreds of thousands of years.

  1. Roadwaste management : a tool for developing district plans.

    DOT National Transportation Integrated Search

    2000-10-01

    The Oregon Department of Transportation (ODOT) conducted a study to examine roadwaste management options. Phase 1 consisted of a thorough review of regulations and standards, roadwaste characterization, current management practices, and new technolog...

  2. Attitudes Concerning Collaborative E-Learning Tool Usage for Information Technology and Information Assurance Organizational Efficiency

    ERIC Educational Resources Information Center

    Sipper, Joshua A.

    2012-01-01

    Information Assurance (IA) is a relatively new field in the Information Technology (IT) construct. Strategies for establishing standards, training, and evaluations for IA are still developing and growing across the IT field. As new threats and vulnerabilities are identified in IA, new information, policies, and procedures must be established and…

  3. Integration of tablet technologies in the e-laboratory of cytology: a health technology assessment.

    PubMed

    Giansanti, Daniele; Pochini, Marco; Giovagnoli, Maria Rosaria

    2014-10-01

    Although tablet systems are becoming a powerful technology, particularly useful in every application of medical imaging, to date no one has investigated the acceptance and performance of this technology in digital cytology. The specific aims of the work were (1) to design a health technology assessment (HTA) tool to assess, in terms of performance and acceptance, the introduction of tablet technologies (wearable, portable, and non portable) in the e-laboratories of cytology and (2) to test the tool in a first significant application of digital cytology. An HTA tool was proposed operating on a domain of five dimensions of investigation comprising the basic information of the product of digital cytology, the perceived subjective quality of images, the assessment of the virtual navigation on the e-slide, the assessment of the information and communication technologies features, and the diagnostic power. Six e-slides regarding studies of cervicovaginal cytology digitalized by means of an Aperio ( www.aperio.com ) scanner and uploaded onto the www.digitalslide.it Web site were used for testing the methodology on three different network connections. Three experts of cytology successfully tested the methodology on seven tablets found suitable for the study in their own standard configuration. Specific indexes furnished by the tool indicated both a high degree of performance and subjective acceptance of the investigated technology. The HTA tool thus could be useful to investigate new tablet technologies in digital cytology and furnish stakeholders with useful information that may help them make decisions involving the healthcare system. From a global point of view the study demonstrates the feasibility of using the tablet technology in digital cytology.

  4. Overview of 'Omics Technologies for Military Occupational Health Surveillance and Medicine.

    PubMed

    Bradburne, Christopher; Graham, David; Kingston, H M; Brenner, Ruth; Pamuku, Matt; Carruth, Lucy

    2015-10-01

    Systems biology ('omics) technologies are emerging as tools for the comprehensive analysis and monitoring of human health. In order for these tools to be used in military medicine, clinical sampling and biobanking will need to be optimized to be compatible with downstream processing and analysis for each class of molecule measured. This article provides an overview of 'omics technologies, including instrumentation, tools, and methods, and their potential application for warfighter exposure monitoring. We discuss the current state and the potential utility of personalized data from a variety of 'omics sources including genomics, epigenomics, transcriptomics, metabolomics, proteomics, lipidomics, and efforts to combine their use. Issues in the "sample-to-answer" workflow, including collection and biobanking are discussed, as well as national efforts for standardization and clinical interpretation. Establishment of these emerging capabilities, along with accurate xenobiotic monitoring, for the Department of Defense could provide new and effective tools for environmental health monitoring at all duty stations, including deployed locations. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  5. Health Information Technology as a Universal Donor to Bioethics Education.

    PubMed

    Goodman, Kenneth W

    2017-04-01

    Health information technology, sometimes called biomedical informatics, is the use of computers and networks in the health professions. This technology has become widespread, from electronic health records to decision support tools to patient access through personal health records. These computational and information-based tools have engendered their own ethics literature and now present an opportunity to shape the standard medical and nursing ethics curricula. It is suggested that each of four core components in the professional education of clinicians-privacy, end-of-life care, access to healthcare and valid consent, and clinician-patient communication-offers an opportunity to leverage health information technology for curricular improvement. Using informatics in ethics education freshens ethics pedagogy and increases its utility, and does so without additional demands on overburdened curricula.

  6. Integrating Apps with the Core Arts Standards in the 21st-Century Elementary Music Classroom

    ERIC Educational Resources Information Center

    Heath-Reynolds, Julia; VanWeelden, Kimberly

    2015-01-01

    The implementation of the National Core Arts Standards has amplified the need for multiple approaches and opportunities for student responses and may compel music educators to use new tools. There are currently over one million available apps, and with the popularity of smart devices, student access to technology is increasing exponentially. Music…

  7. Multidisciplinary life cycle metrics and tools for green buildings.

    PubMed

    Helgeson, Jennifer F; Lippiatt, Barbara C

    2009-07-01

    Building sector stakeholders need compelling metrics, tools, data, and case studies to support major investments in sustainable technologies. Proponents of green building widely claim that buildings integrating sustainable technologies are cost effective, but often these claims are based on incomplete, anecdotal evidence that is difficult to reproduce and defend. The claims suffer from 2 main weaknesses: 1) buildings on which claims are based are not necessarily "green" in a science-based, life cycle assessment (LCA) sense and 2) measures of cost effectiveness often are not based on standard methods for measuring economic worth. Yet, the building industry demands compelling metrics to justify sustainable building designs. The problem is hard to solve because, until now, neither methods nor robust data supporting defensible business cases were available. The US National Institute of Standards and Technology (NIST) Building and Fire Research Laboratory is beginning to address these needs by developing metrics and tools for assessing the life cycle economic and environmental performance of buildings. Economic performance is measured with the use of standard life cycle costing methods. Environmental performance is measured by LCA methods that assess the "carbon footprint" of buildings, as well as 11 other sustainability metrics, including fossil fuel depletion, smog formation, water use, habitat alteration, indoor air quality, and effects on human health. Carbon efficiency ratios and other eco-efficiency metrics are established to yield science-based measures of the relative worth, or "business cases," for green buildings. Here, the approach is illustrated through a realistic building case study focused on different heating, ventilation, air conditioning technology energy efficiency. Additionally, the evolution of the Building for Environmental and Economic Sustainability multidisciplinary team and future plans in this area are described.

  8. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  9. Synergistic Role of Newer Techniques for Forensic and Postmortem CT Examinations.

    PubMed

    Blum, Alain; Kolopp, Martin; Teixeira, Pedro Gondim; Stroud, Tyler; Noirtin, Philippe; Coudane, Henry; Martrille, Laurent

    2018-04-30

    The aim of this article is to provide an overview of newer techniques and postprocessing tools that improve the potential impact of CT in forensic situations. CT has become a standard tool in medicolegal practice. Postmortem CT is an essential aid to the pathologist during autopsies. Advances in technology and software are constantly leading to advances in its performance.

  10. Vision Algorithms Catch Defects in Screen Displays

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Andrew Watson, a senior scientist at Ames Research Center, developed a tool called the Spatial Standard Observer (SSO), which models human vision for use in robotic applications. Redmond, Washington-based Radiant Zemax LLC licensed the technology from NASA and combined it with its imaging colorimeter system, creating a powerful tool that high-volume manufacturers of flat-panel displays use to catch defects in screens.

  11. Design Of Measurements For Evaluating Readiness Of Technoware Components To Meet The Required Standard Of Products

    NASA Astrophysics Data System (ADS)

    Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad

    2018-03-01

    Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.

  12. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  13. Group Collaboration in Organizations: Architectures, Methodologies and Tools

    DTIC Science & Technology

    2002-03-01

    collaboration , its definition and characteristics was completed. Next, existing technologies and standards were studied as well as the ...2000). 22 For effective collaboration , the technology must support the dynamic world of work be it individual, group and/or teamwork, as well as... develop it or simply use it as the basis of discussion. If collaborators are all contributing to the development of a

  14. A Meta-Analysis of the Educational Effectiveness of Three-Dimensional Visualization Technologies in Teaching Anatomy

    ERIC Educational Resources Information Center

    Yammine, Kaissar; Violato, Claudio

    2015-01-01

    Many medical graduates are deficient in anatomy knowledge and perhaps below the standards for safe medical practice. Three-dimensional visualization technology (3DVT) has been advanced as a promising tool to enhance anatomy knowledge. The purpose of this review is to conduct a meta-analysis of the effectiveness of 3DVT in teaching and learning…

  15. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2001-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  16. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2002-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  17. Preservice Mathematics Teachers' Perceptions of Using a Web 2.0 Technology as a Supportive Teaching-Learning Tool in a College Euclidean Geometry Course

    ERIC Educational Resources Information Center

    Hossain, Md. Mokter

    2012-01-01

    This mixed methods study examined preservice secondary mathematics teachers' perceptions of a blogging activity used as a supportive teaching-learning tool in a college Euclidean Geometry course. The effect of a 12-week blogging activity that was a standard component of a college Euclidean Geometry course offered for preservice secondary…

  18. Technology and Technique Standards for Camera-Acquired Digital Dermatologic Images: A Systematic Review.

    PubMed

    Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C

    2015-08-01

    Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed, validated, and adopted to date. Dermatologic imaging is evolving without defined standards for camera-acquired images, leading to variable image quality and limited exchangeability. The development and adoption of universal technology and technique standards may first emerge in scenarios when image use is most associated with a defined clinical benefit.

  19. Development of a Screening Tool to Facilitate Technology Transfer of an Innovative Technology to Treat Perchlorate-Contaminated Water

    DTIC Science & Technology

    2008-03-01

    foods such as fruits, vegetables, and beverages (U.S. FDA, 2004). If the U.S. EPA ultimately establishes a drinking water standard for perchlorate...TREAT PERCHLORATE-CONTAMINATED WATER THESIS Daniel A. Craig, Captain, USAF AFIT/GEM/ENV/08-M06 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY...OF AN INNOVATIVE TECHNOLOGY TO TREAT PERCHLORATE- CONTAMINATED WATER THESIS Presented to the Faculty Department of Systems and Engineering

  20. Utilizing the NASA Data-enhanced Investigations for Climate Change Education Resource for Elementary Pre-service Teachers in a Technology Integration Education Course.

    NASA Astrophysics Data System (ADS)

    Howard, E. M.; Moore, T.; Hale, S. R.; Hayden, L. B.; Johnson, D.

    2014-12-01

    The preservice teachers enrolled in the EDUC 203 Introduction to Computer Instructional Technology course, primarily for elementary-level had created climate change educational lessons based upon their use of the NASA Data-enhanced Investigations for Climate Change Education (DICCE). NASA climate education datasets and tools were introduced to faculty of Minority Serving Institutions through a grant from the NASA Innovations in Climate Education program. These lessons were developed to study various ocean processes involving phytoplankton's chlorophyll production over time for specific geographic areas using the Giovanni NASA software tool. The pre-service teachers had designed the climate change content that will assist K-4 learners to identify and predict phytoplankton sources attributed to sea surface temperatures, nutrient levels, sunlight, and atmospheric carbon dioxide associated with annual chlorophyll production. From the EDUC 203 course content, the preservice teachers applied the three phases of the technology integration planning (TIP) model in developing their lessons. The Zunal website (http://www.zunal.com) served as a hypermedia tool for online instructional delivery in presenting the climate change content, the NASA climate datasets, and the visualization tools used for the production of elementary learning units. A rubric was developed to assess students' development of their webquests to meet the overall learning objectives and specific climate education objectives. Accompanying each webquest is a rubric with a defined table of criteria, for a teacher to assess students completing each of the required tasks for each lesson. Two primary challenges of technology integration for elementary pre-service teachers were 1) motivating pre-service teachers to be interested in climate education and 2) aligning elementary learning objectives with the Next Generation science standards of climate education that are non-existent in the Common Core State Standards.

  1. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanes, Rebecca J.; Carpenter, Alberta

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  2. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE PAGES

    Hanes, Rebecca J.; Carpenter, Alberta

    2017-01-10

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  3. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  4. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  5. A Tool for Rating the Resilience of Critical Infrastructures in Extreme Fires

    DTIC Science & Technology

    2014-05-01

    provide a tool for NRC to help the Canadian industry to develop extreme fire protection materials and technologies for critical infrastructures. Future...supported by the Canadian Safety and Security Program (CSSP) which is led by Defence Research and Development Canada’s Centre for Security Science, in...in oil refinery and chemical industry facilities. The only available standard in North America that addresses the transportation infrastructure is

  6. Blending technology in teaching advanced health assessment in a family nurse practitioner program: using personal digital assistants in a simulation laboratory.

    PubMed

    Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia

    2012-09-01

    This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.

  7. State Landmarks.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2003-01-01

    Explains how to develop lesson plans to help students become effective researchers using electronic searching tools. Uses a unit developed for Kansas landmarks to discuss information skills, competency standards, inquiry, technology use, information literacy and process skills, finding information, and an example of a research log. (LRW)

  8. NREL and IBM Improve Solar Forecasting with Big Data | Energy Systems

    Science.gov Websites

    forecasting model using deep-machine-learning technology. The multi-scale, multi-model tool, named Watt-sun the first standard suite of metrics for this purpose. Validating Watt-sun at multiple sites across the

  9. SNPConvert: SNP Array Standardization and Integration in Livestock Species.

    PubMed

    Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra

    2016-06-09

    One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.

  10. PubMed Central

    DAMONTI, A.; MORELLI, P.; MUSSI, M.; PATREGNANI, C.; GARAGIOLA, E.; FOGLIA, E.; PAGANI, R.; CARMINATI, R.; PORAZZI, E.

    2015-01-01

    Summary Introduction. The objective of this paper is the comparison between two different technologies used for the removal of a uterine myoma, a frequent benign tumor: the standard technology currently used, laparoscopy, and an innovative one, colpoceliotomy. It was considered relevant to evaluate the real and the potential effects of the two technologies implementation and, in addition, the consequences that the introduction or exclusion of the innovative technology would have for both the National Health System (NHS) and the entire community. Methods. The comparison between these two different technologies, the standard and the innovative one, was conducted using a Health Technology Assessment (HTA). In particular, in order to analyse their differences, a multi-dimensional approach was considered: effectiveness, costs and budget impact analysis data were collected, applying different instruments, such as the Activity Based Costing methodology (ABC), the Cost-Effectiveness Analysis (CEA) and the Budget Impact Analysis (BIA). Organisational, equity and social impact were also evaluated. Results. The results showed that the introduction of colpoceliotomy would provide significant economic savings to the Regional and National Health Service; in particular, a saving of € 453.27 for each surgical procedure. Discussion. The introduction of the innovative technology, colpoceliotomy, could be considered a valuable tool; one offering many advantages related to less invasiveness and a shorter surgical procedure than the standard technology currently used (laparoscopy). PMID:26900330

  11. Is there a superior simulator for human anatomy education? How virtual dissection can overcome the anatomic and pedagogic limitations of cadaveric dissection.

    PubMed

    Darras, Kathryn E; de Bruin, Anique B H; Nicolaou, Savvas; Dahlström, Nils; Persson, Anders; van Merriënboer, Jeroen; Forster, Bruce B

    2018-03-23

    Educators must select the best tools to teach anatomy to future physicians and traditionally, cadavers have always been considered the "gold standard" simulator for living anatomy. However, new advances in technology and radiology have created new teaching tools, such as virtual dissection, which provide students with new learning opportunities. Virtual dissection is a novel way of studying human anatomy through patient computed tomography (CT) scans. Through touchscreen technology, students can work together in groups to "virtually dissect" the CT scans to better understand complex anatomic relationships. This article presents the anatomic and pedagogic limitations of cadaveric dissection and explains what virtual dissection is and how this new technology may be used to overcome these limitations.

  12. Developing a Web-based Tool Using Information and Communication Technologies to Expand the Reach and Impact of Photovoice.

    PubMed

    Strack, Robert W; Orsini, Muhsin Michael; Fearnow-Kenney, Melodie; Herget, Jennifer; Milroy, Jeffrey J; Wyrick, David L

    Information and communication technologies are opening up vast new arenas for conducting the work of health promotion. Technology-based health promotions expand reach, standardize information and its delivery, provide opportunities for tailoring, create engaging interactivity within content delivery, provide for privacy and autonomy, improve portability, and lower delivery costs. This commentary describes the ongoing exploration and development of a web-based tool for enhancing the reach and impact of photovoice as a community change intervention. Features of the tool use information and communication technologies that integrate the use of an online learning management system, tailored messaging, gaming technology, interactive features, and the application of social media's power to increase the capacity of communities to employ comprehensive strategies to improve the health of their communities. It will enable individuals and groups to use photos and captions to assess the physical environment, social norms and behaviors of communities; raise community awareness of the factors contributing to ill-health in their communities, mobilize stakeholders, and inform environmental strategies and policy changes. We believe it will enhance the delivery of educational content about conducting photovoice projects, provide features unavailable without the application of information and communication technologies, and will be substantive advancement over existing photovoice resources.

  13. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  14. Quantification of Soil Redoximorphic Features by Standardized Color Identification

    USDA-ARS?s Scientific Manuscript database

    Photography has been a welcome tool in assisting to document and convey qualitative soil information. Greater availability of digital cameras with increased information storage capabilities has promoted novel uses of this technology in investigations of water movement patterns, organic matter conte...

  15. Successful Clicker Standardization

    ERIC Educational Resources Information Center

    Twetten, Jim; Smith, M. K.; Julius, Jim; Murphy-Boyer, Linda

    2007-01-01

    Student response systems, commonly referred to as "clickers," have become an important learning tool in higher education. With a growing number of faculty using the technology to promote active learning, student engagement, and assessment, most campuses have seen increasing clicker use. And with faculty bombarded by multiple,…

  16. The Origin of The Acheulean: The 1.7 Million-Year-Old Site of FLK West, Olduvai Gorge (Tanzania)

    PubMed Central

    Diez-Martín, F.; Sánchez Yustos, P.; Uribelarrea, D.; Baquedano, E.; Mark, D. F.; Mabulla, A.; Fraile, C.; Duque, J.; Díaz, I.; Pérez-González, A.; Yravedra, J.; Egeland, C. P.; Organista, E.; Domínguez-Rodrigo, M.

    2015-01-01

    The appearance of the Acheulean is one of the hallmarks of human evolution. It represents the emergence of a complex behavior, expressed in the recurrent manufacture of large-sized tools, with standardized forms, implying more advance forethought and planning by hominins than those required by the precedent Oldowan technology. The earliest known evidence of this technology dates back to c. 1.7 Ma. and is limited to two sites (Kokiselei [Kenya] and Konso [Ethiopia]), both of which lack functionally-associated fauna. The functionality of these earliest Acheulean assemblages remains unknown. Here we present the discovery of another early Acheulean site also dating to c. 1.7 Ma from Olduvai Gorge. This site provides evidence of the earliest steps in developing the Acheulean technology and is the oldest Acheulean site in which stone tools occur spatially and functionally associated with the exploitation of fauna. Simple and elaborate large-cutting tools (LCT) and bifacial handaxes co-exist at FLK West, showing that complex cognition was present from the earliest stages of the Acheulean. Here we provide a detailed technological study and evidence of the use of these tools on the butchery and consumption of fauna, probably by early Homo erectus sensu lato. PMID:26639785

  17. Quicksilver: Middleware for Scalable Self-Regenerative Systems

    DTIC Science & Technology

    2006-04-01

    Applications can be coded in any of about 25 programming languages ranging from the obvious ones to some very obscure languages , such as OCaml ...technology. Like Tempest, Quicksilver can support applications written in any of a wide range of programming languages supported by .NET. However, whereas...so that developers can work in standard languages and with standard tools and still exploit those solutions. Vendors need to see some success

  18. Impact of newer self-monitoring technology and brief phone-based intervention on weight loss: A randomized pilot study.

    PubMed

    Ross, Kathryn M; Wing, Rena R

    2016-08-01

    Despite the proliferation of newer self-monitoring technology (e.g., activity monitors and smartphone apps), their impact on weight loss outside of structured in-person behavioral intervention is unknown. A randomized, controlled pilot study was conducted to examine efficacy of self-monitoring technology, with and without phone-based intervention, on 6-month weight loss in adults with overweight and obesity. Eighty participants were randomized to receive standard self-monitoring tools (ST, n = 26), technology-based self-monitoring tools (TECH, n = 27), or technology-based tools combined with phone-based intervention (TECH + PHONE, n = 27). All participants attended one introductory weight loss session and completed assessments at baseline, 3 months, and 6 months. Weight loss from baseline to 6 months differed significantly between groups P = 0.042; there was a trend for TECH + PHONE (-6.4 ± 1.2 kg) to lose more weight than ST (-1.3 ± 1.2 kg); weight loss in TECH (-4.1 ± 1.4 kg) was between ST and TECH + PHONE. Fewer ST (15%) achieved ≥5% weight losses compared with TECH and TECH + PHONE (44%), P = 0.039. Adherence to self-monitoring caloric intake was higher in TECH + PHONE than TECH or ST, Ps < 0.05. These results suggest use of newer self-monitoring technology plus brief phone-based intervention improves adherence and weight loss compared with traditional self-monitoring tools. Further research should determine cost-effectiveness of adding phone-based intervention when providing self-monitoring technology. © 2016 The Obesity Society.

  19. Impact of newer self-monitoring technology and brief phone-based intervention on weight loss: a randomized pilot study

    PubMed Central

    Ross, Kathryn M.; Wing, Rena R.

    2016-01-01

    Objective Despite the proliferation of newer self-monitoring technology (e.g., activity monitors and smartphone apps), their impact on weight loss outside of structured in-person behavioral intervention is unknown. Methods A randomized, controlled pilot study was conducted to examine efficacy of self-monitoring technology, with and without phone-based intervention, on 6-month weight loss in adults with overweight and obesity. Eighty participants were randomized to receive standard self-monitoring tools (ST, n=26), technology-based self-monitoring tools (TECH, n=27), or technology-based tools combined with phone-based intervention (TECH+PHONE, n=27). All participants attended one introductory weight loss session and completed assessments at baseline, 3 months, and 6 months. Results Weight loss from baseline to 6 months differed significantly between groups p=.042; there was a trend for TECH+PHONE (−6.4±1.2kg) to lose more weight than ST (−1.3±1.2kg); weight loss in TECH (−4.1±1.4kg) was between ST and TECH+PHONE. Fewer ST (15%) achieved ≥5% weight losses compared to TECH and TECH+PHONE (44%), p=.039. Adherence to self-monitoring caloric intake was higher in TECH+PHONE than TECH or ST, ps<.05. Conclusion These results suggest use of newer self-monitoring technology plus brief phone-based intervention improves adherence and weight loss compared to traditional self-monitoring tools. Further research should determine cost-effectiveness of adding phone-based intervention when providing self-monitoring technology. PMID:27367614

  20. Open Technology Approaches to Geospatial Interface Design

    NASA Astrophysics Data System (ADS)

    Crevensten, B.; Simmons, D.; Alaska Satellite Facility

    2011-12-01

    What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.

  1. Applications of surface metrology in firearm identification

    NASA Astrophysics Data System (ADS)

    Zheng, X.; Soons, J.; Vorburger, T. V.; Song, J.; Renegar, T.; Thompson, R.

    2014-01-01

    Surface metrology is commonly used to characterize functional engineering surfaces. The technologies developed offer opportunities to improve forensic toolmark identification. Toolmarks are created when a hard surface, the tool, comes into contact with a softer surface and causes plastic deformation. Toolmarks are commonly found on fired bullets and cartridge cases. Trained firearms examiners use these toolmarks to link an evidence bullet or cartridge case to a specific firearm, which can lead to a criminal conviction. Currently, identification is typically based on qualitative visual comparison by a trained examiner using a comparison microscope. In 2009, a report by the National Academies called this method into question. Amongst other issues, they questioned the objectivity of visual toolmark identification by firearms examiners. The National Academies recommended the development of objective toolmark identification criteria and confidence limits. The National Institute of Standards and Technology (NIST) have applied its experience in surface metrology to develop objective identification criteria, measurement methods, and reference artefacts for toolmark identification. NIST developed the Standard Reference Material SRM 2460 standard bullet and SRM 2461 standard cartridge case to facilitate quality control and traceability of identifications performed in crime laboratories. Objectivity is improved through measurement of surface topography and application of unambiguous surface similarity metrics, such as the maximum value (ACCFMAX) of the areal cross correlation function. Case studies were performed on consecutively manufactured tools, such as gun barrels and breech faces, to demonstrate that, even in this worst case scenario, all the tested tools imparted unique surface topographies that were identifiable. These studies provide scientific support for toolmark evidence admissibility in criminal court cases.

  2. The NASA Program Management Tool: A New Vision in Business Intelligence

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri

    2006-01-01

    This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.

  3. Translations on Eastern Europe Political, Sociological, and Military Affairs No. 1359.

    DTIC Science & Technology

    1977-03-01

    of the development of productive forces, science and technology and cannot ensure the right to work, an increase in the standard of living and the...winning of international peace and security. In this struggle the chief tool, in addition to political enlight - enment activity, is the organization...development of the trade exchange and to find and utilize new forms of cooperation in the economy, science, technology and culture. Mr Prime Minister

  4. “Standardization through Mechanization”

    PubMed Central

    KIRK, ROBERT G. W.

    2012-01-01

    “In all his work,” Science News-Letter reported on 17 August 1940, “Reyniers follows a slogan of his own, follows it so zealously as to make it almost a fetish: standardization through mechanization.”1 Utilizing new technologies that he designed and built, James Reyniers came to “wide notice in the world of science” due to his innovative approach to standardizing organisms for use as experimental tools. “Ordinarily, when a scientist wants to study an unknown germ (or drug, or nutrient) he tries it out on an experimental animal,” Life magazine explained in September 1949 when reporting Reyniers’s innovative technologies. “But since all laboratory animals are invariably contaminated by a host of unknown germs, he can never be absolutely sure that results he sees are really caused by the agent he is testing. This problem … can be solved only by using animals whose bodies contain no germs at all. Now, for the first time, such animals are available.”2 Reyniers had extended the bacteriological ideal of pure culture to encompass the whole organism, creating “bacteriologically blank” organisms, or “biological tabula rasa,” which he believed formed ideal tools for experimental science. PMID:22530388

  5. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  6. Immunochemistry for high-throughput screening of human exhaled breath condensate (EBC) media: implementation of automated quanterix SIMOA instrumentation

    EPA Science Inventory

    Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing f...

  7. ePortfolio as a Measure of Reflective Practice

    ERIC Educational Resources Information Center

    Parkes, Kelly A.; Dredger, Katie S.; Hicks, David

    2013-01-01

    This instructional article outlines the qualities of effective ePortfolios and how reflection and student growth is measured. Student exemplars and assessment rubrics show how, despite changing tools and evolving standards, sustained collaboration and student coaching yields reflective practitioners in content areas and in technological knowledge.…

  8. Strategically Fostering Dynamic Interactive Environments

    ERIC Educational Resources Information Center

    Özgün-Koca, S. Asli

    2016-01-01

    The Common Core State Standards (CCSSI 2010) and NCTM's (2014) "Principles to Actions" agree that "for meaningful learning of mathematics, tools and technology must be indispensable features of the classroom . . . that support students in exploring mathematics as well as in making sense of concepts and procedures and engaging in…

  9. Using a formal requirements management tool for system engineering: first results at ESO

    NASA Astrophysics Data System (ADS)

    Zamparelli, Michele

    2006-06-01

    The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.

  10. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  11. Technologic advances for evaluation of cervical cytology: is newer better?

    PubMed

    Hartmann, K E; Nanda, K; Hall, S; Myers, E

    2001-12-01

    Among those women who have cervical cancer and have been screened, 14% to 33% of the cases represent failure to detect abnormalities that existed at the time of screening. New technologies intended to improve detection of cytologic abnormalities include liquid-based, thin-layer cytology (ThinPrep, AutoCyte), computerized rescreening (PAPNET), and algorithm-based computer rescreening (AutoPap). This report combines evidence reviews conducted for the U.S. Preventive Services Task Force and the Agency for Healthcare Research and Quality, in which we systematically identified articles on cervical neoplasia, cervical dysplasia, and screening published between January 1966 and March 2001. We note the challenges for improving screening methods, providing an overview of methods for collecting and evaluating cytologic samples, and examining the evidence about the diagnostic performance of new technologies for detecting cervical lesions. Using standard criteria for evaluation of the diagnostic tests, we determined that knowledge about the sensitivity, specificity, and predictive values of new technologies is meager. Only one study of liquid-based cytology used a reference standard of colposcopy, with histology as indicated, to assess participants with normal screening results. Lack of an adequate reference standard is the overwhelming reason that test characteristics cannot be properly assessed or compared. Most publications compare results of screening using the new technology with expert panel review of the cytologic specimen. In that case, the tests are not independent measures and do nothing to relate the screening test findings to the true status of the cervix, making determination of false-negatives, and thus sensitivity, specificity, and negative predictive value, impossible. We did not identify any literature about health outcomes or cost effectiveness of using these tools in a system of screening. For the purposes of guiding decision making about choice of screening tools, the current evidence is inadequate to gauge whether new technologies are "better" than conventional cytology..

  12. Priorities for Standards and Measurements to Accelerate Innovations in Nano-Electrotechnologies: Analysis of the NIST-Energetics-IEC TC 113 Survey+,*

    PubMed Central

    Bennett, Herbert S.; Andres, Howard; Pellegrino, Joan; Kwok, Winnie; Fabricius, Norbert; Chapin, J. Thomas

    2009-01-01

    In 2008, the National Institute of Standards and Technology and Energetics Incorporated collaborated with the International Electrotechnical Commission Technical Committee 113 (IEC TC 113) on nano-electrotechnologies to survey members of the international nanotechnologies community about priorities for standards and measurements to accelerate innovations in nano-electrotechnologies. In this paper, we analyze the 459 survey responses from 45 countries as one means to begin building a consensus on a framework leading to nano-electrotechnologies standards development by standards organizations and national measurement institutes. The distributions of priority rankings from all 459 respondents are such that there are perceived distinctions with statistical confidence between the relative international priorities for the several items ranked in each of the following five Survey category types: 1) Nano-electrotechnology Properties, 2) Nano-electrotechnology Taxonomy: Products, 3) Nano-electrotechnology Taxonomy: Cross-Cutting Technologies, 4) IEC General Discipline Areas, and 5) Stages of the Linear Economic Model. The global consensus prioritizations for ranked items in the above five category types suggest that the IEC TC 113 should focus initially on standards and measurements for electronic and electrical properties of sensors and fabrication tools that support performance assessments of nano-technology enabled sub-assemblies used in energy, medical, and computer products. PMID:27504216

  13. Improving Project Management Using Formal Models and Architectures

    NASA Technical Reports Server (NTRS)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  14. Proceedings, Conference on the Computing Environment for Mathematical Software

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Recent advances in software and hardware technology which make it economical to create computing environments appropriate for specialized applications are addressed. Topics included software tools, FORTRAN standards activity, and features of languages, operating systems, and hardware that are important for the development, testing, and maintenance of mathematical software.

  15. Inter-laboratory Comparison of Real-time PCR Methods for Quantification of General Fecal Indicator Bacteria

    EPA Science Inventory

    The application of quantitative real-time PCR (qPCR) technologies for the rapid identification of fecal bacteria in environmental waters is being considered for use as a national water quality metric in the United States. The transition from research tool to a standardized prot...

  16. Interlaboratory Comparison of Real-time PCR Protocols for Quantification of General Fecal Indicator Bacteria

    EPA Science Inventory

    The application of quantitative real-time PCR (qPCR) technologies for the rapid identification of fecal bacteria in environmental waters is being considered for use as a national water quality metric in the United States. The transition from research tool to a standardized proto...

  17. Designing e-Portfolios to Support Professional Teacher Preparation

    ERIC Educational Resources Information Center

    Tran, Tu; Baker, Robert; Pensavalle, Margo

    2006-01-01

    Tu Tran, Robert Baker, and Margo Pensavalle present e-portfolio technology as an effective tool in teacher preparation. Because e-portfolios chronicle students' learning outcomes, they provide a picture of students' development that can be used in response to increased demands for assessment of student teachers and increasingly standards-based…

  18. "Aid to Thought"--Just Simulate It!

    ERIC Educational Resources Information Center

    Kinczkowski, Linda; Cardon, Phillip; Speelman, Pamela

    2015-01-01

    This paper provides examples of Aid-to-Thought uses in urban decision making, classroom laboratory planning, and in a ship antiaircraft defense system. Aid-to-Thought modeling and simulations are tools students can use effectively in a STEM classroom while meeting Standards for Technological Literacy Benchmarks O and R. These projects prepare…

  19. Evaluation of the Effectiveness of Stormwater Decision Support Tools for Infrastructure Selection and the Barriers to Implementation

    NASA Astrophysics Data System (ADS)

    Spahr, K.; Hogue, T. S.

    2016-12-01

    Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.

  20. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  1. Effects of 3D Printing Project-based Learning on Preservice Elementary Teachers' Science Attitudes, Science Content Knowledge, and Anxiety About Teaching Science

    NASA Astrophysics Data System (ADS)

    Novak, Elena; Wisdom, Sonya

    2018-05-01

    3D printing technology is a powerful educational tool that can promote integrative STEM education by connecting engineering, technology, and applications of science concepts. Yet, research on the integration of 3D printing technology in formal educational contexts is extremely limited. This study engaged preservice elementary teachers (N = 42) in a 3D Printing Science Project that modeled a science experiment in the elementary classroom on why things float or sink using 3D printed boats. The goal was to explore how collaborative 3D printing inquiry-based learning experiences affected preservice teachers' science teaching self-efficacy beliefs, anxiety toward teaching science, interest in science, perceived competence in K-3 technology and engineering science standards, and science content knowledge. The 3D printing project intervention significantly decreased participants' science teaching anxiety and improved their science teaching efficacy, science interest, and perceived competence in K-3 technological and engineering design science standards. Moreover, an analysis of students' project reflections and boat designs provided an insight into their collaborative 3D modeling design experiences. The study makes a contribution to the scarce body of knowledge on how teacher preparation programs can utilize 3D printing technology as a means of preparing prospective teachers to implement the recently adopted engineering and technology standards in K-12 science education.

  2. Trends in modeling Biomedical Complex Systems

    PubMed Central

    Milanesi, Luciano; Romano, Paolo; Castellani, Gastone; Remondini, Daniel; Liò, Petro

    2009-01-01

    In this paper we provide an introduction to the techniques for multi-scale complex biological systems, from the single bio-molecule to the cell, combining theoretical modeling, experiments, informatics tools and technologies suitable for biological and biomedical research, which are becoming increasingly multidisciplinary, multidimensional and information-driven. The most important concepts on mathematical modeling methodologies and statistical inference, bioinformatics and standards tools to investigate complex biomedical systems are discussed and the prominent literature useful to both the practitioner and the theoretician are presented. PMID:19828068

  3. simuwatt - A Tablet Based Electronic Auditing Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel; Parker, Andrew; Lisell, Lars

    2014-05-08

    'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures frommore » the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.« less

  4. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  5. Usability evaluation of a medication reconciliation tool: Embedding safety probes to assess users' detection of medication discrepancies.

    PubMed

    Russ, Alissa L; Jahn, Michelle A; Patel, Himalaya; Porter, Brian W; Nguyen, Khoa A; Zillich, Alan J; Linsky, Amy; Simon, Steven R

    2018-06-01

    An electronic medication reconciliation tool was previously developed by another research team to aid provider-patient communication for medication reconciliation. To evaluate the usability of this tool, we integrated artificial safety probes into standard usability methods. The objective of this article is to describe this method of using safety probes, which enabled us to evaluate how well the tool supports users' detection of medication discrepancies. We completed a mixed-method usability evaluation in a simulated setting with 30 participants: 20 healthcare professionals (HCPs) and 10 patients. We used factual scenarios but embedded three artificial safety probes: (1) a missing medication (i.e., omission); (2) an extraneous medication (i.e., commission); and (3) an inaccurate dose (i.e., dose discrepancy). We measured users' detection of each probe to estimate the probability that a HCP or patient would detect these discrepancies. Additionally, we recorded participants' detection of naturally occurring discrepancies. Each safety probe was detected by ≤50% of HCPs. Patients' detection rates were generally higher. Estimates indicate that a HCP and patient, together, would detect 44.8% of these medication discrepancies. Additionally, HCPs and patients detected 25 and 45 naturally-occurring discrepancies, respectively. Overall, detection of medication discrepancies was low. Findings indicate that more advanced interface designs are warranted. Future research is needed on how technologies can be designed to better aid HCPs' and patients' detection of medication discrepancies. This is one of the first studies to evaluate the usability of a collaborative medication reconciliation tool and assess HCPs' and patients' detection of medication discrepancies. Results demonstrate that embedded safety probes can enhance standard usability methods by measuring additional, clinically-focused usability outcomes. The novel safety probes we used may serve as an initial, standard set for future medication reconciliation research. More prevalent use of safety probes could strengthen usability research for a variety of health information technologies. Published by Elsevier Inc.

  6. Enhancing Access to Land Remote Sensing Data through Mainstream Social Media Channels

    NASA Astrophysics Data System (ADS)

    Sohre, T.; Maiersperger, T.

    2011-12-01

    Social media tools are changing the way that people discover information, communicate, and collaborate. Government agencies supporting the Land Remote Sensing user community have begun taking advantage of standard social media tools and capabilities. National Aeronautics and Space Administration (NASA) Earth Observing System (EOS) data centers have started providing outreach utilizing services including Facebook, Twitter, and YouTube videos. Really Simple Syndication (RSS) Feeds have become more standard means of sharing information, and a DataCasting tool was created as a NASA Technology Infusion effort to make RSS-based technology for accessing Earth Science information available. The United States Geological Survey (USGS) has also started using social media to allow the community access to news feeds and real-time earthquake alerts; listen to podcasts; get updates on new USGS publications, videos, and photographs; and more. Twitter feeds have been implemented in 2011 for the USGS Land Cover and Landsat user communities. In early 2011, the NASA Land Processes Distributed Active Archive Center (LP DAAC) user working group suggested the investigation of concepts for creating and distributing "bundles" of data, which would aggregate theme-based data sets from multiple sources. The LP DAAC is planning to explore the use of standard social bookmarking tools to support community developed bundles through the use of tools such as Delicious, Digg, or StumbleUpon. This concept would allow science users to organize and discover common links to data resources based on community developed tags, or a folksonomy. There are challenges that will need to be addressed such as maintaining the quality of tags but a social bookmarking system may have advantages over traditional search engines or formal ontologies for identifying and labeling various data sets relevant to a theme. As classification is done by the community of scientists who understand the data, the tagged data sets will result in a growing inventory of useful bundles.

  7. OpenMI: the essential concepts and their implications for legacy software

    NASA Astrophysics Data System (ADS)

    Gregersen, J. B.; Gijsbers, P. J. A.; Westen, S. J. P.; Blind, M.

    2005-08-01

    Information & Communication Technology (ICT) tools such as computational models are very helpful in designing river basin management plans (rbmp-s). However, in the scientific world there is consensus that a single integrated modelling system to support e.g. the implementation of the Water Framework Directive cannot be developed and that integrated systems need to be very much tailored to the local situation. As a consequence there is an urgent need to increase the flexibility of modelling systems, such that dedicated model systems can be developed from available building blocks. The HarmonIT project aims at precisely that. Its objective is to develop and implement a standard interface for modelling components and other relevant tools: The Open Modelling Interface (OpenMI) standard. The OpenMI standard has been completed and documented. It relies entirely on the "pull" principle, where data are pulled by one model from the previous model in the chain. This paper gives an overview of the OpenMI standard, explains the foremost concepts and the rational behind it.

  8. COMPASS: An Ada based scheduler

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Culbert, Chris

    1992-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas and funded by the Software Technology Branch of NASA Johnson Space Center. The motivation behind COMPASS is to illustrate scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to DOD standards. COMPASS has some unique characteristics that distinguishes it from commercial products. This paper discusses these characteristics and uses them to illustrate some differences between scheduling tools.

  9. Behavioral Health and Performance (BHP) Work-Rest Cycles

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.; Whitmire, Alexandra

    2011-01-01

    BHP Program Element Goal: Identify, characterize, and prevent or reduce behavioral health and performance risks associated with space travel, exploration and return to terrestrial life. BHP Requirements: a) Characterize and assess risks (e.g., likelihood and consequences). b) Develop tools and technologies to prevent, monitor, and treat adverse outcomes. c) Inform standards. d) Develop technologies to: 1) reduce risks and human systems resource requirements (e.g., crew time, mass, volume, power) and 2) ensure effective human-system integration across exploration mission.

  10. Using collaborative technologies in remote lab delivery systems for topics in automation

    NASA Astrophysics Data System (ADS)

    Ashby, Joe E.

    Lab exercises are a pedagogically essential component of engineering and technology education. Distance education remote labs are being developed which enable students to access lab facilities via the Internet. Collaboration, students working in teams, enhances learning activity through the development of communication skills, sharing observations and problem solving. Web meeting communication tools are currently used in remote labs. The problem identified for investigation was that no standards of practice or paradigms exist to guide remote lab designers in the selection of collaboration tools that best support learning achievement. The goal of this work was to add to the body of knowledge involving the selection and use of remote lab collaboration tools. Experimental research was conducted where the participants were randomly assigned to three communication treatments and learning achievement was measured via assessments at the completion of each of six remote lab based lessons. Quantitative instruments used for assessing learning achievement were implemented, along with a survey to correlate user preference with collaboration treatments. A total of 53 undergraduate technology students worked in two-person teams, where each team was assigned one of the treatments, namely (a) text messaging chat, (b) voice chat, or (c) webcam video with voice chat. Each had little experience with the subject matter involving automation, but possessed the necessary technical background. Analysis of the assessment score data included mean and standard deviation, confirmation of the homogeneity of variance, a one-way ANOVA test and post hoc comparisons. The quantitative and qualitative data indicated that text messaging chat negatively impacted learning achievement and that text messaging chat was not preferred. The data also suggested that the subjects were equally divided on preference to voice chat verses webcam video with voice chat. To the end of designing collaborative communication tools for remote labs involving automation equipment, the results of this work points to making voice chat the default method of communication; but the webcam video with voice chat option should be included. Standards are only beginning to be developed for the design of remote lab systems. Research, design and innovation involving collaboration and presence should be included.

  11. Implications of intelligent, integrated microsystems for product design and development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MYERS,DAVID R.; MCWHORTER,PAUL J.

    2000-04-19

    Intelligent, integrated microsystems combine some or all of the functions of sensing, processing information, actuation, and communication within a single integrated package, and preferably upon a single silicon chip. As the elements of these highly integrated solutions interact strongly with each other, the microsystem can be neither designed nor fabricated piecemeal, in contrast to the more familiar assembled products. Driven by technological imperatives, microsystems will best be developed by multi-disciplinary teams, most likely within the flatter, less hierarchical organizations. Standardization of design and process tools around a single, dominant technology will expedite economically viable operation under a common production infrastructure.more » The production base for intelligent, integrated microsystems has elements in common with the mathematical theory of chaos. Similar to chaos theory, the development of microsystems technology will be strongly dependent on, and optimized to, the initial product requirements that will drive standardization--thereby further rewarding early entrants to integrated microsystem technology.« less

  12. The coming paradigm shift: A transition from manual to automated microscopy.

    PubMed

    Farahani, Navid; Monteith, Corey E

    2016-01-01

    The field of pathology has used light microscopy (LM) extensively since the mid-19(th) century for examination of histological tissue preparations. This technology has remained the foremost tool in use by pathologists even as other fields have undergone a great change in recent years through new technologies. However, as new microscopy techniques are perfected and made available, this reliance on the standard LM will likely begin to change. Advanced imaging involving both diffraction-limited and subdiffraction techniques are bringing nondestructive, high-resolution, molecular-level imaging to pathology. Some of these technologies can produce three-dimensional (3D) datasets from sampled tissues. In addition, block-face/tissue-sectioning techniques are already providing automated, large-scale 3D datasets of whole specimens. These datasets allow pathologists to see an entire sample with all of its spatial information intact, and furthermore allow image analysis such as detection, segmentation, and classification, which are impossible in standard LM. It is likely that these technologies herald a major paradigm shift in the field of pathology.

  13. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    PubMed

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  14. New Visions for Transforming Teaching

    ERIC Educational Resources Information Center

    Epler, James W.

    2009-01-01

    Those who work in schools understand the demands placed on teachers in the face of mounting pressures from parents, administrators, and standardized tests. It would be foolish to assume these demands do not overshadow aspirations to remain current in educational technology trends and tools. That is why it is more important than ever for ed tech…

  15. An Open and Scalable Learning Infrastructure for Food Safety

    ERIC Educational Resources Information Center

    Manouselis, Nikos; Thanopoulos, Charalampos; Vignare, Karen; Geith, Christine

    2013-01-01

    In the last several years, a variety of approaches and tools have been developed for giving access to open educational resources (OER) related to food safety, security, and food standards, as well to various targeted audiences (e.g., farmers, agronomists). The aim of this paper is to present a technology infrastructure currently in demonstration…

  16. Students' Use of Technological Tools for Verification Purposes in Geometry Problem Solving

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Dagdilelis, Vassilios

    2008-01-01

    Despite its importance in mathematical problem solving, verification receives rather little attention by the students in classrooms, especially at the primary school level. Under the hypotheses that (a) non-standard tasks create a feeling of uncertainty that stimulates the students to proceed to verification processes and (b) computational…

  17. TARDEC’s VICTORY SIL is a Key Tool for Advancing Standardized Ground Vehicle Electronic Architecture

    DTIC Science & Technology

    2012-08-06

    SUPPLEMENTARY NOTES Submitted to 2012 NDIA Ground Vehicle Systems Engineering and Technology Symposium August 14-16 Troy , Michigan 14. ABSTRACT VICTORY...Timing, Threat and Remote Weapons Station. The results were very encouraging with very low power consumption (3.15 Watts ), less than 1% system

  18. Preservice Teachers Experience with Online Modules about TPACK

    ERIC Educational Resources Information Center

    White, Bruce; Geer, Ruth

    2013-01-01

    Despite the fact that Information and Communication Technology (ICT) is valued as a tool for learning, the modelling for preservice teachers of ICT integration in the curriculum areas is often limited. In the recently approved AITSL standards for Initial Teacher Education Programs, knowledge of ICTs is explicitly mentioned in three of the…

  19. Assessing Customer Satisfaction at the NIST Research Library: Essential Tool for Future Planning

    ERIC Educational Resources Information Center

    Liu, Rosa; Allmang, Nancy

    2008-01-01

    This article describes a campus-wide customer satisfaction survey undertaken by the National Institute of Standards and Technology (NIST) Research Library in 2007. The methodology, survey instrument, data analysis, results, and actions taken in response to the survey are described. The outcome and recommendations will guide the library both…

  20. Augmented Reality Games: Using Technology on a Budget

    ERIC Educational Resources Information Center

    Annetta, Leonard; Burton, Erin Peters; Frazier, Wendy; Cheng, Rebecca; Chmiel, Margaret

    2012-01-01

    As smartphones become more ubiquitous among adolescents, there is increasing potential for these as a tool to engage students in science instruction through innovative learning environments such as augmented reality (AR). Aligned with the National Science Education Standards (NRC 1996) and integrating the three dimensions of "A Framework for K-12…

  1. Literacy Learning in Networked Classrooms: Using the Internet with Middle-Level Students

    ERIC Educational Resources Information Center

    McNabb, Mary L.; Thurber, Bonnie B.; Dibuz, Balazs; McDermott, Pamela A.; Lee, Carol Ann

    2006-01-01

    Middle-level teachers, librarians, and media specialists can use this book to meet current English language arts and technology standards and to prepare students to be literate citizens in the 21st century. Additional teaching tools include timelines of classroom events, reproducible rubrics for assessing curriculum units, suggested Web resources,…

  2. IR-drop analysis for validating power grids and standard cell architectures in sub-10nm node designs

    NASA Astrophysics Data System (ADS)

    Ban, Yongchan; Wang, Chenchen; Zeng, Jia; Kye, Jongwook

    2017-03-01

    Since chip performance and power are highly dependent on the operating voltage, the robust power distribution network (PDN) is of utmost importance in designs to provide with the reliable voltage without voltage (IR)-drop. However, rapid increase of parasitic resistance and capacitance (RC) in interconnects makes IR-drop much worse with technology scaling. This paper shows various IR-drop analyses in sub 10nm designs. The major objectives are to validate standard cell architectures, where different sizes of power/ground and metal tracks are validated, and to validate PDN architecture, where types of power hook-up approaches are evaluated with IR-drop calculation. To estimate IR-drops in 10nm and below technologies, we first prepare physically routed designs given standard cell libraries, where we use open RISC RTL, synthesize the CPU, and apply placement & routing with process-design kits (PDK). Then, static and dynamic IR-drop flows are set up with commercial tools. Using the IR-drop flow, we compare standard cell architectures, and analysis impacts on performance, power, and area (PPA) with the previous technology-node designs. With this IR-drop flow, we can optimize the best PDN structure against IR-drops as well as types of standard cell library.

  3. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  4. Emerging Issues and Future Developments in Capsule Endoscopy

    PubMed Central

    Slawinski, Piotr R.; Obstein, Keith L.; Valdastri, Pietro

    2015-01-01

    Capsule endoscopy (CE) has transformed from a research venture into a widely used clinical tool and the primary means for diagnosing small bowel pathology. These orally administered capsules traverse passively through the gastrointestinal tract via peristalsis and are used in the esophagus, stomach, small bowel, and colon. The primary focus of CE research in recent years has been enabling active CE manipulation and extension of the technology to therapeutic functionality; thus, widening the scope of the procedure. This review outlines clinical standards of the technology as well as recent advances in CE research. Clinical capsule applications are discussed with respect to each portion of the gastrointestinal tract. Promising research efforts are presented with an emphasis on enabling active capsule locomotion. The presented studies suggest, in particular, that the most viable solution for active capsule manipulation is actuation of a capsule via exterior permanent magnet held by a robot. Developing capsule procedures adhering to current healthcare standards, such as enabling a tool channel or irrigation in a therapeutic device, is a vital phase in the adaptation of CE in the clinical setting. PMID:26028956

  5. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  6. Inventory on the dietary assessment tools available and needed in africa: a prerequisite for setting up a common methodological research infrastructure for nutritional surveillance, research, and prevention of diet-related non-communicable diseases.

    PubMed

    Pisa, Pedro T; Landais, Edwige; Margetts, Barrie; Vorster, Hester H; Friedenreich, Christine M; Huybrechts, Inge; Martin-Prevel, Yves; Branca, Francesco; Lee, Warren T K; Leclercq, Catherine; Jerling, Johann; Zotor, Francis; Amuna, Paul; Al Jawaldeh, Ayoub; Aderibigbe, Olaide Ruth; Amoussa, Waliou Hounkpatin; Anderson, Cheryl A M; Aounallah-Skhiri, Hajer; Atek, Madjid; Benhura, Chakare; Chifamba, Jephat; Covic, Namukolo; Dary, Omar; Delisle, Hélène; El Ati, Jalila; El Hamdouchi, Asmaa; El Rhazi, Karima; Faber, Mieke; Kalimbira, Alexander; Korkalo, Liisa; Kruger, Annamarie; Ledo, James; Machiweni, Tatenda; Mahachi, Carol; Mathe, Nonsikelelo; Mokori, Alex; Mouquet-Rivier, Claire; Mutie, Catherine; Nashandi, Hilde Liisa; Norris, Shane A; Onabanjo, Oluseye Olusegun; Rambeloson, Zo; Saha, Foudjo Brice U; Ubaoji, Kingsley Ikechukwu; Zaghloul, Sahar; Slimani, Nadia

    2018-01-02

    To carry out an inventory on the availability, challenges, and needs of dietary assessment (DA) methods in Africa as a pre-requisite to provide evidence, and set directions (strategies) for implementing common dietary methods and support web-research infrastructure across countries. The inventory was performed within the framework of the "Africa's Study on Physical Activity and Dietary Assessment Methods" (AS-PADAM) project. It involves international institutional and African networks. An inventory questionnaire was developed and disseminated through the networks. Eighteen countries responded to the dietary inventory questionnaire. Various DA tools were reported in Africa; 24-Hour Dietary Recall and Food Frequency Questionnaire were the most commonly used tools. Few tools were validated and tested for reliability. Face-to-face interview was the common method of administration. No computerized software or other new (web) technologies were reported. No tools were standardized across countries. The lack of comparable DA methods across represented countries is a major obstacle to implement comprehensive and joint nutrition-related programmes for surveillance, programme evaluation, research, and prevention. There is a need to develop new or adapt existing DA methods across countries by employing related research infrastructure that has been validated and standardized in other settings, with the view to standardizing methods for wider use.

  7. Technology and Information Tool Preferences of Academics in the Field of Anaesthesiology

    PubMed Central

    Akkaya, Akcan; Bilgi, Murat; Demirhan, Abdullah; Kurt, Adem Deniz; Tekelioğlu, Ümit Yaşar; Akkaya, Kadir; Koçoğlu, Hasan; Tekçe, Hikmet

    2014-01-01

    Objective Researchers use a large number of information technology tools from the beginning until the publication of a scientific study. The aim of the study is to investigate the technology and data processing tool usage preferences of academics who produce scientific publications in the field of anaesthesiology. Methods A multiple-choice survey, including 18 questions regarding the use of technology to assess the preferences of academicians, was performed. Results PubMed has been the most preferred article search portal, and the second is Google Academic. Medscape has become the most preferred medical innovation tracking website. Only 12% of academicians obtain a clinical trial registration number for their randomized clinical research. In total, 28% of respondents used the Consolidated Standards of Reporting Trials checklist in their clinical trials. Of all participants, 21% was using Dropbox and 9% was using Google-Drive for sharing files. Google Chrome was the most preferred internet browser (32.25%) for academic purposes. English language editing service was obtained from the Scribendi (21%) and Textcheck (12%) websites. Half of the academics were getting help from their specialist with a personal relationship, 27% was doing it themselves, and 24% was obtaining professional assistance for statistical requirements. Sixty percent of the participants were not using a reference editing program, and 21% was using EndNote. Nine percent of the academics were spending money for article writing, and the mean cost was 1287 Turkish Liras/year. Conclusion Academics in the field of anaesthesiology significantly benefit from technology and informatics tools to produce scientific publications. PMID:27366448

  8. Energy Management Challenges and Opportunities with Increased Intermittent Renewable Generation on the California Electrical Grid

    NASA Astrophysics Data System (ADS)

    Eichman, Joshua David

    Renewable resources including wind, solar, geothermal, biomass, hydroelectric, wave and tidal, represent an opportunity for environmentally preferred generation of electricity that also increases energy security and independence. California is very proactive in encouraging the implementation of renewable energy in part through legislation like Assembly Bill 32 and the development and execution of Renewable Portfolio Standards (RPS); however renewable technologies are not without challenges. All renewable resources have some resource limitations, be that from location, capacity, cost or availability. Technologies like wind and solar are intermittent in nature but represent one of the most abundant resources for generating renewable electricity. If RPS goals are to be achieved high levels of intermittent renewables must be considered. This work explores the effects of high penetration of renewables on a grid system, with respect to resource availability and identifies the key challenges from the perspective of the grid to introducing these resources. The HiGRID tool was developed for this analysis because no other tool could explore grid operation, while maintaining system reliability, with a diverse set of renewable resources and a wide array of complementary technologies including: energy efficiency, demand response, energy storage technologies and electric transportation. This tool resolves the hourly operation of conventional generation resources (nuclear, coal, geothermal, natural gas and hydro). The resulting behavior from introducing additional renewable resources and the lifetime costs for each technology is analyzed.

  9. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  10. Leveraging Available Technologies for Improved Interoperability and Visualization of Remote Sensing and In-situ Oceanographic data at the PO.DAAC

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Arms, S. C.; Thompson, C. K.; Quach, N.; Lam, T.

    2016-12-01

    Earth science applications increasingly rely on the integration of multivariate data from diverse observational platforms. Whether for satellite mission cal/val, science or decision support, the coupling of remote sensing and in-situ field data is integral also to oceanographic workflows. This has prompted archives such as the PO.DAAC, NASA's physical oceanographic data archive, that historically has had a remote sensing focus, to adapt to better accommodate complex field campaign datasets. However, the inherent heterogeneity of in-situ datasets and their variable adherence to meta/data standards poses a significant impediment to interoperability, a problem originating early in the data lifecycle and significantly impacting stewardship and usability of these data long-term. Here we introduce a new initiative underway at PO.DAAC that seeks to catalyze efforts to address these challenges. It involves the enhancement and integration of available high TRL (Technology Readiness level) components for improved interoperability and support of in-situ data with a focus on a novel yet representative class of oceanographic field data: data from electronic tags deployed on a variety of marine species as biological sampling platforms in support of fisheries management and ocean observation efforts. This project seeks to demonstrate, deliver and ultimately sustain operationally a reusable and accessible set of tools to: 1) mediate reconciliation of heterogeneous source data into a tractable number of standardized formats consistent with earth science data standards; 2) harmonize existing metadata models for satellite and field datasets; 3) demonstrate the value added of integrated data access via a range of available tools and services hosted at the PO.DAAC, including a web-based visualization tool for comprehensive mapping of satellite and in-situ data. An innovative part of our project plan involves partnering with the leading electronic tag manufacturer to promote the adoption of appropriate data standards in their processing software. The proposed project thus adopts a model lifecycle approach complimented by broadly applicable technologies to address key data management and interoperability issues for in-situ data

  11. Sustainable Biosphere Initiative Project

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The goal of the Advanced Technology in Ecological Sciences project is to gain broad participation within the environmental scientific community in developing a research agenda addressing the development and refinement of technologies instrumental to research that responds to these challenges (e.g. global climate change, unsustainable resource use, and threats to biological diversity). The following activities have been completed: (1) A listserve 'eco-tech was set up to serve as a clearinghouse of information about activities and events relating to advanced technologies; (2) A series of conference calls were organized on specific topics including data visualization and spatial analysis, and remote sensing; and (3) Two meetings were organized at the 19% ESA Annual Meeting in Providence, Rhode Island. Topics covered included concerns about tool and data sharing; interest in expanded development of ground-based remote sensing technologies for monitoring; issues involved in training for using new technologies and increasing data streams, and- associated implications of data processing capabilities; questions about how to develop appropriate standards (i.e. surface morphology classification standards) that facilitate the exchange and comparison of analytical results; and some thoughts about remote sensing platforms and vehicles.

  12. A review of odour impact criteria in selected countries around the world.

    PubMed

    Brancher, Marlon; Griffiths, K David; Franco, Davide; de Melo Lisboa, Henrique

    2017-02-01

    Exposure to environmental odour can result in annoyance, health effects and depreciation of property values. Therefore, many jurisdictions classify odour as an atmospheric pollutant and regulate emissions and/or impacts from odour generating activities at a national, state or municipal level. In this work, a critical review of odour regulations in selected jurisdictions of 28 countries is presented. Individual approaches were identified as: comparing ambient air odour concentration and individual chemicals statistics against impact criteria (maximum impact standard); using fixed and variable separation distances (separation distance standard); maximum emission rate for mixtures of odorants and individual chemical species (maximum emission standard); number of complaints received or annoyance level determined via community surveys (maximum annoyance standard); and requiring use of best available technologies (BAT) to minimize odour emissions (technology standard). The comparison of model-predicted odour concentration statistics against odour impact criteria (OIC) is identified as one of the most common tools used by regulators to evaluate the risk of odour impacts in planning stage assessments and is also used to inform assessment of odour impacts of existing facilities. Special emphasis is given to summarizing OIC (concentration percentile and threshold) and the manner in which they are applied. The way short term odour peak to model time-step mean (peak-to-mean) effects is also captured. Furthermore, the fundamentals of odorant properties, dimensions of nuisance odour, odour sampling and analysis methods and dispersion modelling guidance are provided. Common elements of mature and effective odour regulation frameworks are identified and an integrated multi-tool strategy is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  14. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  15. Validation results of specifications for motion control interoperability

    NASA Astrophysics Data System (ADS)

    Szabo, Sandor; Proctor, Frederick M.

    1997-01-01

    The National Institute of Standards and Technology (NIST) is participating in the Department of Energy Technologies Enabling Agile Manufacturing (TEAM) program to establish interface standards for machine tool, robot, and coordinate measuring machine controllers. At NIST, the focus is to validate potential application programming interfaces (APIs) that make it possible to exchange machine controller components with a minimal impact on the rest of the system. This validation is taking place in the enhanced machine controller (EMC) consortium and is in cooperation with users and vendors of motion control equipment. An area of interest is motion control, including closed-loop control of individual axes and coordinated path planning. Initial tests of the motion control APIs are complete. The APIs were implemented on two commercial motion control boards that run on two different machine tools. The results for a baseline set of APIs look promising, but several issues were raised. These include resolving differing approaches in how motions are programmed and defining a standard measurement of performance for motion control. This paper starts with a summary of the process used in developing a set of specifications for motion control interoperability. Next, the EMC architecture and its classification of motion control APIs into two classes, Servo Control and Trajectory Planning, are reviewed. Selected APIs are presented to explain the basic functionality and some of the major issues involved in porting the APIs to other motion controllers. The paper concludes with a summary of the main issues and ways to continue the standards process.

  16. Medicaid information technology architecture: an overview.

    PubMed

    Friedman, Richard H

    2006-01-01

    The Medicaid Information Technology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their information technology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs).

  17. THE APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON SEQUESTRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Sandra Brown; Ellen Hawes

    2002-09-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research projects is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less

  18. The role of informatics in patient-centered care and personalized medicine.

    PubMed

    Hanna, Matthew G; Pantanowitz, Liron

    2017-06-01

    The practice of cytopathology has dramatically changed due to advances in genomics and information technology. Cytology laboratories have accordingly become increasingly dependent on pathology informatics support to meet the emerging demands of precision medicine. Pathology informatics deals with information technology in the laboratory, and the impact of this technology on workflow processes and staff who interact with these tools. This article covers the critical role that laboratory information systems, electronic medical records, and digital imaging plays in patient-centered personalized medicine. The value of integrated diagnostic reports, clinical decision support, and the use of whole-slide imaging to better evaluate cytology samples destined for molecular testing is discussed. Image analysis that offers more precise and quantitative measurements in cytology is addressed, as well as the role of bioinformatics tools to cope with Big Data from next-generation sequencing. This article also highlights the barriers to the widespread adoption of these disruptive technologies due to regulatory obstacles, limited commercial solutions, poor interoperability, and lack of standardization. Cancer Cytopathol 2017;125(6 suppl):494-501. © 2017 American Cancer Society. © 2017 American Cancer Society.

  19. A Blended Learning Framework for Curriculum Design and Professional Development

    ERIC Educational Resources Information Center

    Mirriahi, Negin; Alonzo, Dennis; Fox, Bob

    2015-01-01

    The need for flexibility in learning and the affordances of technology provided the impetus for the rise of blended learning (BL) globally across higher education institutions. However, the adoption of BL practices continues at a low pace due to academics' low digital fluency, various views and BL definitions, and limited standards-based tools to…

  20. The Flipside: Concerns about the "New Literacies" Paths Educators Might Take

    ERIC Educational Resources Information Center

    Kellinger, Janna Jackson

    2012-01-01

    This article explores some of the ways in which educators are using the tools of new literacies without the mind-set. The author poses the possibility that this might be a result of increased pressure on teachers to differentiate for standardization. The author then presents ways in which new literacies, particularly those grounded in technology,…

  1. Clinical applications of breath testing

    PubMed Central

    Paschke, Kelly M; Mashir, Alquam

    2010-01-01

    Breath testing has the potential to benefit the medical field as a cost-effective, non-invasive diagnostic tool for diseases of the lung and beyond. With growing evidence of clinical worth, standardization of methods, and new sensor and detection technologies the stage is set for breath testing to gain considerable attention and wider application in upcoming years. PMID:21173863

  2. Architectural and Functional Design and Evaluation of E-Learning VUIS Based on the Proposed IEEE LTSA Reference Model.

    ERIC Educational Resources Information Center

    O'Droma, Mairtin S.; Ganchev, Ivan; McDonnell, Fergal

    2003-01-01

    Presents a comparative analysis from the Institute of Electrical and Electronics Engineers (IEEE) Learning Technology Standards Committee's (LTSC) of the architectural and functional design of e-learning delivery platforms and applications, e-learning course authoring tools, and learning management systems (LMSs), with a view of assessing how…

  3. A Season of Change: How Science Librarians Can Remain Relevant with Open Access and Scholarly Communications Initiatives

    ERIC Educational Resources Information Center

    Brown, Elizabeth

    2009-01-01

    The current rate of change suggests scholarly communications issues such as new publication models and technology to connect library and research tools is expected to continue into the foreseeable future. As models evolve, standards develop, and scientists evolve in their communication patterns, librarians will need to embrace transitional…

  4. Impacts | Wind | NREL

    Science.gov Websites

    in hard hats standing on top of a large wind turbine overlooking several other wind turbines in the Framework Transforms FAST Wind Turbine Modeling Tool NREL Assesses National Design Standards for Offshore Wind Resource NREL Identifies Investments for Wind Turbine Drivetrain Technologies Awards R&D 100

  5. Investigation of ERP Teaching and Practitioner Experiences Related to ISO 9000 Core Standards

    ERIC Educational Resources Information Center

    Wiggins, Charles

    2010-01-01

    Enterprise Resource Planning (ERP) systems have greatly enhanced the efficiency and continuity of the business process and the flow of information technology in order to support organizations. ERP was intended to be used as a tool for manufacturing in an effort to build a more cohesive customer relationship. Lately many "Fortune" 500 companies…

  6. ChemVoyage: A Web-Based, Simulated Learning Environment with Scaffolding and Linking Visualization to Conceptualization

    ERIC Educational Resources Information Center

    McRae, Christopher; Karuso, Peter; Liu, Fei

    2012-01-01

    The Web is now a standard tool for information access and dissemination in higher education. The prospect of Web-based, simulated learning platforms and technologies, however, remains underexplored. We have developed a Web-based tutorial program (ChemVoyage) for a third-year organic chemistry class on the topic of pericyclic reactions to…

  7. User Evaluation of Automatically Generated Semantic Hypertext Links in a Heavily Used Procedural Manual.

    ERIC Educational Resources Information Center

    Tebbutt, John

    1999-01-01

    Discusses efforts at National Institute of Standards and Technology (NIST) to construct an information discovery tool through the fusion of hypertext and information retrieval that works by parsing a contiguous document base into smaller documents and inserting semantic links between them. Also presents a case study that evaluated user reactions.…

  8. Loop-Mediated Isothermal Amplification Test for Trypanosoma gambiense Group 1 with Stem Primers: A Molecular Xenomonitoring Test for Sleeping Sickness.

    PubMed

    Njiru, Zablon K; Mbae, Cecilia K; Mburugu, Gitonga N

    2017-01-01

    The World Health Organization has targeted Human African Trypanosomiasis (HAT) for elimination by 2020 with zero incidence by 2030. To achieve and sustain this goal, accurate and easy-to-deploy diagnostic tests for Gambian trypanosomiasis which accounts for over 98% of reported cases will play a crucial role. Most needed will be tools for surveillance of pathogen in vectors (xenomonitoring) since population screening tests are readily available. The development of new tests is expensive and takes a long time while incremental improvement of existing technologies that have potential for xenomonitoring may offer a shorter pathway to tools for HAT surveillance. We have investigated the effect of including a second set of reaction accelerating primers (stem primers) to the standard T. brucei gambiense LAMP test format. The new test format was analyzed with and without outer primers. Amplification was carried out using Rotorgene 6000 and the portable ESE Quant amplification unit capable of real-time data output. The stem LAMP formats indicated shorter time to results (~8 min), were 10-100-fold more sensitive, and indicated higher diagnostic sensitivity and accuracy compared to the standard LAMP test. It was possible to confirm the predicted product using ESE melt curves demonstrating the potential of combining LAMP and real-time technologies as possible tool for HAT molecular xenomonitoring.

  9. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  10. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  11. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  12. Update on Smoking Cessation: E-Cigarettes, Emerging Tobacco Products Trends, and New Technology-Based Interventions.

    PubMed

    Das, Smita; Tonelli, Makenzie; Ziedonis, Douglas

    2016-05-01

    Tobacco use disorders (TUDs) continue to be overly represented in patients treated in mental health and addiction treatment settings. It is the most common substance use disorder (SUD) and the leading cause of health disparities and increased morbidity/mortality amongst individuals with a psychiatric disorder. There are seven Food and Drug Administration (FDA) approved medications and excellent evidence-based psychosocial treatment interventions to use in TUD treatment. In the past few years, access to and use of other tobacco or nicotine emerging products are on the rise, including the highly publicized electronic cigarette (e-cigarette). There has also been a proliferation of technology-based interventions to support standard TUD treatment, including mobile apps and web-based interventions. These tools are easily accessed 24/7 to support outpatient treatment. This update will review the emerging products and counter-measure intervention technologies, including how clinicians can integrate these tools and other community-based resources into their practice.

  13. Infusing informatics into interprofessional education: the iTEAM (Interprofessional Technology Enhanced Advanced practice Model) project.

    PubMed

    Skiba, Diane J; Barton, Amy J; Knapfel, Sarah; Moore, Gina; Trinkley, Katy

    2014-01-01

    The iTEAM goal is to prepare advanced practice nurses, physicians and pharmacists with the interprofessional (IP) core competencies (informatics, patient centric, quality-focused, evidence based care) to provide technology enhanced collaborative care by: offering technology enhanced learning opportunities through a required informatics course, advanced practice courses (team based experiences with both standardized and virtual patients) and team based clinical experiences including e-health experiences. The innovative features of iTEAM project will be achieved through use of social media strategies, a web accessible Electronic Health Records (EHRs) system, a Virtual Clinic/Hospital in Second Life, various e-health applications including traditional telehealth tools and consumer oriented tools such as patient portals, social media consumer groups and mobile health (m-health) applications for health and wellness functions. It builds upon the schools' rich history of IP education and includes clinical partners, such as the VA and other clinical sites focused on care for underserved patient populations.

  14. A typology of educationally focused medical simulation tools.

    PubMed

    Alinier, Guillaume

    2007-10-01

    The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful resource for simulation users who are trying to improve their educational practice.

  15. TENI: A comprehensive battery for cognitive assessment based on games and technology.

    PubMed

    Delgado, Marcela Tenorio; Uribe, Paulina Arango; Alonso, Andrés Aparicio; Díaz, Ricardo Rosas

    2016-01-01

    TENI (Test de Evaluación Neuropsicológica Infantil) is an instrument developed to assess cognitive abilities in children between 3 and 9 years of age. It is based on a model that incorporates games and technology as tools to improve the assessment of children's capacities. The test was standardized with two Chilean samples of 524 and 82 children living in urban zones. Evidence of reliability and validity based on current standards is presented. Data show good levels of reliability for all subtests. Some evidence of validity in terms of content, test structure, and association with other variables is presented. This instrument represents a novel approach and a new frontier in cognitive assessment. Further studies with clinical, rural, and cross-cultural populations are required.

  16. Calibrated thermal microscopy of the tool-chip interface in machining

    NASA Astrophysics Data System (ADS)

    Yoon, Howard W.; Davies, Matthew A.; Burns, Timothy J.; Kennedy, M. D.

    2000-03-01

    A critical parameter in predicting tool wear during machining and in accurate computer simulations of machining is the spatially-resolved temperature at the tool-chip interface. We describe the development and the calibration of a nearly diffraction-limited thermal-imaging microscope to measure the spatially-resolved temperatures during the machining of an AISI 1045 steel with a tungsten-carbide tool bit. The microscope has a target area of 0.5 mm X 0.5 mm square region with a < 5 micrometers spatial resolution and is based on a commercial InSb 128 X 128 focal plane array with an all reflective microscope objective. The minimum frame image acquisition time is < 1 ms. The microscope is calibrated using a standard blackbody source from the radiance temperature calibration laboratory at the National Institute of Standards and Technology, and the emissivity of the machined material is deduced from the infrared reflectivity measurements. The steady-state thermal images from the machining of 1045 steel are compared to previous determinations of tool temperatures from micro-hardness measurements and are found to be in agreement with those studies. The measured average chip temperatures are also in agreement with the temperature rise estimated from energy balance considerations. From these calculations and the agreement between the experimental and the calculated determinations of the emissivity of the 1045 steel, the standard uncertainty of the temperature measurements is estimated to be about 45 degree(s)C at 900 degree(s)C.

  17. Electronic Medical Records and the Technological Imperative: The Retrieval of Dialogue in Community-Based Primary Care.

    PubMed

    Franz, Berkeley; Murphy, John W

    2015-01-01

    Electronic medical records are regarded as an important tool in primary health-care settings. Because these records are thought to standardize medical information, facilitate provider communication, and improve office efficiency, many practices are transitioning to these systems. However, much of the concern with improving the practice of record keeping has related to technological innovations and human-computer interaction. Drawing on the philosophical reflection raised in Jacques Ellul's work, this article questions the technological imperative that may be supporting medical record keeping. Furthermore, given the growing emphasis on community-based care, this article discusses important non-technological aspects of electronic medical records that might bring the use of these records in line with participatory primary-care medicine.

  18. Moving Beyond the 10,000 Ways That Don't Work

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Arctur, D. K.; Rueda, C.

    2009-12-01

    From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.

  19. Field-programmable lab-on-a-chip based on microelectrode dot array architecture.

    PubMed

    Wang, Gary; Teng, Daniel; Lai, Yi-Tse; Lu, Yi-Wen; Ho, Yingchieh; Lee, Chen-Yi

    2014-09-01

    The fundamentals of electrowetting-on-dielectric (EWOD) digital microfluidics are very strong: advantageous capability in the manipulation of fluids, small test volumes, precise dynamic control and detection, and microscale systems. These advantages are very important for future biochip developments, but the development of EWOD microfluidics has been hindered by the absence of: integrated detector technology, standard commercial components, on-chip sample preparation, standard manufacturing technology and end-to-end system integration. A field-programmable lab-on-a-chip (FPLOC) system based on microelectrode dot array (MEDA) architecture is presented in this research. The MEDA architecture proposes a standard EWOD microfluidic component called 'microelectrode cell', which can be dynamically configured into microfluidic components to perform microfluidic operations of the biochip. A proof-of-concept prototype FPLOC, containing a 30 × 30 MEDA, was developed by using generic integrated circuits computer aided design tools, and it was manufactured with standard low-voltage complementary metal-oxide-semiconductor technology, which allows smooth on-chip integration of microfluidics and microelectronics. By integrating 900 droplet detection circuits into microelectrode cells, the FPLOC has achieved large-scale integration of microfluidics and microelectronics. Compared to the full-custom and bottom-up design methods, the FPLOC provides hierarchical top-down design approach, field-programmability and dynamic manipulations of droplets for advanced microfluidic operations.

  20. [Good practices and techniques for prevention of accidents at work and occupational diseases. New database of Inail].

    PubMed

    Bindi, L; Ossicini, A

    2007-01-01

    The project "The publication of good practices and good techniques for prevention" is one the priorities of nail. This computerized system for the collection of good practices and standards of Good Technology is aimed to health and safety of workers. The basic objective of the database is to provide a valuable tool, usable, dynamic and implemented, in order to facilitate and direct the access to BP and BT it by people responsible for SSL. At the same time constitutes a tool strategically important for enterprises (especially SMEs) in terms of technological innovation and competitiveness, related to the prevention, safety and health of workers. The realization of this project has involved many of the professionals (chemists, engineers, doctors, biologists, geologists, etc.), and everyone gives his intake of qualified professional competence.

  1. Grape RNA-Seq analysis pipeline environment

    PubMed Central

    Knowles, David G.; Röder, Maik; Merkel, Angelika; Guigó, Roderic

    2013-01-01

    Motivation: The avalanche of data arriving since the development of NGS technologies have prompted the need for developing fast, accurate and easily automated bioinformatic tools capable of dealing with massive datasets. Among the most productive applications of NGS technologies is the sequencing of cellular RNA, known as RNA-Seq. Although RNA-Seq provides similar or superior dynamic range than microarrays at similar or lower cost, the lack of standard and user-friendly pipelines is a bottleneck preventing RNA-Seq from becoming the standard for transcriptome analysis. Results: In this work we present a pipeline for processing and analyzing RNA-Seq data, that we have named Grape (Grape RNA-Seq Analysis Pipeline Environment). Grape supports raw sequencing reads produced by a variety of technologies, either in FASTA or FASTQ format, or as prealigned reads in SAM/BAM format. A minimal Grape configuration consists of the file location of the raw sequencing reads, the genome of the species and the corresponding gene and transcript annotation. Grape first runs a set of quality control steps, and then aligns the reads to the genome, a step that is omitted for prealigned read formats. Grape next estimates gene and transcript expression levels, calculates exon inclusion levels and identifies novel transcripts. Grape can be run on a single computer or in parallel on a computer cluster. It is distributed with specific mapping and quantification tools, but given its modular design, any tool supporting popular data interchange formats can be integrated. Availability: Grape can be obtained from the Bioinformatics and Genomics website at: http://big.crg.cat/services/grape. Contact: david.gonzalez@crg.eu or roderic.guigo@crg.eu PMID:23329413

  2. Fuel Economy Regulations and Efficiency Technology Improvements in U.S. Cars Since 1975

    NASA Astrophysics Data System (ADS)

    MacKenzie, Donald Warren

    Light-duty vehicles account for 43% of petroleum consumption and 23% of greenhouse gas emissions in the United States. Corporate Average Fuel Economy (CAFE) standards are the primary policy tool addressing petroleum consumption in the U.S., and are set to tighten substantially through 2025. In this dissertation, I address several interconnected questions on the technical, policy, and market aspects of fuel consumption reduction. I begin by quantifying historic improvements in fuel efficiency technologies since the 1970s. First. I develop a linear regression model of acceleration performance conditional on power, weight, powertrain, and body characteristics, showing that vehicles today accelerate 20-30% faster than vehicles with similar specifications in the 1970s. Second, I find that growing use of alternative materials and a switch to more weight-efficient vehicle architectures since 1975 have cut the weight of today's new cars by approximately 790 kg (46%). Integrating these results with model-level specification data, I estimate that the average fuel economy of new cars could have tripled from 1975-2009, if not for changes in performance, size, and features over this period. The pace of improvements was not uniform, averaging 5% annually from 1975-1990, but only 2% annually since then. I conclude that the 2025 standards can be met through improvements in efficiency technology, if we can return to 1980s rates of improvement, and growth in acceleration performance and feature content is curtailed. I next test the hypotheses that higher fuel prices and more stringent CAFE standards cause automotive firms to deploy efficiency technologies more rapidly. I find some evidence that higher fuel prices cause more rapid changes in technology, but little to no evidence that tighter CAFE standards increase rates of technology change. I conclude that standards alone, without continued high gasoline prices, may not drive technology improvements at rates needed to meet the 2025 CAFE standards factors determining industry support for nationwide fuel economy regulations. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs@mit.edu)

  3. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Patrick Gonzalez; Sandra Brown

    2005-10-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less

  4. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Patrick Gonzalez; Sandra Brown

    2006-01-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less

  5. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  6. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Evaluation results of a new EUV reticle pod based on SEMI E152

    NASA Astrophysics Data System (ADS)

    Ota, Kazuya; Yonekawa, Masami; Taguchi, Takao; Suga, Osamu

    2010-04-01

    To protect the reticle during shipping, storage and tool handling, various reticle pod concepts have been proposed and evaluated in the last 10 years. MIRAI-Selete has been developing EUV reticle handling technology and evaluating EUV reticle pods designed using "Dual Pod Concept" for four years. The concept was jointly proposed by Canon and Nikon at the EUV mask technology and standards workshop at Miyazaki in November 2004; a mask is doubly protected by an inner pod and an outer pod and the mask is carried into an exposure tool with the inner pod. Canon, Nikon and Entegris have started collaboration in 2005 and developed three types of EUV pod prototypes, alpha, beta and gamma. The gamma pods were evaluated by MIRAI-Selete and the superiority of the dual pod concept has been verified with many experimental data on shipping, storage and tool handling. The dual pod concept was standardized as SEMI E152-0709 "Mechanical Specification of EUV Pods for 150mm EUVL Reticles" in 2009. Canon, Nikon and Entegris have developed a new pod design compatible with SEMI E152; it has a Type A inner baseplate for uses with EUV exposure tools. The baseplate has two alignment windows, a window for a data matrix symbol and five pockets as the front edge grip exclusion volumes. In addition to the new features, there are some differences between the new SEMI compliant pod design and the former design "CNE-gamma", e.g. the material of the inner cover was changed to metal to reduce outgassing rate and the gap between the reticle and the side supports were widened to satisfy a requirement of the standard. MIRAI-Selete has evaluated the particle protective capability of the new SEMI compliant pods "cnPod" during shipping, storage and tool handling in vacuum and found the "cnPod" has the excellent particle protective capability and the dual pod concept can be used not only for EUVL pilot line but also for EUVL high volume manufacturing.

  8. Using Process and Inqury to Teach Content: Projectile Motion and Graphing

    ERIC Educational Resources Information Center

    Rhea, Marilyn; Lucido, Patricia; Gregerson-Malm, Cheryl

    2005-01-01

    These series of lessons uses the process of student inquiry to teach the concepts of force and motion identified in the National Science Education Standards for grades 5-8. The lesson plan also uses technology as a teaching tool through the use of interactive Web sites. The lessons are built on the 5-E format and feature imbedded assessments.

  9. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling | Office of Cancer Genomics

    Cancer.gov

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.

  10. National Software Reference Library (NSRL)

    National Institute of Standards and Technology Data Gateway

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  11. YouTube as a Teacher Training Tool: Information and Communication Technology as a Delivery Instrument for Professional Development

    ERIC Educational Resources Information Center

    Copper, Jenna; Semich, George

    2014-01-01

    High-stakes student testing, accountability for students' outcomes, new educational trends, and revised curricula and standards are only a few of the reasons that teachers must learn to teach complex material with skilled and intentional practices. As a result, professional development for educators is in critical demand. Nevertheless, research in…

  12. Lebron FINAL REPORT Standardized Procedures for Use of Nucleic Acid-Based Tools SERDP PROJECT ER-1561. Strategic Environmental Research and Development Program, Washington, DC, USA

    EPA Science Inventory

    Technical Approach A technology review on the status of MBTs was performed at the beginning of the project to determine MBT use in other industries. The review focused project goals and activities, which included: 1) Comparing qPCR to non-PCR-based enumeration methods to valid...

  13. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  14. An AFDX Network for Spacecraft Data Handling

    NASA Astrophysics Data System (ADS)

    Deredempt, Marie-Helene; Kollias, Vangelis; Sun, Zhili; Canamares, Ernest; Ricco, Philippe

    2014-08-01

    In aeronautical domain, ARINC-664 Part 7 specification (AFDX) [4] provides the enabling technology for interfacing equipment in Integrated Modular Avionics (IMA) architectures. The complementary part of AFDX for a complete interoperability - Time and Space Partitioning (ARINC 653) concepts [1]- was already studied as part of space domain ESA roadmap (i.e. IMA4Space project)Standardized IMA based architecture is already considered in aeronautical domain as more flexible, reliable and secure. Integration and validation become simple, using a common set of tools and data base and could be done by part on different means with the same definition (hardware and software test benches, flight control or alarm test benches, simulator and flight test installation).In some area, requirements in terms of data processing are quite similar in space domain and the concept could be applicable to take benefit of the technology itself and of the panel of hardware and software solutions and tools available on the market. The Mission project (Methodology and assessment for the applicability of ARINC-664 (AFDX) in Satellite/Spacecraft on-board communicatION networks), as an FP7 initiative for bringing terrestrial SME research into the space domain started to evaluate the applicability of the standard in space domain.

  15. Combinative Particle Size Reduction Technologies for the Production of Drug Nanocrystals

    PubMed Central

    Salazar, Jaime; Müller, Rainer H.; Möschwitzer, Jan P.

    2014-01-01

    Nanosizing is a suitable method to enhance the dissolution rate and therefore the bioavailability of poorly soluble drugs. The success of the particle size reduction processes depends on critical factors such as the employed technology, equipment, and drug physicochemical properties. High pressure homogenization and wet bead milling are standard comminution techniques that have been already employed to successfully formulate poorly soluble drugs and bring them to market. However, these techniques have limitations in their particle size reduction performance, such as long production times and the necessity of employing a micronized drug as the starting material. This review article discusses the development of combinative methods, such as the NANOEDGE, H 96, H 69, H 42, and CT technologies. These processes were developed to improve the particle size reduction effectiveness of the standard techniques. These novel technologies can combine bottom-up and/or top-down techniques in a two-step process. The combinative processes lead in general to improved particle size reduction effectiveness. Faster production of drug nanocrystals and smaller final mean particle sizes are among the main advantages. The combinative particle size reduction technologies are very useful formulation tools, and they will continue acquiring importance for the production of drug nanocrystals. PMID:26556191

  16. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    PubMed

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  17. NextGen Technologies on the FAA's Standard Terminal Automation Replacement System

    NASA Technical Reports Server (NTRS)

    Witzberger, Kevin; Swenson, Harry; Martin, Lynne; Lin, Melody; Cheng, Jinn-Hwei

    2014-01-01

    This paper describes the integration, evaluation, and results from a high-fidelity human-in-the-loop (HITL) simulation of key NASA Air Traffic Management Technology Demonstration - 1 (ATD- 1) technologies implemented in an enhanced version of the FAA's Standard Terminal Automation Replacement System (STARS) platform. These ATD-1 technologies include: (1) a NASA enhanced version of the FAA's Time-Based Flow Management, (2) a NASA ground-based automation technology known as controller-managed spacing (CMS), and (3) a NASA advanced avionics airborne technology known as flight-deck interval management (FIM). These ATD-1 technologies have been extensively tested in large-scale HITL simulations using general-purpose workstations to study air transportation technologies. These general purpose workstations perform multiple functions and are collectively referred to as the Multi-Aircraft Control System (MACS). Researchers at NASA Ames Research Center and Raytheon collaborated to augment the STARS platform by including CMS and FIM advisory tools to validate the feasibility of integrating these automation enhancements into the current FAA automation infrastructure. NASA Ames acquired three STARS terminal controller workstations, and then integrated the ATD-1 technologies. HITL simulations were conducted to evaluate the ATD-1 technologies when using the STARS platform. These results were compared with the results obtained when the ATD-1 technologies were tested in the MACS environment. Results collected from the numerical data show acceptably minor differences, and, together with the subjective controller questionnaires showing a trend towards preferring STARS, validate the ATD-1/STARS integration.

  18. IgSimulator: a versatile immunosequencing simulator.

    PubMed

    Safonova, Yana; Lapidus, Alla; Lill, Jennie

    2015-10-01

    The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Validation of next generation sequencing technologies in comparison to current diagnostic gold standards for BRAF, EGFR and KRAS mutational analysis.

    PubMed

    McCourt, Clare M; McArt, Darragh G; Mills, Ken; Catherwood, Mark A; Maxwell, Perry; Waugh, David J; Hamilton, Peter; O'Sullivan, Joe M; Salto-Tellez, Manuel

    2013-01-01

    Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.

  20. Portable Diagnostics Technology Assessment for Space Missions. Part 2; Market Survey

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Chait, Arnon

    2010-01-01

    A mission to Mars of several years duration requires more demanding standards for all onboard instruments than a 6-month mission to the Moon or the International Space Station. In Part 1, we evaluated generic technologies and suitability to NASA needs. This prior work considered crew safety, device maturity and flightworthiness, resource consumption, and medical value. In Part 2, we continue the study by assessing the current marketplace for reliable Point-of-Care diagnostics. The ultimate goal of this project is to provide a set of objective analytical tools to suggest efficient strategies for reaching specific medical targets for any given space mission as program needs, technological development, and scientific understanding evolve.

  1. Development of a site analysis tool for distributed wind projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Shawn

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimatesmore » of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.« less

  2. Enhancing Lay Counselor Capacity to Improve Patient Outcomes with Multimedia Technology.

    PubMed

    Robbins, Reuben N; Mellins, Claude A; Leu, Cheng-Shiun; Rowe, Jessica; Warne, Patricia; Abrams, Elaine J; Witte, Susan; Stein, Dan J; Remien, Robert H

    2015-06-01

    Multimedia technologies offer powerful tools to increase capacity of health workers to deliver standardized, effective, and engaging antiretroviral medication adherence counseling. Masivukeni-is an innovative multimedia-based, computer-driven, lay counselor-delivered intervention designed to help people living with HIV in resource-limited settings achieve optimal adherence. This pilot study examined medication adherence and key psychosocial outcomes among 55 non-adherent South African HIV+ patients, on antiretroviral therapy (ART) for at least 6 months, who were randomized to receive either Masivukeni or standard of care (SOC) counseling for ART non-adherence. At baseline, there were no significant differences between the SOC and Masivukeni groups on any outcome variables. At post-intervention (approximately 5-6 weeks after baseline), -clinic-based pill count adherence data available for 20 participants (10 per intervention arm) showed a 10 % improvement for-participants and a decrease of 8 % for SOC participants. Masivukeni participants reported significantly more positive attitudes towards disclosure and medication social support, less social rejection, and better clinic-patient relationships than did SOC participants. Masivukeni shows promise to promote optimal adherence and provides preliminary evidence that multimedia, computer-based technology can help lay counselors offer better adherence counseling than standard approaches.

  3. Enhancing Lay Counselor Capacity to Improve Patient Outcomes with Multimedia Technology

    PubMed Central

    Robbins, Reuben N.; Mellins, Claude A.; Leu, Cheng-Shiun; Rowe, Jessica; Warne, Patricia; Abrams, Elaine J.; Witte, Susan; Stein, Dan J.; Remien, Robert H.

    2015-01-01

    Multimedia technologies offer powerful tools to increase capacity of health workers to deliver standardized, effective, and engaging antiretroviral medication adherence counseling. Masivukeni is an innovative multimedia-based, computer-driven, lay counselor-delivered intervention designed to help people living with HIV in resource-limited settings achieve optimal adherence. This pilot study examined medication adherence and key psychosocial outcomes among 55 non-adherent South African HIV+ patients, on ART for at least 6 months, who were randomized to receive either Masivukeni or standard of care (SOC) counseling for ART non-adherence. At baseline, there were no significant differences between the SOC and Masivukeni groups on any outcome variables. At post-intervention (approximately 5–6 weeks after baseline), clinic-based pill count adherence data available for 20 participants (10 per intervention arm) showed a 10% improvement for Masivukeni participants and a decrease of 8% for SOC participants. Masivukeni participants reported significantly more positive attitudes towards disclosure and medication social support, less social rejection, and better clinic-patient relationships than did SOC participants. Masivukeni shows promise to promote optimal adherence and provides preliminary evidence that multimedia, computer-based technology can help lay counselors offer better adherence counseling than standard approaches. PMID:25566763

  4. An evaluation of remote sensing technologies for the detection of fugitive contamination at selected Superfund hazardous waste sites in Pennsylvania

    USGS Publications Warehouse

    Slonecker, E. Terrence; Fisher, Gary B.

    2014-01-01

    This evaluation was conducted to assess the potential for using both traditional remote sensing, such as aerial imagery, and emerging remote sensing technology, such as hyperspectral imaging, as tools for postclosure monitoring of selected hazardous waste sites. Sixteen deleted Superfund (SF) National Priorities List (NPL) sites in Pennsylvania were imaged with a Civil Air Patrol (CAP) Airborne Real-Time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) sensor between 2009 and 2012. Deleted sites are those sites that have been remediated and removed from the NPL. The imagery was processed to radiance and atmospherically corrected to relative reflectance with standard software routines using the Environment for Visualizing Imagery (ENVI, ITT–VIS, Boulder, Colorado) software. Standard routines for anomaly detection, endmember collection, vegetation stress, and spectral analysis were applied.

  5. Nursing informatics, outcomes, and quality improvement.

    PubMed

    Charters, Kathleen G

    2003-08-01

    Nursing informatics actively supports nursing by providing standard language systems, databases, decision support, readily accessible research results, and technology assessments. Through normalized datasets spanning an entire enterprise or other large demographic, nursing informatics tools support improvement of healthcare by answering questions about patient outcomes and quality improvement on an enterprise scale, and by providing documentation for business process definition, business process engineering, and strategic planning. Nursing informatics tools provide a way for advanced practice nurses to examine their practice and the effect of their actions on patient outcomes. Analysis of patient outcomes may lead to initiatives for quality improvement. Supported by nursing informatics tools, successful advance practice nurses leverage their quality improvement initiatives against the enterprise strategic plan to gain leadership support and resources.

  6. Software-assisted stacking of gene modules using GoldenBraid 2.0 DNA-assembly framework.

    PubMed

    Vazquez-Vilar, Marta; Sarrion-Perdigones, Alejandro; Ziarsolo, Peio; Blanca, Jose; Granell, Antonio; Orzaez, Diego

    2015-01-01

    GoldenBraid (GB) is a modular DNA assembly technology for plant multigene engineering based on type IIS restriction enzymes. GB speeds up the assembly of transcriptional units from standard genetic parts and facilitates the stacking of several genes within the same T-DNA in few days. GBcloning is software-assisted with a set of online tools. The GBDomesticator tool assists in the adaptation of DNA parts to the GBstandard. The combination of GB-adapted parts to build new transcriptional units is assisted by the GB TU Assembler tool. Finally, the assembly of multigene modules is simulated by the GB Binary Assembler. All the software tools are available at www.gbcloning.org . Here, we describe in detail the assembly methodology to create a multigene construct with three transcriptional units for polyphenol metabolic engineering in plants.

  7. Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.

  8. APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Sandra Brown; Ellen Hawes

    2003-09-01

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less

  9. Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Sandra Brown; Patrick Gonzalez

    2004-07-10

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: remote sensing for carbon analysis; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less

  10. Secondary Organic Aerosol Production from Gasoline Vehicle Exhaust: Effects of Engine Technology, Cold Start, and Emission Certification Standard.

    PubMed

    Zhao, Yunliang; Lambe, Andrew T; Saleh, Rawad; Saliba, Georges; Robinson, Allen L

    2018-02-06

    Secondary organic aerosol (SOA) formation from dilute exhaust from 16 gasoline vehicles was investigated using a potential aerosol mass (PAM) oxidation flow reactor during chassis dynamometer testing using the cold-start unified cycle (UC). Ten vehicles were equipped with gasoline direct injection engines (GDI vehicles) and six with port fuel injection engines (PFI vehicles) certified to a wide range of emissions standards. We measured similar SOA production from GDI and PFI vehicles certified to the same emissions standard; less SOA production from vehicles certified to stricter emissions standards; and, after accounting for differences in gas-particle partitioning, similar effective SOA yields across different engine technologies and certification standards. Therefore the ongoing, dramatic shift from PFI to GDI vehicles in the United States should not alter the contribution of gasoline vehicles to ambient SOA and the natural replacement of older vehicles with newer ones certified to stricter emissions standards should reduce atmospheric SOA levels. Compared to hot operations, cold-start exhaust had lower effective SOA yields, but still contributed more SOA overall because of substantially higher organic gas emissions. We demonstrate that the PAM reactor can be used as a screening tool for vehicle SOA production by carefully accounting for the effects of the large variations in emission rates.

  11. Spreadsheets for Analyzing and Optimizing Space Missions

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick

    2009-01-01

    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.

  12. Current progress in patient-specific modeling

    PubMed Central

    2010-01-01

    We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236

  13. Lessons Learned from Conducting a K-12 Project to Revitalize Achievement by Using Instrumentation in Science Education

    ERIC Educational Resources Information Center

    Kapila, Vikram; Iskander, Magued

    2014-01-01

    A student's first introduction to engineering and technology is typically through high school science labs. Unfortunately, in many high schools, science labs often make use of antiquated tools that fail to deliver exciting lab content. As a result, many students are turned off by science, fail to excel on standardized science exams, and do not…

  14. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  15. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    NASA Astrophysics Data System (ADS)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.

  16. Enriching and improving the quality of linked data with GIS

    NASA Astrophysics Data System (ADS)

    Iwaniak, Adam; Kaczmarek, Iwona; Strzelecki, Marek; Lukowicz, Jaromar; Jankowski, Piotr

    2016-06-01

    Standardization of methods for data exchange in GIS has along history predating the creation of World Wide Web. The advent of World Wide Web brought the emergence of new solutions for data exchange and sharing including; more recently, standards proposed by the W3C for data exchange involving Semantic Web technologies and linked data. Despite the growing interest in integration, GIS and linked data are still two separate paradigms for describing and publishing spatial data on the Web. At the same time, both paradigms offer complementary ways of representing real world phenomena and means of analysis using different processing functions. The complementarity of linked data and GIS can be leveraged to synergize both paradigms resulting in richer data content and more powerful inferencing. The article presents an approach aimed at integrating linked data with GIS. The approach relies on the use of GIS tools for integration, verification and enrichment of linked data. The GIS tools are employed to enrich linked data by furnishing access to collection of data resources, defining relationship between data resources, and subsequently facilitating GIS data integration with linked data. The proposed approach is demonstrated with examples using data from DBpedia, OSM, and tools developed by the authors for standard GIS software.

  17. Current limitations into the application of virtual reality to mental health research.

    PubMed

    Huang, M P; Alessi, N E

    1998-01-01

    Virtual Reality (VR) environments have significant potential as a tool in mental health research, but are limited by technical factors and by mental health research factors. Technical difficulties include cost and complexity of virtual environment creation. Mental health research difficulties include current inadequacy of standards to specify needed details for virtual environment design. Technical difficulties are disappearing with technological advances, but the mental health research difficulties will take a concerted effort to overcome. Some of this effort will need to be directed at the formation of collaborative projects and standards for how such collaborations should proceed.

  18. Optical levitation of a mirror for reaching the standard quantum limit.

    PubMed

    Michimura, Yuta; Kuwahara, Yuya; Ushiba, Takafumi; Matsumoto, Nobuyuki; Ando, Masaki

    2017-06-12

    We propose a new method to optically levitate a macroscopic mirror with two vertical Fabry-Pérot cavities linearly aligned. This configuration gives the simplest possible optical levitation in which the number of laser beams used is the minimum of two. We demonstrate that reaching the standard quantum limit (SQL) of a displacement measurement with our system is feasible with current technology. The cavity geometry and the levitated mirror parameters are designed to ensure that the Brownian vibration of the mirror surface is smaller than the SQL. Our scheme provides a promising tool for testing macroscopic quantum mechanics.

  19. Optical levitation of a mirror for reaching the standard quantum limit

    NASA Astrophysics Data System (ADS)

    Michimura, Yuta; Kuwahara, Yuya; Ushiba, Takafumi; Matsumoto, Nobuyuki; Ando, Masaki

    2017-06-01

    We propose a new method to optically levitate a macroscopic mirror with two vertical Fabry-P{\\'e}rot cavities linearly aligned. This configuration gives the simplest possible optical levitation in which the number of laser beams used is the minimum of two. We demonstrate that reaching the standard quantum limit (SQL) of a displacement measurement with our system is feasible with current technology. The cavity geometry and the levitated mirror parameters are designed to ensure that the Brownian vibration of the mirror surface is smaller than the SQL. Our scheme provides a promising tool for testing macroscopic quantum mechanics.

  20. Direct write fabrication of waveguides and interconnects for optical printed wiring boards

    NASA Astrophysics Data System (ADS)

    Dingeldein, Joseph C.

    Current copper based circuit technology is becoming a limiting factor in high speed data transfer applications as processors are improving at a faster rate than are developments to increase on board data transfer. One solution is to utilize optical waveguide technology to overcome these bandwidth and loss restrictions. The use of this technology virtually eliminates the heat and cross-talk loss seen in copper circuitry, while also operating at a higher bandwidth. Transitioning current fabrication techniques from small scale laboratory environments to large scale manufacturing presents significant challenges. Optical-to-electrical connections and out-of-plane coupling are significant hurdles in the advancement of optical interconnects. The main goals of this research are the development of direct write material deposition and patterning tools for the fabrication of waveguide systems on large substrates, and the development of out-of-plane coupler components compatible with standard fiber optic cabling. Combining these elements with standard printed circuit boards allows for the fabrication of fully functional optical-electrical-printed-wiring-boards (OEPWBs). A direct dispense tool was designed, assembled, and characterized for the repeatable dispensing of blanket waveguide layers over a range of thicknesses (25-225 μm), eliminating waste material and affording the ability to utilize large substrates. This tool was used to directly dispense multimode waveguide cores which required no UV definition or development. These cores had circular cross sections and were comparable in optical performance to lithographically fabricated square waveguides. Laser direct writing is a non-contact process that allows for the dynamic UV patterning of waveguide material on large substrates, eliminating the need for high resolution masks. A laser direct write tool was designed, assembled, and characterized for direct write patterning waveguides that were comparable in quality to those produced using standard lithographic practices (0.047 dB/cm loss for laser written waveguides compared to 0.043 dB/cm for lithographic waveguides). Straight waveguides, and waveguide turns were patterned at multimode and single mode sizes, and the process was characterized and documented. Support structures such as angled reflectors and vertical posts were produced, showing the versatility of the laser direct write tool. Commercially available components were implanted into the optical layer for out-of-plane routing of the optical signals. These devices featured spherical lenses on the input and output sides of a total internal reflection (TIR) mirror, as well as alignment pins compatible with standard MT design. Fully functional OEPWBs were fabricated featuring input and output out-of-plane optical signal routing with total optical losses not exceeding 10 dB. These prototypes survived thermal cycling (-40°C to 85°C) and humidity exposure (95±4% humidity), showing minimal degradation in optical performance. Operational failure occurred after environmental aging life testing at 110°C for 216 hours.

  1. Post place and route design-technology co-optimization for scaling at single-digit nodes with constant ground rules

    NASA Astrophysics Data System (ADS)

    Mattii, Luca; Milojevic, Dragomir; Debacker, Peter; Berekovic, Mladen; Sherazi, Syed Muhammad Yasser; Chava, Bharani; Bardon, Marie Garcia; Schuddinck, Pieter; Rodopoulos, Dimitrios; Baert, Rogier; Gerousis, Vassilios; Ryckaert, Julien; Raghavan, Praveen

    2018-01-01

    Standard-cell design, technology choices, and place and route (P&R) efficiency are deeply interrelated in CMOS technology nodes below 10 nm, where lower number of tracks cells and higher pin densities pose increasingly challenging problems to the router in terms of congestion and pin accessibility. To evaluate and downselect the best solutions, a holistic design-technology co-optimization approach leveraging state-of-the-art P&R tools is thus necessary. We adopt such an approach using the imec N7 technology platform, with contacted poly pitch of 42 nm and tightest metal pitch of 32 nm, by comparing post P&R area of an IP block for different standard cell configurations, technology options, and cell height. Keeping the technology node and the set of ground rules unchanged, we demonstrate that a careful combination of these solutions can enable area gains of up to 50%, comparable with the area benefits of migrating to another node. We further demonstrate that these area benefits can be achieved at isoperformance with >20% reduced power. As at the end of the CMOS roadmap, conventional scaling enacted through pitch reduction is made more and more challenging by constraints imposed by lithography limits, material resistivity, manufacturability, and ultimately wafer cost, the approach shown herein offers a valid, attractive, and low-cost alternative.

  2. Using Avatars to Model Weight Loss Behaviors: Participant Attitudes and Technology Development

    PubMed Central

    Napolitano, Melissa A.; Hayes, Sharon; Russo, Giuseppe; Muresu, Debora; Giordano, Antonio; Foster, Gary D.

    2013-01-01

    Background: Virtual reality and other avatar-based technologies are potential methods for demonstrating and modeling weight loss behaviors. This study examined avatar-based technology as a tool for modeling weight loss behaviors. Methods: This study consisted of two phases: (1) an online survey to obtain feedback about using avatars for modeling weight loss behaviors and (2) technology development and usability testing to create an avatar-based technology program for modeling weight loss behaviors. Results: Results of phase 1 (n = 128) revealed that interest was high, with 88.3% stating that they would participate in a program that used an avatar to help practice weight loss skills in a virtual environment. In phase 2, avatars and modules to model weight loss skills were developed. Eight women were recruited to participate in a 4-week usability test, with 100% reporting they would recommend the program and that it influenced their diet/exercise behavior. Most women (87.5%) indicated that the virtual models were helpful. After 4 weeks, average weight loss was 1.6 kg (standard deviation = 1.7). Conclusion: This investigation revealed a high level of interest in an avatar-based program, with formative work indicating promise. Given the high costs associated with in vivo exposure and practice, this study demonstrates the potential use of avatar-based technology as a tool for modeling weight loss behaviors. PMID:23911189

  3. A Web-based cost-effective training tool with possible application to brain injury rehabilitation.

    PubMed

    Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C

    2004-06-01

    Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.

  4. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  5. Technical Progress Report on Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bill Stanley; Patrick Gonzalez; Sandra Brown

    2006-06-30

    The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool. Work is being carried out in Brazil, Belize, Chile, Peru and the USA.« less

  6. A zero-footprint 3D visualization system utilizing mobile display technology for timely evaluation of stroke patients

    NASA Astrophysics Data System (ADS)

    Park, Young Woo; Guo, Bing; Mogensen, Monique; Wang, Kevin; Law, Meng; Liu, Brent

    2010-03-01

    When a patient is accepted in the emergency room suspected of stroke, time is of the utmost importance. The infarct brain area suffers irreparable damage as soon as three hours after the onset of stroke symptoms. A CT scan is one of standard first line of investigations with imaging and is crucial to identify and properly triage stroke cases. The availability of an expert Radiologist in the emergency environment to diagnose the stroke patient in a timely manner only increases the challenges within the clinical workflow. Therefore, a truly zero-footprint web-based system with powerful advanced visualization tools for volumetric imaging including 2D. MIP/MPR, 3D display can greatly facilitate this dynamic clinical workflow for stroke patients. Together with mobile technology, the proper visualization tools can be delivered at the point of decision anywhere and anytime. We will present a small pilot project to evaluate the use of mobile technologies using devices such as iPhones in evaluating stroke patients. The results of the evaluation as well as any challenges in setting up the system will also be discussed.

  7. Metabolon, Inc.

    PubMed

    Ryals, John; Lawton, Kay; Stevens, Daniel; Milburn, Michael

    2007-07-01

    Metabolon is an emerging technology company developing proprietary analytical methods and software for biomarker discovery using metabolomics. The company's aim is to measure all small molecules (<1500 Da) in a biological sample. These small-molecule compounds include biochemicals of cellular metabolism and xenobiotics from diet and environment. Our proprietary mLIMStrade mark system contains advanced metabolomic software and automated data-processing tools that use a variety of data-analysis and quality-control algorithms to convert raw mass-spectrometry data to identified, quantitated compounds. Metabolon's primary focus is a fee-for-service business that exploits this technology for pharmaceutical and biotechnology companies, with additional clients in the consumer goods, cosmetics and agricultural industries. Fee-for-service studies are often collaborations with groups that employ a variety of technologies for biomarker discovery. Metabolon's goal is to develop technology that will automatically analyze any sample for the small-molecule components present and become a standard technology for applications in health and related sciences.

  8. Virtual microscopy and digital cytology: state of the art.

    PubMed

    Giansanti, Daniele; Grigioni, Mauro; D'Avenio, Giuseppe; Morelli, Sandra; Maccioni, Giovanni; Bondi, Arrigo; Giovagnoli, Maria Rosaria

    2010-01-01

    The paper approaches a new technological scenario relevant for the introduction of the digital cytology (D-CYT) in the health service. A detailed analysis of the state of the art on the status of the introduction of D-CYT in the hospital and more in general in the dispersed territory has been conducted. The analysis was conducted in a form of review and was arranged into two parts: the first part focused on the technological tools needed to carry out a successful service (client server architectures, e-learning, quality assurance issues); the second part focused on issues oriented to help the introduction and evaluation of the technology (specific training in D-CYT, health technology assessment in-routine application, data format standards and picture archiving computerized systems (PACS) implementation, image quality assessment, strategies of navigation, 3D-virtual-reality potentialities). The work enlightens future scenarios of actions relevant for the introduction of the technology.

  9. I-line stepper based overlay evaluation method for wafer bonding applications

    NASA Astrophysics Data System (ADS)

    Kulse, P.; Sasai, K.; Schulz, K.; Wietstruck, M.

    2018-03-01

    In the last decades the semiconductor technology has been driven by Moore's law leading to high performance CMOS technologies with feature sizes of less than 10 nm [1]. It has been pointed out that not only scaling but also the integration of novel components and technology modules into CMOS/BiCMOS technologies is becoming more attractive to realize smart and miniaturized systems [2]. Driven by new applications in the area of communication, health and automation, new components and technology modules such as BiCMOS embedded RF-MEMS, high-Q passives, Sibased microfluidics and InP-SiGe BiCMOS heterointegration have been demonstrated [3-6]. In contrast to standard VLSI processes fabricated on front side of the silicon wafer, these new technology modules additionally require to process the backside of the wafer; thus require an accurate alignment between the front and backside of the wafer. In previous work an advanced back to front side alignment technique and implementation into IHP's 0.25/0.13 µm high performance SiGe:C BiCMOS backside process module has been presented [7]. The developed technique enables a high resolution and accurate lithography on the backside of BiCMOS wafer for additional backside processing. In addition to the aforementioned back side process technologies, new applications like Through-Silicon Vias (TSV) for interposers and advanced substrate technologies for 3D heterogeneous integration demand not only single wafer fabrication but also processing of wafer stacks provided by temporary and permanent wafer bonding [8-9]. In this work, the non-contact infrared alignment system of the Nikon® i-line Stepper NSR-SF150 for both alignment and the overlay determination of bonded wafer stacks with embedded alignment marks are used to achieve an accurate alignment between the different wafer sides. The embedded field image alignment (FIA) marks of the interface and the device wafer top layer are measured in a single measurement job. By taking the offsets between all different FIA's into account, after correcting the wafer rotation induced FIA position errors, hence an overlay for the stacked wafers can be determined. The developed approach has been validated by a standard front side resist in resist experiment. After the successful validation of the developed technique, special wafer stacks with FIA alignment marks in the bonding interface are fabricated and exposed. Following overlay calculation shows an overlay of less than 200 nm, which enables very accurate process condition for highly scaled TSV integration and advanced substrate integration into IHP's 0.25/0.13 µm SiGe:C BiCMOS technology. The developed technique also allows using significantly smaller alignment marks (i.e. standard FIA alignment marks). Furthermore, the presented method is used, in case of wafer bow related overlay tool problems, for the overlay evaluation of the last two metal layers from production wafers prepared in IHP's standard 0.25/0.13 µm SiGe:C BiCMOS technology. In conclusion, the exposure and measurement job can be done with the same tool, minimizing the back to front side/interface top layer misalignment which leads to a significant device performance improvement of backside/TSV integrated components and technologies.

  10. Standard development at the Human Variome Project.

    PubMed

    Smith, Timothy D; Vihinen, Mauno

    2015-01-01

    The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. © The Author(s) 2015. Published by Oxford University Press.

  11. Standard development at the Human Variome Project

    PubMed Central

    Smith, Timothy D.; Vihinen, Mauno

    2015-01-01

    The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. PMID:25818894

  12. DNA fingerprinting, DNA barcoding, and next generation sequencing technology in plants.

    PubMed

    Sucher, Nikolaus J; Hennell, James R; Carles, Maria C

    2012-01-01

    DNA fingerprinting of plants has become an invaluable tool in forensic, scientific, and industrial laboratories all over the world. PCR has become part of virtually every variation of the plethora of approaches used for DNA fingerprinting today. DNA sequencing is increasingly used either in combination with or as a replacement for traditional DNA fingerprinting techniques. A prime example is the use of short, standardized regions of the genome as taxon barcodes for biological identification of plants. Rapid advances in "next generation sequencing" (NGS) technology are driving down the cost of sequencing and bringing large-scale sequencing projects into the reach of individual investigators. We present an overview of recent publications that demonstrate the use of "NGS" technology for DNA fingerprinting and DNA barcoding applications.

  13. Deep Aquifer Remediation Tools (DARTs): A new technology for ground-water remediation

    USGS Publications Warehouse

    Naftz, David L.; Davis, James A.

    1999-01-01

    Potable ground-water supplies throughout the world are contaminated or threatened by advancing plumes containing radionuclides, metals, and organic compounds. Currently (1999), the most widely used method of ground-water remediation is a combination of extraction, ex-situ treatment, and discharge of the treated water, commonly known as pump and treat. Pump-and-treat methods are costly and often ineffective in meeting long-term protection standards (Travis and Doty, 1990; Gillham and Burris, 1992; National Research Council, 1994). This fact sheet describes a new and potentially cost-effective technology for removal of organic and inorganic contaminants from ground water. The U.S. Geological Survey (USGS) is currently exploring the possibilities of obtaining a U.S. Patent for this technology.

  14. Wearable Technology for Global Surgical Teleproctoring.

    PubMed

    Datta, Néha; MacQueen, Ian T; Schroeder, Alexander D; Wilson, Jessica J; Espinoza, Juan C; Wagner, Justin P; Filipi, Charles J; Chen, David C

    2015-01-01

    In underserved communities around the world, inguinal hernias represent a significant burden of surgically-treatable disease. With traditional models of international surgical assistance limited to mission trips, a standardized framework to strengthen local healthcare systems is lacking. We established a surgical education model using web-based tools and wearable technology to allow for long-term proctoring and assessment in a resource-poor setting. This is a feasibility study examining wearable technology and web-based performance rating tools for long-term proctoring in an international setting. Using the Lichtenstein inguinal hernia repair as the index surgical procedure, local surgeons in Paraguay and Brazil were trained in person by visiting international expert trainers using a formal, standardized teaching protocol. Surgeries were captured in real-time using Google Glass and transmitted wirelessly to an online video stream, permitting real-time observation and proctoring by mentoring surgeon experts in remote locations around the world. A system for ongoing remote evaluation and support by experienced surgeons was established using the Lichtenstein-specific Operative Performance Rating Scale. Data were collected from 4 sequential training operations for surgeons trained in both Paraguay and Brazil. With continuous internet connectivity, live streaming of the surgeries was successful. The Operative Performance Rating Scale was immediately used after each operation. Both surgeons demonstrated proficiency at the completion of the fourth case. A sustainable model for surgical training and proctoring to empower local surgeons in resource-poor locations and "train trainers" is feasible with wearable technology and web-based communication. Capacity building by maximizing use of local resources and expertise offers a long-term solution to reducing the global burden of surgically-treatable disease. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  15. Reference metrology in a research fab: the NIST clean calibrations thrust

    NASA Astrophysics Data System (ADS)

    Dixson, Ronald; Fu, Joe; Orji, Ndubuisi; Renegar, Thomas; Zheng, Alan; Vorburger, Theodore; Hilton, Al; Cangemi, Marc; Chen, Lei; Hernandez, Mike; Hajdaj, Russell; Bishop, Michael; Cordes, Aaron

    2009-03-01

    In 2004, the National Institute of Standards and Technology (NIST) commissioned the Advanced Measurement Laboratory (AML) - a state-of-the-art, five-wing laboratory complex for leading edge NIST research. The NIST NanoFab - a 1765 m2 (19,000 ft2) clean room with 743 m2 (8000 ft2) of class 100 space - is the anchor of this facility and an integral component of the new Center for Nanoscale Science and Technology (CNST) at NIST. Although the CNST/NanoFab is a nanotechnology research facility with a different strategic focus than a current high volume semiconductor fab, metrology tools still play an important role in the nanofabrication research conducted here. Some of the metrology tools available to users of the NanoFab include stylus profiling, scanning electron microscopy (SEM), and atomic force microscopy (AFM). Since 2001, NIST has collaborated with SEMATECH to implement a reference measurement system (RMS) using critical dimension atomic force microscopy (CD-AFM). NIST brought metrology expertise to the table and SEMATECH provided access to leading edge metrology tools in their clean room facility in Austin. Now, in the newly launched "clean calibrations" thrust at NIST, we are implementing the reference metrology paradigm on several tools in the CNST/NanoFab. Initially, we have focused on calibration, monitoring, and uncertainty analysis for a three-tool set consisting of a stylus profiler, an SEM, and an AFM. Our larger goal is the development of new and supplemental calibrations and standards that will benefit from the Class 100 environment available in the NanoFab and offering our customers calibration options that do not require exposing their samples to less clean environments. Toward this end, we have completed a preliminary evaluation of the performance of these instruments. The results of these evaluations suggest that the achievable uncertainties are generally consistent with our measurement goals.

  16. Minimum information required for a DMET experiment reporting.

    PubMed

    Kumuthini, Judit; Mbiyavanga, Mamana; Chimusa, Emile R; Pathak, Jyotishman; Somervuo, Panu; Van Schaik, Ron Hn; Dolzan, Vita; Mizzi, Clint; Kalideen, Kusha; Ramesar, Raj S; Macek, Milan; Patrinos, George P; Squassina, Alessio

    2016-09-01

    To provide pharmacogenomics reporting guidelines, the information and tools required for reporting to public omic databases. For effective DMET data interpretation, sharing, interoperability, reproducibility and reporting, we propose the Minimum Information required for a DMET Experiment (MIDE) reporting. MIDE provides reporting guidelines and describes the information required for reporting, data storage and data sharing in the form of XML. The MIDE guidelines will benefit the scientific community with pharmacogenomics experiments, including reporting pharmacogenomics data from other technology platforms, with the tools that will ease and automate the generation of such reports using the standardized MIDE XML schema, facilitating the sharing, dissemination, reanalysis of datasets through accessible and transparent pharmacogenomics data reporting.

  17. Free and open source enabling technologies for patient-centric, guideline-based clinical decision support: a survey.

    PubMed

    Leong, T Y; Kaiser, K; Miksch, S

    2007-01-01

    Guideline-based clinical decision support is an emerging paradigm to help reduce error, lower cost, and improve quality in evidence-based medicine. The free and open source (FOS) approach is a promising alternative for delivering cost-effective information technology (IT) solutions in health care. In this paper, we survey the current FOS enabling technologies for patient-centric, guideline-based care, and discuss the current trends and future directions of their role in clinical decision support. We searched PubMed, major biomedical informatics websites, and the web in general for papers and links related to FOS health care IT systems. We also relied on our background and knowledge for specific subtopics. We focused on the functionalities of guideline modeling tools, and briefly examined the supporting technologies for terminology, data exchange and electronic health record (EHR) standards. To effectively support patient-centric, guideline-based care, the computerized guidelines and protocols need to be integrated with existing clinical information systems or EHRs. Technologies that enable such integration should be accessible, interoperable, and scalable. A plethora of FOS tools and techniques for supporting different knowledge management and quality assurance tasks involved are available. Many challenges, however, remain in their implementation. There are active and growing trends of deploying FOS enabling technologies for integrating clinical guidelines, protocols, and pathways into the main care processes. The continuing development and maturation of such technologies are likely to make increasingly significant contributions to patient-centric, guideline-based clinical decision support.

  18. Process Improvement in a Radically Changing Organization

    NASA Technical Reports Server (NTRS)

    Varga, Denise M.; Wilson, Barbara M.

    2007-01-01

    This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.

  19. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring

    PubMed Central

    Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit

    2016-01-01

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916

  20. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring.

    PubMed

    Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit

    2016-08-23

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.

  1. Tackling the 2nd V: Big Data, Variety and the Need for Representation Consistency

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.

    2016-12-01

    While Big Data technologies are transforming our ability to analyze ever larger volumes of Earth science data, practical constraints continue to limit our ability to compare data across datasets from different sources in an efficient and robust manner. Within a single data collection, invariants such as file format, grid type, and spatial resolution greatly simplify many types of analysis (often implicitly). However, when analysis combines data across multiple data collections, researchers are generally required to implement data transformations (i.e., "data preparation") to provide appropriate invariants. These transformation include changing of file formats, ingesting into a database, and/or regridding to a common spatial representation, and they can either be performed once, statically, or each time the data is accessed. At the very least, this process is inefficient from the perspective of the community as each team selects its own representation and privately implements the appropriate transformations. No doubt there are disadvantages to any "universal" representation, but we posit that major benefits would be obtained if a suitably flexible spatial representation could be standardized along with tools for transforming to/from that representation. We regard this as part of the historic trend in data publishing. Early datasets used ad hoc formats and lacked metadata. As better tools evolved, published data began to use standardized formats (e.g., HDF and netCDF) with attached metadata. We propose that the modern need to perform analysis across data sets should drive a new generation of tools that support a standardized spatial representation. More specifically, we propose the hierarchical triangular mesh (HTM) as a suitable "generic" resolution that permits standard transformations to/from native representations in use today, as well as tools to convert/regrid existing datasets onto that representation.

  2. Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology

    NASA Astrophysics Data System (ADS)

    Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya

    2017-09-01

    Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.

  3. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  4. Navigational Guidance and Ablation Planning Tools for Interventional Radiology.

    PubMed

    Sánchez, Yadiel; Anvari, Arash; Samir, Anthony E; Arellano, Ronald S; Prabhakar, Anand M; Uppot, Raul N

    Image-guided biopsy and ablation relies on successful identification and targeting of lesions. Currently, image-guided procedures are routinely performed under ultrasound, fluoroscopy, magnetic resonance imaging, or computed tomography (CT) guidance. However, these modalities have their limitations including inadequate visibility of the lesion, lesion or organ or patient motion, compatibility of instruments in an magnetic resonance imaging field, and, for CT and fluoroscopy cases, radiation exposure. Recent advances in technology have resulted in the development of a new generation of navigational guidance tools that can aid in targeting lesions for biopsy or ablations. These navigational guidance tools have evolved from simple hand-held trajectory guidance tools, to electronic needle visualization, to image fusion, to the development of a body global positioning system, to growth in cone-beam CT, and to ablation volume planning. These navigational systems are promising technologies that not only have the potential to improve lesion targeting (thereby increasing diagnostic yield of a biopsy or increasing success of tumor ablation) but also have the potential to decrease radiation exposure to the patient and staff, decrease procedure time, decrease the sedation requirements, and improve patient safety. The purpose of this article is to describe the challenges in current standard image-guided techniques, provide a definition and overview for these next-generation navigational devices, and describe the current limitations of these, still evolving, next-generation navigational guidance tools. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. FDA's Activities Supporting Regulatory Application of "Next Gen" Sequencing Technologies.

    PubMed

    Wilson, Carolyn A; Simonyan, Vahan

    2014-01-01

    Applications of next-generation sequencing (NGS) technologies require availability and access to an information technology (IT) infrastructure and bioinformatics tools for large amounts of data storage and analyses. The U.S. Food and Drug Administration (FDA) anticipates that the use of NGS data to support regulatory submissions will continue to increase as the scientific and clinical communities become more familiar with the technologies and identify more ways to apply these advanced methods to support development and evaluation of new biomedical products. FDA laboratories are conducting research on different NGS platforms and developing the IT infrastructure and bioinformatics tools needed to enable regulatory evaluation of the technologies and the data sponsors will submit. A High-performance Integrated Virtual Environment, or HIVE, has been launched, and development and refinement continues as a collaborative effort between the FDA and George Washington University to provide the tools to support these needs. The use of a highly parallelized environment facilitated by use of distributed cloud storage and computation has resulted in a platform that is both rapid and responsive to changing scientific needs. The FDA plans to further develop in-house capacity in this area, while also supporting engagement by the external community, by sponsoring an open, public workshop to discuss NGS technologies and data formats standardization, and to promote the adoption of interoperability protocols in September 2014. Next-generation sequencing (NGS) technologies are enabling breakthroughs in how the biomedical community is developing and evaluating medical products. One example is the potential application of this method to the detection and identification of microbial contaminants in biologic products. In order for the U.S. Food and Drug Administration (FDA) to be able to evaluate the utility of this technology, we need to have the information technology infrastructure and bioinformatics tools to be able to store and analyze large amounts of data. To address this need, we have developed the High-performance Integrated Virtual Environment, or HIVE. HIVE uses a combination of distributed cloud storage and distributed cloud computations to provide a platform that is both rapid and responsive to support the growing and increasingly diverse scientific and regulatory needs of FDA scientists in their evaluation of NGS in research and ultimately for evaluation of NGS data in regulatory submissions. © PDA, Inc. 2014.

  6. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  7. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  8. Fault Injection Validation of a Safety-Critical TMR Sysem

    NASA Astrophysics Data System (ADS)

    Irrera, Ivano; Madeira, Henrique; Zentai, Andras; Hergovics, Beata

    2016-08-01

    Digital systems and their software are the core technology for controlling and monitoring industrial systems in practically all activity domains. Functional safety standards such as the European standard EN 50128 for railway applications define the procedures and technical requirements for the development of software for railway control and protection systems. The validation of such systems is a highly demanding task. In this paper we discuss the use of fault injection techniques, which have been used extensively in several domains, particularly in the space domain, to complement the traditional procedures to validate a SIL (Safety Integrity Level) 4 system for railway signalling, implementing a TMR (Triple Modular Redundancy) architecture. The fault injection tool is based on JTAG technology. The results of our injection campaign showed a high degree of tolerance to most of the injected faults, but several cases of unexpected behaviour have also been observed, helping understanding worst-case scenarios.

  9. JavaScript Access to DICOM Network and Objects in Web Browser.

    PubMed

    Drnasin, Ivan; Grgić, Mislav; Gogić, Goran

    2017-10-01

    Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.

  10. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  11. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  12. Traumatic Brain Injury Diffusion Magnetic Resonance Imaging Research Roadmap Development Project

    DTIC Science & Technology

    2011-10-01

    promising technology on the horizon is the Diffusion Tensor Imaging ( DTI ). Diffusion tensor imaging ( DTI ) is a magnetic resonance imaging (MRI)-based...in the brain. The potential for DTI to improve our understanding of TBI has not been fully explored and challenges associated with non-existent...processing tools, quality control standards, and a shared image repository. The recommendations will be disseminated and pilot tested. A DTI of TBI

  13. Cost Modeling for Space Telescope

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  14. Does Mathletics, a Supplementary Digital Math Tool, Improve Student Learning and Teaching Methods at Three Private Catholic Schools in Florida?--A Mixed Methods Study

    ERIC Educational Resources Information Center

    Stephan, Kelly Purdy

    2017-01-01

    Improving mathematical student performance in K-12 education has been a focus in the U.S. Students in the U.S. score lower on standardized math assessments than students in other countries. Preparing students for a successful future in a global society requires schools to integrate effective digital technologies in math classroom curricula.…

  15. Basics of Compounding: 3D Printing--Pharmacy Applications, Part 2.

    PubMed

    Allen, Loyd V

    2017-01-01

    3D printing is a standard tool in the automotive, aerospace, and consumer goods in industry and is gaining traction in pharmaceutical manufacturing, which has introduced a new element into dosage-form development. This article, which represents part 2 of a 3-part article on the topic of 3D printing, discusses the different technologies available for 3D printing. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  16. Nanowire systems: technology and design

    PubMed Central

    Gaillardon, Pierre-Emmanuel; Amarù, Luca Gaetano; Bobba, Shashikanth; De Marchi, Michele; Sacchetto, Davide; De Micheli, Giovanni

    2014-01-01

    Nanosystems are large-scale integrated systems exploiting nanoelectronic devices. In this study, we consider double independent gate, vertically stacked nanowire field effect transistors (FETs) with gate-all-around structures and typical diameter of 20 nm. These devices, which we have successfully fabricated and evaluated, control the ambipolar behaviour of the nanostructure by selectively enabling one type of carriers. These transistors work as switches with electrically programmable polarity and thus realize an exclusive or operation. The intrinsic higher expressive power of these FETs, when compared with standard complementary metal oxide semiconductor technology, enables us to realize more efficient logic gates, which we organize as tiles to realize nanowire systems by regular arrays. This article surveys both the technology for double independent gate FETs as well as physical and logic design tools to realize digital systems with this fabrication technology. PMID:24567471

  17. In-Situ XRF Measurements in Lunar Surface Exploration Using Apollo Samples as a Standard

    NASA Technical Reports Server (NTRS)

    Young, Kelsey E.; Evans, C.; Allen, C.; Mosie, A.; Hodges, K. V.

    2011-01-01

    Samples collected during the Apollo lunar surface missions were sampled and returned to Earth by astronauts with varying degrees of geological experience. The technology used in these EVAs, or extravehicular activities, included nothing more advanced than traditional terrestrial field instruments: rock hammer, scoop, claw tool, and sample bags. 40 years after Apollo, technology is being developed that will allow for a high-resolution geochemical map to be created in the field real-time. Handheld x-ray fluorescence (XRF) technology is one such technology. We use handheld XRF to enable a broad in-situ characterization of a geologic site of interest based on fairly rapid techniques that can be implemented by either an astronaut or a robotic explorer. The handheld XRF instrument we used for this study was the Innov-X Systems Delta XRF spectrometer.

  18. Producing Production Level Tooling in Prototype Timing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mc Hugh, Kevin Matthew; Knirsch, J.

    A new rapid solidification process machine will be able to produce eight-inch diameter by six-inch thick finished cavities at the rate of one per hour - a rate that will change the tooling industry dramatically. Global Metal Technologies, Inc. (GMTI) (Solon, OH) has signed an exclusive license with Idaho National Engineered and Environmental Laboratories (INEEL) (Idaho Falls, ID) for the development and commercialization of the rapid solidification process (RSP tooling). The first production machine is scheduled for delivery in July 2001. The RSP tooling process is a method of producing production level tooling in prototype timing. The process' inventor, Kevinmore » McHugh, describes it as a rapid solidification method, which differentiates it from the standard spray forming methods. RSP itself is relatively straightforward. Molten metal is sprayed against the ceramic pattern, replicating the pattern's contours, surface texture and details. After spraying, the molten tool steel is cooled at room temperature and separated from the pattern. The irregular periphery of the freshly sprayed insert is squared off, either by machining or, in the case of harder tool steels, by wire EDM. XX« less

  19. The European Classical Swine Fever Virus Database: Blueprint for a Pathogen-Specific Sequence Database with Integrated Sequence Analysis Tools

    PubMed Central

    Postel, Alexander; Schmeiser, Stefanie; Zimmermann, Bernd; Becher, Paul

    2016-01-01

    Molecular epidemiology has become an indispensable tool in the diagnosis of diseases and in tracing the infection routes of pathogens. Due to advances in conventional sequencing and the development of high throughput technologies, the field of sequence determination is in the process of being revolutionized. Platforms for sharing sequence information and providing standardized tools for phylogenetic analyses are becoming increasingly important. The database (DB) of the European Union (EU) and World Organisation for Animal Health (OIE) Reference Laboratory for classical swine fever offers one of the world’s largest semi-public virus-specific sequence collections combined with a module for phylogenetic analysis. The classical swine fever (CSF) DB (CSF-DB) became a valuable tool for supporting diagnosis and epidemiological investigations of this highly contagious disease in pigs with high socio-economic impacts worldwide. The DB has been re-designed and now allows for the storage and analysis of traditionally used, well established genomic regions and of larger genomic regions including complete viral genomes. We present an application example for the analysis of highly similar viral sequences obtained in an endemic disease situation and introduce the new geographic “CSF Maps” tool. The concept of this standardized and easy-to-use DB with an integrated genetic typing module is suited to serve as a blueprint for similar platforms for other human or animal viruses. PMID:27827988

  20. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  1. A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community

    NASA Astrophysics Data System (ADS)

    Merchant, B. J.; Chael, E. P.; Young, C. J.

    2013-12-01

    Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.

  2. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  3. The experiences of undergraduate nursing students with bots in Second LifeRTM

    NASA Astrophysics Data System (ADS)

    Rose, Lesele H.

    As technology continues to transform education from the status quo of traditional lecture-style instruction to an interactive engaging learning experience, students' experiences within the learning environment continues to change as well. This dissertation addressed the need for continuing research in advancing implementation of technology in higher education. The purpose of this phenomenological study was to discover more about the experiences of undergraduate nursing students using standardized geriatric evaluation tools when interacting with scripted geriatric patient bots tools in a simulated instructional intake setting. Data was collected through a Demographics questionnaire, an Experiential questionnaire, and a Reflection questionnaire. Triangulation of data collection occurred through an automatically created log of the interactions with the two bots, and by an automatically recorded log of the participants' movements while in the simulated geriatric intake interview. The data analysis consisted of an iterative review of the questionnaires and the participants' logs in an effort to identify common themes, recurring comments, and issues which would benefit from further exploration. Findings revealed that the interactions with the bots were perceived as a valuable experience for the participants from the perspective of interacting with the Geriatric Evaluation Tools in the role of an intake nurse. Further research is indicated to explore instructional interactions with bots in effectively mastering the use of established Geriatric Evaluation Tools.

  4. The VeTOOLS Project: an example of how to strengthen collaboration between scientists and Civil Protections in disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Marti, Joan; Bartolini, Stefania; Becerril, Laura

    2016-04-01

    VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524

  5. Image-enhanced endoscopy with I-scan technology for the evaluation of duodenal villous patterns.

    PubMed

    Cammarota, Giovanni; Ianiro, Gianluca; Sparano, Lucia; La Mura, Rossella; Ricci, Riccardo; Larocca, Luigi M; Landolfi, Raffaele; Gasbarrini, Antonio

    2013-05-01

    I-scan technology is the newly developed endoscopic tool that works in real time and utilizes a digital contrast method to enhance endoscopic image. We performed a feasibility study aimed to determine the diagnostic accuracy of i-scan technology for the evaluation of duodenal villous patterns, having histology as the reference standard. In this prospective, single center, open study, patients undergoing upper endoscopy for an histological evaluation of duodenal mucosa were enrolled. All patients underwent upper endoscopy using high resolution view in association with i-scan technology. During endoscopy, duodenal villous patterns were evaluated and classified as normal, partial villous atrophy, or marked villous atrophy. Results were then compared with histology. One hundred fifteen subjects were recruited in this study. The endoscopist was able to find marked villous atrophy of the duodenum in 12 subjects, partial villous atrophy in 25, and normal villi in the remaining 78 individuals. The i-scan system was demonstrated to have great accuracy (100 %) in the detection of marked villous atrophy patterns. I-scan technology showed quite lower accuracy in determining partial villous atrophy or normal villous patterns (respectively, 90 % for both items). Image-enhancing endoscopic technology allows a clear visualization of villous patterns in the duodenum. By switching from the standard to the i-scan view, it is possible to optimize the accuracy of endoscopy in recognizing villous alteration in subjects undergoing endoscopic evaluation.

  6. Information technology principles for management, reporting, and research.

    PubMed

    Gillam, Michael; Rothenhaus, Todd; Smith, Vernon; Kanhouwa, Meera

    2004-11-01

    Information technology holds the promise to enhance the ability of individuals and organizations to manage emergency departments, improve data sharing and reporting, and facilitate research. The Society for Academic Emergency Medicine (SAEM) Consensus Committee has identified nine principles to outline a path of optimal features and designs for current and future information technology systems. The principles roughly summarized include the following: utilize open database standards with clear data dictionaries, provide administrative access to necessary data, appoint and recognize individuals with emergency department informatics expertise, allow automated alert and proper identification for enrollment of cases into research, provide visual and statistical tools and training to analyze data, embed automated configurable alarm functionality for clinical and nonclinical systems, allow multiexport standard and format configurable reporting, strategically acquire mission-critical equipment that is networked and capable of automated feedback regarding functional status and location, and dedicate resources toward informatics research and development. The SAEM Consensus Committee concludes that the diligent application of these principles will enhance emergency department management, reporting, and research and ultimately improve the quality of delivered health care.

  7. Using CCSDS Standards to Reduce Mission Costs

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2017-01-01

    NASA's open source Core Flight System (cFS) software framework has been using several Consultative Committee for Space Data Systems (CCSDS) standards since its inception. Recently developed CCSDS standards are now being applied by NASA, ESA and other organizations to streamline and automate aspects of mission development, test, and operations, speeding mission schedules and reducing mission costs. This paper will present the new CCSDS Spacecraft Onboard Interfaces Services (SOIS) Electronic Data Sheet (EDS) standards and show how they are being applied to data interfaces in the cFS software framework, tool chain, and ground systems across a range of missions at NASA. Although NASA is focusing on the cFS, it expected that these technologies are well suited for use in other system architectures and can lower costs for a wide range of both large and small satellites.

  8. Indexing method of digital audiovisual medical resources with semantic Web integration.

    PubMed

    Cuggia, Marc; Mougin, Fleur; Le Beux, Pierre

    2005-03-01

    Digitalization of audiovisual resources and network capability offer many possibilities which are the subject of intensive work in scientific and industrial sectors. Indexing such resources is a major challenge. Recently, the Motion Pictures Expert Group (MPEG) has developed MPEG-7, a standard for describing multimedia content. The goal of this standard is to develop a rich set of standardized tools to enable efficient retrieval from digital archives or the filtering of audiovisual broadcasts on the Internet. How could this kind of technology be used in the medical context? In this paper, we propose a simpler indexing system, based on the Dublin Core standard and compliant to MPEG-7. We use MeSH and the UMLS to introduce conceptual navigation. We also present a video-platform which enables encoding and gives access to audiovisual resources in streaming mode.

  9. High-resolution PET [Positron Emission Tomography] for Medical Science Studies

    DOE R&D Accomplishments Database

    Budinger, T. F.; Derenzo, S. E.; Huesman, R. H.; Jagust, W. J.; Valk, P. E.

    1989-09-01

    One of the unexpected fruits of basic physics research and the computer revolution is the noninvasive imaging power available to today's physician. Technologies that were strictly the province of research scientists only a decade or two ago now serve as the foundations for such standard diagnostic tools as x-ray computer tomography (CT), magnetic resonance imaging (MRI), magnetic resonance spectroscopy (MRS), ultrasound, single photon emission computed tomography (SPECT), and positron emission tomography (PET). Furthermore, prompted by the needs of both the practicing physician and the clinical researcher, efforts to improve these technologies continue. This booklet endeavors to describe the advantages of achieving high resolution in PET imaging.

  10. Community health workers and mobile technology: a systematic review of the literature.

    PubMed

    Braun, Rebecca; Catalani, Caricia; Wimbush, Julian; Israelski, Dennis

    2013-01-01

    In low-resource settings, community health workers are frontline providers who shoulder the health service delivery burden. Increasingly, mobile technologies are developed, tested, and deployed with community health workers to facilitate tasks and improve outcomes. We reviewed the evidence for the use of mobile technology by community health workers to identify opportunities and challenges for strengthening health systems in resource-constrained settings. We conducted a systematic review of peer-reviewed literature from health, medical, social science, and engineering databases, using PRISMA guidelines. We identified a total of 25 unique full-text research articles on community health workers and their use of mobile technology for the delivery of health services. Community health workers have used mobile tools to advance a broad range of health aims throughout the globe, particularly maternal and child health, HIV/AIDS, and sexual and reproductive health. Most commonly, community health workers use mobile technology to collect field-based health data, receive alerts and reminders, facilitate health education sessions, and conduct person-to-person communication. Programmatic efforts to strengthen health service delivery focus on improving adherence to standards and guidelines, community education and training, and programmatic leadership and management practices. Those studies that evaluated program outcomes provided some evidence that mobile tools help community health workers to improve the quality of care provided, efficiency of services, and capacity for program monitoring. Evidence suggests mobile technology presents promising opportunities to improve the range and quality of services provided by community health workers. Small-scale efforts, pilot projects, and preliminary descriptive studies are increasing, and there is a trend toward using feasible and acceptable interventions that lead to positive program outcomes through operational improvements and rigorous study designs. Programmatic and scientific gaps will need to be addressed by global leaders as they advance the use and assessment of mobile technology tools for community health workers.

  11. An approach for software-driven and standard-based support of cross-enterprise tumor boards.

    PubMed

    Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas

    2015-01-01

    For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.

  12. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boden, Thomas A; Krassovski, Misha B; Yang, Bai

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has beenmore » to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management support for numerous long-term measurement projects crucial to climate change science. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. We share our approaches in satisfying the challenges of delivering AmeriFlux data worldwide to benefit others with similar challenges handling climate change data, further heighten awareness and use of an outstanding ecological data resource, and highlight expanded software engineering applications being used for climate change measurement data.« less

  13. Information Technology and Community Restoration Studies/Task 1: Information Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upton, Jaki F.; Lesperance, Ann M.; Stein, Steven L.

    2009-11-19

    Executive Summary The Interagency Biological Restoration Demonstration—a program jointly funded by the Department of Defense's Defense Threat Reduction Agency and the Department of Homeland Security's (DHS's) Science and Technology Directorate—is developing policies, methods, plans, and applied technologies to restore large urban areas, critical infrastructures, and Department of Defense installations following the intentional release of a biological agent (anthrax) by terrorists. There is a perception that there should be a common system that can share information both vertically and horizontally amongst participating organizations as well as support analyses. A key question is: "How far away from this are we?" As partmore » of this program, Pacific Northwest National Laboratory conducted research to identify the current information technology tools that would be used by organizations in the greater Seattle urban area in such a scenario, to define criteria for use in evaluating information technology tools, and to identify current gaps. Researchers interviewed 28 individuals representing 25 agencies in civilian and military organizations to identify the tools they currently use to capture data needed to support operations and decision making. The organizations can be grouped into five broad categories: defense (Department of Defense), environmental/ecological (Environmental Protection Agency/Ecology), public health and medical services, emergency management, and critical infrastructure. The types of information that would be communicated in a biological terrorism incident include critical infrastructure and resource status, safety and protection information, laboratory test results, and general emergency information. The most commonly used tools are WebEOC (web-enabled crisis information management systems with real-time information sharing), mass notification software, resource tracking software, and NW WARN (web-based information to protect critical infrastructure systems). It appears that the current information management tools are used primarily for information gathering and sharing—not decision making. Respondents identified the following criteria for a future software system. It is easy to learn, updates information in real time, works with all agencies, is secure, uses a visualization or geographic information system feature, enables varying permission levels, flows information from one stage to another, works with other databases, feeds decision support tools, is compliant with appropriate standards, and is reasonably priced. Current tools have security issues, lack visual/mapping functions and critical infrastructure status, and do not integrate with other tools. It is clear that there is a need for an integrated, common operating system. The system would need to be accessible by all the organizations that would have a role in managing an anthrax incident to enable regional decision making. The most useful tool would feature a GIS visualization that would allow for a common operating picture that is updated in real time. To capitalize on information gained from the interviews, the following activities are recommended: • Rate emergency management decision tools against the criteria specified by the interviewees. • Identify and analyze other current activities focused on information sharing in the greater Seattle urban area. • Identify and analyze information sharing systems/tools used in other regions.« less

  14. Staying connected: online education engagement and retention using educational technology tools.

    PubMed

    Salazar, Jose

    2010-01-01

    The objective of this article is to inform educators about the use of currently available educational technology tools to promote student retention, engagement and interaction in online courses. Educational technology tools include content management systems, podcasts, video lecture capture technology and electronic discussion boards. Successful use of educational technology tools requires planning, organization and use of effective learning strategies.

  15. Evaluating Spatiotemporal Image Correlation Technology as a Tool for Training Nonexpert Sonographers to Perform Examinations of the Fetal Heart.

    PubMed

    Avnet, Hagai; Mazaaki, Eyal; Shen, Ori; Cohen, Sarah; Yagel, Simcha

    2016-01-01

    We aimed to evaluate the use of spatiotemporal image correlation (STIC) as a tool for training nonexpert examiners to perform screening examinations of the fetal heart by acquiring and examining STIC volumes according to a standardized questionnaire based on the 5 transverse planes of the fetal heart. We conducted a prospective study at 2 tertiary care centers. Two sonographers without formal training in fetal echocardiography received theoretical instruction on the 5 fetal echocardiographic transverse planes, as well as STIC technology. Only women with conditions allowing 4-dimensional STIC volume acquisitions (grayscale and Doppler) were included in the study. Acquired volumes were evaluated offline according to a standardized protocol that required the trainee to mark 30 specified structures on 5 required axial planes. Volumes were then reviewed by an expert examiner for quality of acquisition and correct identification of specified structures. Ninety-six of 112 pregnant women examined entered the study. Patients had singleton pregnancies between 20 and 32 weeks' gestation. After an initial learning curve of 20 examinations, trainees succeeded in identifying 97% to 98% of structures, with a highly significant degree of agreement with the expert's analysis (P < .001). A median of 2 STIC volumes for each examination was necessary for maximal structure identification. Acquisition quality scores were high (8.6-8.7 of a maximal score of 10) and were found to correlate with identification rates (P = .017). After an initial learning curve and under expert guidance, STIC is an excellent tool for trainees to master extended screening examinations of the fetal heart.

  16. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  17. Teaching chemistry and other sciences to blind and low-vision students through hands-on learning experiences in high school science laboratories

    NASA Astrophysics Data System (ADS)

    Supalo, Cary Alan

    2010-11-01

    Students with blindness and low vision (BLV) have traditionally been underrepresented in the sciences as a result of technological and attitudinal barriers to equal access in science laboratory classrooms. The Independent Laboratory Access for the Blind (ILAB) project developed and evaluated a suite of talking and audible hardware/software tools to empower students with BLV to have multisensory, hands-on laboratory learning experiences. This dissertation focuses on the first year of ILAB tool testing in mainstream science laboratory classrooms, and comprises a detailed multi-case study of four students with BLV who were enrolled in high school science classes during 2007--08 alongside sighted students. Participants attended different schools; curricula included chemistry, AP chemistry, and AP physics. The ILAB tools were designed to provide multisensory means for students with BLV to make observations and collect data during standard laboratory lessons on an equivalent basis with their sighted peers. Various qualitative and quantitative data collection instruments were used to determine whether the hands-on experiences facilitated by the ILAB tools had led to increased involvement in laboratory-goal-directed actions, greater peer acceptance in the students' lab groups, improved attitudes toward science, and increased interest in science. Premier among the ILAB tools was the JAWS/Logger Pro software interface, which made audible all information gathered through standard Vernier laboratory probes and visually displayed through Logger Pro. ILAB tools also included a talking balance, a submersible audible light sensor, a scientific talking stopwatch, and a variety of other high-tech and low-tech devices and techniques. While results were mixed, all four participating BLV students seemed to have experienced at least some benefit, with the effect being stronger for some than for others. Not all of the data collection instruments were found to reveal improvements for all of the participating students, but each of the types of data sets provided evidence of benefit for varying subgroups of participants. It is the expectation of the ILAB team that continuing to implement adaptive/assistive technologies for BLV students in science laboratory classrooms will foster enhanced opportunities in science classes and professions.

  18. The National Institute of Justice's Technology Efforts to Meet the Evolving Needs of the Responder Community

    NASA Astrophysics Data System (ADS)

    Boyd, D.

    2002-05-01

    The National Institute of Justice (NIJ) is the research arm of the Department of Justice. Through its Office of Science & Technology (OS&T), NIJ has actively pursued development of better tools for public safety agencies to combat terrorism since 1997, when, pursuant to the Anti-Terrorism and Effective Penalty Act of 1996 (P.L. 104 -132), it began development of technology to better enable law enforcement agencies to combat terrorism. NIJ quickly realized that effectively combating terrorism required a multi disciplinary, multi agency response. Additionally, it came to understand that, as noted by the Gilmore Commission, the best way to prepare the responder community to deal with the consequences of terrorist incidents, was to ``emphasize programs and initiatives that build appropriately on existing State and local capabilities for other emergencies and disasters.'' For example, an effective critical incident management system is just as important to the ability to deal with a terrorist attack, such as occurred at the World Trade Center, as with a major natural disaster or the crash of a commercial airliner or passenger train. Consequently, NIJ's efforts have evolved to focus on the responder community's common, unaddressed needs for better tools to deal with critical incidents. The Institutes efforts focus on five technology areas: infrastructure security, personnel location, explosives detection and remediation, communications and information technology and training, and development of standards.

  19. Critical issues in medical education and the implications for telemedicine technology.

    PubMed

    Mahapatra, Ashok Kumar; Mishra, Saroj Kanta; Kapoor, Lily; Singh, Indra Pratap

    2009-01-01

    Ensuring quality medical education in all the medical colleges across India based on uniform curriculum prescribed by a regulatory body and maintaining a uniform standard are dependent on availability of an excellent infrastructure. Such infrastructure includes qualified teachers, knowledge resources, learning materials, and advanced education technology, which is a challenge in developing countries due to financial and logistic constraints. Advancement in telecommunication, information science, and technology provides an opportunity to exchange knowledge and skill across geographically dispersed organizations by networking academic medical centers of excellence with medical colleges and institutes to practice distance learning using information and communication technology (ICT)-based tools. These may be as basic as commonly used Web-based tools or may be as advanced as virtual reality, simulation, and telepresence-based collaborative learning environment. The scenario in India is no different from any developing country, but there is considerable progress due to technical advancement in these sectors. Telemedicine and tele-education in health science, is gradually getting adopted into the Indian Health System after decade-long pilot studies across the country. A recent recommendation of the National Knowledge Commission, once implemented, would ensure a gigabyte network across all the educational institutions of the country including medical colleges. Availability of indigenous satellite communication technology and the government policy of free bandwidth provision for societal development sector have added strength to set up infrastructure to pilot several telemedicine educational projects across the country.

  20. Review of electronic decision-support tools for diabetes care: a viable option for low- and middle-income countries?

    PubMed

    Ali, Mohammed K; Shah, Seema; Tandon, Nikhil

    2011-05-01

    Diabetes care is complex, requiring motivated patients, providers, and systems that enable guideline-based preventative care processes, intensive risk-factor control, and positive lifestyle choices. However, care delivery in low- and middle-income countries (LMIC) is hindered by a compendium of systemic and personal factors. While electronic medical records (EMR) and computerized clinical decision-support systems (CDSS) have held great promise as interventions that will overcome system-level challenges to improving evidence-based health care delivery, evaluation of these quality improvement interventions for diabetes care in LMICs is lacking. OBJECTIVE AND DATA SOURCES: We reviewed the published medical literature (systematic search of MEDLINE database supplemented by manual searches) to assess the quantifiable and qualitative impacts of combined EMR-CDSS tools on physician performance and patient outcomes and their applicability in LMICs. Inclusion criteria prespecified the population (type 1 or 2 diabetes patients), intervention (clinical EMR-CDSS tools with enhanced functionalities), and outcomes (any process, self-care, or patient-level data) of interest. Case, review, or methods reports and studies focused on nondiabetes, nonclinical, or in-patient uses of EMR-CDSS were excluded. Quantitative and qualitative data were extracted from studies by separate single reviewers, respectively, and relevant data were synthesized. Thirty-three studies met inclusion criteria, originating exclusively from high-income country settings. Among predominantly experimental study designs, process improvements were consistently observed along with small, variable improvements in risk-factor control, compared with baseline and/or control groups (where applicable). Intervention benefits varied by baseline patient characteristics, features of the EMR-CDSS interventions, motivation and access to technology among patients and providers, and whether EMR-CDSS tools were combined with other quality improvement strategies (e.g., workflow changes, case managers, algorithms, incentives). Patients shared experiences of feeling empowered and benefiting from increased provider attention and feedback but also frustration with technical difficulties of EMR-CDSS tools. Providers reported more efficient and standardized processes plus continuity of care but also role tensions and "mechanization" of care. This narrative review supports EMR-CDSS tools as innovative conduits for structuring and standardizing care processes but also highlights setting and selection limitations of the evidence reviewed. In the context of limited resources, individual economic hardships, and lack of structured systems or trained human capital, this review reinforces the need for well-designed investigations evaluating the role and feasibility of technological interventions (customized to each LMIC's locality) in clinical decision making for diabetes care. © 2011 Diabetes Technology Society.

  1. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    PubMed

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  2. SOLE: Applying Semantics and Social Web to Support Technology Enhanced Learning in Software Engineering

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Jiménez-López, Diego; García-Crespo, Ángel; Blanco-Iglesias, Borja

    eLearning educative processes are a challenge for educative institutions and education professionals. In an environment in which learning resources are being produced, catalogued and stored using innovative ways, SOLE provides a platform in which exam questions can be produced supported by Web 2.0 tools, catalogued and labeled via semantic web and stored and distributed using eLearning standards. This paper presents, SOLE, a social network of exam questions sharing particularized for Software Engineering domain, based on semantics and built using semantic web and eLearning standards, such as IMS Question and Test Interoperability specification 2.1.

  3. Study on establishment of Body of Knowledge of Taiwan's Traditional Wooden Structure Technology

    NASA Astrophysics Data System (ADS)

    Huang, M. T.; Chiou, S. C.; Hsu, T. W.; Su, P. C.

    2015-08-01

    The timber technology of the Taiwan traditional architecture is brought by the immigrants in the Southern Fujian of China in the early, which has been inherited for a hundred years. In the past, these traditional timber technologies were taught by mentoring, however, due to the change of the social form, the construction of the traditional architecture was faded away, and what is gradually replaced is the repair work of the traditional architecture, therefore, the construction method of the timber technology, use form of the tool and other factors are very different from previous one, and the core technology is faced with the dilemma of endangered loss. There are many relevant studies on architectural style, construction method of technology, schools of craftsman, technical capacity of craftsman and other timber technologies, or the technology preservation is carried out by dictating the historical record, studying the skills and other ways, but for the timber craftsman repairing the traditional architecture on the front line, there is still space for discussing whether to maintain the original construction method and maintain the due repair quality for the core technology. This paper classified the timber technology knowledge with the document analysis method and expert interview method, carried out the architecture analysis of knowledge hierarchy, and finally, built the preliminary framework of the timber technology knowledge system of the Taiwan traditional architecture, and built the standard formulation available for craftsman training and skills identification by virtue of the knowledge system, so that the craftsman did not affect the technical capacity due to the change of the knowledge instruction system, thus, affecting the repair quality of the traditional architecture; and in addition, the building of the database system can also be derived by means of the knowledge structure, so as to integrate the consistency of the contents of core technical capacity. It can be used as the interpretation data; the knowledge is standardized and the authority file is established, which is regarded as a technical specification, so that the technology is standardized, thus, avoid loss or distort.

  4. The Joy of Playing with Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Smith, A. T.; Xing, Z.; Armstrong, E. M.; Thompson, C. K.; Huang, T.

    2013-12-01

    The web is no longer just an after thought. It is no longer just a presentation layer filled with HTML, CSS, JavaScript, Frameworks, 3D, and more. It has become the medium of our communication. It is the database of all databases. It is the computing platform of all platforms. It has transformed the way we do science. Web service is the de facto method for communication between machines over the web. Representational State Transfer (REST) has standardized the way we architect services and their interfaces. In the Earth Science domain, we are familiar with tools and services such as Open-Source Project for Network Data Access Protocol (OPeNDAP), Thematic Realtime Environmental Distributed Data Services (THREDDS), and Live Access Server (LAS). We are also familiar with various data formats such as NetCDF3/4, HDF4/5, GRIB, TIFF, etc. One of the challenges for the Earth Science community is accessing information within these data. There are community-accepted readers that our users can download and install. However, the Application Programming Interface (API) between these readers is not standardized, which leads to non-portable applications. Webification (w10n) is an emerging technology, developed at the Jet Propulsion Laboratory, which exploits the hierarchical nature of a science data artifact to assign a URL to each element within the artifact. (e.g. a granule file). By embracing standards such as JSON, XML, and HTML5 and predictable URL, w10n provides a simple interface that enables tool-builders and researchers to develop portable tools/applications to interact with artifacts of various formats. The NASA Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is the designated data center for observational products relevant to the physical state of the ocean. Over the past year PO.DAAC has been evaluating w10n technology by webifying its archive holdings to provide simplified access to oceanographic science artifacts and as a service to enable future tools and services development. In this talk, we will focus on a w10n-based system called Distributed Oceanographic Webification Service (DOWS) being developed at PO.DAAC to provide a newer and simpler method for working with observational data artifacts. As a continued effort at PO.DAAC to provide better tools and services to visualize our data, the talk will discuss the latest in web-based data visualization tools/frameworks (such as d3.js, Three.js, Leaflet.js, and more) and techniques for working with webified oceanographic science data in both a 2D and 3D web approach.

  5. Advancing perinatal patient safety through application of safety science principles using health IT.

    PubMed

    Webb, Jennifer; Sorensen, Asta; Sommerness, Samantha; Lasater, Beth; Mistry, Kamila; Kahwati, Leila

    2017-12-19

    The use of health information technology (IT) has been shown to promote patient safety in Labor and Delivery (L&D) units. The use of health IT to apply safety science principles (e.g., standardization) to L&D unit processes may further advance perinatal safety. Semi-structured interviews were conducted with L&D units participating in the Agency for Healthcare Research and Quality's (AHRQ's) Safety Program for Perinatal Care (SPPC) to assess units' experience with program implementation. Analysis of interview transcripts was used to characterize the process and experience of using health IT for applying safety science principles to L&D unit processes. Forty-six L&D units from 10 states completed participation in SPPC program implementation; thirty-two (70%) reported the use of health IT as an enabling strategy for their local implementation. Health IT was used to improve standardization of processes, use of independent checks, and to facilitate learning from defects. L&D units standardized care processes through use of electronic health record (EHR)-based order sets and use of smart pumps and other technology to improve medication safety. Units also standardized EHR documentation, particularly related to electronic fetal monitoring (EFM) and shoulder dystocia. Cognitive aids and tools were integrated into EHR and care workflows to create independent checks such as checklists, risk assessments, and communication handoff tools. Units also used data from EHRs to monitor processes of care to learn from defects. Units experienced several challenges incorporating health IT, including obtaining organization approval, working with their busy IT departments, and retrieving standardized data from health IT systems. Use of health IT played an integral part in the planning and implementation of SPPC for participating L&D units. Use of health IT is an encouraging approach for incorporating safety science principles into care to improve perinatal safety and should be incorporated into materials to facilitate the implementation of perinatal safety initiatives.

  6. Integrated System Health Management: Foundational Concepts, Approach, and Implementation

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2009-01-01

    A sound basis to guide the community in the conception and implementation of ISHM (Integrated System Health Management) capability in operational systems was provided. The concept of "ISHM Model of a System" and a related architecture defined as a unique Data, Information, and Knowledge (DIaK) architecture were described. The ISHM architecture is independent of the typical system architecture, which is based on grouping physical elements that are assembled to make up a subsystem, and subsystems combine to form systems, etc. It was emphasized that ISHM capability needs to be implemented first at a low functional capability level (FCL), or limited ability to detect anomalies, diagnose, determine consequences, etc. As algorithms and tools to augment or improve the FCL are identified, they should be incorporated into the system. This means that the architecture, DIaK management, and software, must be modular and standards-based, in order to enable systematic augmentation of FCL (no ad-hoc modifications). A set of technologies (and tools) needed to implement ISHM were described. One essential tool is a software environment to create the ISHM Model. The software environment encapsulates DIaK, and an infrastructure to focus DIaK on determining health (detect anomalies, determine causes, determine effects, and provide integrated awareness of the system to the operator). The environment includes gateways to communicate in accordance to standards, specially the IEEE 1451.1 Standard for Smart Sensors and Actuators.

  7. NASA Briefing for Unidata

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2016-01-01

    The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.

  8. Promoting scientific collaboration and research through integrated social networking capabilities within the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.

    2009-04-01

    LiDAR (Light Distance And Ranging) topography data offer earth scientists the opportunity to study the earth's surface at very high resolutions. As a result, the popularity of these data is growing dramatically. However, the management, distribution, and analysis of community LiDAR data sets is a challenge due to their massive size (multi-billion point, mutli-terabyte). We have also found that many earth science users of these data sets lack the computing resources and expertise required to process these data. We have developed the OpenTopography Portal to democratize access to these large and computationally challenging data sets. The OpenTopography Portal uses cyberinfrastructure technology developed by the GEON project to provide access to LiDAR data in a variety of formats. LiDAR data products available range from simple Google Earth visualizations of LiDAR-derived hillshades to 1 km2 tiles of standard digital elevation model (DEM) products as well as LiDAR point cloud data and user generated custom-DEMs. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data system with multiple distribution mechanisms and platforms to serve a broader range of user communities. Because the volume of LiDAR topography data available is rapidly expanding, and data analysis techniques are evolving, there is a need for the user community to be able to communicate and interact to share knowledge and experiences. To address this need, the OpenTopography Portal enables social networking capabilities through a variety of collaboration tools, web 2.0 technologies and customized usage pattern tracking. Fundamentally, these tools offer users the ability to communicate, to access and share documents, participate in discussions, and to keep up to date on upcoming events and emerging technologies. The OpenTopography portal achieves the social networking capabilities by integrating various software technologies and platforms. These include the Expression Engine Content Management System (CMS) that comes with pre-packaged collaboration tools like blogs and wikis, the Gridsphere portal framework that contains the primary GEON LiDAR System portlet with user job monitoring capabilities and a java web based discussion forum (Jforums) application all seamlessly integrated under one portal. The OpenTopography Portal also provides integrated authentication mechanism between the various CMS collaboration tools and the core gridsphere based portlets. The integration of these various technologies allows for enhanced user interaction capabilities within the portal. By integrating popular collaboration tools like discussion forums and blogs we can promote conversation and openness among users. The ability to ask question and share expertise in forum discussions allows users to easily find information and interact with users facing similar challenges. The OpenTopography Blog enables our domain experts to post ideas, news items, commentary, and other resources in order to foster discussion and information sharing. The content management capabilities of the portal allow for easy updates to information in the form of publications, documents, and news articles. Access to the most current information fosters better decision-making. As has become the standard for web 2.0 technologies, the OpenTopography Portal is fully RSS enabled to allow users of the portal to keep track of news items, forum discussions, blog updates, and system outages. We are currently exploring how the information captured by user and job monitoring components of the Gridsphere based GEON LiDAR System can be harnessed to provide a recommender system that will help users to identify appropriate processing parameters and to locate related documents and data. By seamlessly integrating the various platforms and technologies under one single portal, we can take advantage of popular online collaboration tools that are either stand alone or software platform restricted. The availability of these collaboration tools along with the data will foster more community interaction and increase the strength and vibrancy of the LiDAR topography user community.

  9. An assessment of advanced displays and controls technology applicable to future space transportation systems

    NASA Technical Reports Server (NTRS)

    Hatfield, Jack J.; Villarreal, Diana

    1990-01-01

    The topic of advanced display and control technology is addressed along with the major objectives of this technology, the current state of the art, major accomplishments, research programs and facilities, future trends, technology issues, space transportation systems applications and projected technology readiness for those applications. The holes that may exist between the technology needs of the transportation systems versus the research that is currently under way are addressed, and cultural changes that might facilitate the incorporation of these advanced technologies into future space transportation systems are recommended. Some of the objectives are to reduce life cycle costs, improve reliability and fault tolerance, use of standards for the incorporation of advancing technology, and reduction of weight, volume and power. Pilot workload can be reduced and the pilot's situational awareness can be improved, which would result in improved flight safety and operating efficiency. This could be accomplished through the use of integrated, electronic pictorial displays, consolidated controls, artificial intelligence, and human centered automation tools. The Orbiter Glass Cockpit Display is an example examined.

  10. Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH)

    DTIC Science & Technology

    2008-10-24

    Award Number: W81XWH-06-1-0761 TITLE: Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH) PRINCIPAL INVESTIGATOR: W...COVERED (From - To) 25 Sep 2007 - 24 Sep 2008 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Surgical Technology Integration with Tools for Cognitive...8 3. Tools and Display Technology

  11. Non-thermal inactivation of Noroviruses in food

    NASA Astrophysics Data System (ADS)

    Velebit, B.; Petronijević, R.; Bošković, T.

    2017-09-01

    An increased incidence of foodborne illnesses caused by Norovirus and consumer demand for fresh, convenient, and safe foods have prompted research into alternative antiviral processing technologies. Chlorine dioxide, UV treatment and thermal processing are standard antinoroviral technologies that have been employed for a while; however, they tend to be non-effective in modern processing due to residue concerns (ClO2), shadowing effects (UV) and low-energy efficiency (heat treatment). Alternative technologies have been validated such as ozone treatment, high pressure processing and pulse electric fields. Although these techniques are promising, none of them individually can deem food free of Norovirus. Further research on the effects on Norovirus in various food matrices is required. Good manufacturing practices and proper sanitation procedures remain the “gold” safety tools in food business.

  12. Midwifery education and technology enhanced learning: Evaluating online story telling in preregistration midwifery education.

    PubMed

    Scamell, Mandie; Hanley, Thomas

    2018-03-01

    A major issue regarding the implementation of blended learning for preregistration health programmes is the analysis of students' perceptions and attitudes towards their learning. It is the extent of the embedding of Technology Enhanced Learning (TEL) into the higher education curriculum that makes this analysis so vital. This paper reports on the quantitative results of a UK based study that was set up to respond to the apparent disconnect between technology enhanced education provision and reliable student evaluation of this mode of learning. Employing a mixed methods research design, the research described here was carried to develop a reliable and valid evaluation tool to measure acceptability of and satisfaction with a blended learning approach, specifically designed for a preregistration midwifery module offered at level 4. Feasibility testing of 46 completed blended learning evaluation questionnaires - Student Midwife Evaluation of Online Learning Effectiveness (SMEOLE) - using descriptive statistics, reliability and internal consistency tests. Standard deviations and mean scores all followed predicted pattern. Results from the reliability and internal consistency testing confirm the feasibility of SMEOLE as an effective tool for measuring student satisfaction with a blended learning approach to preregistration learning. The analysis presented in this paper suggests that we have been successful in our aim to produce an evaluation tool capable of assessing the quality of technology enhanced, University level learning in Midwifery. This work can provide future benchmarking against which midwifery, and other health, blended learning curriculum planning could be structured and evaluated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. ServAR: An augmented reality tool to guide the serving of food.

    PubMed

    Rollo, Megan E; Bucher, Tamara; Smith, Shamus P; Collins, Clare E

    2017-05-12

    Accurate estimation of food portion size is a difficult task. Visual cues are important mediators of portion size and therefore technology-based aids may assist consumers when serving and estimating food portions. The current study evaluated the usability and impact on estimation error of standard food servings of a novel augmented reality food serving aid, ServAR. Participants were randomised into one of three groups: 1) no information/aid (control); 2) verbal information on standard serving sizes; or 3) ServAR, an aid which overlayed virtual food servings over a plate using a tablet computer. Participants were asked to estimate the standard serving sizes of nine foods (broccoli, carrots, cauliflower, green beans, kidney beans, potato, pasta, rice, and sweetcorn) using validated food replicas. Wilcoxon signed-rank tests compared median served weights of each food to reference standard serving size weights. Percentage error was used to compare the estimation of serving size accuracy between the three groups. All participants also performed a usability test using the ServAR tool to guide the serving of one randomly selected food. Ninety adults (78.9% female; a mean (95%CI) age 25.8 (24.9-26.7) years; BMI 24.2 (23.2-25.2) kg/m 2 ) completed the study. The median servings were significantly different to the reference portions for five foods in the ServAR group, compared to eight foods in the information only group and seven foods for the control group. The cumulative proportion of total estimations per group within ±10%, ±25% and ±50% of the reference portion was greater for those using ServAR (30.7, 65.2 and 90.7%; respectively), compared to the information only group (19.6, 47.4 and 77.4%) and control group (10.0, 33.7 and 68.9%). Participants generally found the ServAR tool easy to use and agreed that it showed potential to support optimal portion size selection. However, some refinements to the ServAR tool are required to improve the user experience. Use of the augmented reality tool improved accuracy and consistency of estimating standard serve sizes compared to the information only and control conditions. ServAR demonstrates potential as a practical tool to guide the serving of food. Further evaluation across a broad range of foods, portion sizes and settings is warranted.

  14. The automatic back-check mechanism of mask tooling database and automatic transmission of mask tooling data

    NASA Astrophysics Data System (ADS)

    Xu, Zhe; Peng, M. G.; Tu, Lin Hsin; Lee, Cedric; Lin, J. K.; Jan, Jian Feng; Yin, Alb; Wang, Pei

    2006-10-01

    Nowadays, most foundries have paid more and more attention in order to reduce the CD width. Although the lithography technologies have developed drastically, mask data accuracy is still a big challenge than before. Besides, mask (reticle) price also goes up drastically such that data accuracy needs more special treatments.We've developed a system called eFDMS to guarantee the mask data accuracy. EFDMS is developed to do the automatic back-check of mask tooling database and the data transmission of mask tooling. We integrate our own EFDMS systems to engage with the standard mask tooling system K2 so that the upriver and the downriver processes of the mask tooling main body K2 can perform smoothly and correctly with anticipation. The competition in IC marketplace is changing from high-tech process to lower-price gradually. How to control the reduction of the products' cost more plays a significant role in foundries. Before the violent competition's drawing nearer, we should prepare the cost task ahead of time.

  15. Data management in clinical research: An overview

    PubMed Central

    Krishnankutty, Binny; Bellary, Shantala; Kumar, Naveen B.R.; Moodahadu, Latha S.

    2012-01-01

    Clinical Data Management (CDM) is a critical phase in clinical research, which leads to generation of high-quality, reliable, and statistically sound data from clinical trials. This helps to produce a drastic reduction in time from drug development to marketing. Team members of CDM are actively involved in all stages of clinical trial right from inception to completion. They should have adequate process knowledge that helps maintain the quality standards of CDM processes. Various procedures in CDM including Case Report Form (CRF) designing, CRF annotation, database designing, data-entry, data validation, discrepancy management, medical coding, data extraction, and database locking are assessed for quality at regular intervals during a trial. In the present scenario, there is an increased demand to improve the CDM standards to meet the regulatory requirements and stay ahead of the competition by means of faster commercialization of product. With the implementation of regulatory compliant data management tools, CDM team can meet these demands. Additionally, it is becoming mandatory for companies to submit the data electronically. CDM professionals should meet appropriate expectations and set standards for data quality and also have a drive to adapt to the rapidly changing technology. This article highlights the processes involved and provides the reader an overview of the tools and standards adopted as well as the roles and responsibilities in CDM. PMID:22529469

  16. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    PubMed Central

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447

  17. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    PubMed

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.

  18. Veterinarians and Humane Endings: When Is It the Right Time to Euthanize a Companion Animal?

    PubMed Central

    Knesl, Oliver; Hart, Benjamin L.; Fine, Aubrey H.; Cooper, Leslie; Patterson-Kane, Emily; Houlihan, Kendall Elizabeth; Anthony, Raymond

    2017-01-01

    Current advances in technologies and treatments provide pet owners and veterinarians with more options for prolonging the life of beloved pets, but can simultaneously lead to ethical dilemmas relating to what is best for both animal and owner. Key tools for improving end-of-life outcomes include (1) sufficient training to understand the valid ethical approaches to determining when euthanasia is appropriate, (2) regular training in client communication skills, and (3) a standard end-of-life protocol that includes the use of quality of life assessment tools, euthanasia consent forms, and pet owner resources for coping with the loss of a pet. Using these tools will improve outcomes for animals and their owners and reduce the heavy burden of stress and burnout currently being experienced by the veterinary profession. PMID:28470002

  19. Behavioral Health and Performance Element: Tools and Technologies

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.

    2009-01-01

    This slide presentation reviews the research into the Behavioral Health and Performance (BHP) of the Human Research Program. The program element goal is to identify, characterize and prevent or reduce behavioral health and performance risks associated with space travel, exploration, and return to terrestrial life. To accomplish this goal the program focuses on applied research that is designed to yield deliverables that reduce risk. There are several different elements that are of particular interest: Behavioral Medicine, Sleep, and team composition, and team work. In order to assure success for NASA missions the Human Research Program develops and validate the standards for each of the areas of interest. There is discussion of the impact on BHP while astronauts are on Long Duration Missions. The effort in this research is to create tools to meet the BHP concerns, these prospective tools are reviewed.

  20. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  1. Technology of machine tools. Volume 4. Machine tool controls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-10-01

    The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.

  2. Technology of machine tools. Volume 3. Machine tool mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tlusty, J.

    1980-10-01

    The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.

  3. Technology of machine tools. Volume 5. Machine tool accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hocken, R.J.

    1980-10-01

    The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.

  4. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. The Pedometer as a Tool to Enrich Science Learning in a Public Health Context

    NASA Astrophysics Data System (ADS)

    Rye, James A.; Zizzi, Samuel J.; Vitullo, Elizabeth A.; Tompkins, Nancy O'hara

    2005-12-01

    The United States is experiencing an obesity epidemic: A science-technology-society public health issue tied to our built environment, which is characterized by heavy dependence on automobiles and reduced opportunities to walk and bicycle for transportation. This presents an informal science education opportunity within "science in personal and social perspectives'' to use pedometer technology for enhancing students' understandings about human energy balance. An exploratory study was conducted with 29 teachers to investigate how pedometers could be used for providing academic enrichment to secondary students participating in after-school Health Sciences and Technology Academy clubs. Frequency analysis revealed that the pedometer activities often investigated kilocalorie expenditure and/or incorporated hypothesis testing/experimenting. Teachers' perspectives on learning outcomes most frequently conveyed that students increased their awareness of the importance of health habits relative to kilocalorie intake and expenditure. Pedometers have considerable merit for the regular science curriculum as they allow for numerous mathematics applications and inquiry learning and target concepts such as energy and equilibrium that cut across the National Science Education Standards. Pedometers and associated resources on human energy balance are important tools that science teachers can employ in helping schools respond to the national call to prevent childhood obesity.

  6. A reference guide for tree analysis and visualization

    PubMed Central

    2010-01-01

    The quantities of data obtained by the new high-throughput technologies, such as microarrays or ChIP-Chip arrays, and the large-scale OMICS-approaches, such as genomics, proteomics and transcriptomics, are becoming vast. Sequencing technologies become cheaper and easier to use and, thus, large-scale evolutionary studies towards the origins of life for all species and their evolution becomes more and more challenging. Databases holding information about how data are related and how they are hierarchically organized expand rapidly. Clustering analysis is becoming more and more difficult to be applied on very large amounts of data since the results of these algorithms cannot be efficiently visualized. Most of the available visualization tools that are able to represent such hierarchies, project data in 2D and are lacking often the necessary user friendliness and interactivity. For example, the current phylogenetic tree visualization tools are not able to display easy to understand large scale trees with more than a few thousand nodes. In this study, we review tools that are currently available for the visualization of biological trees and analysis, mainly developed during the last decade. We describe the uniform and standard computer readable formats to represent tree hierarchies and we comment on the functionality and the limitations of these tools. We also discuss on how these tools can be developed further and should become integrated with various data sources. Here we focus on freely available software that offers to the users various tree-representation methodologies for biological data analysis. PMID:20175922

  7. Empowering Lay-Counsellors with Technology: Masivukeni, a Standardized Multimedia Counselling Support Tool to Deliver ART Counselling.

    PubMed

    Gouse, H; Robbins, R N; Mellins, C A; Kingon, A; Rowe, J; Henry, M; Remien, R H; Pearson, A; Victor, F; Joska, J A

    2018-05-19

    Lay-counsellors in resource-limited settings convey critical HIV- and ART-information, and face challenges including limited training and variable application of counselling. This study explored lay-counsellors and Department of Health (DoH) perspectives on the utility of a multimedia adherence counselling program. Masivukeni, an mHealth application that provides scaffolding for delivering standardized ART counselling was used in a 3-year randomized control trail at two primary health care clinics in Cape Town, South Africa. In this programmatic and descriptive narrative report, we describe the application; lay-counsellors' response to open-ended questions regarding their experience with using Masivukeni; and perspectives of the City of Cape Town and Western Cape Government DoH, obtained through ongoing engagements and feedback sessions. Counsellors reported Masivukeni empowered them to provide high quality counselling. DoH indicated strong support for a future implementation study assessing feasibility for larger scale roll-out. Masivukeni has potential as a counselling tool in resource-limited settings.

  8. Indexing method of digital audiovisual medical resources with semantic Web integration.

    PubMed

    Cuggia, Marc; Mougin, Fleur; Le Beux, Pierre

    2003-01-01

    Digitalization of audio-visual resources combined with the performances of the networks offer many possibilities which are the subject of intensive work in the scientific and industrial sectors. Indexing such resources is a major challenge. Recently, the Motion Pictures Expert Group (MPEG) has been developing MPEG-7, a standard for describing multimedia content. The good of this standard is to develop a rich set of standardized tools to enable fast efficient retrieval from digital archives or filtering audiovisual broadcasts on the internet. How this kind of technologies could be used in the medical context? In this paper, we propose a simpler indexing system, based on Dublin Core standard and complaint to MPEG-7. We use MeSH and UMLS to introduce conceptual navigation. We also present a video-platform with enables to encode and give access to audio-visual resources in streaming mode.

  9. Technology of machine tools. Volume 2. Machine tool systems management and utilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomson, A.R.

    1980-10-01

    The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.

  10. CLAST: CUDA implemented large-scale alignment search tool.

    PubMed

    Yano, Masahiro; Mori, Hiroshi; Akiyama, Yutaka; Yamada, Takuji; Kurokawa, Ken

    2014-12-11

    Metagenomics is a powerful methodology to study microbial communities, but it is highly dependent on nucleotide sequence similarity searching against sequence databases. Metagenomic analyses with next-generation sequencing technologies produce enormous numbers of reads from microbial communities, and many reads are derived from microbes whose genomes have not yet been sequenced, limiting the usefulness of existing sequence similarity search tools. Therefore, there is a clear need for a sequence similarity search tool that can rapidly detect weak similarity in large datasets. We developed a tool, which we named CLAST (CUDA implemented large-scale alignment search tool), that enables analyses of millions of reads and thousands of reference genome sequences, and runs on NVIDIA Fermi architecture graphics processing units. CLAST has four main advantages over existing alignment tools. First, CLAST was capable of identifying sequence similarities ~80.8 times faster than BLAST and 9.6 times faster than BLAT. Second, CLAST executes global alignment as the default (local alignment is also an option), enabling CLAST to assign reads to taxonomic and functional groups based on evolutionarily distant nucleotide sequences with high accuracy. Third, CLAST does not need a preprocessed sequence database like Burrows-Wheeler Transform-based tools, and this enables CLAST to incorporate large, frequently updated sequence databases. Fourth, CLAST requires <2 GB of main memory, making it possible to run CLAST on a standard desktop computer or server node. CLAST achieved very high speed (similar to the Burrows-Wheeler Transform-based Bowtie 2 for long reads) and sensitivity (equal to BLAST, BLAT, and FR-HIT) without the need for extensive database preprocessing or a specialized computing platform. Our results demonstrate that CLAST has the potential to be one of the most powerful and realistic approaches to analyze the massive amount of sequence data from next-generation sequencing technologies.

  11. Advanced Sensors and Controls for Building Applications: Market Assessment and Potential R&D Pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brambley, Michael R.; Haves, Philip; McDonald, Sean C.

    2005-04-13

    Significant energy savings can be achieved in commercial building operation, along with increased comfort and control for occupants, through the implementation of advanced technologies. This document provides a market assessment of existing building sensors and controls and presents a range of technology pathways (R&D options) for pursuing advanced sensors and building control strategies. This paper is actually a synthesis of five other white papers: the first describes the market assessment including estimates of market potential and energy savings for sensors and control strategies currently on the market as well as a discussion of market barriers to these technologies. The othermore » four cover technology pathways: (1) current applications and strategies for new applications, (2) sensors and controls, (3) networking, security, and protocols and standards, and (4) automated diagnostics, performance monitoring, commissioning, optimal control and tools. Each technology pathway chapter gives an overview of the technology or application. This is followed by a discussion of needs and the current status of the technology. Finally, a series of research topics is proposed.« less

  12. AsTeRICS.

    PubMed

    Drajsajtl, Tomáš; Struk, Petr; Bednárová, Alice

    2013-01-01

    AsTeRICS - "The Assistive Technology Rapid Integration & Construction Set" is a construction set for assistive technologies which can be adapted to the motor abilities of end-users. AsTeRICS allows access to different devices such as PCs, cell phones and smart home devices, with all of them integrated in a platform adapted as much as possible to each user. People with motor disabilities in the upper limbs, with no cognitive impairment, no perceptual limitations (neither visual nor auditory) and with basic skills in using technologies such as PCs, cell phones, electronic agendas, etc. have available a flexible and adaptable technology which enables them to access the Human-Machine-Interfaces (HMI) on the standard desktop and beyond. AsTeRICS provides graphical model design tools, a middleware and hardware support for the creation of tailored AT-solutions involving bioelectric signal acquisition, Brain-/Neural Computer Interfaces, Computer-Vision techniques and standardized actuator and device controls and allows combining several off-the-shelf AT-devices in every desired combination. Novel, end-user ready solutions can be created and adapted via a graphical editor without additional programming efforts. The AsTeRICS open-source framework provides resources for utilization and extension of the system to developers and researches. AsTeRICS was developed by the AsTeRICS project and was partially funded by EC.

  13. Review on thin-film transistor technology, its applications, and possible new applications to biological cells

    NASA Astrophysics Data System (ADS)

    Tixier-Mita, Agnès; Ihida, Satoshi; Ségard, Bertrand-David; Cathcart, Grant A.; Takahashi, Takuya; Fujita, Hiroyuki; Toshiyoshi, Hiroshi

    2016-04-01

    This paper presents a review on state-of-the-art of thin-film transistor (TFT) technology and its wide range of applications, not only in liquid crystal displays (TFT-LCDs), but also in sensing devices. The history of the evolution of the technology is first given. Then the standard applications of TFT-LCDs, and X-ray detectors, followed by state-of-the-art applications in the field of chemical and biochemical sensing are presented. TFT technology allows the fabrication of dense arrays of independent and transparent microelectrodes on large glass substrates. The potential of these devices as electrical substrates for biological cell applications is then described. The possibility of using TFT array substrates as new tools for electrical experiments on biological cells has been investigated for the first time by our group. Dielectrophoresis experiments and impedance measurements on yeast cells are presented here. Their promising results open the door towards new applications of TFT technology.

  14. A computer-based specification methodology

    NASA Technical Reports Server (NTRS)

    Munck, Robert G.

    1986-01-01

    Standard practices for creating and using system specifications are inadequate for large, advanced-technology systems. A need exists to break away from paper documents in favor of documents that are stored in computers and which are read and otherwise used with the help of computers. An SADT-based system, running on the proposed Space Station data management network, could be a powerful tool for doing much of the required technical work of the Station, including creating and operating the network itself.

  15. Database Technology Activities and Assessment for Defense Modeling and Simulation Office (DMSO) (August 1991-November 1992). A Documented Briefing

    DTIC Science & Technology

    1994-01-01

    databases and identifying new data entities, data elements, and relationships . - Standard data naming conventions, schema, and definition processes...management system. The use of such a tool could offer: (1) structured support for representation of objects and their relationships to each other (and...their relationships to related multimedia objects such as an engineering drawing of the tank object or a satellite image that contains the installation

  16. Parallel-Processing Test Bed For Simulation Software

    NASA Technical Reports Server (NTRS)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  17. Integrating Information Technologies Into Large Organizations

    NASA Technical Reports Server (NTRS)

    Gottlich, Gretchen; Meyer, John M.; Nelson, Michael L.; Bianco, David J.

    1997-01-01

    NASA Langley Research Center's product is aerospace research information. To this end, Langley uses information technology tools in three distinct ways. First, information technology tools are used in the production of information via computation, analysis, data collection and reduction. Second, information technology tools assist in streamlining business processes, particularly those that are primarily communication based. By applying these information tools to administrative activities, Langley spends fewer resources on managing itself and can allocate more resources for research. Third, Langley uses information technology tools to disseminate its aerospace research information, resulting in faster turn around time from the laboratory to the end-customer.

  18. Successful isolation and PCR amplification of DNA from National Institute of Standards and Technology herbal dietary supplement standard reference material powders and extracts.

    PubMed

    Cimino, Matthew T

    2010-03-01

    Twenty-four herbal dietary supplement powder and extract reference standards provided by the National Institute of Standards and Technology (NIST) were investigated using three different commercially available DNA extraction kits to evaluate DNA availability for downstream nucleotide-based applications. The material included samples of Camellia, Citrus, Ephedra, Ginkgo, Hypericum, Serenoa, And Vaccinium. Protocols from Qiagen, MoBio, and Phytopure were used to isolate and purify DNA from the NIST standards. The resulting DNA concentration was quantified using SYBR Green fluorometry. Each of the 24 samples yielded DNA, though the concentration of DNA from each approach was notably different. The Phytopure method consistently yielded more DNA. The average yield ratio was 22 : 3 : 1 (ng/microL; Phytopure : Qiagen : MoBio). Amplification of the internal transcribed spacer II region using PCR was ultimately successful in 22 of the 24 samples. Direct sequencing chromatograms of the amplified material suggested that most of the samples were comprised of mixtures. However, the sequencing chromatograms of 12 of the 24 samples were sufficient to confirm the identity of the target material. The successful extraction, amplification, and sequencing of DNA from these herbal dietary supplement extracts and powders supports a continued effort to explore nucleotide sequence-based tools for the authentication and identification of plants in dietary supplements. (c) Georg Thieme Verlag KG Stuttgart . New York.

  19. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  20. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  1. Characteristics of a semi-custom library development system

    NASA Technical Reports Server (NTRS)

    Yancey, M.; Cannon, R.

    1990-01-01

    Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.

  2. Laboratory and Workplace Assessments of Rivet Bucking Bar Vibration Emissions

    PubMed Central

    McDowell, Thomas W.; Warren, Christopher; Xu, Xueyan S.; Welcome, Daniel E.; Dong, Ren G.

    2016-01-01

    Sheet metal workers operating rivet bucking bars are at risk of developing hand and wrist musculoskeletal disorders associated with exposures to hand-transmitted vibrations and forceful exertions required to operate these hand tools. New bucking bar technologies have been introduced in efforts to reduce workplace vibration exposures to these workers. However, the efficacy of these new bucking bar designs has not been well documented. While there are standardized laboratory-based methodologies for assessing the vibration emissions of many types of powered hand tools, no such standard exists for rivet bucking bars. Therefore, this study included the development of a laboratory-based method for assessing bucking bar vibrations which utilizes a simulated riveting task. With this method, this study evaluated three traditional steel bucking bars, three similarly shaped tungsten alloy bars, and three bars featuring spring-dampeners. For comparison the bucking bar vibrations were also assessed during three typical riveting tasks at a large aircraft maintenance facility. The bucking bars were rank-ordered in terms of unweighted and frequency-weighted acceleration measured at the hand-tool interface. The results suggest that the developed laboratory method is a reasonable technique for ranking bucking bar vibration emissions; the lab-based riveting simulations produced similar rankings to the workplace rankings. However, the laboratory-based acceleration averages were considerably lower than the workplace measurements. These observations suggest that the laboratory test results are acceptable for comparing and screening bucking bars, but the laboratory measurements should not be directly used for assessing the risk of workplace bucking bar vibration exposures. The newer bucking bar technologies exhibited significantly reduced vibrations compared to the traditional steel bars. The results of this study, together with other information such as rivet quality, productivity, tool weight, comfort, worker acceptance, and initial cost can be used to make informed bucking bar selections. PMID:25381185

  3. Laboratory and workplace assessments of rivet bucking bar vibration emissions.

    PubMed

    McDowell, Thomas W; Warren, Christopher; Xu, Xueyan S; Welcome, Daniel E; Dong, Ren G

    2015-04-01

    Sheet metal workers operating rivet bucking bars are at risk of developing hand and wrist musculoskeletal disorders associated with exposures to hand-transmitted vibrations and forceful exertions required to operate these hand tools. New bucking bar technologies have been introduced in efforts to reduce workplace vibration exposures to these workers. However, the efficacy of these new bucking bar designs has not been well documented. While there are standardized laboratory-based methodologies for assessing the vibration emissions of many types of powered hand tools, no such standard exists for rivet bucking bars. Therefore, this study included the development of a laboratory-based method for assessing bucking bar vibrations which utilizes a simulated riveting task. With this method, this study evaluated three traditional steel bucking bars, three similarly shaped tungsten alloy bars, and three bars featuring spring-dampeners. For comparison the bucking bar vibrations were also assessed during three typical riveting tasks at a large aircraft maintenance facility. The bucking bars were rank-ordered in terms of unweighted and frequency-weighted acceleration measured at the hand-tool interface. The results suggest that the developed laboratory method is a reasonable technique for ranking bucking bar vibration emissions; the lab-based riveting simulations produced similar rankings to the workplace rankings. However, the laboratory-based acceleration averages were considerably lower than the workplace measurements. These observations suggest that the laboratory test results are acceptable for comparing and screening bucking bars, but the laboratory measurements should not be directly used for assessing the risk of workplace bucking bar vibration exposures. The newer bucking bar technologies exhibited significantly reduced vibrations compared to the traditional steel bars. The results of this study, together with other information such as rivet quality, productivity, tool weight, comfort, worker acceptance, and initial cost can be used to make informed bucking bar selections. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.

  4. A practical workflow for making anatomical atlases for biological research.

    PubMed

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  5. Using Galaxy to Perform Large-Scale Interactive Data Analyses

    PubMed Central

    Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton

    2014-01-01

    Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312

  6. An Overview of Public Domain Tools for Measuring the Sustainability of Environmental Remediation - 12060

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Claypool, John E.; Rogers, Scott

    The application of sustainability principles to the investigation and remediation of contaminated sites is an area of rapid development within the environmental profession, with new business practices, tools, and performance standards for identifying, evaluating, and managing the 'collateral' impacts of cleanup projects to the environment, economy and society coming from many organizations. Guidelines, frameworks, and standards of practice for 'green and sustainable remediation' (GSR) have been released and are under development by the Sustainable Remediation Forum (SURF), the American Society for Testing Materials (ASTM), the Interstate Technology Roundtable Commission (ITRC) and other organizations in the U.S. and internationally. In responsemore » to Executive Orders from the President, Federal government agencies have developed policies, procedures and guidelines for evaluating and reporting the sustainability of their environmental restoration projects. Private sector companies in the petroleum, utility, manufacturing, defense, and other sectors are developing their own corporate GSR programs to improve day-to-day management of contaminated sites and to support external reporting as part of their corporate social responsibility (CSR) efforts. The explosion of mandates, policy, procedures and guidance raises the question of how to determine whether a remediation technology or cleanup approach is green and/or sustainable. The environmental profession has responded to this question by designing, developing and deploying a wide array of tools, calculators, and databases that enable regulatory agencies, site managers and environmental professionals to calculate the collateral impacts of their remediation projects in the environmental, social, and economic domains. Many of these tools are proprietary ones developed by environmental engineering/consulting firms for use in their consulting engagements and/or tailored specifically to meet the needs of their clients. When it comes to the public domain, Federal government agencies are spearheading the development of software tools to measure and report emissions of air pollutants (e.g., carbon dioxide, other greenhouse gases, criteria air pollutants); consumption of energy, water and natural resources; accident and safety risks; project costs and other economic metrics. Most of the tools developed for the Government are available to environmental practitioners without charge, so they are growing in usage and popularity. The key features and metrics calculated by the available public-domain tools for measuring the sustainability of environmental remediation projects share some commonalities but there are differences amongst the tools. The SiteWise{sup TM} sustainability tool developed for the Navy and US Army will be compared with the Sustainable Remediation Tool (SRT{sup TM}) developed for the US Air Force (USAF). In addition, the USAF's Clean Solar and Wind Energy in Environmental Programs (CleanSWEEP), a soon-to-be-released tool for evaluating the economic feasibility of utilizing renewal energy for powering remediation systems will be described in the paper. (authors)« less

  7. Fire service and first responder thermal imaging camera (TIC) advances and standards

    NASA Astrophysics Data System (ADS)

    Konsin, Lawrence S.; Nixdorff, Stuart

    2007-04-01

    Fire Service and First Responder Thermal Imaging Camera (TIC) applications are growing, saving lives and preventing injury and property damage. Firefighters face a wide range of serious hazards. TICs help mitigate the risks by protecting Firefighters and preventing injury, while reducing time spent fighting the fire and resources needed to do so. Most fire safety equipment is covered by performance standards. Fire TICs, however, are not covered by such standards and are also subject to inadequate operational performance and insufficient user training. Meanwhile, advancements in Fire TICs and lower costs are driving product demand. The need for a Fire TIC Standard was spurred in late 2004 through a Government sponsored Workshop where experts from the First Responder community, component manufacturers, firefighter training, and those doing research on TICs discussed strategies, technologies, procedures, best practices and R&D that could improve Fire TICs. The workshop identified pressing image quality, performance metrics, and standards issues. Durability and ruggedness metrics and standard testing methods were also seen as important, as was TIC training and certification of end-users. A progress report on several efforts in these areas and their impact on the IR sensor industry will be given. This paper is a follow up to the SPIE Orlando 2004 paper on Fire TIC usage (entitled Emergency Responders' Critical Infrared) which explored the technological development of this IR industry segment from the viewpoint of the end user, in light of the studies and reports that had established TICs as a mission critical tool for firefighters.

  8. Micro- and nanoengineering for stem cell biology: the promise with a caution.

    PubMed

    Kshitiz; Kim, Deok-Ho; Beebe, David J; Levchenko, Andre

    2011-08-01

    Current techniques used in stem cell research only crudely mimic the physiological complexity of the stem cell niches. Recent advances in the field of micro- and nanoengineering have brought an array of in vitro cell culture models that have enabled development of novel, highly precise and standardized tools that capture physiological details in a single platform, with greater control, consistency, and throughput. In this review, we describe the micro- and nanotechnology-driven modern toolkit for stem cell biologists to design novel experiments in more physiological microenvironments with increased precision and standardization, and caution them against potential challenges that the modern technologies might present. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  10. Achieving the Meaningful Use Standard: A Model for Implementing Change Within Medical Practices.

    PubMed

    Fryefield, David C; Staggs, Stuart; Herman, William; Stickler, Alan; Ahmad, Asif; Patt, Debra A; Beveridge, Roy A

    2014-03-01

    Change management in medical practices is often an uphill battle. Lack of agreement on standards, ineffective leadership, inertia, inconsistent access to data, and inability to clearly define and communicate the benefits of change represent significant barriers to success. In 2009, the Health Information Technology for Economic and Clinical Health (HITECH) Act created the meaningful use (MU) incentive program administered through the Centers for Medicare and Medicaid Services (CMS). To earn financial incentive payments, eligible physicians adopt certified electronic health record (EHR) technology and use it to meet specified objectives. In response, leadership of the US Oncology Network launched an MU initiative designed to create a comprehensive system of tools, education, performance feedback, and support that would facilitate successful achievement of the MU standards. The EHR used by the majority of network physicians was modified according to the MU specifications, and EHR certification was obtained. Baseline compliance data were measured for each of the MU standards and for each of the eligible physicians. Physician and staff workflow processes necessary for consistent data input and compliance were outlined for each standard. Each practice identified one or more staff members who would act as MU leads. Training modules were developed for the MU leads as well as for physicians, mid-level providers, nurses, medical assistants, and office staff. An MU measurement tool was created, designed to target areas for MU process improvement and automate reporting. Data were updated and verified weekly to provide timely feedback to practices, including individual physician detail and links to individual patient records. A total of 943 practitioners within the US Oncology Network met eligibility criteria for the MU program. At baseline, compliance with each MU standard ranged from 0% (clinical summaries) to 100% (computerized order entry). In many cases, data were simply not being entered into the EHR. Time from program launch to first submission of MU attestation was 18 months. As of March 2013, 781 practitioners (83%) had achieved the MU standards. In comparison, CMS reported that 44% of all eligible physicians and 26% of hematologists and oncologists had successfully achieved Medicare MU standards and received payment. Successful change management in medical practices can be accomplished through a comprehensive system of leadership, education, support, timely feedback of data, and clearly defined incentives. Incentives alone may be far less effective.

  11. The Promise of Information and Communication Technology in Healthcare: Extracting Value From the Chaos.

    PubMed

    Mamlin, Burke W; Tierney, William M

    2016-01-01

    Healthcare is an information business with expanding use of information and communication technologies (ICTs). Current ICT tools are immature, but a brighter future looms. We examine 7 areas of ICT in healthcare: electronic health records (EHRs), health information exchange (HIE), patient portals, telemedicine, social media, mobile devices and wearable sensors and monitors, and privacy and security. In each of these areas, we examine the current status and future promise, highlighting how each might reach its promise. Steps to better EHRs include a universal programming interface, universal patient identifiers, improved documentation and improved data analysis. HIEs require federal subsidies for sustainability and support from EHR vendors, targeting seamless sharing of EHR data. Patient portals must bring patients into the EHR with better design and training, greater provider engagement and leveraging HIEs. Telemedicine needs sustainable payment models, clear rules of engagement, quality measures and monitoring. Social media needs consensus on rules of engagement for providers, better data mining tools and approaches to counter disinformation. Mobile and wearable devices benefit from a universal programming interface, improved infrastructure, more rigorous research and integration with EHRs and HIEs. Laws for privacy and security need updating to match current technologies, and data stewards should share information on breaches and standardize best practices. ICT tools are evolving quickly in healthcare and require a rational and well-funded national agenda for development, use and assessment. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  12. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    PubMed

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  13. Open-source platforms for navigated image-guided interventions.

    PubMed

    Ungi, Tamas; Lasso, Andras; Fichtinger, Gabor

    2016-10-01

    Navigation technology is changing the clinical standards in medical interventions by making existing procedures more accurate, and new procedures possible. Navigation is based on preoperative or intraoperative imaging combined with 3-dimensional position tracking of interventional tools registered to the images. Research of navigation technology in medical interventions requires significant engineering efforts. The difficulty of developing such complex systems has been limiting the clinical translation of new methods and ideas. A key to the future success of this field is to provide researchers with platforms that allow rapid implementation of applications with minimal resources spent on reimplementing existing system features. A number of platforms have been already developed that can share data in real time through standard interfaces. Complete navigation systems can be built using these platforms using a layered software architecture. In this paper, we review the most popular platforms, and show an effective way to take advantage of them through an example surgical navigation application. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Going virtual with quicktime VR: new methods and standardized tools for interactive dynamic visualization of anatomical structures.

    PubMed

    Trelease, R B; Nieder, G L; Dørup, J; Hansen, M S

    2000-04-15

    Continuing evolution of computer-based multimedia technologies has produced QuickTime, a multiplatform digital media standard that is supported by stand-alone commercial programs and World Wide Web browsers. While its core functions might be most commonly employed for production and delivery of conventional video programs (e.g., lecture videos), additional QuickTime VR "virtual reality" features can be used to produce photorealistic, interactive "non-linear movies" of anatomical structures ranging in size from microscopic through gross anatomic. But what is really included in QuickTime VR and how can it be easily used to produce novel and innovative visualizations for education and research? This tutorial introduces the QuickTime multimedia environment, its QuickTime VR extensions, basic linear and non-linear digital video technologies, image acquisition, and other specialized QuickTime VR production methods. Four separate practical applications are presented for light and electron microscopy, dissectable preserved specimens, and explorable functional anatomy in magnetic resonance cinegrams.

  15. Non-thermal plasma technologies: new tools for bio-decontamination.

    PubMed

    Moreau, M; Orange, N; Feuilloley, M G J

    2008-01-01

    Bacterial control and decontamination are crucial to industrial safety assessments. However, most recently developed materials are not compatible with standard heat sterilization treatments. Advanced oxidation processes, and particularly non-thermal plasmas, are emerging and promising technologies for sanitation because they are both efficient and cheap. The applications of non-thermal plasma to bacterial control remain poorly known for several reasons: this technique was not developed for biological applications and most of the literature is in the fields of physics and chemistry. Moreover, the diversity of the devices and complexity of the plasmas made any general evaluation of the potential of the technique difficult. Finally, no experimental equipment for non-thermal plasma sterilization is commercially available and reference articles for microbiologists are rare. The present review aims to give an overview of the principles of action and applications of plasma technologies in biodecontamination.

  16. Informatics in radiology: an information model of the DICOM standard.

    PubMed

    Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L

    2011-01-01

    The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010

  17. New Technologies to Assist Training in Hospitality Sector

    ERIC Educational Resources Information Center

    Balta, Sabah

    2007-01-01

    Hospitality sector needs new technological training tools, which can assist to improve sector employees' skills and services quality. The sector might be more interactive when these technological training tools used on the job-training program. This study addresses to issue of illumination of new technologic tools that enforce training in which…

  18. The FOT tool kit concept

    NASA Technical Reports Server (NTRS)

    Fatig, Michael

    1993-01-01

    Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.

  19. Secure Web-based Ground System User Interfaces over the Open Internet

    NASA Technical Reports Server (NTRS)

    Langston, James H.; Murray, Henry L.; Hunt, Gary R.

    1998-01-01

    A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.

  20. Assay optimisation and technology transfer for multi-site immuno-monitoring in vaccine trials

    PubMed Central

    Harris, Stephanie A.; Satti, Iman; Bryan, Donna; Walker, K. Barry; Dockrell, Hazel M.; McShane, Helen; Ho, Mei Mei

    2017-01-01

    Cellular immunological assays are important tools for the monitoring of responses to T-cell-inducing vaccine candidates. As these bioassays are often technically complex and require considerable experience, careful technology transfer between laboratories is critical if high quality, reproducible data that allows comparison between sites, is to be generated. The aim of this study, funded by the European Union Framework Program 7-funded TRANSVAC project, was to optimise Standard Operating Procedures and the technology transfer process to maximise the reproducibility of three bioassays for interferon-gamma responses: enzyme-linked immunosorbent assay (ELISA), ex-vivo enzyme-linked immunospot and intracellular cytokine staining. We found that the initial variability in results generated across three different laboratories reduced following a combination of Standard Operating Procedure harmonisation and the undertaking of side-by-side training sessions in which assay operators performed each assay in the presence of an assay ‘lead’ operator. Mean inter-site coefficients of variance reduced following this training session when compared with the pre-training values, most notably for the ELISA assay. There was a trend for increased inter-site variability at lower response magnitudes for the ELISA and intracellular cytokine staining assays. In conclusion, we recommend that on-site operator training is an essential component of the assay technology transfer process and combined with harmonised Standard Operating Procedures will improve the quality, reproducibility and comparability of data produced across different laboratories. These data may be helpful in ongoing discussions of the potential risk/benefit of centralised immunological assay strategies for large clinical trials versus decentralised units. PMID:29020010

  1. Food Safety Informatics: A Public Health Imperative

    PubMed Central

    Tucker, Cynthia A.; Larkin, Stephanie N.; Akers, Timothy A.

    2011-01-01

    To date, little has been written about the implementation of utilizing food safety informatics as a technological tool to protect consumers, in real-time, against foodborne illnesses. Food safety outbreaks have become a major public health problem, causing an estimated 48 million illnesses, 128,000 hospitalizations, and 3,000 deaths in the U.S. each year. Yet, government inspectors/regulators that monitor foodservice operations struggle with how to collect, organize, and analyze data; implement, monitor, and enforce safe food systems. Currently, standardized technologies have not been implemented to efficiently establish “near-in-time” or “just-in-time” electronic awareness to enhance early detection of public health threats regarding food safety. To address the potential impact of collection, organization and analyses of data in a foodservice operation, a wireless food safety informatics (FSI) tool was pilot tested at a university student foodservice center. The technological platform in this test collected data every six minutes over a 24 hour period, across two primary domains: time and temperatures within freezers, walk-in refrigerators and dry storage areas. The results of this pilot study briefly illustrated how technology can assist in food safety surveillance and monitoring by efficiently detecting food safety abnormalities related to time and temperatures so that efficient and proper response in “real time” can be addressed to prevent potential foodborne illnesses. PMID:23569605

  2. Evaluation of a Framework to Implement Electronic Health Record Systems Based on the openEHR Standard

    NASA Astrophysics Data System (ADS)

    Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.

    2016-04-01

    The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.

  3. Application of telecom planar lightwave circuits for homeland security sensing

    NASA Astrophysics Data System (ADS)

    Veldhuis, Gert J.; Elders, Job; van Weerden, Harm; Amersfoort, Martin

    2004-03-01

    Over the past decade, a massive effort has been made in the development of planar lightwave circuits (PLCs) for application in optical telecommunications. Major advances have been made, on both the technological and functional performance front. Highly sophisticated software tools that are used to tailor designs to required functional performance support these developments. In addition extensive know-how in the field of packaging, testing, and failure mode and effects analysis (FMEA) has been built up in the struggle for meeting the stringent Telcordia requirements that apply to telecom products. As an example, silica-on-silicon is now a mature technology available at several industrial foundries around the world, where, on the performance front, the arrayed-waveguide grating (AWG) has evolved into an off-the-shelf product. The field of optical chemical-biological (CB) sensors for homeland security application can greatly benefit from the advances as described above. In this paper we discuss the currently available technologies, device concepts, and modeling tools that have emerged from the telecommunications arena and that can effectively be applied to the field of homeland security. Using this profound telecom knowledge base, standard telecom components can readily be tailored for detecting CB agents. Designs for telecom components aim at complete isolation from the environment to exclude impact of environmental parameters on optical performance. For sensing applications, the optical path must be exposed to the measurand, in this area additional development is required beyond what has already been achieved in telecom development. We have tackled this problem, and are now in a position to apply standard telecom components for CB sensing. As an example, the application of an AWG as a refractometer is demonstrated, and its performance evaluated.

  4. SeaDataNet Pan-European infrastructure for Ocean & Marine Data Management

    NASA Astrophysics Data System (ADS)

    Manzella, G. M.; Maillard, C.; Maudire, G.; Schaap, D.; Rickards, L.; Nast, F.; Balopoulos, E.; Mikhailov, N.; Vladymyrov, V.; Pissierssens, P.; Schlitzer, R.; Beckers, J. M.; Barale, V.

    2007-12-01

    SEADATANET is developing a Pan-European data management infrastructure to insure access to a large number of marine environmental data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties), safeguard and long term archiving. Data are derived from many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. SeaDataNet allows to have information on real time and archived marine environmental data collected at a pan-european level, through directories on marine environmental data and projects. SeaDataNet allows the access to the most comprehensive multidisciplinary sets of marine in-situ and remote sensing data, from about 40 laboratories, through user friendly tools. The data selection and access is operated through the Common Data Index (CDI), XML files compliant with ISO standards and unified dictionaries. Technical Developments carried out by SeaDataNet includes: A library of Standards - Meta-data standards, compliant with ISO 19115, for communication and interoperability between the data platforms. Software of interoperable on line system - Interconnection of distributed data centres by interfacing adapted communication technology tools. Off-Line Data Management software - software representing the minimum equipment of all the data centres is developed by AWI "Ocean Data View (ODV)". Training, Education and Capacity Building - Training 'on the job' is carried out by IOC-Unesco in Ostende. SeaDataNet Virtual Educational Centre internet portal provides basic tools for informal education

  5. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  6. A Simulated Learning Environment for Teaching Medicine Dispensing Skills

    PubMed Central

    Styles, Kim; Sewell, Keith; Trinder, Peta; Marriott, Jennifer; Maher, Sheryl; Naidu, Som

    2016-01-01

    Objective. To develop an authentic simulation of the professional practice dispensary context for students to develop their dispensing skills in a risk-free environment. Design. A development team used an Agile software development method to create MyDispense, a web-based simulation. Modeled on virtual learning environments elements, the software employed widely available standards-based technologies to create a virtual community pharmacy environment. Assessment. First-year pharmacy students who used the software in their tutorials, were, at the end of the second semester, surveyed on their prior dispensing experience and their perceptions of MyDispense as a tool to learn dispensing skills. Conclusion. The dispensary simulation is an effective tool for helping students develop dispensing competency and knowledge in a safe environment. PMID:26941437

  7. [Activities using websites and social networks: tools and indicators for evaluation].

    PubMed

    López, María José; Continente, Xavier; Sánchez, Esther; Bartroli, Montse

    In the field of health, information and communication technology (ICT) can create a space that, regardless of place or time, enables information to be shared and disseminated quickly. In addition to the usual challenges of evaluating public health activities, other difficulties are present when evaluating activities using ICT, such as lack of previous standards, unknown individual exposure or lack of information on the characteristics of those exposed. The aim of this paper is to describe some tools and indicators that may help to assess the scope, use and parameters related to website positioning on search engines as well as the connected social networks. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. On the utility of GPU accelerated high-order methods for unsteady flow simulations: A comparison with industry-standard tools

    NASA Astrophysics Data System (ADS)

    Vermeire, B. C.; Witherden, F. D.; Vincent, P. E.

    2017-04-01

    First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier-Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to a range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor-Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.

  9. On the utility of GPU accelerated high-order methods for unsteady flow simulations: A comparison with industry-standard tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vermeire, B.C., E-mail: brian.vermeire@concordia.ca; Witherden, F.D.; Vincent, P.E.

    First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier–Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to amore » range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor–Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.« less

  10. Behavioral Health Program Element

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.

    2006-01-01

    The project goal is to develop behavioral health prevention and maintenance system for continued crew health, safety, and performance for exploration missions. The basic scope includes a) Operationally-relevant research related to clinical cognitive and behavioral health of crewmembers; b) Ground-based studies using analog environments (Antarctic, NEEMO, simulations, and other testbeds; c) ISS studies (ISSMP) focusing on operational issues related to behavioral health outcomes and standards; d) Technology development activities for monitoring and diagnostic tools; and e) Cross-disciplinary research (e.g., human factors and habitability research, skeletal muscle, radiation).

  11. ISA software suite: supporting standards-compliant experimental annotation and enabling curation at the community level

    PubMed Central

    Rocca-Serra, Philippe; Brandizi, Marco; Maguire, Eamonn; Sklyar, Nataliya; Taylor, Chris; Begley, Kimberly; Field, Dawn; Harris, Stephen; Hide, Winston; Hofmann, Oliver; Neumann, Steffen; Sterk, Peter; Tong, Weida; Sansone, Susanna-Assunta

    2010-01-01

    Summary: The first open source software suite for experimentalists and curators that (i) assists in the annotation and local management of experimental metadata from high-throughput studies employing one or a combination of omics and other technologies; (ii) empowers users to uptake community-defined checklists and ontologies; and (iii) facilitates submission to international public repositories. Availability and Implementation: Software, documentation, case studies and implementations at http://www.isa-tools.org Contact: isatools@googlegroups.com PMID:20679334

  12. The visible human and digital anatomy learning initiative.

    PubMed

    Dev, Parvati; Senger, Steven

    2005-01-01

    A collaborative initiative is starting within the Internet2 Health Science community to explore the development of a framework for providing access to digital anatomical teaching resources over Internet2. This is a cross-cutting initiative with broad applicability and will require the involvement of a diverse collection of communities. It will seize an opportunity created by a convergence of needs and technical capabilities to identify the technologies and standards needed to support a sophisticated collection of tools for teaching anatomy.

  13. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  14. Predictive Accuracy of Sweep Frequency Impedance Technology in Identifying Conductive Conditions in Newborns.

    PubMed

    Aithal, Venkatesh; Kei, Joseph; Driscoll, Carlie; Murakoshi, Michio; Wada, Hiroshi

    2018-02-01

    Diagnosing conductive conditions in newborns is challenging for both audiologists and otolaryngologists. Although high-frequency tympanometry (HFT), acoustic stapedial reflex tests, and wideband absorbance measures are useful diagnostic tools, there is performance measure variability in their detection of middle ear conditions. Additional diagnostic sensitivity and specificity measures gained through new technology such as sweep frequency impedance (SFI) measures may assist in the diagnosis of middle ear dysfunction in newborns. The purpose of this study was to determine the test performance of SFI to predict the status of the outer and middle ear in newborns against commonly used reference standards. Automated auditory brainstem response (AABR), HFT (1000 Hz), transient evoked otoacoustic emission (TEOAE), distortion product otoacoustic emission (DPOAE), and SFI tests were administered to the study sample. A total of 188 neonates (98 males and 90 females) with a mean gestational age of 39.4 weeks were included in the sample. Mean age at the time of testing was 44.4 hr. Diagnostic accuracy of SFI was assessed in terms of its ability to identify conductive conditions in neonates when compared with nine different reference standards (including four single tests [AABR, HFT, TEOAE, and DPOAE] and five test batteries [HFT + DPOAE, HFT + TEOAE, DPOAE + TEOAE, DPOAE + AABR, and TEOAE + AABR]), using receiver operating characteristic (ROC) analysis and traditional test performance measures such as sensitivity and specificity. The test performance of SFI against the test battery reference standard of HFT + DPOAE and single reference standard of HFT was high with an area under the ROC curve (AROC) of 0.87 and 0.82, respectively. Although the HFT + DPOAE test battery reference standard performed better than the HFT reference standard in predicting middle ear conductive conditions in neonates, the difference in AROC was not significant. Further analysis revealed that the highest sensitivity and specificity for SFI (86% and 88%, respectively) was obtained when compared with the reference standard of HFT + DPOAE. Among the four single reference standards, SFI had the highest sensitivity and specificity (76% and 88%, respectively) when compared against the HFT reference standard. The high test performance of SFI against the HFT and HFT + DPOAE reference standards indicates that the SFI measure has appropriate diagnostic accuracy in detection of conductive conditions in newborns. Hence, the SFI test could be used as adjunct tool to identify conductive conditions in universal newborn hearing screening programs, and can also be used in diagnostic follow-up assessments. American Academy of Audiology

  15. Perception of Science Standards' Effectiveness and Their Implementation by Science Teachers

    NASA Astrophysics Data System (ADS)

    Klieger, Aviva; Yakobovitch, Anat

    2011-06-01

    The introduction of standards into the education system poses numerous challenges and difficulties. As with any change, plans should be made for teachers to understand and implement the standards. This study examined science teachers' perceptions of the effectiveness of the standards for teaching and learning, and the extent and ease/difficulty of implementing science standards in different grades. The research used a mixed methods approach, combining qualitative and quantitative research methods. The research tools were questionnaires that were administered to elementary school science teachers. The majority of the teachers perceived the standards in science as effective for teaching and learning and only a small minority viewed them as restricting their pedagogical autonomy. Differences were found in the extent of implementation of the different standards and between different grades. The teachers perceived a different degree of difficulty in the implementation of the different standards. The standards experienced as easiest to implement were in the field of biology and materials, whereas the standards in earth sciences and the universe and technology were most difficult to implement, and are also those evaluated by the teachers as being implemented to the least extent. Exposure of teachers' perceptions on the effectiveness of standards and the implementation of the standards may aid policymakers in future planning of teachers' professional development for the implementation of standards.

  16. European solvent industry group generic exposure scenario risk and exposure tool

    PubMed Central

    Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris

    2014-01-01

    The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates. PMID:23361440

  17. European solvent industry group generic exposure scenario risk and exposure tool.

    PubMed

    Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris

    2014-01-01

    The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates.

  18. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  19. Data hosting infrastructure for primary biodiversity data

    PubMed Central

    2011-01-01

    Background Today, an unprecedented volume of primary biodiversity data are being generated worldwide, yet significant amounts of these data have been and will continue to be lost after the conclusion of the projects tasked with collecting them. To get the most value out of these data it is imperative to seek a solution whereby these data are rescued, archived and made available to the biodiversity community. To this end, the biodiversity informatics community requires investment in processes and infrastructure to mitigate data loss and provide solutions for long-term hosting and sharing of biodiversity data. Discussion We review the current state of biodiversity data hosting and investigate the technological and sociological barriers to proper data management. We further explore the rescuing and re-hosting of legacy data, the state of existing toolsets and propose a future direction for the development of new discovery tools. We also explore the role of data standards and licensing in the context of data hosting and preservation. We provide five recommendations for the biodiversity community that will foster better data preservation and access: (1) encourage the community's use of data standards, (2) promote the public domain licensing of data, (3) establish a community of those involved in data hosting and archival, (4) establish hosting centers for biodiversity data, and (5) develop tools for data discovery. Conclusion The community's adoption of standards and development of tools to enable data discovery is essential to sustainable data preservation. Furthermore, the increased adoption of open content licensing, the establishment of data hosting infrastructure and the creation of a data hosting and archiving community are all necessary steps towards the community ensuring that data archival policies become standardized. PMID:22373257

  20. LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience

    NASA Astrophysics Data System (ADS)

    Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.

    2016-12-01

    CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)

  1. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  2. Technology of machine tools. Volume 1. Executive summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, G.P.

    1980-10-01

    The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.

  3. Standardized sign-out reduces intern perception of medical errors on the general internal medicine ward.

    PubMed

    Salerno, Stephen M; Arnett, Michael V; Domanski, Jeremy P

    2009-01-01

    Prior research on reducing variation in housestaff handoff procedures have depended on proprietary checkout software. Use of low-technology standardization techniques has not been widely studied. We wished to determine if standardizing the process of intern sign-out using low-technology sign-out tools could reduce perception of errors and missing handoff data. We conducted a pre-post prospective study of a cohort of 34 interns on a general internal medicine ward. Night interns coming off duty and day interns reassuming care were surveyed on their perception of erroneous sign-out data, mistakes made by the night intern overnight, and occurrences unanticipated by sign-out. Trainee satisfaction with the sign-out process was assessed with a 5-point Likert survey. There were 399 intern surveys performed 8 weeks before and 6 weeks after the introduction of a standardized sign-out form. The response rate was 95% for the night interns and 70% for the interns reassuming care in the morning. After the standardized form was introduced, night interns were significantly (p < .003) less likely to detect missing sign-out data including missing important diseases, contingency plans, or medications. Standardized sign-out did not significantly alter the frequency of dropped tasks or missed lab and X-ray data as perceived by the night intern. However, the day teams thought there were significantly less perceived errors on the part of the night intern (p = .001) after introduction of the standardized sign-out sheet. There was no difference in mean Likert scores of resident satisfaction with sign-out before and after the intervention. Standardized written sign-out sheets significantly improve the completeness and effectiveness of handoffs between night and day interns. Further research is needed to determine if these process improvements are related to better patient outcomes.

  4. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  5. Tools and technologies needed for conducting planetary field geology while on EVA: Insights from the 2010 Desert RATS geologist crewmembers

    NASA Astrophysics Data System (ADS)

    Young, Kelsey; Hurtado, José M.; Bleacher, Jacob E.; Brent Garry, W.; Bleisath, Scott; Buffington, Jesse; Rice, James W.

    2013-10-01

    The tools used by crews while on extravehicular activity during future missions to other bodies in the Solar System will be a combination of traditional geologic field tools (e.g. hammers, rakes, sample bags) and state-of-the-art technologies (e.g. high definition cameras, digital situational awareness devices, and new geologic tools). In the 2010 Desert Research and Technology Studies (RATS) field test, four crews, each consisting of an astronaut/engineer and field geologist, tested and evaluated various technologies during two weeks of simulated spacewalks in the San Francisco volcanic field, Arizona. These tools consisted of both Apollo-style field geology tools and modern technological equipment not used during the six Apollo lunar landings. The underlying exploration driver for this field test was to establish the protocols and technology needed for an eventual manned mission to an asteroid, the Moon, or Mars. The authors of this paper represent Desert RATS geologist crewmembers as well as two engineers who worked on technology development. Here we present an evaluation and assessment of these tools and technologies based on our first-hand experience of using them during the analog field test. We intend this to serve as a basis for continued development of technologies and protocols used for conducting planetary field geology as the Solar System exploration community moves forward into the next generation of planetary surface exploration.

  6. Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study

    NASA Astrophysics Data System (ADS)

    Saliceti, Jose A.

    The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.

  7. Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.

  8. Expanding the Design Space: Forging the Transition from 3D Printing to Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Amend, Matthew

    The synergy of Additive Manufacturing and Computational Geometry has the potential to radically expand the "design space" of solutions available to designers. Additive Manufacturing (AM) is capable of fabricating objects that are highly complex both in geometry and material properties. However, the introduction of any new technology can have a disruptive effect on established design practices and organizations. Before "Design for Additive Manufacturing" (DFAM) is a commonplace means of producing objects employed in "real world" products, appropriate design knowledge must be sufficiently integrated within industry. First, materials suited to additive manufacturing methods must be developed to satisfy existing industry standards and specifications, or new standards must be developed. Second, a new class of design representation (CAD) tools will need to be developed. Third, designers and design organizations will need to develop strategies for employing such tools. This thesis describes three DFAM exercises intended to demonstrate the potential for innovative design when using advanced additive materials, tools, and printers. These design exercises included 1) a light-weight composite layup mold developed with topology optimization, 2) a low-pressure fluid duct enhanced with an external lattice structure, and 3) an airline seat tray designed using a non-uniform lattice structure optimized with topology optimization.

  9. Utilizing social media for informal ocean conservation and education: The BioOceanography Project

    NASA Astrophysics Data System (ADS)

    Payette, J.

    2016-02-01

    Science communication through the use of social media is a rapidly evolving and growing pursuit in academic and scientific circles. Online tools and social media are being used in not only scientific communication but also scientific publication, education, and outreach. Standards and usage of social media as well as other online tools for communication, networking, outreach, and publication are always in development. Caution and a conservative attitude towards these novel "Science 2.0" tools is understandable because of their rapidly changing nature and the lack of professional standards for using them. However there are some key benefits and unique ways social media, online systems, and other Open or Open Source technologies, software, and "Science 2.0" tools can be utilized for academic purposes such as education and outreach. Diverse efforts for ocean conservation and education will continue to utilize social media for a variety of purposes. The BioOceanography project is an informal communication, education, outreach, and conservation initiative created for enhancing knowledge related to Oceanography and Marine Science with an unbiased yet conservation-minded approach and in an Open Source format. The BioOceanography project is ongoing and still evolving, but has already contributed to ocean education and conservation communication in key ways through a concerted web presence since 2013, including a curated Twitter account @_Oceanography and BioOceanography blog style website. Social media tools like those used in this project, if used properly can be highly effective and valuable for encouraging students, networking with researchers, and educating the general public in Oceanography.

  10. Single molecule real-time (SMRT) sequencing comes of age: applications and utilities for medical diagnostics

    PubMed Central

    Ardui, Simon; Ameur, Adam; Vermeesch, Joris R; Hestand, Matthew S

    2018-01-01

    Abstract Short read massive parallel sequencing has emerged as a standard diagnostic tool in the medical setting. However, short read technologies have inherent limitations such as GC bias, difficulties mapping to repetitive elements, trouble discriminating paralogous sequences, and difficulties in phasing alleles. Long read single molecule sequencers resolve these obstacles. Moreover, they offer higher consensus accuracies and can detect epigenetic modifications from native DNA. The first commercially available long read single molecule platform was the RS system based on PacBio's single molecule real-time (SMRT) sequencing technology, which has since evolved into their RSII and Sequel systems. Here we capsulize how SMRT sequencing is revolutionizing constitutional, reproductive, cancer, microbial and viral genetic testing. PMID:29401301

  11. Potential Futures for Information.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackermann, Mark R.

    Information is one of the most powerful tools available today. All advances in technology may be used, as David Sarnoff said, for the benefit or harm of society. Information can be used to shape the future by free people, or used to control people by less than benevolent governments, as has been demonstrated since the mid - 1930s, and with growing frequency over the past 50 years. What promised to once set people free and fuel an industrial revolution that might improve the standard of living over most of the world, has also been used to manipulate and enslave entiremore » populations. The future of information is tied to the future of technologies that support the collection of data, processing those data into information and knowledge, and distribution. Technologies supporting the future of information must include technologies that help protect the integrity of data and information, and help to guarantee its discoverability and appropriate availability -- often to the whole of society. This Page Intentionally Left Blank« less

  12. [Investigation of the Campaign of"Eliminating the Four Pests and Paying Attention to Hygiene"in Shanxi province in the 1950s, with special reference to technological innovation].

    PubMed

    Wu, X Y

    2016-07-28

    In order to improve the level of people's living and health standard, increase the output of cereal crops, achieve the country's economic recovery and development, and consolidate the new state power, a Campaign of"Eliminating the Four Pests and Paying Attention to Hygiene"in the field of health was launched by the Central Government in the 1950s. In response to the call of government, Shanxi province actively organized the people to participate in this Campaign. For improving the efficiency of Eliminating the Four Pests, people widely carried out technological innovation, with constant creation and invented advanced tools, and gained fruitful technological achievements. Through technological innovation, the working efficiency and quality were enhanced. More importantly, the interaction between the national and local authorities was promoted. In other words, there was a formation of national political domination and power over the local authority, as well as the"owner"awareness and the formation of the state conception.

  13. Wireless local area network for the dental office.

    PubMed

    Mupparapu, Muralidhar

    2004-01-01

    Dental offices are no exception to the implementation of new and advanced technology, especially if it enhances productivity. In a rapidly transforming digital world, wireless technology has a special place, as it has truly "retired the wire" and contributed to the ease and efficient access to patient data and other software-based applications for diagnosis and treatment. If the office or the clinic is networked, access to patient management software, imaging software and treatment planning tools is enhanced. Access will be further enhanced and unrestricted if the entire network is wireless. As with any new, emerging technology, there will be issues that should be kept in mind before adapting to the wireless environment. Foremost is the network security involved in the installation and use of these wireless networks. This short, technical manuscript deals with standards and choices in wireless technology currently available for implementation within a dental office. The benefits of each network security protocol available to protect patient data and boost the efficiency of a modern dental office are discussed.

  14. Application of Mathematical and Three-Dimensional Computer Modeling Tools in the Planning of Processes of Fuel and Energy Complexes

    NASA Astrophysics Data System (ADS)

    Aksenova, Olesya; Nikolaeva, Evgenia; Cehlár, Michal

    2017-11-01

    This work aims to investigate the effectiveness of mathematical and three-dimensional computer modeling tools in the planning of processes of fuel and energy complexes at the planning and design phase of a thermal power plant (TPP). A solution for purification of gas emissions at the design development phase of waste treatment systems is proposed employing mathematical and three-dimensional computer modeling - using the E-nets apparatus and the development of a 3D model of the future gas emission purification system. Which allows to visualize the designed result, to select and scientifically prove economically feasible technology, as well as to ensure the high environmental and social effect of the developed waste treatment system. The authors present results of a treatment of planned technological processes and the system for purifying gas emissions in terms of E-nets. using mathematical modeling in the Simulink application. What allowed to create a model of a device from the library of standard blocks and to perform calculations. A three-dimensional model of a system for purifying gas emissions has been constructed. It allows to visualize technological processes and compare them with the theoretical calculations at the design phase of a TPP and. if necessary, make adjustments.

  15. Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea

    2014-05-01

    National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.

  16. How we used two social media tools to enhance aspects of active learning during lectures.

    PubMed

    George, Daniel R; Dreibelbis, Tomi D; Aumiller, Betsy

    2013-12-01

    Medical education is evolving to include active learning approaches, yet some courses will remain lecture-based. Social media tools used by students may foster collaborative learning during lectures. We present preliminary results from a pilot study that integrated two 'social' technologies, Google Docs and SurveyMonkey, into 22 hour-long lectures for a course called "Social Influences on Health" attended by 154 students. At the conclusion of the semester, we reviewed student usage patterns with both technologies and collected data from students via course evaluations that included a standard Likert Scale. We used thematic analysis to identify emergent themes from evaluations. On average, students contributed 6 comments/questions to the Google Doc in each lecture, and 35 students participated in SurveyMonkey. Engagement with both technologies increased throughout the semester and no unprofessional incidents were observed. The mean student rating for integration of Google Docs and SurveyMonkey was 3.4 or "above average" (SD = 1.17). Thematic analysis identified perceived strengths of this approach as well as areas for improvement. Social media such as Google Docs and SurveyMonkey can facilitate interaction and provide students with control over content and flow of lecture-based courses, but educators must be mindful of practical and conceptual limitations.

  17. Evolution of Force Sensing Technologies.

    PubMed

    Shah, Dipen

    2017-06-01

    In order to Improve the procedural success and long-term outcomes of catheter ablation techniques for atrial fibrillation (AF), an Important unfulfilled requirement is to create durable electrophysiologically complete lesions. Measurement of contact force (CF) between the catheter tip and the target tissue can guide physicians to optimise both mapping and ablation procedures. Contact force can affect lesion size and clinical outcomes following catheter ablation of AF. Force sensing technologies have matured since their advent several years ago, and now allow the direct measurement of CF between the catheter tip and the target myocardium in real time. In order to obtain complete durable lesions, catheter tip spatial stability and stable contact force are important. Suboptimal energy delivery, lesion density/contiguity and/or excessive wall thickness of the pulmonary vein-left atrial (PV-LA) junction may result in conduction recovery at these sites. Lesion assessment tools may help predict and localise electrical weak points resulting in conduction recovery during and after ablation. There is increasing clinical evidence to show that optimal use of CF sensing during ablation can reduce acute PV re-conduction, although prospective randomised studies are desirable to confirm long-term favourable clinical outcomes. In combination with optimised lesion assessment tools, contact force sensing technology has the potential to become the standard of care for all patients undergoing AF catheter ablation.

  18. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  19. Ethical and regulatory challenges of research using pervasive sensing and other emerging technologies: IRB perspectives.

    PubMed

    Nebeker, Camille; Harlow, John; Espinoza Giacinto, Rebeca; Orozco-Linares, Rubi; Bloss, Cinnamon S; Weibel, Nadir

    2017-01-01

    Vast quantities of personal health information and private identifiable information are being created through mobile apps, wearable sensors, and social networks. While new strategies and tools for obtaining health data have expanded researchers' abilities to design and test personalized and adaptive health interventions, the deployment of pervasive sensing and computational techniques to gather research data is raising ethical challenges for Institutional Review Boards (IRBs) charged with protecting research participants. To explore experiences with, and perceptions about, technology-enabled research, and identify solutions for promoting responsible conduct of this research we conducted focus groups with human research protection program and IRB affiliates. Our findings outline the need for increased collaboration across stakeholders in terms of: (1) shared and dynamic resources that improve awareness of technologies and decrease potential threats to participant privacy and data confidentiality, and (2) development of appropriate and dynamic standards through collaboration with stakeholders in the research ethics community.

  20. Nucleic acid delivery using magnetic nanoparticles: the Magnetofection technology.

    PubMed

    Laurentt, Nicolas; Sapet, Cédric; Le Gourrierec, Loic; Bertosio, Elodie; Zelphati, Olivier

    2011-04-01

    In recent years, gene therapy has received considerable interest as a potential method for the treatment of numerous inherited and acquired diseases. However, successes have so far been hampered by several limitations, including safety issues of viral-based nucleic acid vectors and poor in vivo efficiency of nonviral vectors. Magnetofection has been introduced as a novel and powerful tool to deliver genetic material into cells. This technology is defined as the delivery of nucleic acids, either 'naked' or packaged (as complexes with lipids or polymers, and viruses) using magnetic nanoparticles under the guidance of an external magnetic field. This article first discusses the principles of the Magnetofection technology and its benefits as compared with standard transfection methods. A number of relevant examples of its use, both in vitro and in vivo, will then be highlighted. Future trends in the development of new magnetic nanoparticle formulations will also be outlined.

  1. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras.

    PubMed

    Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A

    2017-07-25

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  2. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras

    PubMed Central

    Spinosa, Emanuele; Roberts, David A.

    2017-01-01

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access. PMID:28757553

  3. Fundamentals of bipolar high-frequency surgery.

    PubMed

    Reidenbach, H D

    1993-04-01

    In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.

  4. Use of Natural Diversity and Biotechnology to Increase the Quality and Nutritional Content of Tomato and Grape

    PubMed Central

    Gascuel, Quentin; Diretto, Gianfranco; Monforte, Antonio J.; Fortes, Ana M.; Granell, Antonio

    2017-01-01

    Improving fruit quality has become a major goal in plant breeding. Direct approaches to tackling fruit quality traits specifically linked to consumer preferences and environmental friendliness, such as improved flavor, nutraceutical compounds, and sustainability, have slowly been added to a breeder priority list that already includes traits like productivity, efficiency, and, especially, pest and disease control. Breeders already use molecular genetic tools to improve fruit quality although most advances have been made in producer and industrial quality standards. Furthermore, progress has largely been limited to simple agronomic traits easy-to-observe, whereas the vast majority of quality attributes, specifically those relating to flavor and nutrition, are complex and have mostly been neglected. Fortunately, wild germplasm, which is used for resistance against/tolerance of environmental stresses (including pathogens), is still available and harbors significant genetic variation for taste and health-promoting traits. Similarly, heirloom/traditional varieties could be used to identify which genes contribute to flavor and health quality and, at the same time, serve as a good source of the best alleles for organoleptic quality improvement. Grape (Vitis vinifera L.) and tomato (Solanum lycopersicum L.) produce fleshy, berry-type fruits, among the most consumed in the world. Both have undergone important domestication and selection processes, that have dramatically reduced their genetic variability, and strongly standardized fruit traits. Moreover, more and more consumers are asking for sustainable production, incompatible with the wide range of chemical inputs. In the present paper, we review the genetic resources available to tomato/grape breeders, and the recent technological progresses that facilitate the identification of genes/alleles of interest within the natural or generated variability gene pool. These technologies include omics, high-throughput phenotyping/phenomics, and biotech approaches. Our review also covers a range of technologies used to transfer to tomato and grape those alleles considered of interest for fruit quality. These include traditional breeding, TILLING (Targeting Induced Local Lesions in Genomes), genetic engineering, or NPBT (New Plant Breeding Technologies). Altogether, the combined exploitation of genetic variability and innovative biotechnological tools may facilitate breeders to improve fruit quality tacking more into account the consumer standards and the needs to move forward into more sustainable farming practices. PMID:28553296

  5. Evidence based radiation oncology with existing technology

    PubMed Central

    Isa, Nicolas

    2013-01-01

    Aim To assess the real contribution of modern radiation therapy (RT) technology in the more common tumoral types in Central America, Caribbean and South America. Background RT is an essential tool in the management of cancer. RT can be either palliative or of curative intent. In general, for palliative radiotherapy, major technologies are not needed. Materials and methods We analyzed the contribution of RT technology based on published evidence for breast, lung, gastric, gallbladder, colorectal, prostate and cervix cancer in terms of disease control, survival or toxicity with especial focus on Latin America. Results Findings indicate that three dimensional conformal radiation therapy (3D RT) is the gold standard in most common type of cancer in the studied regions. Prostate cancer is probably the pathology that has more benefits when using new RT technology such as intensity modulated radiation therapy (IMRT) versus 3DRT in terms of toxicity and biochemical progression-free survival. Conclusions In light of the changes in technology, the ever-increasing access of developing countries to such technology, and its current coverage in Latin America, any efforts in this area should be aimed at improving the quality of the radiotherapy departments and centers that are already in place. PMID:25061519

  6. Energy-Efficiency Labels and Standards: A Guidebook forAppliances, Equipment, and Lighting - 2nd Edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiel, Stephen; McMahon, James E.

    2005-04-28

    Energy-performance improvements in consumer products are an essential element in any government's portfolio of energy-efficiency and climate change mitigation programs. Governments need to develop balanced programs, both voluntary and regulatory, that remove cost-ineffective, energy-wasting products from the marketplace and stimulate the development of cost-effective, energy-efficient technology. Energy-efficiency labels and standards for appliances, equipment, and lighting products deserve to be among the first policy tools considered by a country's energy policy makers. The U.S. Agency for International Development (USAID) and several other organizations identified on the cover of this guidebook recognize the need to support policy makers in their efforts tomore » implement energy-efficiency standards and labeling programs and have developed this guidebook, together with the Collaborative Labeling and Appliance Standards Program (CLASP), as a primary reference. This second edition of the guidebook was prepared over the course of the past year, four years after the preparation of the first edition, with a significant contribution from the authors and reviewers mentioned previously. Their diligent participation helps maintain this book as the international guidance tool it has become. The lead authors would like to thank the members of the Communications Office of the Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory for their support in the development, production, and distribution of the guidebook. This guidebook is designed as a manual for government officials and others around the world responsible for developing, implementing, enforcing, monitoring, and maintaining labeling and standards setting programs. It discusses the pros and cons of adopting energy-efficiency labels and standards and describes the data, facilities, and institutional and human resources needed for these programs. It provides guidance on the design, development, implementation, maintenance, and evaluation of the programs and on the design of the labels and standards themselves. In addition, it directs the reader to references and other resources likely to be useful in conducting the activities described and includes a chapter on energy policies and programs that complement appliance efficiency labels and standards. This guidebook attempts to reflect the essential framework of labeling and standards programs. It is the intent of the authors and sponsor to distribute copies of this book worldwide, at no charge, for the general public benefit. The guidebook is also available on the web at www.clasponline.org and may be downloaded to be used intact or piecemeal for whatever beneficial purposes readers may conceive.« less

  7. Technologies for Assessment of Motor Disorders in Parkinson’s Disease: A Review

    PubMed Central

    Oung, Qi Wei; Muthusamy, Hariharan; Lee, Hoi Leong; Basah, Shafriza Nisha; Yaacob, Sazali; Sarillee, Mohamed; Lee, Chia Hau

    2015-01-01

    Parkinson’s Disease (PD) is characterized as the commonest neurodegenerative illness that gradually degenerates the central nervous system. The goal of this review is to come out with a summary of the recent progress of numerous forms of sensors and systems that are related to diagnosis of PD in the past decades. The paper reviews the substantial researches on the application of technological tools (objective techniques) in the PD field applying different types of sensors proposed by previous researchers. In addition, this also includes the use of clinical tools (subjective techniques) for PD assessments, for instance, patient self-reports, patient diaries and the international gold standard reference scale, Unified Parkinson Disease Rating Scale (UPDRS). Comparative studies and critical descriptions of these approaches have been highlighted in this paper, giving an insight on the current state of the art. It is followed by explaining the merits of the multiple sensor fusion platform compared to single sensor platform for better monitoring progression of PD, and ends with thoughts about the future direction towards the need of multimodal sensor integration platform for the assessment of PD. PMID:26404288

  8. Tools and collaborative environments for bioinformatics research

    PubMed Central

    Giugno, Rosalba; Pulvirenti, Alfredo

    2011-01-01

    Advanced research requires intensive interaction among a multitude of actors, often possessing different expertise and usually working at a distance from each other. The field of collaborative research aims to establish suitable models and technologies to properly support these interactions. In this article, we first present the reasons for an interest of Bioinformatics in this context by also suggesting some research domains that could benefit from collaborative research. We then review the principles and some of the most relevant applications of social networking, with a special attention to networks supporting scientific collaboration, by also highlighting some critical issues, such as identification of users and standardization of formats. We then introduce some systems for collaborative document creation, including wiki systems and tools for ontology development, and review some of the most interesting biological wikis. We also review the principles of Collaborative Development Environments for software and show some examples in Bioinformatics. Finally, we present the principles and some examples of Learning Management Systems. In conclusion, we try to devise some of the goals to be achieved in the short term for the exploitation of these technologies. PMID:21984743

  9. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  10. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  11. How Can eHealth Technology Address Challenges Related to Multimorbidity? Perspectives from Patients with Multiple Chronic Conditions.

    PubMed

    Zulman, Donna M; Jenchura, Emily C; Cohen, Danielle M; Lewis, Eleanor T; Houston, Thomas K; Asch, Steven M

    2015-08-01

    Patient eHealth technology offers potential support for disease self-management, but the value of existing applications for patients with multiple chronic conditions (MCCs) is unclear. To understand self-management and health care navigation challenges that patients face due to MCCs and to identify opportunities to support these patients through new and enhanced eHealth technology. After administering a screening survey, we conducted 10 focus groups of 3-8 patients grouped by age, sex, and common chronic conditions. Patients discussed challenges associated with having MCCs and their use of (and desires from) technology to support self-management. Three investigators used standard content analysis methods to code the focus group transcripts. Emergent themes were reviewed with all collaborators, and final themes and representative quotes were validated with a sample of participants. Fifty-three individuals with ≥3 chronic conditions and experience using technology for health-related purposes. Focus group participants had an average of five chronic conditions. Participants reported using technology most frequently to search for health information (96%), communicate with health care providers (92%), track medical information (83%), track medications (77%), and support decision-making about treatment (55%). Three themes emerged to guide eHealth technology development: (1) Patients with MCCs manage a high volume of information, visits, and self-care tasks; (2) they need to coordinate, synthesize, and reconcile health information from multiple providers and about different conditions; (3) their unique position at the hub of multiple health issues requires self-advocacy and expertise. Focus groups identified desirable eHealth resources and tools that reflect these themes. Although patients with multiple health issues use eHealth technology to support self-care for specific conditions, they also desire tools that transcend disease boundaries. By addressing the holistic needs of patients with MCCs, eHealth technology can advance health care from a disease-centered to a patient-centered model.

  12. Machine tool task force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, G.P.

    1980-10-22

    The Machine Tool Task Force (MTTF) is a multidisciplined team of international experts, whose mission was to investigate the state of the art of machine tool technology, to identify promising future directions of that technology for both the US government and private industry, and to disseminate the findings of its research in a conference and through the publication of a final report. MTTF was a two and one-half year effort that involved the participation of 122 experts in the specialized technologies of machine tools and in the management of machine tool operations. The scope of the MTTF was limited tomore » cutting-type or material-removal-type machine tools, because they represent the major share and value of all machine tools now installed or being built. The activities of the MTTF and the technical, commercial and economic signifiance of recommended activities for improving machine tool technology are discussed. (LCL)« less

  13. Barriers to Technology Use in Large and Small School Districts

    ERIC Educational Resources Information Center

    Francom, Gregory M.

    2016-01-01

    Barriers to effective technology integration come in several different categories, including access to technology tools and resources, technology training and support, administrative support, time to plan and prepare for technology integration, and beliefs about the importance and usefulness of technology tools and resources. This study used…

  14. Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH)

    DTIC Science & Technology

    2007-10-01

    Award Number: W81XWH-06-1-0761 TITLE: Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH) PRINCIPAL INVESTIGATOR...23 JUL 2007 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH) 5b. GRANT...by considering cognitive and environmental factors such as mental workload, stress, situation awareness, and level of comfort with complex tools

  15. Data and Tools | Transportation Research | NREL

    Science.gov Websites

    Projection Tool Lite Tool for projecting consumer demand for electric vehicle charging infrastructure at the technologies or for selecting a technology to invest in. Transportation-Related Consumer Preference Data Consumer preference data related to alternative fuel and advanced vehicle technologies to support the

  16. In-house coordination for organ donation--single-center experience in a pilot project in Germany (2006 to 2013).

    PubMed

    Kaiser, G M; Wirges, U; Becker, S; Baier, C; Radunz, S; Kraus, H; Paul, A

    2014-01-01

    A challenge for solid organ transplantation in Germany is the shortage of organs. In an effort to increase donation rates, some federal states mandated hospitals to install transplantation officers to coordinate, evaluate, and enhance the donation and transplantation processes. In 2009 the German Foundation for Organ Transplantation (DSO) implemented the In-House Coordination Project, which includes retrospective, quarterly, information technology-based case analyses of all deceased patients with primary or secondary brain injury in regard to the organ donation process in maximum care hospitals. From 2006 to 2008 an analysis of potential organ donors was performed in our hospital using a time-consuming, complex method using questionnaires, hand-written patient files, and the hospital IT documentation system (standard method). Analyses in the In-House Coordination Project are instead carried out by a proprietary semiautomated IT tool called Transplant Check, which uses easily accessible standard data records of the hospital controlling and accounting unit. The aim of our study was to compare the results of the standard method and Transplant Check in detecting and evaluating potential donors. To do so, the same period of time (2006 to 2008) was re-evaluated using the IT tool. Transplant Check was able to record significantly more patients who fulfilled the criteria for inclusion than the standard method (641 vs 424). The methods displayed a wide overlap, apart from 22 patients who were only recorded by the standard method. In these cases, the accompanying brain injury diagnosis was not recorded in the controlling and accounting unit data records due to little relative clinical significance. None of the 22 patients fulfilled the criteria for brain death. In summary, Transplant Check is an easy-to-use, reliable, and valid tool for evaluating donor potential in a maximum care hospital. Therefore from 2010 on, analyses were performed exclusively with Transplant Check at our university hospital. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  18. Overview of MPEG internet video coding

    NASA Astrophysics Data System (ADS)

    Wang, R. G.; Li, G.; Park, S.; Kim, J.; Huang, T.; Jang, E. S.; Gao, W.

    2015-09-01

    MPEG has produced standards that have provided the industry with the best video compression technologies. In order to address the diversified needs of the Internet, MPEG issued the Call for Proposals (CfP) for internet video coding in July, 2011. It is anticipated that any patent declaration associated with the Baseline Profile of this standard will indicate that the patent owner is prepared to grant a free of charge license to an unrestricted number of applicants on a worldwide, non-discriminatory basis and under other reasonable terms and conditions to make, use, and sell implementations of the Baseline Profile of this standard in accordance with the ITU-T/ITU-R/ISO/IEC Common Patent Policy. Three different codecs had responded to the CfP, which are WVC, VCB and IVC. WVC was proposed jointly by Apple, Cisco, Fraunhofer HHI, Magnum Semiconductor, Polycom and RIM etc. it's in fact AVC baseline. VCB was proposed by Google, and it's in fact VP8. IVC was proposed by several Universities (Peking University, Tsinghua University, Zhejiang University, Hanyang University and Korea Aerospace University etc.) and its coding tools was developed from Zero. In this paper, we give an overview of the coding tools in IVC, and evaluate its performance by comparing it with WVC, VCB and AVC High Profile.

  19. Making the Connection: Technological Literacy and Technology Assessment.

    ERIC Educational Resources Information Center

    Deal, Walter F.

    2002-01-01

    Technology assessment is the process of identifying, analyzing, and evaluating potential consequences of technology. Tools for assessing and forecasting impact include relevance trees and futures wheels. Activities based on these tools can be used to teach assessment to technology education students. (SK)

  20. Electronic commerce: beyond the euphoria.

    PubMed

    Healy, J L; DeLuca, J M

    2000-01-01

    As the center of considerable media attention, case study articles, vendor research, and development efforts, electronic commerce technology is entering healthcare and having a profound effect. The simple truth, however, is that after the drama and excitement begins to wear off, completing a successful e-commerce implementation remains good old-fashioned hard, sometimes monotonous work. To be successful, e-commerce technologies must be planned and implemented with rigorous project standards, and incorporated with significant process and workflow reengineering to actually return significant value to the organization. This article briefs readers on the organizational issues they must consider in evaluating e-commerce readiness--cultural, executive and technological factors that either support or inhibit project and technology success. Readers will be left with the tools to conduct an electronic commerce "readiness assessment" to evaluate the immediate, mid- and long-term potential of electronic commerce; practical remediation strategies for better preparing the organization for the changes inherent in moving to an e-commerce-enabled business model; and comments from the field--advice from organizations that have successfully implemented e-commerce technologies into their ongoing operations.

  1. Tele-ICU "myth busters".

    PubMed

    Venditti, Angelo; Ronk, Chanda; Kopenhaver, Tracey; Fetterman, Susan

    2012-01-01

    Tele-intensive care unit (ICU) technology has been proven to bridge the gap between available resources and quality care for many health care systems across the country. Tele-ICUs allow the standardization of care and provide a second set of eyes traditionally not available in the ICU. A growing body of literature supports the use of tele-ICUs based on improved outcomes and reduction in errors. To date, the literature has not effectively outlined the limitations of this technology related to response to changes in patient care, interventions, and interaction with the care team. This information can potentially have a profound impact on service expectations. Some misconceptions about tele-ICU technology include the following: tele-ICU is "watching" 24 hours a day, 7 days a week; tele-ICU is a telemetry unit; tele-ICU is a stand-alone crisis intervention tool; tele-ICU decreases staffing at the bedside; tele-ICU clinical roles are clearly defined and understood; and tele-ICUs are not cost-effective to operate. This article outlines the purpose of tele-ICU technology, reviews outcomes, and "busts" myths about tele-ICU technology.

  2. Simulation-Based e-Learning Tools for Science,Engineering, and Technology Education(SimBeLT)

    NASA Astrophysics Data System (ADS)

    Davis, Doyle V.; Cherner, Y.

    2006-12-01

    The focus of Project SimBeLT is the research, development, testing, and dissemination of a new type of simulation-based integrated e-learning set of modules for two-year college technical and engineering curricula in the areas of thermodynamics, fluid physics, and fiber optics that can also be used in secondary schools and four-year colleges. A collection of sophisticated virtual labs is the core component of the SimBeLT modules. These labs will be designed to enhance the understanding of technical concepts and underlying fundamental principles of these topics, as well as to master certain performance based skills online. SimBeLT software will help educators to meet the National Science Education Standard that "learning science and technology is something that students do, not something that is done to them". A major component of Project SimBeLT is the development of multi-layered technology-oriented virtual labs that realistically mimic workplace-like environments. Dynamic data exchange between simulations will be implemented and links with instant instructional messages and data handling tools will be realized. A second important goal of Project SimBeLT labs is to bridge technical skills and scientific knowledge by enhancing the teaching and learning of specific scientific or engineering subjects. SimBeLT builds upon research and outcomes of interactive teaching strategies and tools developed through prior NSF funding (http://webphysics.nhctc.edu/compact/index.html) (Project SimBeLT is partially supported by a grant from the National Science Foundation DUE-0603277)

  3. Enabling Interoperability and Servicing Multiple User Segments Through Web Services, Standards, and Data Tools

    NASA Astrophysics Data System (ADS)

    Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet

    2010-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC metadata is published to several metadata repositories using the Open Archive Initiative Protocol for Metadata Handling (OAI-PMH), to increase the chances that users can find data holdings relevant to their particular scientific problem. ORNL also seeks to leverage technology across these various data projects and encourage standardization of processes and technical architecture. This standardization is behind current efforts involving the use of Drupal and Fedora Commons. This poster describes the current and planned approaches that the ORNL DAAC is taking to enable cost-effective interoperability among data centers, both across the NASA EOSDIS data centers and across the international spectrum of terrestrial ecology-related data centers. The poster will highlight the standards that we are currently using across data formats, metadata formats, and data protocols. References: [1]Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2]Devarakonda R., et al. Data sharing and retrieval using OAI-PMH. Earth Science Informatics (2011), 4(1): 1-5.

  4. International Standards for Genomes, Transcriptomes, and Metagenomes

    PubMed Central

    Mason, Christopher E.; Afshinnekoo, Ebrahim; Tighe, Scott; Wu, Shixiu; Levy, Shawn

    2017-01-01

    Challenges and biases in preparing, characterizing, and sequencing DNA and RNA can have significant impacts on research in genomics across all kingdoms of life, including experiments in single-cells, RNA profiling, and metagenomics (across multiple genomes). Technical artifacts and contamination can arise at each point of sample manipulation, extraction, sequencing, and analysis. Thus, the measurement and benchmarking of these potential sources of error are of paramount importance as next-generation sequencing (NGS) projects become more global and ubiquitous. Fortunately, a variety of methods, standards, and technologies have recently emerged that improve measurements in genomics and sequencing, from the initial input material to the computational pipelines that process and annotate the data. Here we review current standards and their applications in genomics, including whole genomes, transcriptomes, mixed genomic samples (metagenomes), and the modified bases within each (epigenomes and epitranscriptomes). These standards, tools, and metrics are critical for quantifying the accuracy of NGS methods, which will be essential for robust approaches in clinical genomics and precision medicine. PMID:28337071

  5. Evaluation of the methods for enumerating coliform bacteria from water samples using precise reference standards.

    PubMed

    Wohlsen, T; Bates, J; Vesey, G; Robinson, W A; Katouli, M

    2006-04-01

    To use BioBall cultures as a precise reference standard to evaluate methods for enumeration of Escherichia coli and other coliform bacteria in water samples. Eight methods were evaluated including membrane filtration, standard plate count (pour and spread plate methods), defined substrate technology methods (Colilert and Colisure), the most probable number method and the Petrifilm disposable plate method. Escherichia coli and Enterobacter aerogenes BioBall cultures containing 30 organisms each were used. All tests were performed using 10 replicates. The mean recovery of both bacteria varied with the different methods employed. The best and most consistent results were obtained with Petrifilm and the pour plate method. Other methods either yielded a low recovery or showed significantly high variability between replicates. The BioBall is a very suitable quality control tool for evaluating the efficiency of methods for bacterial enumeration in water samples.

  6. The Frontlines of Medicine Project: a proposal for the standardized communication of emergency department data for public health uses including syndromic surveillance for biological and chemical terrorism.

    PubMed

    Barthell, Edward N; Cordell, William H; Moorhead, John C; Handler, Jonathan; Feied, Craig; Smith, Mark S; Cochrane, Dennis G; Felton, Christopher W; Collins, Michael A

    2002-04-01

    The Frontlines of Medicine Project is a collaborative effort of emergency medicine (including emergency medical services and clinical toxicology), public health, emergency government, law enforcement, and informatics. This collaboration proposes to develop a nonproprietary, "open systems" approach for reporting emergency department patient data. The common element is a standard approach to sending messages from individual EDs to regional oversight entities that could then analyze the data received. ED encounter data could be used for various public health initiatives, including syndromic surveillance for chemical and biological terrorism. The interlinking of these regional systems could also permit public health surveillance at a national level based on ED patient encounter data. Advancements in the Internet and Web-based technologies could allow the deployment of these standardized tools in a rapid time frame.

  7. Overview of the Machine-Tool Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, G.P.

    1981-06-08

    The Machine Tool Task Force, (MTTF) surveyed the state of the art of machine tool technology for material removal for two and one-half years. This overview gives a brief summary of the approach, specific subjects covered, principal conclusions and some of the key recommendations aimed at improving the technology and advancing the productivity of machine tools. The Task Force consisted of 123 experts from the US and other countries. Their findings are documented in a five-volume report, Technology of Machine Tools.

  8. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    PubMed

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support communication and associated resident outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Tools, courses, and learning pathways offered by the National Interagency Fuels, Fire, and Vegetation Technology Transfer

    Treesearch

    Eva K. Strand; Kathy H. Schon; Jeff Jones

    2010-01-01

    Technological advances in the area of fuel and wildland fire management have created a need for effective decision support tools and technology training. The National Interagency Fuels Committee and LANDFIRE have chartered a team to develop science-based learning tools for assessment of fire and fuels and to provide online training and technology transfer to help...

  10. Technology as Mediation Tool for Improving Teaching Profession in Higher Education Practices

    ERIC Educational Resources Information Center

    Altinay-Gazi, Zehra; Altinay-Aksal, Fahriye

    2017-01-01

    Technology became a mediation tool for forming information and developing skills is teacher education programs of higher education institutions because technological tools can be used for self-reflection of prospective teachers' teaching performances. Practical implementation of teacher education programmes is a part of quality indicator in higher…

  11. Technology Tools to Support Reading in the Digital Age

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Griffiths, Gina G.

    2012-01-01

    Advances in digital technologies are dramatically altering the texts and tools available to teachers and students. These technological advances have created excitement among many for their potential to be used as instructional tools for literacy education. Yet with the promise of these advances come issues that can exacerbate the literacy…

  12. 45 CFR 170.205 - Content exchange standards and implementation specifications for exchanging electronic health...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS... TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.205 Content.... The Healthcare Information Technology Standards Panel (HITSP) Summary Documents Using HL7 CCD...

  13. 45 CFR 170.205 - Content exchange standards and implementation specifications for exchanging electronic health...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS... TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.205 Content.... The Healthcare Information Technology Standards Panel (HITSP) Summary Documents Using HL7 CCD...

  14. A Standards-Based Grading and Reporting Tool for Faculty: Design and Implications

    ERIC Educational Resources Information Center

    Sadik, Alaa M.

    2011-01-01

    The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications…

  15. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  16. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  17. Precision medicine in chronic disease management: the MS BioScreen

    PubMed Central

    Gourraud, Pierre-Antoine; Henry, Roland; Cree, Bruce AC; Crane, Jason C; Lizee, Antoine; Olson, Marram P; Santaniello, Adam V.; Datta, Esha; Zhu, Alyssa H.; Bevan, Carolyn J.; Gelfand, Jeffrey M.; Graves, Jennifer A.; Goodin, Douglas E.; Green, Ari; von Büdingen, H.-Christian; Waubant, Emmanuelle; Zamvil, Scott S.; Crabtree-Hartman, Elizabeth; Nelson, Sarah; Baranzini, Sergio E.; Hauser, Stephen L.

    2014-01-01

    We present a precision medicine application developed for multiple sclerosis (MS): the MS BioScreen. This new tool addresses the challenges of dynamic management of a complex chronic disease; the interaction of clinicians and patients with such a tool illustrates the extent to which translational digital medicine – i.e. the application of information technology to medicine—has the potential to radically transform medical practice. We introduce three key evolutionary phases in displaying data to health care providers, patients, and researchers: visualization (accessing data), contextualization (understanding the data), and actionable interpretation (real-time use of the data to assist decision-making). Together these form the stepping-stones that are expected to accelerate standardization of data across platforms, promote evidence-based medicine, support shared decision-making, and ultimately lead to improved outcomes. PMID:25263997

  18. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  19. SRM 2460/2461 Standard Bullets and Casings Project

    PubMed Central

    Song, J.; Whitenton, E.; Kelley, D.; Clary, R.; Ma, L.; Ballou, S.; Ols, M.

    2004-01-01

    The National Institute of Standards and Technology Standard Reference Material (SRM) 2460/2461 standard bullets and casings project will provide support to firearms examiners and to the National Integrated Ballistics Information Network (NIBIN) in the United States. The SRM bullet is designed as both a virtual and a physical bullet profile signature standard. The virtual standard is a set of six digitized bullet profile signatures originally traced from six master bullets fired at the Bureau of Alcohol, Tobacco and Firearms (ATF) and the Federal Bureau of Investigation (FBI). By using the virtual signature standard to control the tool path on a numerically controlled diamond turning machine, 40 SRM bullets were produced. A profile signature measurement system was established for the SRM bullets. The profile signature differences are quantified by the maximum of the cross correlation function and by the signature difference between pairs of compared profile signatures measured on different SRM bullets. Initial measurement results showed high reproducibility for both the measurement system and production process of the SRM bullets. A traceability scheme has been proposed to establish the measurement traceability for nationwide bullet signature measurements to NIST, ATF and FBI. Prototype SRM casings have also been developed. PMID:27366632

  20. Developing Indicators for a Classroom Observation Tool on Pedagogy and Technology Integration: A Delphi Study

    ERIC Educational Resources Information Center

    Elmendorf, Douglas C.; Song, Liyan

    2015-01-01

    Rapid advances in technology and increased access to technology tools have created new instructional demands and expectations on teachers. Due to the ubiquitous presence of technology in K-12 schools, teachers are being observed on both their pedagogical and technology integration practices. Applying the technological pedagogical and content…

  1. Development of a nursing handoff tool: a web-based application to enhance patient safety.

    PubMed

    Goldsmith, Denise; Boomhower, Marc; Lancaster, Diane R; Antonelli, Mary; Kenyon, Mary Anne Murphy; Benoit, Angela; Chang, Frank; Dykes, Patricia C

    2010-11-13

    Dynamic and complex clinical environments present many challenges for effective communication among health care providers. The omission of accurate, timely, easily accessible vital information by health care providers significantly increases risk of patient harm and can have devastating consequences for patient care. An effective nursing handoff supports the standardized transfer of accurate, timely, critical patient information, as well as continuity of care and treatment, resulting in enhanced patient safety. The Brigham and Women's/Faulkner Hospital Healthcare Information Technology Innovation Program (HIP) is supporting the development of a web based nursing handoff tool (NHT). The goal of this project is to develop a "proof of concept" handoff application to be evaluated by nurses on the inpatient intermediate care units. The handoff tool would enable nurses to use existing knowledge of evidence-based handoff methodology in their everyday practice to improve patient care and safety. In this paper, we discuss the results of nursing focus groups designed to identify the current state of handoff practice as well as the functional and data element requirements of a web based Nursing Handoff Tool (NHT).

  2. Game playbooks: tools to guide multidisciplinary teams in developing videogame-based behavior change interventions.

    PubMed

    Duncan, Lindsay R; Hieftje, Kimberly D; Culyba, Sabrina; Fiellin, Lynn E

    2014-03-01

    As mobile technologies and videogaming platforms are becoming increasingly prevalent in the realm of health and healthcare, so are the opportunities to use these resources to conduct behavioral interventions. The creation and empirical testing of game style interventions, however, is challenged by the requisite collaboration of multidisciplinary teams, including researchers and game developers who have different cultures, terminologies, and standards of evidence. Thus, traditional intervention development tools such as logic models and intervention manuals may need to be augmented by creating what we have termed "Game Playbooks" which are intervention guidebooks that are created by, understood by, and acceptable to all members of the multidisciplinary game development team. The purpose of this paper is to describe the importance and content of a Game Playbook created to aide in the development of a videogame intervention designed specifically for health behavior change in young teens as well as the process for creating such a tool. We draw on the experience of our research and game design team to describe the critical components of the Game Playbook and the necessity of creating such a tool.

  3. "The wondrous eyes of a new technology"-a history of the early electroencephalography (EEG) of psychopathy, delinquency, and immorality.

    PubMed

    Schirmann, Felix

    2014-01-01

    This article presents a history of the early electroencephalography (EEG) of psychopathy, delinquency, and immorality in Great Britain and the United States in the 1940s and 1950s. Then, EEG was a novel research tool that promised ground-breaking insights in psychiatry and criminology. Experts explored its potential regarding the diagnosis, classification, etiology, and treatment of unethical and unlawful persons. This line of research yielded tentative and inconsistent findings, which the experts attributed to methodological and theoretical shortcomings. Accordingly, the scientific community discussed the reliability, validity, and utility of EEG, and launched initiatives to calibrate and standardize the novel tool. The analysis shows that knowledge production, gauging of the research tool, and attempts to establish credibility for EEG in the study of immoral persons occurred simultaneously. The paper concludes with a reflection on the similarities between EEG and neuroimaging-the prime research tool in the current neuroscience of morality-and calls for a critical assessment of their potentials and limitations in the study of immorality and crime.

  4. Current State and Model for Development of Technology-Based Care for Attention Deficit Hyperactivity Disorder.

    PubMed

    Benyakorn, Songpoom; Riley, Steven J; Calub, Catrina A; Schweitzer, Julie B

    2016-09-01

    Care (i.e., evaluation and intervention) delivered through technology is used in many areas of mental health services, including for persons with attention deficit hyperactivity disorder (ADHD). Technology can facilitate care for individuals with ADHD, their parents, and their care providers. The adoption of technological tools for ADHD care requires evidence-based studies to support the transition from development to integration into use in the home, school, or work for persons with the disorder. The initial phase, which is development of technological tools, has begun in earnest; however, the evidence base for many of these tools is lacking. In some instances, the uptake of a piece of technology into home use or clinical practice may be further along than the research to support its use. In this study, we review the current evidence regarding technology for ADHD and also propose a model to evaluate the support for other tools that have yet to be tested. We propose using the Research Domain Criteria as a framework for evaluating the tools' relationships to dimensions related to ADHD. This article concludes with recommendations for testing new tools that may have promise in improving the evaluation or treatment of persons with ADHD.

  5. Usability testing of a monitoring and feedback tool to stimulate physical activity.

    PubMed

    van der Weegen, Sanne; Verwey, Renée; Tange, Huibert J; Spreeuwenberg, Marieke D; de Witte, Luc P

    2014-01-01

    A MONITORING AND FEEDBACK TOOL TO STIMULATE PHYSICAL ACTIVITY, CONSISTING OF AN ACTIVITY SENSOR, SMARTPHONE APPLICATION (APP), AND WEBSITE FOR PATIENTS AND THEIR PRACTICE NURSES, HAS BEEN DEVELOPED: the 'It's LiFe!' tool. In this study the usability of the tool was evaluated by technology experts and end users (people with chronic obstructive pulmonary disease or type 2 diabetes, with ages from 40-70 years), to improve the user interfaces and content of the tool. THE STUDY HAD FOUR PHASES: 1) a heuristic evaluation with six technology experts; 2) a usability test in a laboratory by five patients; 3) a pilot in real life wherein 20 patients used the tool for 3 months; and 4) a final lab test by five patients. In both lab tests (phases 2 and 4) qualitative data were collected through a thinking-aloud procedure and video recordings, and quantitative data through questions about task complexity, text comprehensiveness, and readability. In addition, the post-study system usability questionnaire (PSSUQ) was completed for the app and the website. In the pilot test (phase 3), all patients were interviewed three times and the Software Usability Measurement Inventory (SUMI) was completed. After each phase, improvements were made, mainly to the layout and text. The main improvement was a refresh button for active data synchronization between activity sensor, app, and server, implemented after connectivity problems in the pilot test. The mean score on the PSSUQ for the website improved from 5.6 (standard deviation [SD] 1.3) to 6.5 (SD 0.5), and for the app from 5.4 (SD 1.5) to 6.2 (SD 1.1). Satisfaction in the pilot was not very high according to the SUMI. The use of laboratory versus real-life tests and expert-based versus user-based tests revealed a wide range of usability issues. The usability of the It's LiFe! tool improved considerably during the study.

  6. Standardization in synthetic biology: an engineering discipline coming of age.

    PubMed

    Decoene, Thomas; De Paepe, Brecht; Maertens, Jo; Coussement, Pieter; Peters, Gert; De Maeseneire, Sofie L; De Mey, Marjan

    2018-08-01

    Leaping DNA read-and-write technologies, and extensive automation and miniaturization are radically transforming the field of biological experimentation by providing the tools that enable the cost-effective high-throughput required to address the enormous complexity of biological systems. However, standardization of the synthetic biology workflow has not kept abreast with dwindling technical and resource constraints, leading, for example, to the collection of multi-level and multi-omics large data sets that end up disconnected or remain under- or even unexploited. In this contribution, we critically evaluate the various efforts, and the (limited) success thereof, in order to introduce standards for defining, designing, assembling, characterizing, and sharing synthetic biology parts. The causes for this success or the lack thereof, as well as possible solutions to overcome these, are discussed. Akin to other engineering disciplines, extensive standardization will undoubtedly speed-up and reduce the cost of bioprocess development. In this respect, further implementation of synthetic biology standards will be crucial for the field in order to redeem its promise, i.e. to enable predictable forward engineering.

  7. Image standards in tissue-based diagnosis (diagnostic surgical pathology).

    PubMed

    Kayser, Klaus; Görtler, Jürgen; Goldmann, Torsten; Vollmer, Ekkehard; Hufnagl, Peter; Kayser, Gian

    2008-04-18

    Progress in automated image analysis, virtual microscopy, hospital information systems, and interdisciplinary data exchange require image standards to be applied in tissue-based diagnosis. To describe the theoretical background, practical experiences and comparable solutions in other medical fields to promote image standards applicable for diagnostic pathology. THEORY AND EXPERIENCES: Images used in tissue-based diagnosis present with pathology-specific characteristics. It seems appropriate to discuss their characteristics and potential standardization in relation to the levels of hierarchy in which they appear. All levels can be divided into legal, medical, and technological properties. Standards applied to the first level include regulations or aims to be fulfilled. In legal properties, they have to regulate features of privacy, image documentation, transmission, and presentation; in medical properties, features of disease-image combination, human-diagnostics, automated information extraction, archive retrieval and access; and in technological properties features of image acquisition, display, formats, transfer speed, safety, and system dynamics. The next lower second level has to implement the prescriptions of the upper one, i.e. describe how they are implemented. Legal aspects should demand secure encryption for privacy of all patient related data, image archives that include all images used for diagnostics for a period of 10 years at minimum, accurate annotations of dates and viewing, and precise hardware and software information. Medical aspects should demand standardized patients' files such as DICOM 3 or HL 7 including history and previous examinations, information of image display hardware and software, of image resolution and fields of view, of relation between sizes of biological objects and image sizes, and of access to archives and retrieval. Technological aspects should deal with image acquisition systems (resolution, colour temperature, focus, brightness, and quality evaluation procedures), display resolution data, implemented image formats, storage, cycle frequency, backup procedures, operation system, and external system accessibility. The lowest third level describes the permitted limits and threshold in detail. At present, an applicable standard including all mentioned features does not exist to our knowledge; some aspects can be taken from radiological standards (PACS, DICOM 3); others require specific solutions or are not covered yet. The progress in virtual microscopy and application of artificial intelligence (AI) in tissue-based diagnosis demands fast preparation and implementation of an internationally acceptable standard. The described hierarchic order as well as analytic investigation in all potentially necessary aspects and details offers an appropriate tool to specifically determine standardized requirements.

  8. A phenomenological study on middle-school science teachers' perspectives on utilization of technology in the science classroom and its effect on their pedagogy

    NASA Astrophysics Data System (ADS)

    Rajbanshi, Roshani

    With access to technology and expectation by the mainstream, the use of technology in the classroom has become essential these days. However, the problem in science education is that with classrooms filled with technological equipment, the teaching style is didactic, and teachers employ traditional teacher-centered methods in the classroom. In addition, results of international assessments indicate that students' science learning needs to be improved. The purpose of this study is to analyze and document the lived experience of middle-school science teachers and their use of technology in personal, professional lives as well as in their classroom and to describe the phenomenon of middle-school science teachers' technological beliefs for integration of digital devices or technology as an instructional delivery tool, knowledge construction tool and learning tool. For this study, technology is defined as digital devices such as computer, laptops, digital camera, iPad that are used in the science classroom as an instructional delivery tool, as a learning tool, and as a knowledge construction tool. Constructivism is the lens, the theoretical framework that guides this qualitative phenomenological research. Observation, interview, personal journal, photo elicitation, and journal reflection are used as methods of data collection. Data was analyzed based on a constructivist theoretical framework to construct knowledge and draw conclusion. MAXQDA, a qualitative analysis software, was also used to analyze the data. The findings indicate that middle-school science teachers use technology in various ways to engage and motivate students in science learning; however, there are multiple factors that influence teachers' technology use in the class. In conclusion, teacher, students, and technology are the three sides of the triangle where technology acts as the third side or the bridge to connect teachers' content knowledge to students through the tool with which students are familiar. Keywords: Teachers' belief, science and technology, knowledge construction.

  9. Using Web-Based Technologies and Tools in Future Choreographers' Training: British Experience

    ERIC Educational Resources Information Center

    Bidyuk, Dmytro

    2016-01-01

    In the paper the problem of using effective web-based technologies and tools in teaching choreography in British higher education institutions has been discussed. Researches on the usage of web-based technologies and tools for practical dance courses in choreographers' professional training at British higher education institutions by such British…

  10. A Teacher's Utilization of Information and Communication Technology as a Pedagogical Tool

    ERIC Educational Resources Information Center

    Golas, Jennifer Lynn

    2013-01-01

    This research study investigated the factors that either contribute or inhibit a teacher from using information and communication technology as an educational tool in the classroom. Factors as teachers' perceived level of proficiency with technology, technology-related professional development, planning time, and the technology resources available…

  11. Image is everything: pearls and pitfalls of digital photography and PowerPoint presentations for the cosmetic surgeon.

    PubMed

    Niamtu, Joseph

    2004-01-01

    Cosmetic surgery and photography are inseparable. Clinical photographs serve as diagnostic aids, medical records, legal protection, and marketing tools. In the past, taking high-quality, standardized images and maintaining and using them for presentations were tasks of significant proportion when done correctly. Although the cosmetic literature is replete with articles on standardized photography, this has eluded many practitioners in part to the complexity. A paradigm shift has occurred in the past decade, and digital technology has revolutionized clinical photography and presentations. Digital technology has made it easier than ever to take high-quality, standardized images and to use them in a multitude of ways to enhance the practice of cosmetic surgery. PowerPoint presentations have become the standard for academic presentations, but many pitfalls exist, especially when taking a backup disc to play on an alternate computer at a lecture venue. Embracing digital technology has a mild to moderate learning curve but is complicated by old habits and holdovers from the days of slide photography, macro lenses, and specialized flashes. Discussion is presented to circumvent common problems involving computer glitches with PowerPoint presentations. In the past, high-quality clinical photography was complex and sometimes beyond the confines of a busy clinical practice. The digital revolution of the past decade has removed many of these associated barriers, and it has never been easier or more affordable to take images and use them in a multitude of ways for learning, judging surgical outcomes, teaching and lecturing, and marketing. Even though this technology has existed for years, many practitioners have failed to embrace it for various reasons or fears. By following a few simple techniques, even the most novice practitioner can be on the forefront of digital imaging technology. By observing a number of modified techniques with digital cameras, any practitioner can take high-quality, standardized clinical photographs and can make and use these images to enhance his or her practice. This article deals with common pitfalls of digital photography and PowerPoint presentations and presents multiple pearls to achieve proficiency quickly with digital photography and imaging as well as avoid malfunction of PowerPoint presentations in an academic lecture venue.

  12. Telemedicine Technologies for Diabetes in Pregnancy: A Systematic Review and Meta-Analysis.

    PubMed

    Ming, Wai-Kit; Mackillop, Lucy H; Farmer, Andrew J; Loerup, Lise; Bartlett, Katy; Levy, Jonathan C; Tarassenko, Lionel; Velardo, Carmelo; Kenworthy, Yvonne; Hirst, Jane E

    2016-11-09

    Diabetes in pregnancy is a global problem. Technological innovations present exciting opportunities for novel approaches to improve clinical care delivery for gestational and other forms of diabetes in pregnancy. To perform an updated and comprehensive systematic review and meta-analysis of the literature to determine whether telemedicine solutions offer any advantages compared with the standard care for women with diabetes in pregnancy. The review was developed using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework. Randomized controlled trials (RCT) in women with diabetes in pregnancy that compared telemedicine blood glucose monitoring with the standard care were identified. Searches were performed in SCOPUS and PubMed, limited to English language publications between January 2000 and January 2016. Trials that met the eligibility criteria were scored for risk of bias using the Cochrane Collaborations Risk of Bias Tool. A meta-analysis was performed using Review Manager software version 5.3 (Nordic Cochrane Centre, Cochrane Collaboration). A total of 7 trials were identified. Meta-analysis demonstrated a modest but statistically significant improvement in HbA1c associated with the use of a telemedicine technology. The mean HbA1c of women using telemedicine was 5.33% (SD 0.70) compared with 5.45% (SD 0.58) in the standard care group, representing a mean difference of -0.12% (95% CI -0.23% to -0.02%). When this comparison was limited to women with gestational diabetes mellitus (GDM) only, the mean HbA1c of women using telemedicine was 5.22% (SD 0.70) compared with 5.37% (SD 0.61) in the standard care group, mean difference -0.14% (95% CI -0.25% to -0.04%). There were no differences in other maternal and neonatal outcomes reported. There is currently insufficient evidence that telemedicine technology is superior to standard care for women with diabetes in pregnancy; however, there was no evidence of harm. No trials were identified that assessed patient satisfaction or cost of care delivery, and it may be in these areas where these technologies may be found most valuable. ©Wai-Kit Ming, Lucy H Mackillop, Andrew J Farmer, Lise Loerup, Katy Bartlett, Jonathan C Levy, Lionel Tarassenko, Carmelo Velardo, Yvonne Kenworthy, Jane E Hirst. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 09.11.2016.

  13. 40 CFR 268.42 - Treatment standards expressed as specified technologies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... specified technologies. 268.42 Section 268.42 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... standards expressed as specified technologies. Note: For the requirements previously found in this section in Table 2—Technology-Based Standards By RCRA Waste Code, and Table 3—Technology-Based Standards for...

  14. 40 CFR 268.42 - Treatment standards expressed as specified technologies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... specified technologies. 268.42 Section 268.42 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... standards expressed as specified technologies. Note: For the requirements previously found in this section in Table 2—Technology-Based Standards By RCRA Waste Code, and Table 3—Technology-Based Standards for...

  15. State-of-the-art and emerging technologies for atrial fibrillation ablation.

    PubMed

    Dewire, Jane; Calkins, Hugh

    2010-03-01

    Catheter ablation is an important treatment modality for patients with atrial fibrillation (AF). Although the superiority of catheter ablation over antiarrhythmic drug therapy has been demonstrated in middle-aged patients with paroxysmal AF, the role the procedure in other patient subgroups-particularly those with long-standing persistent AF-has not been well defined. Furthermore, although AF ablation can be performed with reasonable efficacy and safety by experienced operators, long-term success rates for single procedures are suboptimal. Fortunately, extensive ongoing research will improve our understanding of the mechanisms of AF, and considerable funds are being invested in developing new ablation technologies to improve patient outcomes. These technologies include ablation catheters designed to electrically isolate the pulmonary veins with improved safety, efficacy, and speed, catheters designed to deliver radiofrequency energy with improved precision, robotic systems to address the technological demands of the procedure, improved imaging and electrical mapping systems, and MRI-guided ablation strategies. The tools, technologies, and techniques that will ultimately stand the test of time and become the standard approach to AF ablation in the future remain unclear. However, technological advances are sure to result in the necessary improvements in the safety and efficacy of AF ablation procedures.

  16. A Transforming Electricity System: Understanding the Interactions Between Clean Energy Technologies, Markets, and Policies

    NASA Astrophysics Data System (ADS)

    Mooney, David

    The U.S. electricity system is currently undergoing a dramatic transformation. State-level renewable portfolio standards, abundant natural gas at low prices, and rapidly falling prices for wind and solar technologies are among the factors that have ushered in this transformation. With objective, rigorous, technology-neutral analysis, NREL aims to increase the understanding of energy policies, markets, resources, technologies, and infrastructure and their connections with economic, environmental, and security priorities. The results of these analyses are meant to inform R&D, policy, and investment decisions as energy-efficient and renewable energy technologies advance from concept to commercial application to market penetration. This talk will provide an overview of how NREL uses high-fidelity data, deep knowledge of energy technology cost and performance, and advanced models and tools to provide the information needed to ensure this transformation occurs economically, while maintaining system reliability. Examples will be explored and will include analysis of tax credit impacts on wind and solar deployment and power sector emissions, as well as analysis of power systems operations in the Eastern Interconnection under 30% wind and solar penetration scenarios. Invited speaker number 47185.

  17. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  18. Current state-of-art of STR sequencing in forensic genetics.

    PubMed

    Alonso, Antonio; Barrio, Pedro A; Müller, Petra; Köcher, Steffi; Berger, Burkhard; Martin, Pablo; Bodner, Martin; Willuweit, Sascha; Parson, Walther; Roewer, Lutz; Budowle, Bruce

    2018-05-11

    The current state of validation and implementation strategies of MPS technology for the analysis of STR markers for forensic genetics use is described, covering the topics of the current catalogue of commercial MPS-STR panels, leading MPS-platforms, and MPS-STR data analysis tools. In addition, the developmental and internal validation studies carried out to date to evaluate reliability, sensitivity, mixture analysis, concordance, and the ability to analyze challenged samples are summarized. The results of various MPS-STR population studies that showed a large number of new STR sequence variants that increase the power of discrimination in several forensically-relevant loci are also presented. Finally, various initiatives developed by several international projects and standardization (or guidelines) groups to facilitate application of MPS technology for STR marker analyses are discussed in regard to promoting a standard STR sequence nomenclature, performing population studies to detect sequence variants, and developing a universal system to translate sequence variants into a simple STR nomenclature (numbers and letters) compatible with national STR databases. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. A Collaborative Decision Environment for UAV Operations

    NASA Technical Reports Server (NTRS)

    D'Ortenzio, Matthew V.; Enomoto, Francis Y.; Johan, Sandra L.

    2005-01-01

    NASA is developing Intelligent Mission Management (IMM) technology for science missions employing long endurance unmanned aerial vehicles (UAV's). The IMM groundbased component is the Collaborative Decision Environment (CDE), a ground system that provides the Mission/Science team with situational awareness, collaboration, and decisionmaking tools. The CDE is used for pre-flight planning, mission monitoring, and visualization of acquired data. It integrates external data products used for planning and executing a mission, such as weather, satellite data products, and topographic maps by leveraging established and emerging Open Geospatial Consortium (OGC) standards to acquire external data products via the Internet, and an industry standard geographic information system (GIs) toolkit for visualization As a Science/Mission team may be geographically dispersed, the CDE is capable of providing access to remote users across wide area networks using Web Services technology. A prototype CDE is being developed for an instrument checkout flight on a manned aircraft in the fall of 2005, in preparation for a full deployment in support of the US Forest Service and NASA Ames Western States Fire Mission in 2006.

  20. 45 CFR 170.207 - Vocabulary standards for representing electronic health information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.207 Vocabulary standards for representing electronic...

  1. 45 CFR 170.207 - Vocabulary standards for representing electronic health information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.207 Vocabulary standards for representing electronic...

  2. 45 CFR 170.207 - Vocabulary standards for representing electronic health information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.207 Vocabulary standards for representing electronic...

  3. 45 CFR 170.207 - Vocabulary standards for representing electronic health information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.207 Vocabulary standards for representing electronic...

  4. Productivity improvement through cycle time analysis

    NASA Astrophysics Data System (ADS)

    Bonal, Javier; Rios, Luis; Ortega, Carlos; Aparicio, Santiago; Fernandez, Manuel; Rosendo, Maria; Sanchez, Alejandro; Malvar, Sergio

    1996-09-01

    A cycle time (CT) reduction methodology has been developed in the Lucent Technology facility (former AT&T) in Madrid, Spain. It is based on a comparison of the contribution of each process step in each technology with a target generated by a cycle time model. These targeted cycle times are obtained using capacity data of the machines processing those steps, queuing theory and theory of constrains (TOC) principles (buffers to protect bottleneck and low cycle time/inventory everywhere else). Overall efficiency equipment (OEE) like analysis is done in the machine groups with major differences between their target cycle time and real values. Comparisons between the current value of the parameters that command their capacity (process times, availability, idles, reworks, etc.) and the engineering standards are done to detect the cause of exceeding their contribution to the cycle time. Several friendly and graphical tools have been developed to track and analyze those capacity parameters. Specially important have showed to be two tools: ASAP (analysis of scheduling, arrivals and performance) and performer which analyzes interrelation problems among machines procedures and direct labor. The performer is designed for a detailed and daily analysis of an isolate machine. The extensive use of this tool by the whole labor force has demonstrated impressive results in the elimination of multiple small inefficiencies with a direct positive implications on OEE. As for ASAP, it shows the lot in process/queue for different machines at the same time. ASAP is a powerful tool to analyze the product flow management and the assigned capacity for interdependent operations like the cleaning and the oxidation/diffusion. Additional tools have been developed to track, analyze and improve the process times and the availability.

  5. The MMI Semantic Framework: Rosetta Stones for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.

    2009-12-01

    Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.

  6. Arachne—A web-based event viewer for MINERνA

    NASA Astrophysics Data System (ADS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A. M.; Gran, R.; Harris, D. A.; Kordosky, M.; Lee, H.; Maggi, G.; Maher, E.; Mann, W. A.; Marshall, C. M.; McFarland, K. S.; McGowan, A. M.; Mislivec, A.; Mousseau, J.; Osmanov, B.; Osta, J.; Paolone, V.; Perdue, G.; Ransome, R. D.; Ray, H.; Schellman, H.; Schmitz, D. W.; Simon, C.; Solano Salinas, C. J.; Tice, B. G.; Walding, J.; Walton, T.; Wolcott, J.; Zhang, D.; Ziemer, B. P.; MinerνA Collaboration

    2012-06-01

    Neutrino interaction events in the MINERνA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERνA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  7. Arachne - A web-based event viewer for MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tagg, N.; /Otterbein Coll.; Brangham, J.

    2011-11-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  8. Security concept in 'MyAngelWeb' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli, F; Nahaissi, D; Boschini, M; Ferrari, R; Meloni, G; Camnasio, M; Spaggiari, P; Carnerone, G

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  9. Fast-mode duplex qPCR for BCR-ABL1 molecular monitoring: innovation, automation, and harmonization.

    PubMed

    Gerrard, Gareth; Mudge, Katherine; Foskett, Pierre; Stevens, David; Alikian, Mary; White, Helen E; Cross, Nicholas C P; Apperley, Jane; Foroni, Letizia

    2012-07-01

    Reverse transcription quantitative polymerase chain reaction (RTqPCR)is currently the most sensitive tool available for the routine monitoring of disease level in patients undergoing treatment for BCRABL1 associated malignancies. Considerable effort has been invested at both the local and international levels to standardise the methodology and reporting criteria used to assess this critical metric. In an effort to accommodate the demands of increasing sample throughput and greater standardization, we adapted the current best-practice guidelines to encompass automation platforms and improved multiplex RT-qPCR technology.

  10. The promise and limits of PET texture analysis.

    PubMed

    Cheng, Nai-Ming; Fang, Yu-Hua Dean; Yen, Tzu-Chen

    2013-11-01

    Metabolic heterogeneity is a recognized characteristic of malignant tumors. Positron emission tomography (PET) texture analysis evaluated intratumoral heterogeneity in the uptake of (18)F-fluorodeoxyglucose. There were recent evidences that PET textural features were of prognostic significance in patients with different solid tumors. Unfortunately, there are still crucial standardization challenges to transform PET texture parameters from their current use as research tools into the arena of validated technologies for use in oncology practice. Testing its generalizability, robustness, consistency, and limitations is necessary before implementing it in daily patient care.

  11. Use of CRISPR/Cas Genome Editing Technology for Targeted Mutagenesis in Rice.

    PubMed

    Xu, Rongfang; Wei, Pengcheng; Yang, Jianbo

    2017-01-01

    Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)/CRISPR-associated protein (Cas) system is a newly emerging mutagenesis (gene-editing) tool in genetic engineering. Among the agriculturally important crops, several genes have been successfully mutated by the system, and some agronomic important traits have been rapidly generated, which indicates the potential applications in both scientific research and plant breeding. In this chapter, we describe a standard gene-editing procedure to effectively target rice genes and to make specific rice mutants using the CRISPR/Cas9 system mediated by Agrobacterium transformation.

  12. Security concept in 'MyAngelWeb((R))' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli; Nahaissi; Boschini; Ferrari; Meloni; Camnasio; Spaggiari; Carnerone

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  13. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  14. The Rita Network.: how the High Energy Tools can BE Used in Order to Transmit Clinical Hadrontherapic Data

    NASA Astrophysics Data System (ADS)

    Ferraris, M.; Risso, P.; Squarcia, S.

    We present the realization done for the organization, selection, transmission of Radiotherapy's data and images. The choice of a standard healthcare records, based on the stereotactic and/or conformational radiotherapy, the implementation of the healthcare file into a distributed data-base using the World Wide Web platform for data presentation and transmission and the availability in the network is presented. The solution chosen is a good example of technology transfert from High Energy physics and Medicine and opens new interesting ways in this field.

  15. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  16. Web-Based Tools for Data Visualization and Decision Support for South Asia

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  17. Recent Developments in Microsystems Fabricated by the Liga-Technique

    NASA Technical Reports Server (NTRS)

    Schulz, J.; Bade, K.; El-Kholi, A.; Hein, H.; Mohr, J.

    1995-01-01

    As an example of microsystems fabricated by the LIGA-technique (x-ray lithography, electroplating and molding), three systems are described and characterized: a triaxial acceleration sensor system, a micro-optical switch, and a microsystem for the analysis of pollutants. The fabrication technologies are reviewed with respect to the key components of the three systems: an acceleration sensor, and electrostatic actuator, and a spectrometer made by the LIGA-technique. Aa micro-pump and micro-valve made by using micromachined tools for molding and optical fiber imaging are made possible by combining LIGA and anisotropic etching of silicon in a batch process. These examples show that the combination of technologies and components is the key to complex microsystems. The design of such microsystems will be facilitated is standardized interfaces are available.

  18. Fleet DNA (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkokwicz, K.; Duran, A.

    2014-06-01

    The Fleet DNA project objectives include capturing and quantifying drive cycle and technology variation for the multitude of medium- and heavy-duty vocations; providing a common data storage warehouse for medium- and heavy-duty vehicle fleet data across DOE activities and laboratories; and integrating existing DOE tools, models, and analyses to provide data-driven decision making capabilities. Fleet DNA advantages include: for Government - providing in-use data for standard drive cycle development, R&D, tech targets, and rule making; for OEMs - real-world usage datasets provide concrete examples of customer use profiles; for fleets - vocational datasets help illustrate how to maximize return onmore » technology investments; for Funding Agencies - ways are revealed to optimize the impact of financial incentive offers; and for researchers -a data source is provided for modeling and simulation.« less

  19. Freva - Freie Univ Evaluation System Framework for Scientific HPC Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Schartner, T.; Grieger, J.; Kirchner, I.; Rust, H.; Cubasch, U.; Ulbrich, U.

    2017-12-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science (e.g. www-miklip.dkrz.de, cmip-eval.dkrz.de). Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  20. The Interactive Candidate Assessment Tool: A New Way to Interview Residents.

    PubMed

    Platt, Michael P; Akhtar-Khavari, Vafa; Ortega, Rafael; Schneider, Jeffrey I; Fineberg, Tabitha; Grundfast, Kenneth M

    2017-06-01

    The purpose of the residency interview is to determine the extent to which a well-qualified applicant is a good fit with a residency program. However, questions asked during residency interviews tend to be standard and repetitive, and they may not elicit information that best differentiates one applicant from another. The iCAT (interactive Candidate Assessment Tool) is a novel interview instrument that allows both interviewers and interviewees to learn about each other in a meaningful way. The iCAT uses a tablet computer to enable the candidate to select questions from an array of video and nonvideo vignettes. Vignettes include recorded videos regarding some aspect of the program, while other icons include questions within recognizable categories. Postinterview surveys demonstrated advantages over traditional interview methods, with 93% agreeing that it was an innovative and effective tool for conducting residency program interviews. The iCAT for residency interviews is a technological advancement that facilitates in-depth candidate assessment.

Top